Science.gov

Sample records for management independent verification

  1. Software risk management through independent verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Zhou, Tong C.; Wood, Ralph

    1995-01-01

    Software project managers need tools to estimate and track project goals in a continuous fashion before, during, and after development of a system. In addition, they need an ability to compare the current project status with past project profiles to validate management intuition, identify problems, and then direct appropriate resources to the sources of problems. This paper describes a measurement-based approach to calculating the risk inherent in meeting project goals that leverages past project metrics and existing estimation and tracking models. We introduce the IV&V Goal/Questions/Metrics model, explain its use in the software development life cycle, and describe our attempts to validate the model through the reverse engineering of existing projects.

  2. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort...

  3. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... missing statutory or regulatory deadlines for automation that is intended to meet program requirements; (2... program offices in the development and implementation of the project. (b) Independent Verification...

  4. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... missing statutory or regulatory deadlines for automation that is intended to meet program requirements; (2... program offices in the development and implementation of the project. (b) Independent Verification...

  5. A Tutorial on Text-Independent Speaker Verification

    NASA Astrophysics Data System (ADS)

    Bimbot, Frédéric; Bonastre, Jean-François; Fredouille, Corinne; Gravier, Guillaume; Magrin-Chagnolleau, Ivan; Meignier, Sylvain; Merlin, Teva; Ortega-García, Javier; Petrovska-Delacrétaz, Dijana; Reynolds, Douglas A.

    2004-12-01

    This paper presents an overview of a state-of-the-art text-independent speaker verification system. First, an introduction proposes a modular scheme of the training and test phases of a speaker verification system. Then, the most commonly speech parameterization used in speaker verification, namely, cepstral analysis, is detailed. Gaussian mixture modeling, which is the speaker modeling technique used in most systems, is then explained. A few speaker modeling alternatives, namely, neural networks and support vector machines, are mentioned. Normalization of scores is then explained, as this is a very important step to deal with real-world data. The evaluation of a speaker verification system is then detailed, and the detection error trade-off (DET) curve is explained. Several extensions of speaker verification are then enumerated, including speaker tracking and segmentation by speakers. Then, some applications of speaker verification are proposed, including on-site applications, remote applications, applications relative to structuring audio information, and games. Issues concerning the forensic area are then recalled, as we believe it is very important to inform people about the actual performance and limitations of speaker verification systems. This paper concludes by giving a few research trends in speaker verification for the next couple of years.

  6. Systems analysis-independent analysis and verification

    SciTech Connect

    Badin, J.S.; DiPietro, J.P.

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  7. Systems analysis - independent analysis and verification

    SciTech Connect

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S.

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  8. Applying Independent Verification and Validation to Automatic Test Equipment

    NASA Technical Reports Server (NTRS)

    Calhoun, Cynthia C.

    1997-01-01

    This paper describes a general overview of applying Independent Verification and Validation (IV&V) to Automatic Test Equipment (ATE). The overview is not inclusive of all IV&V activities that can occur or of all development and maintenance items that can be validated and verified, during the IV&V process. A sampling of possible IV&V activities that can occur within each phase of the ATE life cycle are described.

  9. Ares I-X Range Safety Simulation Verification and Analysis Independent Validation and Verification

    NASA Technical Reports Server (NTRS)

    Merry, Carl M.; Tarpley, Ashley F.; Craig, A. Scott; Tartabini, Paul V.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle

    2011-01-01

    NASA s Ares I-X vehicle launched on a suborbital test flight from the Eastern Range in Florida on October 28, 2009. To obtain approval for launch, a range safety final flight data package was generated to meet the data requirements defined in the Air Force Space Command Manual 91-710 Volume 2. The delivery included products such as a nominal trajectory, trajectory envelopes, stage disposal data and footprints, and a malfunction turn analysis. The Air Force s 45th Space Wing uses these products to ensure public and launch area safety. Due to the criticality of these data, an independent validation and verification effort was undertaken to ensure data quality and adherence to requirements. As a result, the product package was delivered with the confidence that independent organizations using separate simulation software generated data to meet the range requirements and yielded consistent results. This document captures Ares I-X final flight data package verification and validation analysis, including the methodology used to validate and verify simulation inputs, execution, and results and presents lessons learned during the process

  10. Independent verification and validation for Space Shuttle flight software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Committee for Review of Oversight Mechanisms for Space Shuttle Software was asked by the National Aeronautics and Space Administration's (NASA) Office of Space Flight to determine the need to continue independent verification and validation (IV&V) for Space Shuttle flight software. The Committee found that the current IV&V process is necessary to maintain NASA's stringent safety and quality requirements for man-rated vehicles. Therefore, the Committee does not support NASA's plan to eliminate funding for the IV&V effort in fiscal year 1993. The Committee believes that the Space Shuttle software development process is not adequate without IV&V and that elimination of IV&V as currently practiced will adversely affect the overall quality and safety of the software, both now and in the future. Furthermore, the Committee was told that no organization within NASA has the expertise or the manpower to replace the current IV&V function in a timely fashion, nor will building this expertise elsewhere necessarily reduce cost. Thus, the Committee does not recommend moving IV&V functions to other organizations within NASA unless the current IV&V is maintained for as long as it takes to build comparable expertise in the replacing organization.

  11. Independent Verification Final Summary Report for the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    SciTech Connect

    P.C. Weaver

    2009-04-29

    The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified “hot spot” cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: 1) performing radiological walkover surveys, and 2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site was removed in its entirety.

  12. INDEPENDENT VERIFICATION OF THE BUILDING 3550 SLAB AT OAK RIDGE NATIONAL LABORATORY OAK RIDGE, TENNESSEE

    SciTech Connect

    Weaver, Phyllis C.

    2012-05-08

    The Oak Ridge Institute for Science and Education (ORISE) has completed the independent verification survey of the Building 3550 Slab. The results of this effort are provided. The objective of this verification survey is to provide independent review and field assessment of remediation actions conducted by Safety and Ecology Corporation (SEC) to document that the final radiological condition of the slab meets the release guidelines. Verification survey activities on the Building 3550 Slab that included scans, measurements, and the collection of smears. Scans for alpha, alpha plus beta, and gamma activity identified several areas that were investigated.

  13. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  14. Space telescope observatory management system preliminary test and verification plan

    NASA Technical Reports Server (NTRS)

    Fritz, J. S.; Kaldenbach, C. F.; Williams, W. B.

    1982-01-01

    The preliminary plan for the Space Telescope Observatory Management System Test and Verification (TAV) is provided. Methodology, test scenarios, test plans and procedure formats, schedules, and the TAV organization are included. Supporting information is provided.

  15. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... D Appendix D to Part 236 Transportation Other Regulations Relating to Transportation (Continued..., AND APPLIANCES Pt. 236, App. D Appendix D to Part 236—Independent Review of Verification and..., in FRA's judgment, for FRA to monitor the assessment. (d) The reviewer shall evaluate the...

  16. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... D Appendix D to Part 236 Transportation Other Regulations Relating to Transportation (Continued..., AND APPLIANCES Pt. 236, App. D Appendix D to Part 236—Independent Review of Verification and..., in FRA's judgment, for FRA to monitor the assessment. (d) The reviewer shall evaluate the...

  17. INDEPENDENT VERIFICATION SURVEY REPORT FOR ZONE 1 OF THE EAST TENNESSEE TECHNOLOGY PARK IN OAK RIDGE, TENNESSEE

    SciTech Connect

    King, David A.

    2012-08-16

    Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs).

  18. SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT

    SciTech Connect

    Baba, H; Tachibana, H; Kamima, T; Takahashi, R; Kawai, D; Sugawara, Y; Yamamoto, T; Sato, A; Yamashita, M

    2015-06-15

    Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%.

  19. INDEPENDENT VERIFICATION OF THE CENTRAL CAMPUS AND SOUTHEAST LABORATORY COMPLEX BUILDING SLABS AT OAK RIDGE NATIONAL LABORATORY, OAK RIDGE, TENNESSEE

    SciTech Connect

    Weaver, Phyllis C.

    2012-07-24

    Oak Ridge Associated Universities/Oak Ridge Institute for Science and Education (ORAU/ORISE) has completed the independent verification survey of the Central Campus and Southeast Lab Complex Building Slabs. The results of this effort are provided. The objective of this verification survey was to provide independent review and field assessment of remediation actions conducted by SEC, and to independently assess whether the final radiological condition of the slabs met the release guidelines.

  20. An independent verification and validation of the Future Theater Level Model conceptual model

    SciTech Connect

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  1. Formal verification of a set of memory management units

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, Gerald C.

    1992-01-01

    This document describes the verification of a set of memory management units (MMU). The verification effort demonstrates the use of hierarchical decomposition and abstract theories. The MMUs can be organized into a complexity hierarchy. Each new level in the hierarchy adds a few significant features or modifications to the lower level MMU. The units described include: (1) a page check translation look-aside module (TLM); (2) a page check TLM with supervisor line; (3) a base bounds MMU; (4) a virtual address translation MMU; and (5) a virtual address translation MMU with memory resident segment table.

  2. How Nasa's Independent Verification and Validation (IVandV) Program Builds Reliability into a Space Mission Software System (SMSS)

    NASA Technical Reports Server (NTRS)

    Fisher, Marcus S.; Northey, Jeffrey; Stanton, William

    2014-01-01

    The purpose of this presentation is to outline how the NASA Independent Verification and Validation (IVV) Program helps to build reliability into the Space Mission Software Systems (SMSSs) that its customers develop.

  3. Benchmark testing and independent verification of the VS2DT computer code

    NASA Astrophysics Data System (ADS)

    McCord, James T.; Goodrich, Michael T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code's original documentation.

  4. Benchmark testing and independent verification of the VS2DT computer code

    SciTech Connect

    McCord, J.T.; Goodrich, M.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code`s original documentation.

  5. A method for online verification of adapted fields using an independent dose monitor

    SciTech Connect

    Chang Jina; Norrlinger, Bernhard D.; Heaton, Robert K.; Jaffray, David A.; Cho, Young-Bin; Islam, Mohammad K.; Mahon, Robert

    2013-07-15

    Purpose: Clinical implementation of online adaptive radiotherapy requires generation of modified fields and a method of dosimetric verification in a short time. We present a method of treatment field modification to account for patient setup error, and an online method of verification using an independent monitoring system.Methods: The fields are modified by translating each multileaf collimator (MLC) defined aperture in the direction of the patient setup error, and magnifying to account for distance variation to the marked isocentre. A modified version of a previously reported online beam monitoring system, the integral quality monitoring (IQM) system, was investigated for validation of adapted fields. The system consists of a large area ion-chamber with a spatial gradient in electrode separation to provide a spatially sensitive signal for each beam segment, mounted below the MLC, and a calculation algorithm to predict the signal. IMRT plans of ten prostate patients have been modified in response to six randomly chosen setup errors in three orthogonal directions.Results: A total of approximately 49 beams for the modified fields were verified by the IQM system, of which 97% of measured IQM signal agree with the predicted value to within 2%.Conclusions: The modified IQM system was found to be suitable for online verification of adapted treatment fields.

  6. Independent verification and validation testing of the FLASH computer code, Versiion 3. 0

    SciTech Connect

    Martian, P.; Chung, J.N. . Dept. of Mechanical and Materials Engineering)

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies.

  7. Commitment at Work and Independence from Management.

    ERIC Educational Resources Information Center

    Belanger, Jacques; Edwards, Paul K.; Wright, Martyn

    2003-01-01

    Case study of a Canadian aluminum smelter through 15 interviews, observation, and employee survey (n=214) revealed high commitment, acceptance of change, and worker independence from management. This pattern emerged from a traditionally strong union presence. Comparison with other cases underlines the centrality of collective organization to…

  8. Cryogenic fluid management experiment trunnion fatigue verification

    NASA Technical Reports Server (NTRS)

    Bailey, W. J.; Fester, D. A.; Toth, J. M., Jr.; Kasper, H. J.

    1983-01-01

    A subcritical liquid hydrogen orbital storage and transfer experiment was designed for flight in the Shuttle cargo bay. The Cryogenic Fluid Management Experiment (CFME) includes a liquid hydrogen tank supported in a vacuum jacket by two fiberglass epoxy trunnion mounts. This composite material was selected for the trunnions since it provides desirable strength, weight and thermal characteristics for supporting cryogenic tankage. An experimental program was conducted to provide material property and fatigue data for S-glass epoxy composite materials at ambient and liquid hydrogen temperatures and to verify structural integrity of the CFME trunnion supports.

  9. SU-E-T-32: A Feasibility Study of Independent Dose Verification for IMAT

    SciTech Connect

    Kamima, T; Takahashi, R; Sato, Y; Baba, H; Tachibana, H; Yamashita, M; Sugawara, Y

    2015-06-15

    Purpose: To assess the feasibility of the independent dose verification (Indp) for intensity modulated arc therapy (IMAT). Methods: An independent dose calculation software program (Simple MU Analysis, Triangle Products, JP) was used in this study, which can compute the radiological path length from the surface to the reference point for each control point using patient’s CT image dataset and the MLC aperture shape was simultaneously modeled in reference to the information of MLC from DICOM-RT plan. Dose calculation was performed using a modified Clarkson method considering MLC transmission and dosimetric leaf gap. In this study, a retrospective analysis was conducted in which IMAT plans from 120 patients of the two sites (prostate / head and neck) from four institutes were retrospectively analyzed to compare the Indp to the TPS using patient CT images. In addition, an ion-chamber measurement was performed to verify the accuracy of the TPS and the Indp in water-equivalent phantom. Results: The agreements between the Indp and the TPS (mean±1SD) were −0.8±2.4% and −1.3±3.8% for the regions of prostate and head and neck, respectively. The measurement comparison showed similar results (−0.8±1.6% and 0.1±4.6% for prostate and head and neck). The variation was larger in the head and neck because the number of the segments was increased that the reference point was under the MLC and the modified Clarkson method cannot consider the smooth falloff of the leaf penumbra. Conclusion: The independent verification program would be practical and effective for secondary check for IMAT with the sufficient accuracy in the measurement and CT-based calculation. The accuracy would be improved if considering the falloff of the leaf penumbra.

  10. An algorithm for independent verification of Gamma Knife{sup TM} treatment plans

    SciTech Connect

    Beck, James; Berndt, Anita

    2004-10-01

    A formalism for independent treatment verification has been developed for Gamma Knife{sup TM} radiosurgery in analogy to the second checks being performed routinely in the field of external beam radiotherapy. A verification algorithm is presented, and evaluated based on its agreement with treatment planning calculations for the first 40 Canadian Gamma Knife{sup TM} patients. The algorithm is used to calculate the irradiation time for each shot, and the value of the dose at the maximum dose point in each calculation matrix. Data entry consists of information included on the plan printout, and can be streamlined by using an optional plan import feature. Calculated shot times differed from those generated by the treatment planning software by an average of 0.3%, with a standard deviation of 1.4%. The agreement of dose maxima was comparable with an average of -0.2% and a standard deviation of 1.3%. Consistently accurate comparisons were observed for centrally located lesions treated with a small number of shots. Large discrepancies were almost all associated with dose plans utilizing a large number of collimator plugs, for which the simplifying approximations used by the program are known to break down.

  11. Text-independent writer identification and verification using textural and allographic features.

    PubMed

    Bulacu, Marius; Schomaker, Lambert

    2007-04-01

    The identification of a person on the basis of scanned images of handwriting is a useful biometric modality with application in forensic and historic document analysis and constitutes an exemplary study area within the research field of behavioral biometrics. We developed new and very effective techniques for automatic writer identification and verification that use probability distribution functions (PDFs) extracted from the handwriting images to characterize writer individuality. A defining property of our methods is that they are designed to be independent of the textual content of the handwritten samples. Our methods operate at two levels of analysis: the texture level and the character-shape (allograph) level. At the texture level, we use contour-based joint directional PDFs that encode orientation and curvature information to give an intimate characterization of individual handwriting style. In our analysis at the allograph level, the writer is considered to be characterized by a stochastic pattern generator of ink-trace fragments, or graphemes. The PDF of these simple shapes in a given handwriting sample is characteristic for the writer and is computed using a common shape codebook obtained by grapheme clustering. Combining multiple features (directional, grapheme, and run-length PDFs) yields increased writer identification and verification performance. The proposed methods are applicable to free-style handwriting (both cursive and isolated) and have practical feasibility, under the assumption that a few text lines of handwritten material are available in order to obtain reliable probability estimates.

  12. Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia

    1996-01-01

    The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.

  13. Independent verification of plutonium decontamination on Johnston Atoll (1992--1996)

    SciTech Connect

    Wilson-Nichols, M.J.; Wilson, J.E.; McDowell-Boyer, L.M.; Davidson, J.R.; Egidi, P.V.; Coleman, R.L.

    1998-05-01

    The Field Command, Defense Special Weapons Agency (FCDSWA) (formerly FCDNA) contracted Oak Ridge National Laboratory (ORNL) Environmental Technology Section (ETS) to conduct an independent verification (IV) of the Johnston Atoll (JA) Plutonium Decontamination Project by an interagency agreement with the US Department of Energy in 1992. The main island is contaminated with the transuranic elements plutonium and americium, and soil decontamination activities have been ongoing since 1984. FCDSWA has selected a remedy that employs a system of sorting contaminated particles from the coral/soil matrix, allowing uncontaminated soil to be reused. The objective of IV is to evaluate the effectiveness of remedial action. The IV contractor`s task is to determine whether the remedial action contractor has effectively reduced contamination to levels within established criteria and whether the supporting documentation describing the remedial action is adequate. ORNL conducted four interrelated tasks from 1992 through 1996 to accomplish the IV mission. This document is a compilation and summary of those activities, in addition to a comprehensive review of the history of the project.

  14. US-VISIT Independent Verification and Validation Project: Test Bed Establishment Report

    SciTech Connect

    Jensen, N W; Gansemer, J D

    2011-01-21

    This document describes the computational and data systems available at the Lawrence Livermore National Laboratory for use on the US-VISIT Independent Verification and Validation (IV&V) project. This system - composed of data, software and hardware - is designed to be as close as a representation of the operational ADIS system as is required to verify and validate US-VISIT methodologies. It is not required to reproduce the computational capabilities of the enterprise-class operational system. During FY10, the test bed was simplified from the FY09 version by reducing the number of database host computers from three to one, significantly reducing the maintenance and overhead while simultaneously increasing system throughput. During the current performance period, a database transfer was performed as a set of Data Pump Export files. The previous RMAN backup from 2007 required the availability of an AIX system which is not required when using data pump. Due to efficiencies in the new system and process, loading of the database refresh was able to be accomplished in a much shorter time frame than was previously required. The FY10 Oracle Test Bed now consists of a single Linux platform hosting two Oracle databases including the 2007 copy as well as the October 2010 refresh.

  15. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience

    NASA Astrophysics Data System (ADS)

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael

    2007-08-01

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm3 ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 ± 1.2% and 0.5 ± 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 ± 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach

  16. A Multitier System for the Verification, Visualization and Management of CHIMERA

    SciTech Connect

    Lingerfelt, Eric J; Messer, Bronson; Osborne, James A; Budiardja, R. D.; Mezzacappa, Anthony

    2011-01-01

    CHIMERA is a multi-dimensional radiation hydrodynamics code designed to study core-collapse supernovae. The code is made up of three essentially independent parts: a hydrodynamics module, a nuclear burning module, and a neutrino transport solver combined within an operator-split approach. Given CHIMERA s complexity and pace of ongoing development, a new support system, Bellerophon, has been designed and implemented to perform automated verification, visualization and management tasks while integrating with other workflow systems utilized by CHIMERA s development group. In order to achieve these goals, a multitier approach has been adopted. By integrating supercomputing platforms, visualization clusters, a dedicated web server and a client-side desktop application, this system attempts to provide an encapsulated, end-to-end solution to these needs.

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: QUALITY AND MANAGEMENT PLAN FOR THE PILOT PERIOD (1995-2000)

    EPA Science Inventory

    Based upon the structure and specifications in ANSI/ASQC E4-1994, Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs, the Environmental Technology Verification (ETV) program Quality and Management Plan (QMP) f...

  18. Results of the independent radiological verification survey at the former Bridgeport Brass Company Facility, Seymour, Connecticut (SSC001)

    SciTech Connect

    Foley, R.D.; Rice, D.E.; Allred, J.F.; Brown, K.S.

    1995-03-01

    At the request of the USDOE, a team from ORNL conducted an independent radiological verification survey at the former Bridgeport Brass Company Facility, Seymour, Connecticut, from September 1992 to March 1993. Purpose of the survey was to determine whether residual levels of radioactivity inside the Ruffert Building and selected adjacent areas were rmediated to levels below DOE guidelines for FUSRAP sites. The property was contaminated with radioactive residues of {sup 238}U from uranium processing experiments conducted by Reactive Metals, Inc., from 1962 to 1964 for the Atomic Energy Commission. A previous radiological survey did not characterize the entire floor space because equipment which could not be moved at the time made it inaccessible for radiological surveys. During the remediation process, additional areas of elevated radioactivity were discovered under stationary equipment, which required additional remediation and further verification. Results of the independent radiological verification survey confirm that, with the exception of the drain system inside the building, residual uranium contamination has been remediated to levels below DOE guidelines for unrestricted release of property at FUSRAP sites inside and outside the Ruffert Building. However, certain sections of the drain system retain uranium contamination above DOE surface guideline levels. These sections of pipe are addressed in separate, referenced documentation.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION: A VEHICLE FOR INDEPENDENT, CREDIBLE PERFORMANCE RESULTS ON COMMERCIALLY READY TECHNOLOGIES

    EPA Science Inventory

    The paper discusses the U. S. Environmental Protection Agency's Environmental Technology Verification (ETV) Program: its history, operations, past successes, and future plans. Begun in 1995 in response to President Clinton's "Bridge to a Sustainable Future" as a means to work wit...

  20. VERIFICATION TESTING OF AIR POLLUTION CONTROL TECHNOLOGY QUALITY MANAGEMENT PLAN

    EPA Science Inventory

    This document is the basis for quality assurance for the Air Pollution Control Technology Verification Center (APCT Center) operated under the U.S. Environmental Protection Agency (EPA). It describes the policies, organizational structure, responsibilities, procedures, and qualit...

  1. SU-E-T-48: A Multi-Institutional Study of Independent Dose Verification for Conventional, SRS and SBRT

    SciTech Connect

    Takahashi, R; Kamima, T; Tachibana, H; Baba, H; Itano, M; Yamazaki, T; Ishibashi, S; Higuchi, Y; Shimizu, H; Yamamoto, T; Yamashita, M; Sugawara, Y; Sato, A; Nishiyama, S; Kawai, D; Miyaoka, S

    2015-06-15

    Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle{sup 3} (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sites (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly.

  2. Results of the independent radiological verification survey at 79 Avenue E, Lodi, New Jersey (LJ091V)

    SciTech Connect

    Rodriguez, R.E.; Uziel, M.S.

    1996-09-01

    Prior to remediation, thorium residues in excess of DOE applicable guidelines were found in the eastern comer of the backyard on property at 79 Avenue B, Lodi, New Jersey. Decontamination, which consisted of excavation and removal of contaminated soil, was performed by subcontractors under the direction of Bechtel National, Inc. The independent radiological verification survey described in this report was performed by the Measurement Applications and Development Group at Oak Ridge National Laboratory to verify that the final remedial action had reduced contamination levels to within authorized limits. The property at 79 Avenue B, Lodi, New Jersey, was thoroughly investigated outdoors for radionuclide residues. Surface gamma exposure rates were below guideline levels and comparable the area. The results of soil radionuclide analysis for all soil concentration measurements were below limits prescribed by DOE applicable guidelines for protection against radiation. Analysis of data contained in the post-remedial action report and results of this independent radiological verification survey by ORNL confirm that all radiological measurements fall below the limits prescribed by DOE guidelines established for this site. The property at 79 Avenue B successfully meets the DOE remedial action objectives.

  3. Final Report Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York

    SciTech Connect

    Harpeneau, Evan M.

    2011-06-24

    On May 9, 2011, ORISE conducted verification survey activities including scans, sampling, and the collection of smears of the remaining soils and off-gas pipe associated with the 802 Fan House within the HFBR (High Flux Beam Reactor) Complex at BNL. ORISE is of the opinion, based on independent scan and sample results obtained during verification activities at the HFBR 802 Fan House, that the FSS (final status survey) unit meets the applicable site cleanup objectives established for as left radiological conditions.

  4. An independent system for real-time dynamic multileaf collimation trajectory verification using EPID

    NASA Astrophysics Data System (ADS)

    Fuangrod, Todsaporn; Woodruff, Henry C.; Rowshanfarzad, Pejman; O'Connor, Daryl J.; Middleton, Richard H.; Greer, Peter B.

    2014-01-01

    A new tool has been developed to verify the trajectory of dynamic multileaf collimators (MLCs) used in advanced radiotherapy techniques using only the information provided by the electronic portal imaging devices (EPID) measured image frames. The prescribed leaf positions are resampled to a higher resolution in a pre-processing stage to improve the verification precision. Measured MLC positions are extracted from the EPID frames using a template matching method. A cosine similarity metric is then applied to synchronise measured and planned leaf positions for comparison. Three additional comparison functions were incorporated to ensure robust synchronisation. The MLC leaf trajectory error detection was simulated for both intensity modulated radiation therapy (IMRT) (prostate) and volumetric modulated arc therapy (VMAT) (head-and-neck) deliveries with anthropomorphic phantoms in the beam. The overall accuracy for MLC positions automatically extracted from EPID image frames was approximately 0.5 mm. The MLC leaf trajectory verification system can detect leaf position errors during IMRT and VMAT with a tolerance of 3.5 mm within 1 s.

  5. SU-E-T-505: CT-Based Independent Dose Verification for RapidArc Plan as a Secondary Check

    SciTech Connect

    Tachibana, H; Baba, H; Kamima, T; Takahashi, R

    2014-06-01

    Purpose: To design and develop a CT-based independent dose verification for the RapidArc plan and also to show the effectiveness of inhomogeneous correction in the secondary check for the plan. Methods: To compute the radiological path from the body surface to the reference point and equivalent field sizes from the multiple MLC aperture shapes in the RapidArc MLC sequences independently, DICOM files of CT image, structure and RapidArc plan were imported to our in-house software. The radiological path was computed using a three-dimensional CT arrays for each segment. The multiple MLC aperture shapes were used to compute tissue maximum ratio and phantom scatter factor using the Clarkson-method. In this study, two RapidArc plans for oropharynx cancer were used to compare the doses in CT-based calculation and water-equivalent phantom calculation using the contoured body structure to the dose in a treatment planning system (TPS). Results: The comparison in the one plan shows good agreement in both of the calculation (within 1%). However, in the other case, the CT-based calculation shows better agreement compared to the water-equivalent phantom calculation (CT-based: -2.8% vs. Water-based: -3.8%). Because there were multiple structures along the multiple beam paths and the radiological path length in the CT-based calculation and the path in the water-homogenous phantom calculation were comparatively different. Conclusion: RapidArc treatments are performed in any sites (from head, chest, abdomen to pelvis), which includes inhomogeneous media. Therefore, a more reliable CT-based calculation may be used as a secondary check for the independent verification.

  6. Inside Sweden's Independent Public Schools: Innovations in Management.

    ERIC Educational Resources Information Center

    Raham, Helen

    2003-01-01

    Profiles three Swedish tuition-free, independent public schools. Independent schools were formed after the Swedish government enacted school choice legislation in 1992 resulting in the replacement of private schools with a system of tuition-free, self-managed public schools. These schools (now over 800) provide parents with alternatives to…

  7. SU-E-T-50: A Multi-Institutional Study of Independent Dose Verification Software Program for Lung SBRT

    SciTech Connect

    Kawai, D; Takahashi, R; Kamima, T; Baba, H; Yamamoto, T; Kubo, Y; Ishibashi, S; Higuchi, Y; Takahashi, H; Tachibana, H

    2015-06-15

    Purpose: The accuracy of dose distribution depends on treatment planning system especially in heterogeneity-region. The tolerance level (TL) of the secondary check using the independent dose verification may be variable in lung SBRT plans. We conducted a multi-institutional study to evaluate the tolerance level of lung SBRT plans shown in the AAPM TG114. Methods: Five institutes in Japan participated in this study. All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. Analytical Anisotropic Algorithm (AAA), Pencil Beam Convolution with modified Batho-method (PBC-B) and Adaptive Convolve (AC) were used for lung SBRT planning. A measurement using an ion-chamber was performed in a heterogeneous phantom to compare doses from the three different algorithms and the SMU to the measured dose. In addition to it, a retrospective analysis using clinical lung SBRT plans (547 beams from 77 patients) was conducted to evaluate the confidence limit (CL, Average±2SD) in dose between the three algorithms and the SMU. Results: Compared to the measurement, the AAA showed the larger systematic dose error of 2.9±3.2% than PBC-B and AC. The Clarkson-based SMU showed larger error of 5.8±3.8%. The CLs for clinical plans were 7.7±6.0 % (AAA), 5.3±3.3 % (AC), 5.7±3.4 % (PBC -B), respectively. Conclusion: The TLs from the CLs were evaluated. A Clarkson-based system shows a large systematic variation because of inhomogeneous correction. The AAA showed a significant variation. Thus, we must consider the difference of inhomogeneous correction as well as the dependence of dose calculation engine.

  8. OPC verification and hotspot management for yield enhancement through layout analysis

    NASA Astrophysics Data System (ADS)

    Yoo, Gyun; Kim, Jungchan; Lee, Taehyeong; Jung, Areum; Yang, Hyunjo; Yim, Donggyu; Park, Sungki; Maruyama, Kotaro; Yamamoto, Masahiro; Vikram, Abhishek; Park, Sangho

    2011-03-01

    As the design rule shrinks down, various techniques such as RET, DFM have been continuously developed and applied to lithography field. And we have struggled not only to obtain sufficient process window with those techniques but also to feedback hot spots to OPC process for yield improvement in mass production. OPC verification procedure which iterates its processes from OPC to wafer verification until the CD targets are met and hot spots are cleared is becoming more important to ensure robust and accurate patterning and tight hot spot management. Generally, wafer verification results which demonstrate how well OPC corrections are made need to be fed back to OPC engineer in effective and accurate way. First of all, however, it is not possible to cover all transistors in full-chip with some OPC monitoring points which have been used for wafer verification. Secondly, the hot spots which are extracted by OPC simulator are not always reliable enough to represent defective information for full-chip. Finally, it takes much TAT and labor to do this with CD SEM measurement. These difficulties on wafer verification would be improved by design based analysis. The optimal OPC monitoring points are created by classifying all transistors in full chip layout and Hotspot set is selected by pattern matching process using the NanoScopeTM, which is known as a fast design based analysis tool, with a very small amount of hotspots extracted by OPC simulator in full chip layout. Then, each set is used for wafer verification using design based inspection tool, NGR2150TM. In this paper, new verification methodology based on design based analysis will be introduced as an alternative method for effective control of OPC accuracy and hot spot management.

  9. 49 CFR 236.1017 - Independent third party Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... that consideration of the methodology used in the risk assessment (§ 236.913(g)(2)(vii)) shall apply only to the extent that a comparative risk assessment was required. To the extent practicable, FRA... Validation. (a) The PTCSP must be supported by an independent third-party assessment when the...

  10. 49 CFR 236.1017 - Independent third party verification and validation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... that consideration of the methodology used in the risk assessment (§ 236.913(g)(2)(vii)) shall apply only to the extent that a comparative risk assessment was required. To the extent practicable, FRA... validation. (a) The PTCSP must be supported by an independent third-party assessment when the...

  11. 49 CFR 236.1017 - Independent third party Verification and Validation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... that consideration of the methodology used in the risk assessment (§ 236.913(g)(2)(vii)) shall apply only to the extent that a comparative risk assessment was required. To the extent practicable, FRA... Validation. (a) The PTCSP must be supported by an independent third-party assessment when the...

  12. 49 CFR 236.1017 - Independent third party Verification and Validation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... that consideration of the methodology used in the risk assessment (§ 236.913(g)(2)(vii)) shall apply only to the extent that a comparative risk assessment was required. To the extent practicable, FRA... Validation. (a) The PTCSP must be supported by an independent third-party assessment when the...

  13. 49 CFR 236.1017 - Independent third party Verification and Validation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... that consideration of the methodology used in the risk assessment (§ 236.913(g)(2)(vii)) shall apply only to the extent that a comparative risk assessment was required. To the extent practicable, FRA... Validation. (a) The PTCSP must be supported by an independent third-party assessment when the...

  14. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... assessment is to provide an independent evaluation of the product manufacturer's utilization of safety design... design and development of the product. At a minimum, the reviewer shall compare the supplier processes... the product design which considers the safety elements listed in paragraph (b) of appendix C to...

  15. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... assessment is to provide an independent evaluation of the product manufacturer's utilization of safety design... design and development of the product. At a minimum, the reviewer shall compare the supplier processes... the product design which considers the safety elements listed in paragraph (b) of appendix C to...

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER MANAGEMENT STORMFILTER® TREATMENT SYSTEM USING PERLITE MEDIA

    EPA Science Inventory

    Verification testing of the Stormwater Management, Inc. StormFilter® Using Perlite Filter Media was conducted on a 0.7 acre drainage basin near downtown Griffin, Georgia. The system consists of an inlet bay, flow spreader, cartridge bay, overflow baffle, and outlet bay, housed in...

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE: STORMWATER MANAGEMENT INC., STORMSCREEN� TREATMENT SYSTEM

    EPA Science Inventory

    Verification Testing of the Stormwater Management, Inc. StormScreen treatment technology was performed during a 12-month period starting in May, 2003. The system was previously installed in a city-owned right-of-way near downtown Griffin, GA., and is a device for removing trash,...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE - STORMWATER MANAGEMENT INC., CATCH BASIN STORMFILTER®

    EPA Science Inventory

    Verification testing of the Stormwater Management CatchBasin StormFilter® (CBSF) was conducted on a 0.16 acre drainage basin at the City of St. Clair Shores, Michigan Department of Public Works facility. The four-cartridge CBSF consists of a storm grate and filter chamber inlet b...

  19. An Independent Verification and Validation of the Future Theater Level Model Conceptual Model

    DTIC Science & Technology

    1994-08-01

    the Battle Area FLOT Forward Line of Troops FTLM Future Theater Level Model GUI Graphical User Interface I/ O Input/Output IV&V Independent...areas have been grouped together and the four areas concerned with input and output (I/ O ) have been grouped together. The Analytic Structure area...LOGISTICS • i .i .-.• .j 3.0 4.G j I/ O GROUP OPERATIONAL GROUP (.• (.* 1." 1.’ t

  20. Multifractal analysis of managed and independent float exchange rates

    NASA Astrophysics Data System (ADS)

    Stošić, Darko; Stošić, Dusan; Stošić, Tatijana; Stanley, H. Eugene

    2015-06-01

    We investigate multifractal properties of daily price changes in currency rates using the multifractal detrended fluctuation analysis (MF-DFA). We analyze managed and independent floating currency rates in eight countries, and determine the changes in multifractal spectrum when transitioning between the two regimes. We find that after the transition from managed to independent float regime the changes in multifractal spectrum (position of maximum and width) indicate an increase in market efficiency. The observed changes are more pronounced for developed countries that have a well established trading market. After shuffling the series, we find that the multifractality is due to both probability density function and long term correlations for managed float regime, while for independent float regime multifractality is in most cases caused by broad probability density function.

  1. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 & Vol 2

    SciTech Connect

    PARSONS, J.E.

    2000-07-15

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  2. Verification of a Quality Management Theory: Using a Delphi Study

    PubMed Central

    Mosadeghrad, Ali Mohammad

    2013-01-01

    Background: A model of quality management called Strategic Collaborative Quality Management (SCQM) model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. Methods: The proposed model was further developed using feedback from thirty quality management experts using a Delphi method. Further, a guidebook for its implementation was prepared including a road map and performance measurement. Results: The research led to the development of a context-specific model of quality management for healthcare organisations and a series of guidelines for its implementation. Conclusion: A proper model of quality management should be developed and implemented properly in healthcare organisations to achieve business excellence. PMID:24596883

  3. A simple method of independent treatment time verification in gamma knife radiosurgery using integral dose

    SciTech Connect

    Jin Jianyue; Drzymala, Robert; Li Zuofeng

    2004-12-01

    The purpose of this study is to develop a simple independent dose calculation method to verify treatment plans for Leksell Gamma Knife radiosurgery. Our approach uses the total integral dose within the skull as an end point for comparison. The total integral dose is computed using a spreadsheet and is compared to that obtained from Leksell GammaPlan registered . It is calculated as the sum of the integral doses of 201 beams, each passing through a cylindrical volume. The average length of the cylinders is estimated from the Skull-Scaler measurement data taken before treatment. Correction factors are applied to the length of the cylinder depending on the location of a shot in the skull. The radius of the cylinder corresponds to the collimator aperture of the helmet, with a correction factor for the beam penumbra and scattering. We have tested our simple spreadsheet program using treatment plans of 40 patients treated with Gamma Knife registered in our center. These patients differ in geometry, size, lesion locations, collimator helmet, and treatment complexities. Results show that differences between our calculations and treatment planning results are typically within {+-}3%, with a maximum difference of {+-}3.8%. We demonstrate that our spreadsheet program is a convenient and effective independent method to verify treatment planning irradiation times prior to implementation of Gamma Knife radiosurgery.

  4. Independent verification survey report for exposure units Z2-24, Z2-31, Z2-32, AND Z2-36 in zone 2 of the East Tennessee technology park Oak Ridge, Tennessee

    SciTech Connect

    King, David A.

    2013-10-01

    The U.S. Department of Energy (DOE) Oak Ridge Office of Environmental Management selected Oak Ridge Associated Universities (ORAU), through the Oak Ridge Institute for Science and Education (ORISE) contract, to perform independent verification (IV) at Zone 2 of the East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. ORAU has concluded IV surveys, per the project-specific plan (PSP) (ORAU 2013a) covering exposure units (EUs) Z2-24, -31, -32, and -36. The objective of this effort was to verify the target EUs comply with requirements in the Zone 2 Record of Decision (ROD) (DOE 2005), as implemented by using the dynamic verification strategy presented in the dynamic work plan (DWP) (BJC 2007); and confirm commitments in the DWP were adequately implemented, as verified via IV surveys and soil sampling.

  5. Independent verification of the values of electroelastic constants of alpha quartz

    NASA Astrophysics Data System (ADS)

    Hruska, Carl K.

    1987-02-01

    The component values of the cumulative electroelastic tensor of quartz found by C. K. Hruska (Proceedings of the 31st Annual Frequency Control Symposium, June 1977, pp. 159-170; available from NTIS, Springfield, VA, as document AD A088221) by means of the resonator method are independently confirmed by G. A. Reider, E. Kittinger, and J. Tichy [J. Appl. Phys. 53, 8716 (1982)] using a time delay experiment. The disagreement of several orders of magnitude existing with other authors who also used the resonator method is explained and removed. There can be no doubt that the magnitude of the electroelastic tensor components is in the range of 1 and 10-1 N/(V m) and that the standard errors of their determined values can be as small as several percent.

  6. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  7. Imaging for dismantlement verification: information management and analysis algorithms

    SciTech Connect

    Seifert, Allen; Miller, Erin A.; Myjak, Mitchell J.; Robinson, Sean M.; Jarman, Kenneth D.; Misner, Alex C.; Pitts, W. Karl; Woodring, Mitchell L.

    2010-09-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute. However, this process must be performed with care. Computing the perimeter, area, and intensity of an object, for example, might reveal sensitive information relating to shape, size, and material composition. This paper presents three analysis algorithms that reduce full image information to non-sensitive feature information. Ultimately, the algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We evaluate the algorithms on both their technical performance in image analysis, and their application with and without an explicitly constructed information barrier. The underlying images can be highly detailed, since they are dynamically generated behind the information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography.

  8. DOE handbook: Integrated safety management systems (ISMS) verification team leader`s handbook

    SciTech Connect

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  9. Independent Verification Survey of the Clean Coral Storage Pile at the Johnston Atoll Plutonium Contaminated Soil Remediation Project

    SciTech Connect

    Wilson-Nichols, M.J.; Egidi, P.V.; Roemer, E.K.; Schlosser, R.M.

    2000-09-01

    f I The Oak Ridge National Laboratory (ORNL) Environmental Technology Section conducted an independent verification (IV) survey of the clean storage pile at the Johnston Atoll Plutonium Contaminated Soil Remediation Project (JAPCSRP) from January 18-25, 1999. The goal of the JAPCSRP is to restore a 24-acre area that was contaminated with plutonium oxide particles during nuclear testing in the 1960s. The selected remedy was a soil sorting operation that combined radiological measurements and mining processes to identify and sequester plutonium-contaminated soil. The soil sorter operated from about 1990 to 1998. The remaining clean soil is stored on-site for planned beneficial use on Johnston Island. The clean storage pile currently consists of approximately 120,000 m3 of coral. ORNL conducted the survey according to a Sampling and Analysis Plan, which proposed to provide an IV of the clean pile by collecting a minimum number (99) of samples. The goal was to ascertain wi th 95% confidence whether 97% of the processed soil is less than or equal to the accepted guideline (500-Bq/kg or 13.5-pCi/g) total transuranic (TRU) activity.

  10. Environmental Technology Verification Program Materials Management and Remediation Center Generic Protocol for Verification of In Situ Chemical Oxidation

    EPA Science Inventory

    The protocol provides generic procedures for implementing a verification test for the performance of in situ chemical oxidation (ISCO), focused specifically to expand the application of ISCO at manufactured gas plants with polyaromatic hydrocarbon (PAH) contamination (MGP/PAH) an...

  11. Adaptive beamlet-based finite-size pencil beam dose calculation for independent verification of IMRT and VMAT

    SciTech Connect

    Park, Justin C.; Li, Jonathan G.; Arhjoul, Lahcen; Yan, Guanghua; Lu, Bo; Fan, Qiyong; Liu, Chihray

    2015-04-15

    Purpose: The use of sophisticated dose calculation procedure in modern radiation therapy treatment planning is inevitable in order to account for complex treatment fields created by multileaf collimators (MLCs). As a consequence, independent volumetric dose verification is time consuming, which affects the efficiency of clinical workflow. In this study, the authors present an efficient adaptive beamlet-based finite-size pencil beam (AB-FSPB) dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. Methods: The computational time of finite-size pencil beam (FSPB) algorithm is proportional to the number of infinitesimal and identical beamlets that constitute an arbitrary field shape. In AB-FSPB, dose distribution from each beamlet is mathematically modeled such that the sizes of beamlets to represent an arbitrary field shape no longer need to be infinitesimal nor identical. As a result, it is possible to represent an arbitrary field shape with combinations of different sized and minimal number of beamlets. In addition, the authors included the model parameters to consider MLC for its rounded edge and transmission. Results: Root mean square error (RMSE) between treatment planning system and conventional FSPB on a 10 × 10 cm{sup 2} square field using 10 × 10, 2.5 × 2.5, and 0.5 × 0.5 cm{sup 2} beamlet sizes were 4.90%, 3.19%, and 2.87%, respectively, compared with RMSE of 1.10%, 1.11%, and 1.14% for AB-FSPB. This finding holds true for a larger square field size of 25 × 25 cm{sup 2}, where RMSE for 25 × 25, 2.5 × 2.5, and 0.5 × 0.5 cm{sup 2} beamlet sizes were 5.41%, 4.76%, and 3.54% in FSPB, respectively, compared with RMSE of 0.86%, 0.83%, and 0.88% for AB-FSPB. It was found that AB-FSPB could successfully account for the MLC transmissions without major discrepancy. The algorithm was also graphical processing unit (GPU) compatible to maximize its computational speed. For an intensity modulated radiation therapy (

  12. Revalidation and quality assurance: the application of the MUSIQ framework in independent verification visits to healthcare organisations

    PubMed Central

    Griffin, Ann; Viney, Rowena; Welland, Trevor; Gafson, Irene

    2017-01-01

    Objectives We present a national evaluation of the impact of independent verification visits (IVVs) performed by National Health Service (NHS) England as part of quality assuring medical revalidation. Organisational visits are central to NHS quality assurance. They are costly, yet little empirical research evidence exists concerning their impact, and what does exist is conflicting. Setting The focus was on healthcare providers in the NHS (in secondary care) and private sector across England, who were designated bodies (DBs). DBs are healthcare organisations that have a statutory responsibility, via the lead clinician, the responsible officer (RO), to implement medical revalidation. Participants All ROs who had undergone an IVV in England in 2014 and 2015 were invited to participate. 46 ROs were interviewed. Ethnographic data were gathered at 18 observations of the IVVs and 20 IVV post visit reports underwent documentary analysis. Primary and secondary outcome measures Primary outcomes were the findings pertaining to the effectiveness of the IVV system in supporting the revalidation processes at the DBs. Secondary outcomes were methodological, relating to the Model for Understanding Success in Quality (MUSIQ) and how its application to the IVV reveals the relevance of contextual factors described in the model. Results The impact of the IVVs varied by DB according to three major themes: the personal context of the RO; the organisational context of the DB; and the visit and its impact. ROs were largely satisfied with visits which raised the status of appraisal within their organisations. Inadequate or untimely feedback was associated with dissatisfaction. Conclusions Influencing teams whose prime responsibility is establishing processes and evaluating progress was crucial for internal quality improvement. Visits acted as a nudge, generating internal quality review, which was reinforced by visit teams with relevant expertise. Diverse team membership, knowledge transfer

  13. Independent Verification and Validation Of SAPHIRE 8 Volume 3 Users' Guide Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Volume 3 Users’ Guide is to assess the user documentation for its completeness, correctness, and consistency with respect to requirements for user interface and for any functionality that can be invoked by the user. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  14. Cryogenic Fluid Management Experiment (CFME) trunnion verification testing

    NASA Astrophysics Data System (ADS)

    Bailey, W. J.; Fester, D. A.

    1983-12-01

    The Cryogenic Fluid Management Experiment (CFME) was designed to characterize subcritical liquid hydrogen storage and expulsion in the low-g space environment. The CFME has now become the storage and supply tank for the Cryogenic Fluid Management Facility, which includes transfer line and receiver tanks, as well. The liquid hydrogen storage and supply vessel is supported within a vacuum jacket to two fiberglass/epoxy composite trunnions which were analyzed and designed. Analysis using the limited available data indicated the trunnion was the most fatigue critical component in the storage vessel. Before committing the complete storage tank assembly to environmental testing, an experimental assessment was performed to verify the capability of the trunnion design to withstand expected vibration and loading conditions. Three tasks were conducted to evaluate trunnion integrity. The first determined the fatigue properties of the trunnion composite laminate materials. Tests at both ambient and liquid hydrogen temperatures showed composite material fatigue properties far in excess of those expected. Next, an assessment of the adequacy of the trunnion designs was performed (based on the tested material properties).

  15. Cryogenic Fluid Management Experiment (CFME) trunnion verification testing

    NASA Technical Reports Server (NTRS)

    Bailey, W. J.; Fester, D. A.

    1983-01-01

    The Cryogenic Fluid Management Experiment (CFME) was designed to characterize subcritical liquid hydrogen storage and expulsion in the low-g space environment. The CFME has now become the storage and supply tank for the Cryogenic Fluid Management Facility, which includes transfer line and receiver tanks, as well. The liquid hydrogen storage and supply vessel is supported within a vacuum jacket to two fiberglass/epoxy composite trunnions which were analyzed and designed. Analysis using the limited available data indicated the trunnion was the most fatigue critical component in the storage vessel. Before committing the complete storage tank assembly to environmental testing, an experimental assessment was performed to verify the capability of the trunnion design to withstand expected vibration and loading conditions. Three tasks were conducted to evaluate trunnion integrity. The first determined the fatigue properties of the trunnion composite laminate materials. Tests at both ambient and liquid hydrogen temperatures showed composite material fatigue properties far in excess of those expected. Next, an assessment of the adequacy of the trunnion designs was performed (based on the tested material properties).

  16. Checklists for Business Managers. A Tool for Effective Independent School Management.

    ERIC Educational Resources Information Center

    National Association of Independent Schools, Boston, MA.

    The business office guides of the departments of education of Illinois and New Jersey served as the basic resource documents in forming this guide for independent school business managers. The checklists are grouped under the following headings: financial management, insurance and risk management, records retention, purchasing, nonacademic staff,…

  17. The SAMS: Smartphone Addiction Management System and verification.

    PubMed

    Lee, Heyoung; Ahn, Heejune; Choi, Samwook; Choi, Wanbok

    2014-01-01

    While the popularity of smartphones has given enormous convenience to our lives, their pathological use has created a new mental health concern among the community. Hence, intensive research is being conducted on the etiology and treatment of the condition. However, the traditional clinical approach based surveys and interviews has serious limitations: health professionals cannot perform continual assessment and intervention for the affected group and the subjectivity of assessment is questionable. To cope with these limitations, a comprehensive ICT (Information and Communications Technology) system called SAMS (Smartphone Addiction Management System) is developed for objective assessment and intervention. The SAMS system consists of an Android smartphone application and a web application server. The SAMS client monitors the user's application usage together with GPS location and Internet access location, and transmits the data to the SAMS server. The SAMS server stores the usage data and performs key statistical data analysis and usage intervention according to the clinicians' decision. To verify the reliability and efficacy of the developed system, a comparison study with survey-based screening with the K-SAS (Korean Smartphone Addiction Scale) as well as self-field trials is performed. The comparison study is done using usage data from 14 users who are 19 to 50 year old adults that left at least 1 week usage logs and completed the survey questionnaires. The field trial fully verified the accuracy of the time, location, and Internet access information in the usage measurement and the reliability of the system operation over more than 2 weeks. The comparison study showed that daily use count has a strong correlation with K-SAS scores, whereas daily use times do not strongly correlate for potentially addicted users. The correlation coefficients of count and times with total K-SAS score are CC = 0.62 and CC =0.07, respectively, and the t-test analysis for the

  18. Orion GN&C Fault Management System Verification: Scope And Methodology

    NASA Technical Reports Server (NTRS)

    Brown, Denise; Weiler, David; Flanary, Ronald

    2016-01-01

    In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.

  19. Project TEAMS (Techniques and Education for Achieving Management Skills): Independent Business Owner/Managers.

    ERIC Educational Resources Information Center

    Platte Technical Community Coll., Columbus, NE.

    These Project TEAMS (Techniques and Education for Achieving Managerial Skills) instructional materials consist of five units for use in training independent business owner/managers. The first unit contains materials which deal with management skills relating to personal characteristics of successful business people, knowledge of self and chosen…

  20. Independent Business Owner/Managers. Project TEAMS. (Techniques and Education for Achieving Management Skills).

    ERIC Educational Resources Information Center

    Platte Technical Community Coll., Columbus, NE.

    Prepared as part of Platte Technical Community College's project to help managers and supervisors develop practical, up-to-date managerial skills in a relatively short time, this instructional workbook provides information and exercises applicable to on-the-job situations encountered by independent business owner/managers. Unit I provides…

  1. Independent Verification and Validation Of SAPHIRE 8 Software Quality Assurance Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-02-01

    This report provides an evaluation of the Software Quality Assurance Plan. The Software Quality Assurance Plan is intended to ensure all actions necessary for the software life cycle; verification and validation activities; documentation and deliverables; project management; configuration management, nonconformance reporting and corrective action; and quality assessment and improvement have been planned and a systematic pattern of all actions necessary to provide adequate confidence that a software product conforms to established technical requirements; and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  2. Independent Verification Survey of the Clean Coral Storage Pile at the Johnston Atoll Plutonium-Contaminated Soil Remediation Project

    SciTech Connect

    Wilson-Nichols, M.J.

    2000-12-07

    The Oak Ridge National Laboratory (ORNL) Environmental Technology Section conducted an independent verification (IV) survey of the clean storage pile at the Johnston Atoll Plutonium Contaminated Soil Remediation Project (JAPCSRP) from January 18-25, 1999. The goal of the JAPCSRP is to restore a 24-acre area that was contaminated with plutonium oxide particles during nuclear testing in the 1960s. The selected remedy was a soil sorting operation that combined radiological measurements and mining processes to identify and sequester plutonium-contaminated soil. The soil sorter operated from about 1990 to 1998. The remaining clean soil is stored on-site for planned beneficial use on Johnston Island. The clean storage pile currently consists of approximately 120,000 m{sup 3} of coral. ORNL conducted the survey according to a Sampling and Analysis Plan, which proposed to provide an IV of the clean pile by collecting a minimum number (99) of samples. The goal was to ascertain with 95% confidence whether 97% of the processed soil is less than or equal to the accepted guideline (500-Bq/kg or 13.5-pCi/g) total transuranic (TRU) activity. In previous IV tasks, ORNL has (1) evaluated and tested the soil sorter system software and hardware and (2) evaluated the quality control (QC) program used at the soil sorter plant. The IV has found that the soil sorter decontamination was effective and significantly reduced plutonium contamination in the soil processed at the JA site. The Field Command Defense Threat Reduction Agency currently plans to re-use soil from the clean pile as a cover to remaining contamination in portions of the radiological control area. Therefore, ORNL was requested to provide an IV. The survey team collected samples from 103 random locations within the top 4 ft of the clean storage pile. The samples were analyzed in the on-site radioanalytical counting laboratory with an American Nuclear Systems (ANS) field instrument used for the detection of low

  3. Transient analysis for the tajoura critical facility with IRT-2M HEU fuel and IRT-4M leu fuel : ANL independent verification results.

    SciTech Connect

    Garner, P. L.; Hanan, N. A.

    2005-12-02

    Calculations have been performed for postulated transients in the Critical Facility at the Tajoura Nuclear Research Center (TNRC) in Libya. These calculations have been performed at the request of staff of the Renewable Energy and Water Desalinization Research Center (REWDRC) who are performing similar calculations. The transients considered were established during a working meeting between ANL and REWDRC staff on October 1-2, 2005 and subsequent email correspondence. Calculations were performed for the current high-enriched uranium (HEU) core and the proposed low-enriched uranium (LEU) core. These calculations have been performed independently from those being performed by REWDRC and serve as one step in the verification process.

  4. Independent Verification and Validation Of SAPHIRE 8 Software Acceptance Test Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Software Acceptance Test Plan is to assess the approach to be taken for intended testing activities. The plan typically identifies the items to be tested, the requirements being tested, the testing to be performed, test schedules, personnel requirements, reporting requirements, evaluation criteria, and any risks requiring contingency planning. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  5. Global climate change mitigation and sustainable forest management--The challenge of monitoring and verification

    SciTech Connect

    Makundi, Willy R.

    1997-12-31

    In this paper, sustainable forest management is discussed within the historical and theoretical framework of the sustainable development debate. The various criteria and indicators for sustainable forest management put forth by different institutions are critically explored. Specific types of climate change mitigation policies/projects in the forest sector are identified and examined in the light of the general criteria for sustainable forest management. Areas of compatibility and contradiction between the climate mitigation objectives and the minimum criteria for sustainable forest management are identified and discussed. Emphasis is put on the problems of monitoring and verifying carbon benefits associated with such projects given their impacts on pre-existing policy objectives on sustainable forest management. The implications of such policy interactions on assignment of carbon credits from forest projects under Joint Implementation/Activities Implemented Jointly initiatives are discussed. The paper concludes that a comprehensive monitoring and verification regime must include an impact assessment on the criteria covered under other agreements such as the Biodiversity and/or Desertification Conventions. The actual carbon credit assigned to a specific project should at least take into account the negative impacts on the criteria for sustainable forest management. The value of the impacts and/or the procedure to evaluate them need to be established by interested parties such as the Councils of the respective Conventions.

  6. Verification Testing of Air Pollution Control Technology Quality Management Plan Revision 2.3

    EPA Pesticide Factsheets

    The Air Pollution Control Technology Verification Center was established in 1995 as part of the EPA’s Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technologies’ performance.

  7. Managing Headship Transitions in U.S. Independent Schools

    ERIC Educational Resources Information Center

    Kane, Pearl Rock; Barbaro, Justin

    2016-01-01

    Headship transitions in U.S. independent schools represent critical organizational events that affect multiple school constituencies, including faculty, staff, and students. With recent projections forecasting a high level of impending headship transitions in independent schools, this paper seeks to capture how second-year U.S. independent school…

  8. INDEPENDENT VERIFICATION SURVEY OF THE SPRU LOWER LEVEL HILLSIDE AREA AT THE KNOLLS ATOMIC POWER LABORATORY NISKAYUNA, NEW YORK

    SciTech Connect

    Harpenau, Evan M.; Weaver, Phyllis C.

    2012-06-06

    During August 10, 2011 through August 19, 2011, and October 23, 2011 through November 4, 2011, ORAU/ORISE conducted verification survey activities at the Separations Process Research Unit (SPRU) site that included in-process inspections, surface scans, and soil sampling of the Lower Level Hillside Area. According to the Type-B Investigation Report, Sr-90 was the primary contributor to the majority of the activity (60 times greater than the Cs-137 activity). The evaluation of the scan data and sample results obtained during verification activities determined that the primary radionuclide of concern, Sr-90, was well below the agreed upon soil cleanup objective (SCO) of 30 pCi/g for the site. However, the concentration of Cs-137 in the four judgmental samples collected in final status survey (FSS) Units A and B was greater than the SCO. Both ORAU and aRc surveys identified higher Cs-137 concentrations in FSS Units A and B; the greatest concentrations were indentified in FSS Unit A.

  9. SU-E-T-351: Verification of Monitor Unit Calculation for Lung Stereotactic Body Radiation Therapy Using a Secondary Independent Planning System

    SciTech Connect

    Tsuruta, Y; Nakata, M; Higashimura, K; Nakamura, M; Miyabe, Y; Akimoto, M; Ono, T; Mukumoto, N; Ishihara, Y; Matsuo, Y; Mizowaki, T; Hiraoka, M

    2014-06-01

    Purpose: To compare isocenter (IC) dose between X-ray Voxel Monte Carlo (XVMC) and Acuros XB (AXB) as part of an independent verification of monitor unit (MU) calculation for lung stereotactic body radiation therapy (SBRT) using a secondary independent treatment planning system (TPS). Methods: Treatment plans of 110 lesions from 101 patients who underwent lung SBRT with Vero4DRT (Mitsubishi Heavy Industries, Ltd., Japan, and BrainLAB, Feldkirchen, Germany) were evaluated retrospectively. Dose distribution was calculated with X-ray Voxel Monte Carlo (XVMC) in iPlan 4.5.1 (BrainLAB, Feldkirchen, Germany) on averaged intensity projection images. A spatial resolution and mean variance were 2 mm and 2%, respectively. The clinical treatment plans were transferred from iPlan to Eclipse (Varian Medical Systems, Palo Alto, CA, USA), and doses were recalculated with well commissioned AXB ver. 11.0.31 while maintaining the XVMC-calculated MUs and beam arrangement. Dose calculations were made in the dose-to-medium dose reporting mode with the calculation grid size of 2.5 mm. The mean and standard deviation (SD) of the IC dose difference between XVMC and AXB were calculated. The tolerance level was defined as |mean|+2SD. Additionally, the relationship between IC dose difference and the size of planning target volume (PTV) or computed tomography (CT) value of internal target volume (ITV) was evaluated. Results: The mean±SD of the IC dose difference between XVMC and AXB was −0.32±0.73%. The tolerance level was 1.8%. Absolute IC dose differences exceeding the tolerance level were observed in 3 patients (2.8%). There were no strong correlations between IC dose difference and PTV size (R=−0.14) or CT value of ITV (R=−0.33). Conclusion: The present study suggested that independent verification of MU calculation for lung SBRT using a secondary TPS is useful.

  10. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  11. SU-E-T-490: Independent Three-Dimensional (3D) Dose Verification of VMAT/SBRT Using EPID and Cloud Computing

    SciTech Connect

    Ding, A; Han, B; Bush, K; Wang, L; Xing, L

    2015-06-15

    Purpose: Dosimetric verification of VMAT/SBRT is currently performed on one or two planes in a phantom with either film or array detectors. A robust and easy-to-use 3D dosimetric tool has been sought since the advent of conformal radiation therapy. Here we present such a strategy for independent 3D VMAT/SBRT plan verification system by a combined use of EPID and cloud-based Monte Carlo (MC) dose calculation. Methods: The 3D dosimetric verification proceeds in two steps. First, the plan was delivered with a high resolution portable EPID mounted on the gantry, and the EPID-captured gantry-angle-resolved VMAT/SBRT field images were converted into fluence by using the EPID pixel response function derived from MC simulations. The fluence was resampled and used as the input for an in-house developed Amazon cloud-based MC software to reconstruct the 3D dose distribution. The accuracy of the developed 3D dosimetric tool was assessed using a Delta4 phantom with various field sizes (square, circular, rectangular, and irregular MLC fields) and different patient cases. The method was applied to validate VMAT/SBRT plans using WFF and FFF photon beams (Varian TrueBeam STX). Results: It was found that the proposed method yielded results consistent with the Delta4 measurements. For points on the two detector planes, a good agreement within 1.5% were found for all the testing fields. Patient VMAT/SBRT plan studies revealed similar level of accuracy: an average γ-index passing rate of 99.2± 0.6% (3mm/3%), 97.4± 2.4% (2mm/2%), and 72.6± 8.4 % ( 1mm/1%). Conclusion: A valuable 3D dosimetric verification strategy has been developed for VMAT/SBRT plan validation. The technique provides a viable solution for a number of intractable dosimetry problems, such as small fields and plans with high dose gradient.

  12. Results of the independent radiological verification survey at the former Associate Aircraft Tool and Manufacturing Company site, Fairfield, Ohio (FOH001)

    SciTech Connect

    Rice, D.E.; Murray, M.E.; Brown, K.S.

    1996-01-01

    The former Associate Aircraft Tool and Manufacturing Company site is located at 3550 Dixie Highway, Fairfield, Ohio. Associate Aircraft Tool and Manufacturing Company produced hollow uranium slugs in a machine shop at the site in 1956. The work was performed for National Lead of Ohio in a contract with the Atomic Energy Commission to augment the capacity of the Feed Materials Production Center at Fernald in the development of nuclear energy for defense-related projects. The current occupant of the building, Force Control, operates a multipurpose machine shop. At the request of the US Department of Energy (DOE), a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at the former Associate Aircraft Tool and Manufacturing Company Site, Fairfield, Ohio. The survey was performed from February to May of 1995. The purpose of the survey was to verify that radioactivity from residues of {sup 238}U was remediated to a level below acceptable DOE guidelines levels.

  13. Neutronic, steady-state, and transient analyses for the Kazakhstan VVR-K reactor with LEU fuel: ANL independent verification results

    SciTech Connect

    Hanan, Nelson A.; Garner, Patrick L.

    2015-08-01

    Calculations have been performed for steady state and postulated transients in the VVR-K reactor at the Institute of Nuclear Physics (INP), Kazakhstan. (The reactor designation in Cyrillic is BBP-K; transliterating characters to English gives VVR-K but translating words gives WWR-K.) These calculations have been performed at the request of staff of the INP who are performing similar calculations. The selection of the transients considered started during working meetings and email correspondence between Argonne National Laboratory (ANL) and INP staff. In the end the transient were defined by the INP staff. Calculations were performed for the fresh low-enriched uranium (LEU) core and for four subsequent cores as beryllium is added to maintain critically during the first 15 cycles. These calculations have been performed independently from those being performed by INP and serve as one step in the verification process.

  14. Transient analyses for the Uzbekistan VVR-SM reactor with IRT-3M HEU fuel and IRT-4M LEU fuel : ANL independent verification results.

    SciTech Connect

    Garner, P. L.; Hanan, N. A.; Nuclear Engineering Division

    2007-09-24

    Calculations have been performed for postulated transients in the VVR-SM Reactor at the Institute of Nuclear Physics (INP) of the Academy of Sciences in the Republic of Uzbekistan. (The reactor designation in Cyrillic is BBP-CM; transliterating characters to English gives VVRSM but translating words gives WWR-SM.) These calculations have been performed at the request of staff of the INP who are performing similar calculations. The transients considered were established during working meetings between Argonne National Laboratory (ANL) and INP staff during summer 2006 [Ref. 1], subsequent email correspondence, and subsequent staff visits. Calculations were performed for the current high-enriched uranium (HEU) core, the proposed low-enriched uranium (LEU) core, and one mixed HEU-LEU core during the transition. These calculations have been performed independently from those being performed by INP and serve as one step in the verification process.

  15. TH-E-BRE-11: Adaptive-Beamlet Based Finite Size Pencil Beam (AB-FSPB) Dose Calculation Algorithm for Independent Verification of IMRT and VMAT

    SciTech Connect

    Park, C; Arhjoul, L; Yan, G; Lu, B; Li, J; Liu, C

    2014-06-15

    Purpose: In current IMRT and VMAT settings, the use of sophisticated dose calculation procedure is inevitable in order to account complex treatment field created by MLCs. As a consequence, independent volumetric dose verification procedure is time consuming which affect the efficiency of clinical workflow. In this study, the authors present an efficient Pencil Beam based dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. Methods: The computational time of Finite Size Pencil Beam (FSPB) algorithm is proportional to the number of infinitesimal identical beamlets that constitute the arbitrary field shape. In AB-FSPB, the dose distribution from each beamlet is mathematically modelled such that the sizes of beamlets to represent arbitrary field shape are no longer needed to be infinitesimal nor identical. In consequence, it is possible to represent arbitrary field shape with combinations of different sized and minimal number of beamlets. Results: On comparing FSPB with AB-FSPB, the complexity of the algorithm has been reduced significantly. For 25 by 25 cm2 squared shaped field, 1 beamlet of 25 by 25 cm2 was sufficient to calculate dose in AB-FSPB, whereas in conventional FSPB, minimum 2500 beamlets of 0.5 by 0.5 cm2 size were needed to calculate dose that was comparable to the Result computed from Treatment Planning System (TPS). The algorithm was also found to be GPU compatible to maximize its computational speed. On calculating 3D dose of IMRT (∼30 control points) and VMAT plan (∼90 control points) with grid size 2.0 mm (200 by 200 by 200), the dose could be computed within 3∼5 and 10∼15 seconds. Conclusion: Authors have developed an efficient Pencil Beam type dose calculation algorithm called AB-FSPB. The fast computation nature along with GPU compatibility has shown performance better than conventional FSPB. This completely enables the implantation of AB-FSPB in the clinical environment for independent

  16. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  17. Independent Validation and Verification of Process Design and Optimization Technology Diagnostic and Control of Natural Gas Fired Furnaces via Flame Image Analysis Technology

    SciTech Connect

    Cox, Daryl

    2009-05-01

    The United States Department of Energy, Industrial Technologies Program has invested in emerging Process Design and Optimizations Technologies (PDOT) to encourage the development of new initiatives that might result in energy savings in industrial processes. Gas fired furnaces present a harsh environment, often making accurate determination of correct air/fuel ratios a challenge. Operation with the correct air/fuel ratio and especially with balanced burners in multi-burner combustion equipment can result in improved system efficiency, yielding lower operating costs and reduced emissions. Flame Image Analysis offers a way to improve individual burner performance by identifying and correcting fuel-rich burners. The anticipated benefit of this technology is improved furnace thermal efficiency, and lower NOx emissions. Independent validation and verification (V&V) testing of the FIA technology was performed at Missouri Forge, Inc., in Doniphan, Missouri by Environ International Corporation (V&V contractor) and Enterprise Energy and Research (EE&R), the developer of the technology. The test site was selected by the technology developer and accepted by Environ after a meeting held at Missouri Forge. As stated in the solicitation for the V&V contractor, 'The objective of this activity is to provide independent verification and validation of the performance of this new technology when demonstrated in industrial applications. A primary goal for the V&V process will be to independently evaluate if this technology, when demonstrated in an industrial application, can be utilized to save a significant amount of the operating energy cost. The Seller will also independently evaluate the other benefits of the demonstrated technology that were previously identified by the developer, including those related to product quality, productivity, environmental impact, etc'. A test plan was provided by the technology developer and is included as an appendix to the summary report submitted

  18. FINAL REPORT –INDEPENDENT VERIFICATION SURVEY SUMMARY AND RESULTS FOR THE ARGONNE NATIONAL LABORATORY BUILDING 330 PROJECT FOOTPRINT, ARGONNE, ILLINOIS

    SciTech Connect

    ERIKA N. BAILEY

    2012-02-29

    ORISE conducted onsite verification activities of the Building 330 project footprint during the period of June 6 through June 7, 2011. The verification activities included technical reviews of project documents, visual inspections, radiation surface scans, and sampling and analysis. The draft verification report was issued in July 2011 with findings and recommendations. The contractor performed additional evaluations and remediation.

  19. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  20. The Management of Independent Secondary School Libraries in England and Wales: The Skills and Perceptions of Library Managers

    ERIC Educational Resources Information Center

    Turner, Richard; Matthews, Graham; Ashcroft, Linda; Farrow, Janet

    2007-01-01

    This paper investigates aspects of the management of independent secondary school libraries in England and Wales. It is based on a survey of 150 independent school library managers, with a response rate of 68.7 percent, which was carried out as part of an ongoing PhD research project. The paper considers a range of issues important to school…

  1. Identification, Verification, and Compilation of Produced Water Management Practices for Conventional Oil and Gas Production Operations

    SciTech Connect

    Rachel Henderson

    2007-09-30

    The project is titled 'Identification, Verification, and Compilation of Produced Water Management Practices for Conventional Oil and Gas Production Operations'. The Interstate Oil and Gas Compact Commission (IOGCC), headquartered in Oklahoma City, Oklahoma, is the principal investigator and the IOGCC has partnered with ALL Consulting, Inc., headquartered in Tulsa, Oklahoma, in this project. State agencies that also have partnered in the project are the Wyoming Oil and Gas Conservation Commission, the Montana Board of Oil and Gas Conservation, the Kansas Oil and Gas Conservation Division, the Oklahoma Oil and Gas Conservation Division and the Alaska Oil and Gas Conservation Commission. The objective is to characterize produced water quality and management practices for the handling, treating, and disposing of produced water from conventional oil and gas operations throughout the industry nationwide. Water produced from these operations varies greatly in quality and quantity and is often the single largest barrier to the economic viability of wells. The lack of data, coupled with renewed emphasis on domestic oil and gas development, has prompted many experts to speculate that the number of wells drilled over the next 20 years will approach 3 million, or near the number of current wells. This level of exploration and development undoubtedly will draw the attention of environmental communities, focusing their concerns on produced water management based on perceived potential impacts to fresh water resources. Therefore, it is imperative that produced water management practices be performed in a manner that best minimizes environmental impacts. This is being accomplished by compiling current best management practices for produced water from conventional oil and gas operations and to develop an analysis tool based on a geographic information system (GIS) to assist in the understanding of watershed-issued permits. That would allow management costs to be kept in line with

  2. International Space Station Atmosphere Control and Supply, Atmosphere Revitalization, and Water Recovery and Management Subsystem - Verification for Node 1

    NASA Technical Reports Server (NTRS)

    Williams, David E.

    2007-01-01

    The International Space Station (ISS) Node 1 Environmental Control and Life Support (ECLS) System is comprised of five subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), and Water Recovery and Management (WRM). This paper provides a summary of the nominal operation of the Node 1 ACS, AR, and WRM design and detailed Element Verification methodologies utilized during the Qualification phase for Node 1.

  3. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  4. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  5. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  6. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  7. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  8. Space station data management system - A common GSE test interface for systems testing and verification

    NASA Technical Reports Server (NTRS)

    Martinez, Pedro A.; Dunn, Kevin W.

    1987-01-01

    This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.

  9. Independent Verification of Research Reactor Operation (Analysis of the Georgian IRT-M Reactor by the Isotope Ratio Method)

    SciTech Connect

    Cliff, John B.; Frank, Douglas P.; Gerlach, David C.; Gesh, Christopher J.; Little, Winston W.; Reid, Bruce D.; Tsiklauri, Georgi V.; Abramidze, Sh; Rostomashvili, Z.; Kiknadze, G.; Dzhavakhishvily, O.; Nabakhtiani, G.

    2010-08-11

    The U.S. Department of Energy’s Office of Nonproliferation and International Security (NA-24) develops technologies to aid in implementing international nuclear safeguards. The Isotope Ratio Method (IRM) was successfully developed in 2005 – 2007 by Pacific Northwest National Laboratory (PNNL) and the Republic of Georgia’s Andronikashvili Institute of Physics as a generic technology to verify the declared operation of water-moderated research reactors, independent of spent fuel inventory. IRM estimates the energy produced over the operating lifetime of a fission reactor by measuring the ratios of the isotopes of trace impurity elements in non-fuel reactor components.The Isotope Ratio Method is a technique for estimating the energy produced over the operating lifetime of a fission reactor by measuring the ratios of the isotopes of impurity elements in non-fuel reactor components.

  10. Report of the Space Shuttle Management Independent Review Team

    NASA Technical Reports Server (NTRS)

    1995-01-01

    At the request of the NASA Administrator a team was formed to review the Space Shuttle Program and propose a new management system that could significantly reduce operating costs. Composed of a group of people with broad and extensive experience in spaceflight and related areas, the team received briefings from the NASA organizations and most of the supporting contractors involved in the Shuttle Program. In addition, a number of chief executives from the supporting contractors provided advice and suggestions. The team found that the present management system has functioned reasonably well despite its diffuse structure. The team also determined that the shuttle has become a mature and reliable system, and--in terms of a manned rocket-propelled space launch system--is about as safe as today's technology will provide. In addition, NASA has reduced shuttle operating costs by about 25 percent over the past 3 years. The program, however, remains in a quasi-development mode and yearly costs remain higher than required. Given the current NASA-contractor structure and incentives, it is difficult to establish cost reduction as a primary goal and implement changes to achieve efficiencies. As a result, the team sought to create a management structure and associated environment that enables and motivates the Program to further reduce operational costs. Accordingly, the review team concluded that the NASA Space Shuttle Program should (1) establish a clear set of program goals, placing a greater emphasis on cost-efficient operations and user-friendly payload integration; (2) redefine the management structure, separating development and operations and disengaging NASA from the daily operation of the space shuttle; and (3) provide the necessary environment and conditions within the program to pursue these goals.

  11. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  12. Provenance In Sensor Data Management: A Cohesive, Independent Solution

    SciTech Connect

    Hensley, Zachary P; Sanyal, Jibonananda; New, Joshua Ryan

    2014-01-01

    In today's information-driven workplaces, data is constantly undergoing transformations and being moved around. The typical business-as-usual approach is to use email attachments, shared network locations, databases, and now, the cloud. More often than not, there are multiple versions of the data sitting in different locations and users of this data are confounded by the lack of metadata describing its provenance, or in other words, its lineage. Our project is aimed to solve this issue in the context of sensor data. The Oak Ridge National Laboratory's Building Technologies Research and Integration Center has reconfigurable commercial buildings deployed on the Flexible Research Platforms (FRPs). These FRPs are instrumented with a large number of sensors which measure a number of variables such as HVAC efficiency, relative humidity, and temperature gradients across doors, windows, and walls. Sub-minute resolution data from hundreds of channels is acquired. This sensor data, traditionally, was saved to a shared network location which was accessible to a number of scientists for performing complicated simulation and analysis tasks. The sensor data also participates in elaborate quality assurance exercises as a result of inherent faults. Sometimes, faults are induced to observe building behavior. It became apparent that proper scientific controls required not just managing the data acquisition and delivery, but to also manage the metadata associated with temporal subsets of the sensor data. We built a system named ProvDMS, or Provenance Data Management System for the FRPs, which would both allow researchers to retrieve data of interest as well as trace data lineage. This provides researchers a one-stop shop for comprehensive views of various data transformation allowing researchers to effectively trace their data to its source so that experiments, and derivations of experiments, may be reused and reproduced without much overhead of the repeatability of experiments that

  13. Demand-side management implementation and verification at Fort Drum, New York

    SciTech Connect

    Armstrong, P.R.; Dixon, D.R.; Richman, E.E.; Rowley, S.E.

    1994-12-01

    Through the Facility Energy Decision Screening (FEDS) process, the US Army Forces Command (FORSCOM) has identified present value savings of nearly $47 million in cost-effective energy conservation measures (ECMs) at Fort Drum, New York. With associated costs of more than $16 million (1992 $), the measures provide a net present value of $30.6 million for all identified projects. By implementing all cost-effective ECMs, Fort Drum can reduce its annual energy use by more than 230,000 MBtu (11% of its fossil energy consumption) and more than 27,000 MWh (32% of its electric energy consumption). The annual cost of energy services will decrease by $2.8 million (20%) at current energy rates. The servicing utility (Niagara Mohawk Power Corporation) has informally agreed to finance and implement cost-effective ECMs and to participate in the verification of energy savings. Verification baselining is under way; implementation of retrofit projects is expected to begin in late 1994. The utility-administered financing and contracting arrangements and the alternative federal programs for implementing the projects are described. The verification protocols and sampling plans for audit, indirect, and direct measurement levels of verification and the responsibilities of Fort Drum, the utility, the energy service companies (ESCOs), and Pacific Northwest Laboratory (PNL) in the verification process are also presented. A preliminary weather-normalized model of baseline energy consumption has been developed based on a full year`s metered data.

  14. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  15. Managing Complexity in the MSL/Curiosity Entry, Descent, and Landing Flight Software and Avionics Verification and Validation Campaign

    NASA Technical Reports Server (NTRS)

    Stehura, Aaron; Rozek, Matthew

    2013-01-01

    The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.

  16. Proceedings of the International Workshop on Sustainable ForestManagement: Monitoring and Verification of Greenhouse Gases

    SciTech Connect

    Sathaye , Jayant; Makundi , Willy; Goldberg ,Beth; Andrasko , Ken; Sanchez , Arturo

    1997-07-01

    The International Workshop on Sustainable Forest Management: Monitoring and Verification of Greenhouse Gases was held in San Jose, Costa Rica, July 29-31, 1996. The main objectives of the workshop were to: (1) assemble key practitioners of forestry greenhouse gas (GHG) or carbon offset projects, remote sensing of land cover change, guidelines development, and the forest products certification movement, to offer presentations and small group discussions on findings relevant to the crucial need for the development of guidelines for monitoring and verifying offset projects, and (2) disseminate the findings to interested carbon offset project developers and forestry and climate change policy makers, who need guidance and consistency of methods to reduce project transaction costs and increase probable reliability of carbon benefits, at appropriate venues. The workshop brought together about 45 participants from developed, developing, and transition countries. The participants included researchers, government officials, project developers, and staff from regional and international agencies. Each shared his or her perspectives based on experience in the development and use of methods for monitoring and verifying carbon flows from forest areas and projects. A shared sense among the participants was that methods for monitoring forestry projects are well established, and the techniques are known and used extensively, particularly in production forestry. Introducing climate change with its long-term perspective is often in conflict with the shorter-term perspective of most forestry projects and standard accounting principles. The resolution of these conflicts may require national and international agreements among the affected parties. The establishment of guidelines and protocols for better methods that are sensitive to regional issues will be an important first step to increase the credibility of forestry projects as viable mitigation options. The workshop deliberations led

  17. Independent Verification and Validation Program

    NASA Technical Reports Server (NTRS)

    Bailey, Brandon T.

    2015-01-01

    Presentation to be given to European Space Agency counterparts to give an overview of NASA's IVV Program and the layout and structure of the Software Testing and Research laboratory maintained at IVV. Seeking STI-ITAR review due to the international audience. Most of the information has been presented to public audiences in the past, with some variations on data, or is in the public domain.

  18. Neutronics, steady-state, and transient analyses for the Poland MARIA reactor for irradiation testing of LEU lead test fuel assemblies from CERCA : ANL independent verification results.

    SciTech Connect

    Garner, P. L.; Hanan, N. A.

    2011-06-07

    The MARIA reactor at the Institute of Atomic Energy (IAE) in Swierk (30 km SE of Warsaw) in the Republic of Poland is considering conversion from high-enriched uranium (HEU) to low-enriched uranium (LEU) fuel assemblies (FA). The FA design in MARIA is rather unique; a suitable LEU FA has never been designed or tested. IAE has contracted with CERCA (the fuel supply portion of AREVA in France) to supply 2 lead test assemblies (LTA). The LTAs will be irradiated in MARIA to burnup level of at least 40% for both LTAs and to 60% for one LTA. IAE may decide to purchase additional LEU FAs for a full core conversion after the test irradiation. The Reactor Safety Committee within IAE and the National Atomic Energy Agency in Poland (PAA) must approve the LTA irradiation process. The approval will be based, in part, on IAE submitting revisions to portions of the Safety Analysis Report (SAR) which are affected by the insertion of the LTAs. (A similar process will be required for the full core conversion to LEU fuel.) The analysis required was established during working meetings between Argonne National Laboratory (ANL) and IAE staff during August 2006, subsequent email correspondence, and subsequent staff visits. The analysis needs to consider the current high-enriched uranium (HEU) core and 4 core configurations containing 1 and 2 LEU LTAs in various core positions. Calculations have been performed at ANL in support of the LTA irradiation. These calculations are summarized in this report and include criticality, burn-up, neutronics parameters, steady-state thermal hydraulics, and postulated transients. These calculations have been performed at the request of the IAE staff, who are performing similar calculations to be used in their SAR amendment submittal to the PAA. The ANL analysis has been performed independently from that being performed by IAE and should only be used as one step in the verification process.

  19. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.; Winberg, M.R.; McIsaac, C.V.

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  20. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  1. Formal verification of an MMU and MMU cache

    NASA Technical Reports Server (NTRS)

    Schubert, E. T.

    1991-01-01

    We describe the formal verification of a hardware subsystem consisting of a memory management unit and a cache. These devices are verified independently and then shown to interact correctly when composed. The MMU authorizes memory requests and translates virtual addresses to real addresses. The cache improves performance by maintaining a LRU (least recently used) list from the memory resident segment table.

  2. Enhancing independent time-management skills of individuals with mental retardation using a Palmtop personal computer.

    PubMed

    Davies, Daniel K; Stock, Steven E; Wehmeyer, Michael L

    2002-10-01

    Achieving greater independence for individuals with mental retardation depends upon the acquisition of several key skills, including time-management and scheduling skills. The ability to perform tasks according to a schedule is essential to domains like independent living and employment. The use of a portable schedule prompting system to increase independence and self-regulation in time-management for individuals with mental retardation was examined. Twelve people with mental retardation participated in a comparison of their use of the technology system to perform tasks on a schedule with use of a written schedule. Results demonstrated the utility of a Palmtop computer with schedule prompting software to increase independence in the performance of vocational and daily living tasks by individuals with mental retardation.

  3. Integrated Safety Management System Phase 1 and 2 Verification for the Environmental Restoration Contractor Volumes 1 and 2

    SciTech Connect

    CARTER, R.P.

    2000-04-04

    DOE Policy 450.4 mandates that safety be integrated into all aspects of the management and operations of its facilities. The goal of an institutionalized Integrated Safety Management System (ISMS) is to have a single integrated system that includes Environment, Safety, and Health requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and the federal property over the life cycle of the Environmental Restoration (ER) Project. The purpose of this Environmental Restoration Contractor (ERC) ISMS Phase MI Verification was to determine whether ISMS programs and processes were institutionalized within the ER Project, whether these programs and processes were implemented, and whether the system had promoted the development of a safety conscious work culture.

  4. SU-E-T-24: A Simple Correction-Based Method for Independent Monitor Unit (MU) Verification in Monte Carlo (MC) Lung SBRT Plans

    SciTech Connect

    Pokhrel, D; Badkul, R; Jiang, H; Estes, C; Kumar, P; Wang, F

    2014-06-01

    Purpose: Lung-SBRT uses hypo-fractionated dose in small non-IMRT fields with tissue-heterogeneity corrected plans. An independent MU verification is mandatory for safe and effective delivery of the treatment plan. This report compares planned MU obtained from iPlan-XVM-Calgorithm against spreadsheet-based hand-calculation using most commonly used simple TMR-based method. Methods: Treatment plans of 15 patients who underwent for MC-based lung-SBRT to 50Gy in 5 fractions for PTV V100%=95% were studied. ITV was delineated on MIP images based on 4D-CT scans. PTVs(ITV+5mm margins) ranged from 10.1- 106.5cc(average=48.6cc). MC-SBRT plans were generated using a combination of non-coplanar conformal arcs/beams using iPlan XVM-Calgorithm (BrainLAB iPlan ver.4.1.2) for Novalis-TX consisting of micro-MLCs and 6MV-SRS (1000MU/min) beam. These plans were re-computed using heterogeneity-corrected Pencil-Beam (PB-hete) algorithm without changing any beam parameters, such as MLCs/MUs. Dose-ratio: PB-hete/MC gave beam-by-beam inhomogeneity-correction-factors (ICFs):Individual Correction. For independent-2nd-check, MC-MUs were verified using TMR-based hand-calculation and obtained an average ICF:Average Correction, whereas TMR-based hand-calculation systematically underestimated MC-MUs by ∼5%. Also, first 10 MC-plans were verified with an ion-chamber measurement using homogenous phantom. Results: For both beams/arcs, mean PB-hete dose was systematically overestimated by 5.5±2.6% and mean hand-calculated MU systematic underestimated by 5.5±2.5% compared to XVMC. With individual correction, mean hand-calculated MUs matched with XVMC by - 0.3±1.4%/0.4±1.4 for beams/arcs, respectively. After average 5% correction, hand-calculated MUs matched with XVMC by 0.5±2.5%/0.6±2.0% for beams/arcs, respectively. Smaller dependence on tumor volume(TV)/field size(FS) was also observed. Ion-chamber measurement was within ±3.0%. Conclusion: PB-hete overestimates dose to lung tumor relative to

  5. Leveraging Independent Management and Chief Engineer Hierarchy: Vertically and Horizontally-Derived Technical Authority Value

    NASA Technical Reports Server (NTRS)

    Barley, Bryan; Newhouse, Marilyn

    2012-01-01

    In the development of complex spacecraft missions, project management authority is usually extended hierarchically from NASA's highest agency levels down to the implementing institution's project team level, through both the center and the program. In parallel with management authority, NASA utilizes a complementary, but independent, hierarchy of technical authority (TA) that extends from the agency level to the project, again, through both the center and the program. The chief engineers (CEs) who serve in this technical authority capacity oversee and report on the technical status and ensure sound engineering practices, controls, and management of the projects and programs. At the lowest level, implementing institutions assign project CEs to technically engage projects, lead development teams, and ensure sound technical principles, processes, and issue resolution. At the middle level, programs and centers independently use CEs to ensure the technical success of their projects and programs. At the agency level, NASA's mission directorate CEs maintain technical cognizance over every program and project in their directorate and advise directorate management on the technical, cost, schedule, and programmatic health of each. As part of this vertically-extended CE team, a program level CE manages a continually varying balance between penetration depth and breadth across his or her assigned missions. Teamwork issues and information integration become critical for management at all levels to ensure value-added use of both the synergy available between CEs at the various agency levels, and the independence of the technical authority at each organization.

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    EPA Science Inventory

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  7. Management of Students' Independent Work through the Project Technology in Foreign Language Education

    ERIC Educational Resources Information Center

    Lakova, Assel; Chaklikova, Assel

    2016-01-01

    This article focuses on the management of students' independent work in the specialty "Journalism" on the subject "Special Foreign Language" in high school through project-based learning, which is one of the most important and modern types of tasks. The goal of this work is theoretically and experimentally proved the…

  8. The Managing Independent Living Program and the Development of Reasoning Processes.

    ERIC Educational Resources Information Center

    Thomas, Ruth G.

    The Managing Independent Living Program provides instruction in reasoning skills to analyze career and life planning, housing, and consumer needs of persons in transition. Although the program was implemented and tested with adult female offenders, it is readily adaptable to both institutionalized and noninstitutionalized persons and for secondary…

  9. Integrated Safety Management System Phase I Verification for the Plutonium Finishing Plant (PFP) [VOL 1 & 2

    SciTech Connect

    SETH, S.S.

    2000-01-10

    U.S. Department of Energy (DOE) Policy 450.4, Safety Management System Policy commits to institutionalizing an Integrated Safety Management System (ISMS) throughout the DOE complex as a means of accomplishing its missions safely. DOE Acquisition Regulation 970.5204-2 requires that contractors manage and perform work in accordance with a documented safety management system.

  10. Leading and Managing Today's Independent School: A Qualitative Analysis of the Skills and Practices of Experienced Heads of Independent Schools in the New York Metropolitan Area

    ERIC Educational Resources Information Center

    Juhel, Jean-Marc

    2016-01-01

    This article presents the findings of a qualitative study conducted in 2014 with 16 experienced heads of school in the New York metropolitan area. The study was designed to better understand the skills and practices that they view as critical to leading and managing independent schools. The data collected speak to each head's ability to manage the…

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE; PRACTICAL BEST MANAGEMENT OF GEORGIA, INC., CRYSTALSTREAM� WATER QUALITY VAULT MODEL 1056

    EPA Science Inventory

    Verification testing of the Practical Best Management, Inc., CrystalStream™ stormwater treatment system was conducted over a 15-month period starting in March, 2003. The system was installed in a test site in Griffin, Georgia, and served a drainage basin of approximately 4 ...

  12. Approaches in highly parameterized inversion - GENIE, a general model-independent TCP/IP run manager

    USGS Publications Warehouse

    Muffels, Christopher T.; Schreuder, Willem A.; Doherty, John E.; Karanovic, Marinko; Tonkin, Matthew J.; Hunt, Randall J.; Welter, David E.

    2012-01-01

    GENIE is a model-independent suite of programs that can be used to generally distribute, manage, and execute multiple model runs via the TCP/IP infrastructure. The suite consists of a file distribution interface, a run manage, a run executer, and a routine that can be compiled as part of a program and used to exchange model runs with the run manager. Because communication is via a standard protocol (TCP/IP), any computer connected to the Internet can serve in any of the capacities offered by this suite. Model independence is consistent with the existing template and instruction file protocols of the widely used PEST parameter estimation program. This report describes (1) the problem addressed; (2) the approach used by GENIE to queue, distribute, and retrieve model runs; and (3) user instructions, classes, and functions developed. It also includes (4) an example to illustrate the linking of GENIE with Parallel PEST using the interface routine.

  13. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  14. Spent Nuclear Fuel (SNF) project Integrated Safety Management System phase I and II Verification Review Plan

    SciTech Connect

    CARTER, R.P.

    1999-11-19

    The U.S. Department of Energy (DOE) commits to accomplishing its mission safely. To ensure this objective is met, DOE issued DOE P 450.4, Safety Management System Policy, and incorporated safety management into the DOE Acquisition Regulations ([DEAR] 48 CFR 970.5204-2 and 90.5204-78). Integrated Safety Management (ISM) requires contractors to integrate safety into management and work practices at all levels so that missions are achieved while protecting the public, the worker, and the environment. The contractor is required to describe the Integrated Safety Management System (ISMS) to be used to implement the safety performance objective.

  15. Verification of JUPITER Standard Analysis Method for Upgrading Joyo MK-III Core Design and Management

    NASA Astrophysics Data System (ADS)

    Maeda, Shigetaka; Ito, Chikara; Sekine, Takashi; Aoyama, Takafumi

    In the experimental fast reactor Joyo, loading of irradiation test rigs causes a decrease in excess reactivity because the rigs contain less fissile materials than the driver fuel. In order to carry out duty operation cycles using as many irradiation rigs as possible, it is necessary to upgrade the core performance to increase its excess reactivity and irradiation capacity. Core modification plans have been considered, such as the installation of advanced radial reflectors and reduction of the number of control rods. To implement such core modifications, it is first necessary to improve the prediction accuracy in core design and to optimize safety margins. In the present study, verification of the JUPITER fast reactor standard analysis method was conducted through a comparison between the calculated and the measured Joyo MK-III core characteristics, and it was concluded that the accuracy for a small sodium-cooled fast reactor with a hard neutron spectrum was within 5 % of unity. It was shown that, the performance of the irradiation bed core could be upgraded by the improvement of the prediction accuracy of the core characteristics and optimization of safety margins.

  16. Fluor Daniel Hanford Inc. integrated safety management system phase 1 verification final report

    SciTech Connect

    PARSONS, J.E.

    1999-10-28

    The purpose of this review is to verify the adequacy of documentation as submitted to the Approval Authority by Fluor Daniel Hanford, Inc. (FDH). This review is not only a review of the Integrated Safety Management System (ISMS) System Description documentation, but is also a review of the procedures, policies, and manuals of practice used to implement safety management in an environment of organizational restructuring. The FDH ISMS should support the Hanford Strategic Plan (DOE-RL 1996) to safely clean up and manage the site's legacy waste; deploy science and technology while incorporating the ISMS theme to ''Do work safely''; and protect human health and the environment.

  17. Data Verification Tools for Minimizing Management Costs of Dense Air-Quality Monitoring Networks.

    PubMed

    Miskell, Georgia; Salmond, Jennifer; Alavi-Shoshtari, Maryam; Bart, Mark; Ainslie, Bruce; Grange, Stuart; McKendry, Ian G; Henshaw, Geoff S; Williams, David E

    2016-01-19

    Aiming at minimizing the costs, both of capital expenditure and maintenance, of an extensive air-quality measurement network, we present simple statistical methods that do not require extensive training data sets for automated real-time verification of the reliability of data delivered by a spatially dense hybrid network of both low-cost and reference ozone measurement instruments. Ozone is a pollutant that has a relatively smooth spatial spread over a large scale although there can be significant small-scale variations. We take advantage of these characteristics and demonstrate detection of instrument calibration drift within a few days using a rolling 72 h comparison of hourly averaged data from the test instrument with that from suitably defined proxies. We define the required characteristics of the proxy measurements by working from a definition of the network purpose and specification, in this case reliable determination of the proportion of hourly averaged ozone measurements that are above a threshold in any given day, and detection of calibration drift of greater than ±30% in slope or ±5 parts-per-billion in offset. By analyzing results of a study of an extensive deployment of low-cost instruments in the Lower Fraser Valley, we demonstrate that proxies can be established using land-use criteria and that simple statistical comparisons can identify low-cost instruments that are not stable and therefore need replacing. We propose that a minimal set of compliant reference instruments can be used to verify the reliability of data from a much more extensive network of low-cost devices.

  18. Development of an airborne remote sensing system for crop pest management: System integration and verification

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remote sensing along with Global Positioning Systems, Geographic Information Systems, and variable rate technology has been developed, which scientists can implement to help farmers maximize the economic and environmental benefits of crop pest management through precision agriculture. Airborne remo...

  19. River Protection Project Integrated safety management system phase II verification review plan - 7/29/99

    SciTech Connect

    SHOOP, D.S.

    1999-09-10

    The purpose of this review is to verify the implementation status of the Integrated Safety Management System (ISMS) for the River Protection Project (RPP) facilities managed by Fluor Daniel Hanford, Inc. (FDH) and operated by Lockheed Martin Hanford Company (LMHC). This review will also ascertain whether within RPP facilities and operations the work planning and execution processes are in place and functioning to effectively protect the health and safety of the workers, public, environment, and federal property over the RPP life cycle. The RPP ISMS should support the Hanford Strategic Plan (DOERL-96-92) to safely clean up and manage the site's legacy waste and deploy science and technology while incorporating the ISMS central theme to ''Do work safely'' and protect human health and the environment.

  20. Environmental Technology Verification Program Quality Management Plan, Version 3.0

    EPA Science Inventory

    The ETV QMP is a document that addresses specific policies and procedures that have been established for managing quality-related activities in the ETV program. It is the “blueprint” that defines an organization’s QA policies and procedures; the criteria for and areas of QA appli...

  1. Experimental Verification and Integration of a Next Generation Smart Power Management System

    NASA Astrophysics Data System (ADS)

    Clemmer, Tavis B.

    With the increase in energy demand by the residential community in this country and the diminishing fossil fuel resources being used for electric energy production there is a need for a system to efficiently manage power within a residence. The Smart Green Power Node (SGPN) is a next generation energy management system that automates on-site energy production, storage, consumption, and grid usage to yield the most savings for both the utility and the consumer. Such a system automatically manages on-site distributed generation sources such as a PhotoVoltaic (PV) input and battery storage to curtail grid energy usage when the price is high. The SGPN high level control features an advanced modular algorithm that incorporates weather data for projected PV generation, battery health monitoring algorithms, user preferences for load prioritization within the home in case of an outage, Time of Use (ToU) grid power pricing, and status of on-site resources to intelligently schedule and manage power flow between the grid, loads, and the on-site resources. The SGPN has a scalable, modular architecture such that it can be customized for user specific applications. This drove the topology for the SGPN which connects on-site resources at a low voltage DC microbus; a two stage bi-directional inverter/rectifier then couples the AC load and residential grid connect to on-site generation. The SGPN has been designed, built, and is undergoing testing. Hardware test results obtained are consistent with the design goals set and indicate that the SGPN is a viable system with recommended changes and future work.

  2. The role of data management in discipline-independent data visualization

    NASA Technical Reports Server (NTRS)

    Treinish, Lloyd A.

    1990-01-01

    The common data format (CDF) is described in terms of its support applications for the database management of visualization systems. The CDF is a self-describing data abstraction technique for the storage and manipulation of multidimensional data that are based on block structures. The discipline-independent approach is designed to manage, manipulate, archive, display, and analyze data, and can be applied to heterogeneous equipment communicating different data structures over networks. An improved CDF version incorporates a hyperplane access allowing random aggregate access to subdimensional blocks within a multidimensional variable. The visualization pipeline is also discussed, which controls the flow of data and permits the visualization of different classes of data representation techniques. The system is found to accommodate a large variety of scientific data structures and large disk-based data sets.

  3. Moving the journey towards independence: Adolescents transitioning to successful diabetes self-management

    PubMed Central

    Strickland, C. June

    2016-01-01

    Purpose To gain a greater understanding of adolescent’s experiences living with Type 1 diabetes mellitus (T1DM) and create a theoretical paradigm. Methods Grounded theory as described by Glaser was used. Fifteen in-depth interviews were conducted with adolescent’s ages 11 to 15 with T1DM. Symbolic interactionism is the theoretical framework for grounded theory. Data were collected; transcribed, coded, and analyzed simultaneously using constant comparative analysis and findings were grounded in the words of participants. Results A theoretical model was created with the concept of “normalizing”. Normalizing was defined as the ability to integrate diabetes into one’s daily life to make diabetes ‘part of me’. Phase four of the model, and the focus of this manuscript was “Moving the Journey towards Independence” and included: 1) taking over care, 2) experiencing conflict with parents, and 3) realizing diabetes is hard. The major task for adolescents in this phase was separating from parents to independently manage diabetes. The normalizing task for this phase was: “taking on the burden of care”. Adolescents described challenges with independent care and increased parental conflict including: fearing needles, forgetting insulin, feeling embarrassed and believing that diabetes was a burden in their life. Additionally, juggling the multiple responsibilities of home, school and work along with managing a chronic illness during adolescence is challenging. Conclusions Transitioning to diabetes self-management is a challenge for adolescents. This model advances understanding of the moving processes in adolescents transitioning; additionally, hypotheses are presented that may be used for developing interventions to promote success in self-management. PMID:26190456

  4. The Home Independence Program with non-health professionals as care managers: an evaluation

    PubMed Central

    Lewin, Gill; Concanen, Karyn; Youens, David

    2016-01-01

    The Home Independence Program (HIP), an Australian restorative home care/reablement service for older adults, has been shown to be effective in reducing functional dependency and increasing functional mobility, confidence in everyday activities, and quality of life. These gains were found to translate into a reduced need for ongoing care services and reduced health and aged care costs over time. Despite these positive outcomes, few Australian home care agencies have adopted the service model – a key reason being that few Australian providers employ health professionals, who act as care managers under the HIP service model. A call for proposals from Health Workforce Australia for projects to expand the scope of practice of health/aged care staff then provided the opportunity to develop, implement, and evaluate a service delivery model, in which nonprofessionals replaced the health professionals as Care Managers in the HIP service. Seventy older people who received the HIP Coordinator (HIPC) service participated in the outcomes evaluation. On a range of personal outcome measures, the group showed statistically significant improvement at 3 and 12 months compared to baseline. On each outcome, the improvement observed was larger than that observed in a previous trial in which the service was delivered by health professionals. However, differences in the timing of data collection between the two studies mean that a direct comparison cannot be made. Clients in both studies showed a similarly reduced need for ongoing home care services at both follow-up points. The outcomes achieved by HIPC, with non-health professionals as Care Managers, were positive and can be considered to compare favorably with the outcomes achieved in HIP when health professionals take the Care Manager role. These findings will be of interest to managers of home care services and to policy makers interested in reducing the long-term care needs of older community dwelling individuals. PMID:27382264

  5. Independent validation of the Pain Management Plan in a multi-disciplinary pain team setting

    PubMed Central

    Quinlan, Joanna; Hughes, Richard; Laird, David

    2016-01-01

    Context/background: The Pain Management Plan (PP) is a brief cognitive behavioural therapy (CBT) self-management programme for people living with persistent pain that can be individually facilitated or provided in a group setting. Evidence of PP efficacy has been reported previously by the pain centres involved in its development. Objectives: To provide a fully independent evaluation of the PP and compare these with the findings reported by Cole et al. Methods: The PP programme was delivered by the County Durham Pain Team (Co. Durham PT) as outlined in training sessions led by Cole et al. Pre- and post-quantitative/patient experience measures were repeated with reliable and clinical significant change determined and compared to the original evaluation. Results: Of the 69 participants who completed the programme, 33% achieved reliable change and 20% clinical significant change using the Pain Self-Efficacy Questionnaire (PSEQ). Across the Brief Pain Inventory (BPI) interference domains between 11% and 22% of participants achieved clinical significant change. There were high levels of positive patient feedback with 25% of participants scoring 100% satisfaction. The mean participant satisfaction across the population was 88%. Conclusion: The results from this evaluation validate those reported by Cole et al. It demonstrates clinically significant improvement in pain and health functioning and high patient appreciation results. Both evaluations emphasise the potential of this programme as an early intervention delivered within a stratified care pain pathway. This approach could optimise the use of finite resources and improve wider access to pain management. PMID:27867506

  6. W-026, Waste Receiving and Processing Facility data management system validation and verification report

    SciTech Connect

    Palmer, M.E.

    1997-12-05

    This V and V Report includes analysis of two revisions of the DMS [data management system] System Requirements Specification (SRS) and the Preliminary System Design Document (PSDD); the source code for the DMS Communication Module (DMSCOM) messages; the source code for selected DMS Screens, and the code for the BWAS Simulator. BDM Federal analysts used a series of matrices to: compare the requirements in the System Requirements Specification (SRS) to the specifications found in the System Design Document (SDD), to ensure the design supports the business functions, compare the discreet parts of the SDD with each other, to ensure that the design is consistent and cohesive, compare the source code of the DMS Communication Module with the specifications, to ensure that the resultant messages will support the design, compare the source code of selected screens to the specifications to ensure that resultant system screens will support the design, compare the source code of the BWAS simulator with the requirements to interface with DMS messages and data transfers relating to the BWAS operations.

  7. Phase two of Site 300`s ecological risk assessment: Model verification and risk management

    SciTech Connect

    Carlson, T.M.; Gregory, S.D.

    1995-12-31

    The authors completed the baseline ecological risk assessment (ERA) for Lawrence Livermore National Laboratory`s Site 300 in 1993. Using data collection and modeling techniques adapted from the human health risk assessment (HRA), they evaluated the potential hazard of contaminants in environmental media to ecological receptors. They identified potential hazards to (1) aquatic invertebrates from heavy metal contaminants in surface water, (2) burrowing vertebrates from contaminants volatilizing from subsurface soil into burrow air, and (3) grazing deer and burrowing vertebrates from cadmium contamination in surface soil. They recently began collecting data to refine the estimates of potential hazard to these ecological receptors. Bioassay results form the surface water failed to verify a hazard to aquatic invertebrates. Soil vapor surveys of subsurface burrows did verify the presence of high concentrations of volatile organic compounds (VOCs). However, they have not yet verified a true impact on the burrowing populations. The authors also completed an extensive surface soil sampling program, which identified local hot spots of cadmium contamination. In addition, they have been collecting data on the land use patterns of the deer population. Their data indicate that deer do not typically use those areas with cadmium surface soil contamination. Information from this phase of the ERA, along with the results of the HRA, will direct the selection of remedial alternatives for the site. For the ecological receptors, remedial alternatives include developing a risk management program which includes ensuring that (1) sensitive burrowing species (such as rare or endangered species) do not use areas of surface or subsurface contamination, and (2) deer populations do not use areas of surface soil contamination.

  8. Independent management and financial review, Yucca Mountain Project, Nevada. Final report, Appendix

    SciTech Connect

    1995-07-15

    The Nuclear Waste Policy Act of 1982 (Public Law 97-425), as amended by Public Law 100-203, December 22, 1987, established the Office of Civilian Radioactive Waste Management (OCRWM) within the Department of Energy (DOE), and directed the Office to investigate a site at Yucca Mountain, Nevada, to determine if this site is suitable for the construction of a repository for the disposal of high level nuclear waste. Work on site characterization has been under way for several years. Thus far, about $1.47 billion have been spent on Yucca Mountain programs. This work has been funded by Congressional appropriations from a Nuclear Waste Fund to which contributions have been made by electric utility ratepayers through electric utilities generating power from nuclear power stations. The Secretary of Energy and the Governor of the State of Nevada have appointed one person each to a panel to oversee an objective, independent financial and management evaluation of the Yucca Mountain Project. The Requirements for the work will include an analysis of (1) the Yucca Mountain financial and, contract management techniques and controls; (2) Project schedules and credibility of the proposed milestones; (3) Project organizational effectiveness and internal planning processes, and (4) adequacy of funding levels and funding priorities, including the cost of infrastructure and scientific studies. The recipient will provide monthly progress report and the following reports/documents will be presented as deliverables under the contract: (1) Financial and Contract Management Preliminary Report; (2) Project Scheduling Preliminary Report; (3)Project Organizational Effectiveness Preliminary Report; (4) Project Funding Levels and Funding Priorities Preliminary Report; and (5) Final Report.

  9. Independent naturalists make matchless contributions to science and resource management (Invited)

    NASA Astrophysics Data System (ADS)

    Crimmins, T. M.; Crimmins, M.; Bertelsen, C. D.

    2013-12-01

    Much of the recent growth in PPSR, or public participation in scientific research, has been in 'contributory' or 'collaborative'-type PPSR projects, where non-scientists' roles primarily are data collection or some participation in other aspects of project design or execution. A less common PPSR model, referred to as 'collegial' in recent literature, is characterized by dedicated naturalists collecting rich and extensive data sets outside of an organized program and then working with professional scientists to analyze these data and disseminate findings. The three collaborators on this presentation represent an example of the collegial model; our team is comprised of an independent naturalist who has collected over 150,000 records of plant flowering phenology spanning three decades, a professional climatologist, and a professional plant ecologist. Together, we have documented fundamental plant-climate relationships and seasonal patterns in flowering in the Sonoran Desert region, as well as changes in flowering community composition and distribution associated with changing climate conditions in the form of seven peer-reviewed journal articles and several conference presentations and proceedings. These novel findings address critical gaps in our understanding of plant ecology in the Sky Islands region, and have been incorporated into the Southwest Climate Change and other regional planning documents. It is safe to say that the data resource amassed by a single very dedicated individual, which is far beyond what could be accomplished by probably nearly all researchers or resource managers, has been instrumental in documenting fundamental ecological relationships in the Sky Islands region as well as how these systems are changing in this period of rapidly changing climate. The research findings that have resulted from this partnership have the potential to also directly affect management decisions. The watershed under study, managed by the US Forest Service, has been

  10. Independent practice associations and physician-hospital organizations can improve care management for smaller practices.

    PubMed

    Casalino, Lawrence P; Wu, Frances M; Ryan, Andrew M; Copeland, Kennon; Rittenhouse, Diane R; Ramsay, Patricia P; Shortell, Stephen M

    2013-08-01

    Pay-for-performance, public reporting, and accountable care organization programs place pressures on physicians to use health information technology and organized care management processes to improve the care they provide. But physician practices that are not large may lack the resources and size to implement such processes. We used data from a unique national survey of 1,164 practices with fewer than twenty physicians to provide the first information available on the extent to which independent practice associations (IPAs) and physician-hospital organizations (PHOs) might make it possible for these smaller practices to share resources to improve care. Nearly a quarter of the practices participated in an IPA or a PHO that accounted for a significant proportion of their patients. On average, practices participating in these organizations provided nearly three times as many care management processes for patients with chronic conditions as nonparticipating practices did (10.4 versus 3.8). Half of these processes were provided only by IPAs or PHOs. These organizations may provide a way for small and medium-size practices to systematically improve care and participate in accountable care organizations.

  11. The behavior of multiple independent managers and ecological traits interact to determine prevalence of weeds.

    PubMed

    Coutts, Shaun R; Yokomizo, Hiroyuki; Buckley, Yvonne M

    2013-04-01

    Management of damaging invasive plants is often undertaken by multiple decision makers, each managing only a small part of the invader's population. As weeds can move between properties and re-infest eradicated sites from unmanaged sources, the dynamics of multiple decision makers plays a significant role in weed prevalence and invasion risk at the landscape scale. We used a spatially explicit agent-based simulation to determine how individual agent behavior, in concert with weed population ecology, determined weed prevalence. We compared two invasive grass species that differ in ecology, control methods, and costs: Nassella trichotoma (serrated tussock) and Eragrostis curvula (African love grass). The way decision makers reacted to the benefit of management had a large effect on the extent of a weed. If benefits of weed control outweighed the costs, and either net benefit was very large or all agents were very sensitive to net benefits, then agents tended to act synchronously, reducing the pool of infested agents available to spread the weed. As N. trichotoma was more damaging than E. curvula and had more effective control methods, agents chose to manage it more often, which resulted in lower prevalence of N. trichotoma. A relatively low number of agents who were intrinsically less motivated to control weeds led to increased prevalence of both species. This was particularly apparent when long-distance dispersal meant each infested agent increased the invasion risk for a large portion of the landscape. In this case, a small proportion of land mangers reluctant to control, regardless of costs and benefits, could lead to the whole landscape being infested, even when local control stopped new infestations. Social pressure was important, but only if it was independent of weed prevalence, suggesting that early access to information, and incentives to act on that information, may be crucial in stopping a weed from infesting large areas. The response of our model to both

  12. Results of the independent verification of radiological remedial action at 600 South Clayhill Drive (AKA 600 South Cemetery Road), Monticello, Utah (MS00145)

    SciTech Connect

    Wilson, M.J.; Crutcher, J.W.

    1991-07-01

    In 1980 the site of a vanadium and uranium mill at Monticello, Utah, was accepted into the US Department of Energy's (DOE's) Surplus Facilities Management Program, with the objectives of restoring the government-owned mill site to safe levels of radioactivity, disposing of or containing the tailings in an environmentally safe manner, and performing remedial actions on off-site (vicinity) properties that had been contaminated by radioactive material resulting from mill operations. During 1986 and 1987, UNC Geotech, the remedial action contractor designated by DOE, performed remedial action on the vicinity property at 600 South Cemetery Road (updated by San Juan County and the state of Utah to 600 South Clayhill Drive), Monticello, Utah. The Pollutant Assessments Group (PAG) of Oak Ridge National Laboratory was assigned the responsibility of verifying the data supporting the adequacy of remedial action and confirming the site's compliance with DOE guidelines. The PAG found that the site successfully meets the DOE remedial action objectives. Procedures used by PAG are described. 3 refs., 2 tabs.

  13. Low-Intrusion Techniques and Sensitive Information Management for Warhead Counting and Verification: FY2011 Annual Report

    SciTech Connect

    Jarman, Kenneth D.; Robinson, Sean M.; McDonald, Benjamin S.; Gilbert, Andrew J.; Misner, Alex C.; Pitts, W. Karl; White, Timothy A.; Seifert, Allen; Miller, Erin A.

    2011-09-01

    Future arms control treaties may push nuclear weapons limits to unprecedented low levels and may entail precise counting of warheads as well as distinguishing between strategic and tactical nuclear weapons. Such advances will require assessment of form and function to confidently verify the presence or absence of nuclear warheads and/or their components. Imaging with penetrating radiation can provide such an assessment and could thus play a unique role in inspection scenarios. Yet many imaging capabilities have been viewed as too intrusive from the perspective of revealing weapon design details, and the potential for the release of sensitive information poses challenges in verification settings. A widely held perception is that verification through radiography requires images of sufficient quality that an expert (e.g., a trained inspector or an image-matching algorithm) can verify the presence or absence of components of a device. The concept of information barriers (IBs) has been established to prevent access to relevant weapon-design information by inspectors (or algorithms), and has, to date, limited the usefulness of radiographic inspection. The challenge of this project is to demonstrate that radiographic information can be used behind an IB to improve the capabilities of treaty-verification weapons-inspection systems.

  14. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  15. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  16. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  17. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  18. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  19. Independent External Peer Review Report of the Dredged Material Management Plan for Green Bay, Wisconsin

    DTIC Science & Technology

    2011-06-27

    implementation, maintenance, and continual improvement of its International Organization for Standardization ( ISO ) 9001 :2008 Compliant Quality Management ... managing the IEPR. Noblis performed the requirements of this contract in accordance with its Quality Management System, which is compliant with...International Organization for Standardization ( ISO ) 9000. Specifically, Noblis prepared a Work Plan to define and manage the process for conducting the IEPR

  20. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11..., Health and Safety Auditor Certification: California Climate Action Registry; Clean Development Mechanism... statements (or lack thereof) of any significant changes in entity boundaries, products, or processes;...

  1. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11..., Health and Safety Auditor Certification: California Climate Action Registry; Clean Development Mechanism... statements (or lack thereof) of any significant changes in entity boundaries, products, or processes;...

  2. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11..., Health and Safety Auditor Certification: California Climate Action Registry; Clean Development Mechanism... statements (or lack thereof) of any significant changes in entity boundaries, products, or processes;...

  3. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11..., Health and Safety Auditor Certification: California Climate Action Registry; Clean Development Mechanism... statements (or lack thereof) of any significant changes in entity boundaries, products, or processes;...

  4. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11..., Health and Safety Auditor Certification: California Climate Action Registry; Clean Development Mechanism... statements (or lack thereof) of any significant changes in entity boundaries, products, or processes;...

  5. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    Formal Verification the verification tools developed by the Programming Languages and Software Engineering group were improved. A series of games...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox. Verification tools and games were integrated to verify...N/A i Contents List of Figures 1. SUMMARY .............................................................................................. 1 2

  6. Verification of RESRAD-build computer code, version 3.1.

    SciTech Connect

    2003-06-02

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V&V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used for

  7. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  8. Stormwater Management for Federal Facilities under Section 438 of the Energy Independence and Security Act

    EPA Pesticide Factsheets

    Federal agencies are required to reduce stormwater runoff from federal development and redevelopment projects to protect water resources. Options include a variety of stormwater management practices like green infrastructure or low impact development

  9. Successful Management of Refractory Dialysis Independent Wegener's Granulomatosis with Combination of Therapeutic Plasma Exchange and Rituximab.

    PubMed

    Malhotra, Sheetal; Dhawan, Hari Krishan; Sharma, Ratti Ram; Marwaha, Neelam; Sharma, Aman

    2016-06-01

    Wegeners granulomatosis (WG) is an autoimmune, antineutrophil cytoplasmic antibody mediated necrotizing vasculitis involving renal, and upper and lower respiratory systems. Treatment relies on a combination of immunosuppressive drugs and tapering regimen of glucocorticoids. Therapeutic plasma exchange (TPE) has been recognized as a second line treatment. We report the successful use of TPE in combination with rituximab in achieving remission in a patient with WG (dialysis independent) not responding to conventional therapy.

  10. Current Problems of Improving the Environmental Certification and Output Compliance Verification in the Context of Environmental Management in Kazakhstan

    ERIC Educational Resources Information Center

    Zhambaev, Yerzhan S.; Sagieva, Galia K.; Bazarbek, Bakhytzhan Zh.; Akkulov, Rustem T.

    2016-01-01

    The article discusses the issues of improving the activity of subjects of environmental management in accordance with international environmental standards and national environmental legislation. The article deals with the problem of ensuring the implementation of international environmental standards, the introduction of eco-management, and the…

  11. Crowd-Sourced Program Verification

    DTIC Science & Technology

    2012-12-01

    S / ROBERT L. KAMINSKI WARREN H. DEBANY, JR. Work Unit Manager Technical Advisor, Information Exploitation & Operations...Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington, DC 20503. PLEASE DO NOT RETURN...investigation, the contractor constructed a prototype of a crowd-sourced verification system that takes as input a given program and produces as output a

  12. THIRD PARTY TECHNOLOGY PERFORMANCE VERIFICATION DATA FROM A STAKEHOLD-DRIVEN TECHNOLOGY TESTING PROGRAM

    EPA Science Inventory

    The Greenhouse Gas (GHG) Technology Verification Center is one of 12 independently operated verification centers established by the U.S. Environmental Protection Agency. The Center provides third-party performance data to stakeholders interested in environmetnal technologies tha...

  13. Reasons in Support of Data Security and Data Security Management as Two Independent Concepts: A New Model

    PubMed Central

    Moghaddasi, Hamid; Kamkarhaghighi, Mehran

    2016-01-01

    Introduction: Any information which is generated and saved needs to be protected against accidental or intentional losses and manipulations if it is to be used by the intended users in due time. As such, information managers have adopted numerous measures to achieve data security within data storage systems, along with the spread of information technology. Background: The “data security models” presented thus far have unanimously highlighted the significance of data security management. For further clarification, the current study first introduces the “needs and improvement” cycle; the study will then present some independent definitions, together with a support umbrella, in an attempt to shed light on the data security management. Findings: Data security focuses on three features or attributes known as integrity, identity of sender(s) and identity of receiver(s). Management in data security follows an endless evolutionary process, to keep up with new developments in information technology and communication. In this process management develops new characteristics with greater capabilities to achieve better data security. The characteristics, continuously increasing in number, with a special focus on control, are as follows: private zone, confidentiality, availability, non-repudiation, possession, accountability, authenticity, authentication and auditability. Conclusion: Data security management steadily progresses, resulting in more sophisticated features. The developments are in line with new developments in information and communication technology and novel advances in intrusion detection systems (IDS). Attention to differences between data security and data security management by international organizations such as the International Standard Organization (ISO), and International Telecommunication Union (ITU) is necessary if information quality is to be enhanced. PMID:27857823

  14. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  15. Independent technical evaluation and recommendations for contaminated groundwater at the department of energy office of legacy management Riverton processing site

    SciTech Connect

    Looney, Brain B.; Denham, Miles E.; Eddy-Dilek, Carol A.

    2014-04-01

    The U.S. Department of Energy Office of Legacy Management (DOE-LM) manages the legacy contamination at the Riverton, WY, Processing Site – a former uranium milling site that operated from 1958 to 1963. The tailings and associated materials were removed in 1988-1989 and contaminants are currently flushing from the groundwater. DOE-LM commissioned an independent technical team to assess the status of the contaminant flushing, identify any issues or opportunities for DOE-LM, and provide key recommendations. The team applied a range of technical frameworks – spatial, temporal, hydrological and geochemical – in performing the evaluation. In each topic area, an in depth evaluation was performed using DOE-LM site data (e.g., chemical measurements in groundwater, surface water and soil, water levels, and historical records) along with information collected during the December 2013 site visit (e.g., plant type survey, geomorphology, and minerals that were observed, collected and evaluated).

  16. Welfare Eligibility: Deficit Reduction Act Income Verification Issues. Fact Sheet for the Ranking Minority Member, Subcommittee on Oversight of Government Management, Committee on Governmental Affairs, United States Senate.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Div. of Human Resources.

    This income and eligibility verification system (IEVS) database was created to aid the implementation of data exchanges among federal and state agencies. These exchanges are important for income and eligibility verification of persons who receive benefits from welfare and unemployment programs. Attempts are being made to match the computer…

  17. International Space Station United States Laboratory Module Water Recovery Management Subsystem Verification from Flight 5A to Stage ULF2

    NASA Technical Reports Server (NTRS)

    Williams, David E.; Labuda, Laura

    2009-01-01

    The International Space Station (ISS) Environmental Control and Life Support (ECLS) system comprises of seven subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), Vacuum System (VS), Water Recovery and Management (WRM), and Waste Management (WM). This paper provides a summary of the nominal operation of the United States (U.S.) Laboratory Module WRM design and detailed element methodologies utilized during the Qualification phase of the U.S. Laboratory Module prior to launch and the Qualification of all of the modification kits added to it from Flight 5A up and including Stage ULF2.

  18. An automatic medication self-management and monitoring system for independently living patients.

    PubMed

    McCall, Corey; Maynes, Branden; Zou, Cliff C; Zhang, Ning J

    2013-04-01

    This paper describes the development, prototyping, and evaluation of RMAIS (RFID-based Medication Adherence Intelligence System). Previous work in this field has resulted in devices that are either costly or too complicated for general (especially elderly) patients to operate. RMAIS provides a practical and economical means for ordinary patients to easily manage their own medications, taking the right dosage of medicine at the prescribed time in a fully automatic way. The system design has the following features: (1) fully automatic operation for easy medication by using the built-in scale for dosage measurement and a motorized rotation plate to deliver the right medicine container in front of a patient, (2) various medication reminder messages for patients, and noncompliance alerts for caregivers (such as doctors, relatives or social workers who take care of the patients), and (3) incremental and economical adoption by pharmacies, patients, and insurance companies.

  19. Risk management and market efficiency on the Midwest Independent System Operator electricity exchange

    NASA Astrophysics Data System (ADS)

    Jones, Kevin

    Midwest Independent Transmission System Operator, Inc. (MISO) is a non-profit regional transmission organization (RTO) that oversees electricity production and transmission across thirteen states and one Canadian province. MISO also operates an electronic exchange for buying and selling electricity for each of its five regional hubs. MISO oversees two types of markets. The forward market, which is referred to as the day-ahead (DA) market, allows market participants to place demand bids and supply offers on electricity to be delivered at a specified hour the following day. The equilibrium price, known as the locational marginal price (LMP), is determined by MISO after receiving sale offers and purchase bids from market participants. MISO also coordinates a spot market, which is known as the real-time (RT) market. Traders in the real-time market must submit bids and offers by thirty minutes prior to the hour for which the trade will be executed. After receiving purchase and sale offers for a given hour in the real time market, MISO then determines the LMP for that particular hour. The existence of the DA and RT markets allows producers and retailers to hedge against the large fluctuations that are common in electricity prices. Hedge ratios on the MISO exchange are estimated using various techniques. No hedge ratio technique examined consistently outperforms the unhedged portfolio in terms of variance reduction. Consequently, none of the hedge ratio methods in this study meet the general interpretation of FASB guidelines for a highly effective hedge. One of the major goals of deregulation is to bring about competition and increased efficiency in electricity markets. Previous research suggests that electricity exchanges may not be weak-form market efficient. A simple moving average trading rule is found to produce statistically and economically significant profits on the MISO exchange. This could call the long-term survivability of the MISO exchange into question.

  20. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  1. Columbus pressurized module verification

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Comandatore, Emanuele

    1986-01-01

    The baseline verification approach of the COLUMBUS Pressurized Module was defined during the A and B1 project phases. Peculiarities of the verification program are the testing requirements derived from the permanent manned presence in space. The model philosophy and the test program have been developed in line with the overall verification concept. Such critical areas as meteoroid protections, heat pipe radiators and module seals are identified and tested. Verification problem areas are identified and recommendations for the next development are proposed.

  2. Independent management and financial review, Yucca Mountain Project, Nevada. Final report

    SciTech Connect

    1995-07-15

    The Yucca Mountain Project is one part of the Department of Energy`s Office of Civilian Radioactive Waste Management Program (the Program) which was established by the Nuclear Waste Policy Act of 1982, and as amended in 1987. The Program`s goal is to site the nation`s first geologic repository for the permanent disposal of high-level nuclear waste, in the form of spent fuel rod assemblies, generated by the nuclear power industry and a smaller quantity of Government radioactive waste. The Program, which also encompasses the transportation system and the multipurpose canister system was not the subject of this Report. The subject of this Review was only the Yucca Mountain Project in Nevada. While the Review was directed toward the Yucca Mountain Project rather than the Program as a whole, there are certain elements of the Project which cannot be addressed except through discussion of some Program issues. An example is the Total System Life Cycle Cost addressed in Section 7 of this report. Where Program issues are discussed in this Report, the reader is reminded of the scope limitations of the National Association of Regulatory Utility Commissioners (NARUC) contract to review only the Yucca Mountain Project. The primary scope of the Review was to respond to the specific criteria contained in the NARUC scope of work. In responding to these criteria, the Review Team understood that some interested parties have expressed concern over the requirements of the Nuclear Waste Policy Act relative to the Yucca Mountain Project and the nature of activities currently being carried out by the Department of Energy at the Yucca Mountain Project site. The Review Team has attempted to analyze relevant portions of the Nuclear Waste Policy Act as Amended, but has not conducted a thorough analysis of this legislation that could lead to any specific legal conclusions about all aspects of it.

  3. Sensor to User - NASA/EOS Data for Coastal Zone Management Applications Developed from Integrated Analyses: Verification, Validation and Benchmark Report

    NASA Technical Reports Server (NTRS)

    Hall, Callie; Arnone, Robert

    2006-01-01

    The NASA Applied Sciences Program seeks to transfer NASA data, models, and knowledge into the hands of end-users by forming links with partner agencies and associated decision support tools (DSTs). Through the NASA REASoN (Research, Education and Applications Solutions Network) Cooperative Agreement, the Oceanography Division of the Naval Research Laboratory (NRLSSC) is developing new products through the integration of data from NASA Earth-Sun System assets with coastal ocean forecast models and other available data to enhance coastal management in the Gulf of Mexico. The recipient federal agency for this research effort is the National Oceanic and Atmospheric Administration (NOAA). The contents of this report detail the effort to further the goals of the NASA Applied Sciences Program by demonstrating the use of NASA satellite products combined with data-assimilating ocean models to provide near real-time information to maritime users and coastal managers of the Gulf of Mexico. This effort provides new and improved capabilities for monitoring, assessing, and predicting the coastal environment. Coastal managers can exploit these capabilities through enhanced DSTs at federal, state and local agencies. The project addresses three major issues facing coastal managers: 1) Harmful Algal Blooms (HABs); 2) hypoxia; and 3) freshwater fluxes to the coastal ocean. A suite of ocean products capable of describing Ocean Weather is assembled on a daily basis as the foundation for this semi-operational multiyear effort. This continuous realtime capability brings decision makers a new ability to monitor both normal and anomalous coastal ocean conditions with a steady flow of satellite and ocean model conditions. Furthermore, as the baseline data sets are used more extensively and the customer list increased, customer feedback is obtained and additional customized products are developed and provided to decision makers. Continual customer feedback and response with new improved

  4. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  5. M&V Guidelines: Measurement and Verification for Performance-Based Contracts Version 4.0

    SciTech Connect

    2015-11-02

    Document outlines the Federal Energy Management Program's standard procedures and guidelines for measurement and verification (M&V) for federal energy managers, procurement officials, and energy service providers.

  6. Event-Based Prospective Memory Is Independently Associated with Self-Report of Medication Management in Older Adults

    PubMed Central

    Woods, Steven Paul; Weinborn, Michael; Maxwell, Brenton R.; Gummery, Alice; Mo, Kevin; Ng, Amanda R. J.; Bucks, Romola S.

    2014-01-01

    Background Identifying potentially modifiable risk factors for medication non-adherence in older adults is important in order to enhance screening and intervention efforts designed to improve medication-taking behavior and health outcomes. The current study sought to determine the unique contribution of prospective memory (i.e., “remembering to remember”) to successful self-reported medication management in older adults. Methods Sixty-five older adults with current medication prescriptions completed a comprehensive research evaluation of sociodemographic, psychiatric, and neurocognitive functioning, which included the Memory for Adherence to Medication Scale (MAMS), Prospective and Retrospective Memory Questionnaire (PRMQ), and a performance-based measure of prospective memory that measured both semantically-related and semantically-unrelated cue-intention (i.e., when-what) pairings. Results A series of hierarchical regressions controlling for biopsychosocial, other neurocognitive, and medication-related factors showed that elevated complaints on the PM scale of the PRMQ and worse performance on an objective semantically-unrelated event-based prospective memory task were independent predictors of poorer medication adherence as measured by the MAMS. Conclusions Prospective memory plays an important role in self-report of successful medication management among older adults. Findings may have implications for screening for older individuals “at risk” of non-adherence, as well as the development of prospective memory-based interventions to improve medication adherence and, ultimately, long-term health outcomes in older adults. PMID:24410357

  7. 77 FR 60714 - Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-04

    ... Bureau of Safety and Environmental Enforcement Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office of Management and Budget (OMB) Review; Comment Request ACTION... the paperwork requirements for the Notice to Lessees (NTL) on the Legacy Data Verification...

  8. Modeling in the State Flow Environment to Support Launch Vehicle Verification Testing for Mission and Fault Management Algorithms in the NASA Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Berg, Peter; England, Dwight; Johnson, Stephen B.

    2016-01-01

    Analysis methods and testing processes are essential activities in the engineering development and verification of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS). Central to mission success is reliable verification of the Mission and Fault Management (M&FM) algorithms for the SLS launch vehicle (LV) flight software. This is particularly difficult because M&FM algorithms integrate and operate LV subsystems, which consist of diverse forms of hardware and software themselves, with equally diverse integration from the engineering disciplines of LV subsystems. M&FM operation of SLS requires a changing mix of LV automation. During pre-launch the LV is primarily operated by the Kennedy Space Center (KSC) Ground Systems Development and Operations (GSDO) organization with some LV automation of time-critical functions, and much more autonomous LV operations during ascent that have crucial interactions with the Orion crew capsule, its astronauts, and with mission controllers at the Johnson Space Center. M&FM algorithms must perform all nominal mission commanding via the flight computer to control LV states from pre-launch through disposal and also address failure conditions by initiating autonomous or commanded aborts (crew capsule escape from the failing LV), redundancy management of failing subsystems and components, and safing actions to reduce or prevent threats to ground systems and crew. To address the criticality of the verification testing of these algorithms, the NASA M&FM team has utilized the State Flow environment6 (SFE) with its existing Vehicle Management End-to-End Testbed (VMET) platform which also hosts vendor-supplied physics-based LV subsystem models. The human-derived M&FM algorithms are designed and vetted in Integrated Development Teams composed of design and development disciplines such as Systems Engineering, Flight Software (FSW), Safety and Mission Assurance (S&MA) and major subsystems and vehicle elements

  9. SU-E-T-77: A Statistical Approach to Manage Quality for Pre-Treatment Verification in IMRT/VMAT

    SciTech Connect

    Jassal, K; Sarkar, B; Mohanti, B; Roy, S; Ganesh, T; Munshi, A; Chougule, A; Sachdev, K

    2015-06-15

    Objective: The study presents the application of a simple concept of statistical process control (SPC) for pre-treatment quality assurance procedure analysis for planar dose measurements performed using 2D-array and a-Si electronic portal imaging device (a-Si EPID). Method: A total of 195 patients of four different anatomical sites: brain (n1=45), head & neck (n2=45), thorax (n3=50) and pelvis (n4=55) were selected for the study. Pre-treatment quality assurance for the clinically acceptable IMRT/VMAT plans was measured with 2D array and a-Si EPID of the accelerator. After the γ-analysis, control charts and the quality index Cpm was evaluated for each cohort. Results: Mean and σ of γ ( 3%/3 mm) were EPID γ %≤1= 99.9% ± 1.15% and array γ %<1 = 99.6% ± 1.06%. Among all plans γ max was consistently lower than for 2D array as compared to a-Si EPID. Fig.1 presents the X-bar control charts for every cohort. Cpm values for a-Si EPID were found to be higher than array, detailed results are presented in table 1. Conclusion: Present study demonstrates the significance of control charts used for quality management purposes in newer radiotherapy clinics. Also, provides a pictorial overview of the clinic performance for the advanced radiotherapy techniques.Higher Cpm values for EPID indicate its higher efficiency than array based measurements.

  10. Radiography Facility - Building 239 Independent Validation Review

    SciTech Connect

    Altenbach, T J; Beaulieu, R A; Watson, J F; Wong, H J

    2010-02-02

    The purpose of this task was to perform an Independent Validation Review to evaluate the successful implementation and effectiveness of Safety Basis controls, including new and revised controls, to support the implementation of a new DSA/TSR for B239. This task addresses Milestone 2 of FY10 PEP 7.6.6. As the first IVR ever conducted on a LLNL nuclear facility, it was designated a pilot project. The review follows the outline developed for Milestone 1 of the PEP, which is based on the DOE Draft Guide for Performance of Independent Verification Review of Safety Basis Controls. A formal Safety Basis procedure will be developed later, based on the lessons learned with this pilot project. Note, this review is termed a ''Validation'' in order to be consistent with the PEP definition and address issues historically raised about verification mechanisms at LLNL. Validation is intended to confirm that implementing mechanisms realistically establish the ability of TSR LCO, administrative control or safety management program to accomplish its intended safety function and that the controls are being implemented. This effort should not, however, be confused with a compliance assessment against all relevant DOE requirements and national standards. Nor is it used as a vehicle to question the derivation of controls already approved by LSO unless a given TSR statement simply cannot be implemented as stated.

  11. INTERIM REPORT--INDEPENDENT VERIFICATION SURVEY OF SECTION 3, SURVEY UNITS 1, 4 AND 5 EXCAVATED SURFACES, WHITTAKER CORPORATION, REYNOLDS INDUSTRIAL PARK, TRANSFER, PENNSYLVANIA DCN: 5002-SR-04-0"

    SciTech Connect

    ADAMS, WADE C

    2013-04-18

    At Pennsylvania Department of Environmental Protection's request, ORAU's IEAV program conducted verification surveys on the excavated surfaces of Section 3, SUs 1, 4, and 5 at the Whittaker site on March 13 and 14, 2013. The survey activities included visual inspections, gamma radiation surface scans, gamma activity measurements, and soil sampling activities. Verification activities also included the review and assessment of the licensee's project documentation and methodologies. Surface scans identified four areas of elevated direct gamma radiation distinguishable from background; one area within SUs 1 and 4 and two areas within SU5. One area within SU5 was remediated by removing a golf ball size piece of slag while ORAU staff was onsite. With the exception of the golf ball size piece of slag within SU5, a review of the ESL Section 3 EXS data packages for SUs 1, 4, and 5 indicated that these locations of elevated gamma radiation were also identified by the ESL gamma scans and that ESL personnel performed additional investigations and soil sampling within these areas. The investigative results indicated that the areas met the release criteria.

  12. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF BAGHOUSE FILTRATION PRODUCTS

    EPA Science Inventory

    The Environmental Technology Verification Program (ETV) was started by EPA in 1995 to generate independent credible data on the performance of innovative technologies that have potential to improve protection of public health and the environment. ETV does not approve or certify p...

  14. Design of experiments with multiple independent variables: a resource management perspective on complete and reduced factorial designs.

    PubMed

    Collins, Linda M; Dziak, John J; Li, Runze

    2009-09-01

    An investigator who plans to conduct an experiment with multiple independent variables must decide whether to use a complete or reduced factorial design. This article advocates a resource management perspective on making this decision, in which the investigator seeks a strategic balance between service to scientific objectives and economy. Considerations in making design decisions include whether research questions are framed as main effects or simple effects; whether and which effects are aliased (confounded) in a particular design; the number of experimental conditions that must be implemented in a particular design and the number of experimental subjects the design requires to maintain the desired level of statistical power; and the costs associated with implementing experimental conditions and obtaining experimental subjects. In this article 4 design options are compared: complete factorial, individual experiments, single factor, and fractional factorial. Complete and fractional factorial designs and single-factor designs are generally more economical than conducting individual experiments on each factor. Although relatively unfamiliar to behavioral scientists, fractional factorial designs merit serious consideration because of their economy and versatility.

  15. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  16. Cold fusion verification

    NASA Astrophysics Data System (ADS)

    North, M. H.; Mastny, G. F.; Wesley, E. J.

    1991-03-01

    The objective of this work to verify and reproduce experimental observations of Cold Nuclear Fusion (CNF), as originally reported in 1989. The method was to start with the original report and add such additional information as became available to build a set of operational electrolytic CNF cells. Verification was to be achieved by first observing cells for neutron production, and for those cells that demonstrated a nuclear effect, careful calorimetric measurements were planned. The authors concluded, after laboratory experience, reading published work, talking with others in the field, and attending conferences, that CNF probably is chimera and will go the way of N-rays and polywater. The neutron detector used for these tests was a completely packaged unit built into a metal suitcase that afforded electrostatic shielding for the detectors and self-contained electronics. It was battery-powered, although it was on charge for most of the long tests. The sensor element consists of He detectors arranged in three independent layers in a solid moderating block. The count from each of the three layers as well as the sum of all the detectors were brought out and recorded separately. The neutron measurements were made with both the neutron detector and the sample tested in a cave made of thick moderating material that surrounded the two units on the sides and bottom.

  17. Verification issues for rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; Riley, Gary; Savely, Robert T.

    1987-01-01

    Verification and validation of expert systems is very important for the future success of this technology. Software will never be used in non-trivial applications unless the program developers can assure both users and managers that the software is reliable and generally free from error. Therefore, verification and validation of expert systems must be done. The primary hindrance to effective verification and validation is the use of methodologies which do not produce testable requirements. An extension of the flight technique panels used in previous NASA programs should provide both documented requirements and very high levels of verification for expert systems.

  18. Fuel Retrieval System Design Verification Report

    SciTech Connect

    GROTH, B.D.

    2000-04-11

    The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway. Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR). A Design Verification Status Questionnaire, Table 1, is included which addresses Corrective Action SNF-EG-MA-EG-20000060, Item No.9 (Miller 2000).

  19. CS-10 Verification Survey at Former McClellan AFB, Sacramento, CA

    DTIC Science & Technology

    2013-09-26

    assessment/verification survey from 20-22 Feb 2013 at site CS- 10 on former McClellan AFB, CA. Radium -226 was the sole radionuclide of concern. Cabrera...Aerospace Medicine (USAFSAM), former McClellan AFB, radium -226, verification survey, final status survey, independent radiological assessment 16...independent radiological assessment/verification survey from 20-22 Feb 2013 at site CS-10 on former McClellan AFB, CA. Radium -226 (Ra-226) was the sole

  20. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... in Federal awards and whose Section 8 programs are not audited by an independent auditor (IA), will... annual IA audit report is a HUD verification method. For those PHAs, the SEMAP score and overall... list. (24 CFR 982.54(d)(1) and 982.204(a)) (2) HUD verification method: The independent auditor...

  1. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  2. Greater years of maternal schooling and higher scores on academic achievement tests are independently associated with improved management of child diarrhea by rural Guatemalan mothers.

    PubMed

    Webb, Aimee L; Ramakrishnan, Usha; Stein, Aryeh D; Sellen, Daniel W; Merchant, Moeza; Martorell, Reynaldo

    2010-09-01

    Appropriate home management can alleviate many of the consequences of diarrhea including malnutrition, impaired development, growth faltering, and mortality. Maternal cognitive ability, years of schooling, and acquired academic skills are hypothesized to improve child health by improving maternal child care practices, such as illness management. Using information collected longitudinally in 1996-1999 from 466 rural Guatemalan women with children <36 months, we examined the independent associations between maternal years of schooling, academic skills, and scores on the Raven's Progressive Matrices and an illness management index (IMI). Women scoring in the lowest and middle tertiles of academic skills scored lower on the IMI compared to women in the highest tertile (-0.24 [95% CI: -0.54, 0.07]; -0.30 [95% CI: -0.54, -0.06], respectively) independent of sociodemographic factors, schooling, and Raven's scores. Among mothers with less than 1 year of schooling, scoring in the lowest tertile on the Raven's Progressive Matrices compared to the highest was significantly associated with scoring one point lower on the IMI (-1.18 [95% CI: -2.20, -0.17]). Greater academic skills were independently associated with maternal care during episodes of infant diarrhea. Schooling of young girls and/or community based programs that provide women with academic skills such as literacy, numeracy and knowledge could potentially improve mothers' care giving practices.

  3. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    PubMed

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  4. [The implementation of an independent and differentiated pain management SOP (Standard Operating Procedure) for the interdisciplinary intensive care unit].

    PubMed

    Aust, Hansjörg; Wulf, Hinnerk; Vassiliou, Timon

    2013-03-01

    Up to the present day, pain management in the ICU (Intensive Care Units) is a unresolved clinical problem due to patient heterogeneity with complex variation in etiopathology and treatment of the underlying diseases. Therefore, therapeutic strategies in terms of standard operating procedure (SOP) are a necessary to improve the pain management for intensive care patients. Common guidelines for analgosedation are often inadequate to reflect the clinical situation. In particular, for an ICU setting without permanent presence of a physician a missing pain management SOP resulting in delayed pain therapy caused by a therapeutic uncertainty of the nurse staff. In addition to our pre-existing SOP for analgosedation we implemented a pain management SOP for our interdisciplinary, anaesthesiologic ICU. A exploratory survey among the nurse staff was conducted to assess the efficacy of the SOP. The results of the evaluation after a 6 month follow-up indicated a faster onset of pain management and good acceptance by the nursing staff.

  5. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  6. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  7. Voice verification upgrade

    NASA Astrophysics Data System (ADS)

    Davis, R. L.; Sinnamon, J. T.; Cox, D. L.

    1982-06-01

    This contractor has two major objectives. The first was to build, test, and deliver to the government an entry control system using speaker verification (voice authentication) as the mechanism for verifying the user's claimed identity. This system included a physical mantrap, with an integral weight scale to prevent more than one user from gaining access with one verification (tailgating). The speaker verification part of the entry control system contained all the updates and embellishments to the algorithm that was developed earlier for the BISS (Base and Installation Security System) system under contract with the Electronic Systems Division of the USAF. These updates were tested prior to and during the contract on an operational system used at Texas Instruments in Dallas, Texas, for controlling entry to the Corporate Information Center (CIC).

  8. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  9. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  10. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  11. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  12. Voice Verification Upgrade.

    DTIC Science & Technology

    1982-06-01

    to develop speaker verification techniques for use over degraded commun- ication channels -- specifically telephone lines. A test of BISS type speaker...verification technology was performed on a degraded channel and compensation techniques were then developed . The fifth program [103 (Total Voice SV...UPGAW. *mbit aL DuI~sel Jme T. SImmoon e~d David L. Cox AAWVLP FIR MIEW RMAS Utgl~rIMIW At" DT11C AU9 231f CD, _ ROME AIR DEVELOPMENT CENTER Air

  13. Are Independent Probes Truly Independent?

    ERIC Educational Resources Information Center

    Camp, Gino; Pecher, Diane; Schmidt, Henk G.; Zeelenberg, Rene

    2009-01-01

    The independent cue technique has been developed to test traditional interference theories against inhibition theories of forgetting. In the present study, the authors tested the critical criterion for the independence of independent cues: Studied cues not presented during test (and unrelated to test cues) should not contribute to the retrieval…

  14. 76 FR 54810 - Submission for Review: 3206-0215, Verification of Full-Time School Attendance, RI 25-49

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ... MANAGEMENT Submission for Review: 3206-0215, Verification of Full-Time School Attendance, RI 25-49 AGENCY: U..., Verification of Full-Time School Attendance. As required by the Paperwork Reduction Act of 1995 (Pub. L. 104-13... to (202) 395-6974. SUPPLEMENTARY INFORMATION: RI 25-49, Verification of Full-Time School...

  15. 76 FR 29805 - Submission for Review: Verification of Full-Time School Attendance, RI 25-49

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-23

    ... MANAGEMENT Submission for Review: Verification of Full-Time School Attendance, RI 25-49 AGENCY: U.S. Office... opportunity to comment on a revised information collection request (ICR) 3206-0215, Verification of Full-Time...@opm.gov or faxed to (202) 606-0910. SUPPLEMENTARY INFORMATION: RI 25-49, Verification of...

  16. Environment, Safety and Health independent evaluation of Fernald Environmental Restoration Management Company`s (FERMCO) Comprehensive Environmental Occupational Safety and Health Program (CEOSHP)

    SciTech Connect

    Not Available

    1994-04-01

    The Office of Environmental Management (EM) requested the Office of Environment, Safety and Health (EH) to perform an independent evaluation of Fernald Environmental Restoration Management Corporation`s (FERMCO`s) Comprehensive Environmental occupational Safety and Health Program (CEOSHP) document. In 1992, FERMCO was awarded the Department of Energy`s (DOE) first Environmental Restoration Management Contract and developed the CEOSHP to respond to contract requirements. EH limited its review to the CEOSHP because this document constitutes FERMCO`s written environment, safety and health (ES&H) program document and thus provides the basis for FERMCO`s ES&H program. EH`s independent review identified several major areas of the CEOSHP that need to be revised if it is to function successfully as the program-level document for FERMCO`s environment, safety and health program. The problems identified occur throughout the document and apply across the three CEOSHP sections evaluated by EH: the Occupational Safety and Health program, the Environmental Protection program, and the Radiological Control program. Primary findings of the CEOSHP: (1) Does not fully reflect the occupational safety and health, environmental protection, and radiological control requirements of the Department; (2) Does not convey a strong sense of management leadership of the program or clearly delineate employee rights, responsibilities, and roles in FERMCO`s ES&H program; (3) Is not a program management-level document; (4) Does not describe a ``seamless`` ES&H program; and (5) Does not clearly convey how FERMCO`s ES&H program actually works. EH`s detailed evaluation of FERMCO`s CEOSHP, along with specific recommendations are presented in Sections 2, 3, and 4 of this report. EH believes that EM will find this review and analysis useful in its efforts to assist FERMCO in a comprehensive redrafting of the CEOSHP.

  17. Competencies of Leaders and Managers in Educational Research and Development. Independent Research and Development Project Reports. Report No. 4.

    ERIC Educational Resources Information Center

    DeAnda, Natividad

    This report clarifies pilot efforts which address new problem areas in educational needs. The project was initiated to determine the specific competencies essential to the management of educational projects in research and development. The goals of the project were to establish a profile of identified competencies for use in planning the content…

  18. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  20. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  1. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  2. Integrated testing and verification system for research flight software

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.

    1979-01-01

    The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.

  3. INDEPENDENT TECHNICAL ASSESSMENT OF MANAGEMENT OF STORMWATER AND WASTEWATER AT THE SEPARATIONS PROCESS RESEARCH UNIT (SPRU) DISPOSITION PROJECT, NEW YORK

    SciTech Connect

    Abitz, R.; Jackson, D.; Eddy-Dilek, C.

    2011-06-27

    The U.S. Department of Energy (DOE) is currently evaluating the water management procedures at the Separations Process Research Unit (SPRU). The facility has three issues related to water management that require technical assistance: (1) due to a excessive rainfall event in October, 2010, contaminated water collected in basements of G2 and H2 buildings. As a result of this event, the contractor has had to collect and dispose of water offsite; (2) The failure of a sump pump at a KAPL outfall resulted in a Notice of Violation issued by the New York State Department of Environment and Conservation (NYSDEC) and subsequent Consent Order. On-site water now requires treatment and off-site disposition; and (3) stormwater infiltration has resulted in Strontium-90 levels discharged to the storm drains that exceed NR standards. The contractor has indicated that water management at SPRU requires major staff resources (at least 50 persons). The purpose of this review is to determine if the contractor's technical approach warrants the large number of staff resources and to ensure that the technical approach is compliant and in accordance with federal, state and NR requirements.

  4. Improved Verification for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Powell, Mark A.

    2008-01-01

    Aerospace systems are subject to many stringent performance requirements to be verified with low risk. This report investigates verification planning using conditional approaches vice the standard classical statistical methods, and usage of historical surrogate data for requirement validation and in verification planning. The example used in this report to illustrate the results of these investigations is a proposed mission assurance requirement with the concomitant maximum acceptable verification risk for the NASA Constellation Program Orion Launch Abort System (LAS). This report demonstrates the following improvements: 1) verification planning using conditional approaches vice classical statistical methods results in plans that are more achievable and feasible; 2) historical surrogate data can be used to bound validation of performance requirements; and, 3) incorporation of historical surrogate data in verification planning using conditional approaches produces even less costly and more reasonable verification plans. The procedures presented in this report may produce similar improvements and cost savings in verification for any stringent performance requirement for an aerospace system.

  5. Endodontic management of a C-shaped maxillary first molar with three independent buccal root canals by using cone-beam computed tomography

    PubMed Central

    Karanxha, Lorena; Kim, Hee-Jin; Hong, Sung-Ok; Lee, Wan; Kim, Pyung-Sik

    2012-01-01

    The aim of this study was to present a method for endodontic management of a maxillary first molar with unusual C-shaped morphology of the buccal root verified by cone-beam computed tomography (CBCT) images. This rare anatomical variation was confirmed using CBCT, and nonsurgical endodontic treatment was performed by meticulous evaluation of the pulpal floor. Posttreatment image revealed 3 independent canals in the buccal root obturated efficiently to the accepted lengths in all 3 canals. Our study describes a unique C-shaped variation of the root canal system in a maxillary first molar, involving the 3 buccal canals. In addition, our study highlights the usefulness of CBCT imaging for accurate diagnosis and management of this unusual canal morphology. PMID:23429761

  6. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  7. Fluor Hanford Integrated Safety Management System Phase 1 Verification 04/12/2000 Thru 04/28/2000 Volume 1 and 2

    SciTech Connect

    PARSONS, J.E.

    2000-03-01

    The U.S. Department of Energy (DOE) commits to accomplishing its mission safely. To ensure this objective is met, DOE issued DOE P 450.4, Safety Management System Policy, and incorporated safety management into the DOE Acquisition Regulations ([DEAR] 48 CFR 970.5204-2 and 90.5204-78).

  8. The consequences of disposal of low-level radioactive waste from the Fernald Environmental Management Project: Report of the DOE/Nevada Independent Panel

    SciTech Connect

    Crowe, B.; Hansen, W.; Waters, R.; Sully, M.; Levitt, D.

    1998-04-01

    The Department of Energy (DOE) convened a panel of independent scientists to assess the performance impact of shallow burial of low-level radioactive waste from the Fernald Environmental Management Project, in light of a transportation incident in December 1997 involving this waste stream. The Fernald waste has been transported to the Nevada Test Site and disposed in the Area 5 Radioactive Waste Management Site (RWMS) since 1993. A separate DOE investigation of the incident established that the waste has been buried in stress-fractured metal boxes, and some of the waste contained excess moisture (high-volumetric water contents). The Independent Panel was charged with determining whether disposition of this waste in the Area 5 RWMS has impacted the conclusions of a previously completed performance assessment in which the site was judged to meet required performance objectives. To assess the performance impact on Area 5, the panel members developed a series of questions. The three areas addressed in these questions were (1) reduced container integrity, (2) the impact of reduced container integrity on subsidence of waste in the disposal pits and (3) excess moisture in the waste. The panel has concluded that there is no performance impact from reduced container integrity--no performance is allocated to the container in the conservative assumptions used in performance assessment. Similarly, the process controlling post-closure subsidence results primarily from void space within and between containers, and the container is assumed to degrade and collapse within 100 years.

  9. Microcode Verification Project.

    DTIC Science & Technology

    1980-05-01

    MICROCOPY RESOLUTION TEST CHART MADCTR.S042 /2>t w NI TeduIem R"pm’ 00 0 MICRQCODE VERIFICATION PROJECT Unhvrsity of Southern California Stephen D...in the production, testing , and maintenance of Air Force software. This effort was undertaken in response to that goal. The objective of the effort was...rather than hard wiring, is a recent development in computer technology. Hardware diagnostics do not fulfill testing requirements for these computers

  10. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  11. Robust verification analysis

    SciTech Connect

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  12. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  13. Salam's independence

    NASA Astrophysics Data System (ADS)

    Fraser, Gordon

    2009-01-01

    In his kind review of my biography of the Nobel laureate Abdus Salam (December 2008 pp45-46), John W Moffat wrongly claims that Salam had "independently thought of the idea of parity violation in weak interactions".

  14. Environmental Technology Verification (ETV) Program: Site Characterization and Monitoring Technologies Center

    EPA Pesticide Factsheets

    The ETV Site Characterization and Monitoring Technology Pilot is composed of EPA, DoD, DOE, other Federal agencies, state regulators, technology evaluation and verification entities, and potential end users of these technologies to facilitate independent..

  15. Quantitative Measures for Software Independent Verification and Validation

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    1996-01-01

    As software is maintained or reused, it undergoes an evolution which tends to increase the overall complexity of the code. To understand the effects of this, we brought in statistics experts and leading researchers in software complexity, reliability, and their interrelationships. These experts' project has resulted in our ability to statistically correlate specific code complexity attributes, in orthogonal domains, to errors found over time in the HAL/S flight software which flies in the Space Shuttle. Although only a prototype-tools experiment, the result of this research appears to be extendable to all other NASA software, given appropriate data similar to that logged for the Shuttle onboard software. Our research has demonstrated that a more complete domain coverage can be mathematically demonstrated with the approach we have applied, thereby ensuring full insight into the cause-and-effects relationship between the complexity of a software system and the fault density of that system. By applying the operational profile we can characterize the dynamic effects of software path complexity under this same approach We now have the ability to measure specific attributes which have been statistically demonstrated to correlate to increased error probability, and to know which actions to take, for each complexity domain. Shuttle software verifiers can now monitor the changes in the software complexity, assess the added or decreased risk of software faults in modified code, and determine necessary corrections. The reports, tool documentation, user's guides, and new approach that have resulted from this research effort represent advances in the state of the art of software quality and reliability assurance. Details describing how to apply this technique to other NASA code are contained in this document.

  16. SAPHIRE 8 Software Independent Verification and Validation Plan

    SciTech Connect

    Rae J. Nims; Kent M. Norris

    2010-02-01

    SAPHIRE 8 is being developed with a phased or cyclic iterative rapid application development methodology. Due to this approach, a similar approach is being taken for the IV&V activities on each vital software object. The IV&V plan is structured around NUREG/BR-0167, “Software Quality Assurance Program and Guidelines,” February 1993. The Nuclear Regulatory Research Office Instruction No.: PRM-12, “Software Quality Assurance for RES Sponsored Codes,” March 26, 2007 specifies that RES-sponsored software is to be evaluated against NUREG/BR-0167. Per the guidance in NUREG/BR-0167, SAPHIRE is classified as “Level 1.” Level 1 software corresponds to technical application software used in a safety decision.

  17. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... be required in the case of APD projects that meet any of the following criteria: (1) Are at risk of...) Are at risk of failing to meet a critical milestone; (3) Indicate the need for a new project or total... Social Security Act; (5) Are at risk of failure, major delay, or cost overrun in their...

  18. Characteristics Verification of an Independently Controllable Electromagnetic Spherical Motor

    PubMed Central

    Maeda, Shuhei; Hirata, Katsuhiro; Niguchi, Noboru

    2014-01-01

    We have been developing electromagnetic spherical actuators capable of three-degree-of-freedom rotation. However, these actuators require complex control to realize simultaneous triaxial drive, because rotation around one axis interferes with rotation around another. In this paper, we propose a new three-degree-of-freedom actuator where 3-axes rotation can be controlled easily. The basic structure and the operating principle of the actuator are described. Then the torque characteristics and the dynamic characteristics are computed by employing 3D-FEM and the effectiveness of this actuator is clarified. Finally, the experimental results using the prototype of the actuator are shown to verify the dynamic performance. PMID:24919011

  19. Independent Evaluation of the integrated Community Case Management of Childhood Illness Strategy in Malawi Using a National Evaluation Platform Design.

    PubMed

    Amouzou, Agbessi; Kanyuka, Mercy; Hazel, Elizabeth; Heidkamp, Rebecca; Marsh, Andrew; Mleme, Tiope; Munthali, Spy; Park, Lois; Banda, Benjamin; Moulton, Lawrence H; Black, Robert E; Hill, Kenneth; Perin, Jamie; Victora, Cesar G; Bryce, Jennifer

    2016-03-01

    We evaluated the impact of integrated community case management of childhood illness (iCCM) on careseeking for childhood illness and child mortality in Malawi, using a National Evaluation Platform dose-response design with 27 districts as units of analysis. "Dose" variables included density of iCCM providers, drug availability, and supervision, measured through a cross-sectional cellular telephone survey of all iCCM-trained providers. "Response" variables were changes between 2010 and 2014 in careseeking and mortality in children aged 2-59 months, measured through household surveys. iCCM implementation strength was not associated with changes in careseeking or mortality. There were fewer than one iCCM-ready provider per 1,000 under-five children per district. About 70% of sick children were taken outside the home for care in both 2010 and 2014. Careseeking from iCCM providers increased over time from about 2% to 10%; careseeking from other providers fell by a similar amount. Likely contributors to the failure to find impact include low density of iCCM providers, geographic targeting of iCCM to "hard-to-reach" areas although women did not identify distance from a provider as a barrier to health care, and displacement of facility careseeking by iCCM careseeking. This suggests that targeting iCCM solely based on geographic barriers may need to be reconsidered.

  20. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  1. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  2. Signal verification can promote reliable signalling

    PubMed Central

    Broom, Mark; Ruxton, Graeme D.; Schaefer, H. Martin

    2013-01-01

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer–resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism. PMID:24068354

  3. Automating engineering verification in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Castillo, Jorge

    2014-08-01

    The Atacama Large Millimeter/submillimeter Array is an interferometer comprising 66 individual high precision antennas located over 5000 meters altitude in the north of Chile. Several complex electronic subsystems need to be meticulously tested at different stages of an antenna commissioning, both independently and when integrated together. First subsystem integration takes place at the Operations Support Facilities (OSF), at an altitude of 3000 meters. Second integration occurs at the high altitude Array Operations Site (AOS), where also combined performance with Central Local Oscillator (CLO) and Correlator is assessed. In addition, there are several other events requiring complete or partial verification of instrument specifications compliance, such as parts replacements, calibration, relocation within AOS, preventive maintenance and troubleshooting due to poor performance in scientific observations. Restricted engineering time allocation and the constant pressure of minimizing downtime in a 24/7 astronomical observatory, impose the need to complete (and report) the aforementioned verifications in the least possible time. Array-wide disturbances, such as global power interruptions and following recovery, generate the added challenge of executing this checkout on multiple antenna elements at once. This paper presents the outcome of the automation of engineering verification setup, execution, notification and reporting in ALMA and how these efforts have resulted in a dramatic reduction of both time and operator training required. Signal Path Connectivity (SPC) checkout is introduced as a notable case of such automation.

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: EVALUATION OF THE XP-SWMM STORMWATER WASTEWATER MANAGEMENT MODEL, VERSION 8.2, 2000, FROM XP SOFTWARE, INC.

    EPA Science Inventory

    XP-SWMM is a commercial software package used throughout the United States and around the world for simulation of storm, sanitary and combined sewer systems. It was designed based on the EPA Storm Water Management Model (EPA SWMM), but has enhancements and additional algorithms f...

  5. Independent Living.

    ERIC Educational Resources Information Center

    Nathanson, Jeanne H., Ed.

    1994-01-01

    This issue of "OSERS" addresses the subject of independent living of individuals with disabilities. The issue includes a message from Judith E. Heumann, the Assistant Secretary of the Office of Special Education and Rehabilitative Services (OSERS), and 10 papers. Papers have the following titles and authors: "Changes in the…

  6. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  7. Production readiness verification testing

    NASA Technical Reports Server (NTRS)

    James, A. M.; Bohon, H. L.

    1980-01-01

    A Production Readiness Verification Testing (PRVT) program has been established to determine if structures fabricated from advanced composites can be committed on a production basis to commercial airline service. The program utilizes subcomponents which reflect the variabilities in structure that can realistically be expected from current production and quality control technology to estimate the production qualities, variation in static strength, and durability of advanced composite structures. The results of the static tests and a durability assessment after one year of continuous load/environment testing of twenty two duplicates of each of two structural components (a segment of the front spar and cover of a vertical stabilizer box structure) are discussed.

  8. 30 CFR 585.705 - When must I use a Certified Verification Agent (CVA)?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CVA)? 585.705 Section 585.705 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE... Facility Design, Fabrication, and Installation Certified Verification Agent § 585.705 When must I use a Certified Verification Agent (CVA)? You must use a CVA to review and certify the Facility Design Report,...

  9. 30 CFR 585.705 - When must I use a Certified Verification Agent (CVA)?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CVA)? 585.705 Section 585.705 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE... Facility Design, Fabrication, and Installation Certified Verification Agent § 585.705 When must I use a Certified Verification Agent (CVA)? You must use a CVA to review and certify the Facility Design Report,...

  10. 30 CFR 585.705 - When must I use a Certified Verification Agent (CVA)?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CVA)? 585.705 Section 585.705 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE... Facility Design, Fabrication, and Installation Certified Verification Agent § 585.705 When must I use a Certified Verification Agent (CVA)? You must use a CVA to review and certify the Facility Design Report,...

  11. 30 CFR 285.705 - When must I use a Certified Verification Agent (CVA)?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CVA)? 285.705 Section 285.705 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE... Facility Design, Fabrication, and Installation Certified Verification Agent § 285.705 When must I use a Certified Verification Agent (CVA)? You must use a CVA to review and certify the Facility Design Report,...

  12. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... addition to the benefit of providing high volume, centralized SSN verification services to the business community in a secure manner, CBSV provides us with cost and workload management benefits. New Information... ADMINISTRATION Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social...

  13. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicators, HUD verification... HOUSING AND URBAN DEVELOPMENT SECTION 8 MANAGEMENT ASSESSMENT PROGRAM (SEMAP) General § 985.3 Indicators, HUD verification methods and ratings. This section states the performance indicators that are used...

  14. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Indicators, HUD verification... HOUSING AND URBAN DEVELOPMENT SECTION 8 MANAGEMENT ASSESSMENT PROGRAM (SEMAP) General § 985.3 Indicators, HUD verification methods and ratings. This section states the performance indicators that are used...

  15. Independence and Survival.

    ERIC Educational Resources Information Center

    James, H. Thomas

    Independent schools that are of viable size, well managed, and strategically located to meet competition will survive and prosper past the current financial crisis. We live in a complex technological society with insatiable demands for knowledgeable people to keep it running. The future will be marked by the orderly selection of qualified people,…

  16. Independent technical review, handbook

    SciTech Connect

    Not Available

    1994-02-01

    Purpose Provide an independent engineering review of the major projects being funded by the Department of Energy, Office of Environmental Restoration and Waste Management. The independent engineering review will address questions of whether the engineering practice is sufficiently developed to a point where a major project can be executed without significant technical problems. The independent review will focus on questions related to: (1) Adequacy of development of the technical base of understanding; (2) Status of development and availability of technology among the various alternatives; (3) Status and availability of the industrial infrastructure to support project design, equipment fabrication, facility construction, and process and program/project operation; (4) Adequacy of the design effort to provide a sound foundation to support execution of project; (5) Ability of the organization to fully integrate the system, and direct, manage, and control the execution of a complex major project.

  17. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  18. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  19. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  20. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  1. Understanding independence

    NASA Astrophysics Data System (ADS)

    Annan, James; Hargreaves, Julia

    2016-04-01

    In order to perform any Bayesian processing of a model ensemble, we need a prior over the ensemble members. In the case of multimodel ensembles such as CMIP, the historical approach of ``model democracy'' (i.e. equal weight for all models in the sample) is no longer credible (if it ever was) due to model duplication and inbreeding. The question of ``model independence'' is central to the question of prior weights. However, although this question has been repeatedly raised, it has not yet been satisfactorily addressed. Here I will discuss the issue of independence and present a theoretical foundation for understanding and analysing the ensemble in this context. I will also present some simple examples showing how these ideas may be applied and developed.

  2. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    NASA Technical Reports Server (NTRS)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network

  3. Correction, improvement and model verification of CARE 3, version 3

    NASA Technical Reports Server (NTRS)

    Rose, D. M.; Manke, J. W.; Altschul, R. E.; Nelson, D. L.

    1987-01-01

    An independent verification of the CARE 3 mathematical model and computer code was conducted and reported in NASA Contractor Report 166096, Review and Verification of CARE 3 Mathematical Model and Code: Interim Report. The study uncovered some implementation errors that were corrected and are reported in this document. The corrected CARE 3 program is called version 4. Thus the document, correction. improvement, and model verification of CARE 3, version 3 was written in April 1984. It is being published now as it has been determined to contain a more accurate representation of CARE 3 than the preceding document of April 1983. This edition supercedes NASA-CR-166122 entitled, 'Correction and Improvement of CARE 3,' version 3, April 1983.

  4. Verification and validation of RADMODL Version 1.0

    SciTech Connect

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  5. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  6. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  7. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  8. Shift Verification and Validation

    SciTech Connect

    Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G; Johnson, Seth R.; Godfrey, Andrew T.

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  9. Assessment and Verification of National Vocational Qualifications: Policy and Practice.

    ERIC Educational Resources Information Center

    Konrad, John

    2000-01-01

    To overcome problems of assessment and verification of National Vocational Qualifications, the system should move from narrow quality control to total quality management. Situated learning in communities of practice, including assessors and assessees, should be developed. This requires radically different quality criteria and professional…

  10. 'Independence' Panorama

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Click on the image for 'Independence' Panorama (QTVR)

    This is the Spirit 'Independence' panorama, acquired on martian days, or sols, 536 to 543 (July 6 to 13, 2005), from a position in the 'Columbia Hills' near the summit of 'Husband Hill.' The summit of 'Husband Hill' is the peak near the right side of this panorama and is about 100 meters (328 feet) away from the rover and about 30 meters (98 feet) higher in elevation. The rocky outcrops downhill and on the left side of this mosaic include 'Larry's Lookout' and 'Cumberland Ridge,' which Spirit explored in April, May, and June of 2005.

    The panorama spans 360 degrees and consists of 108 individual images, each acquired with five filters of the rover's panoramic camera. The approximate true color of the mosaic was generated using the camera's 750-, 530-, and 480-nanometer filters. During the 8 martian days, or sols, that it took to acquire this image, the lighting varied considerably, partly because of imaging at different times of sol, and partly because of small sol-to-sol variations in the dustiness of the atmosphere. These slight changes produced some image seams and rock shadows. These seams have been eliminated from the sky portion of the mosaic to better simulate the vista a person standing on Mars would see. However, it is often not possible or practical to smooth out such seams for regions of rock, soil, rover tracks or solar panels. Such is the nature of acquiring and assembling large panoramas from the rovers.

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHOTOACOUSTIC SPECTROPHOTOMATER INNOVA AIR TECH INSTRUMENTS MODEL 1312 MULTI-GAS MONITOR

    EPA Science Inventory

    The U.S. Environmental Protection Agency, Through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report documents demons...

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH - SENTEX SYSTEMS, INC. SCENTOGRAPH PLUS II

    EPA Science Inventory

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report documents demons...

  13. System maintenance verification and validation plan for the TWRS controlled baseline database system

    SciTech Connect

    Spencer, S.G.

    1998-09-23

    TWRS Controlled Baseline Database, formally known as the Performance Measurement Control System, is used to track and monitor TWRS project management baseline information. This document contains the verification and validation approach for system documentation changes within the database system.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH - PERKIN-ELMER PHOTOVAC, INC. VOYAGOR

    EPA Science Inventory

    The U.S Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. Reports document the performa...

  15. Better Buildings Alliance, Advanced Rooftop Unit Campaign: Rooftop Unit Measurement and Verification (Fact Sheet)

    SciTech Connect

    Not Available

    2014-09-01

    This document provides facility managers and building owners an introduction to measurement and verification (M&V) methods to estimate energy and cost savings of rooftop units replacement or retrofit projects to estimate paybacks or to justify future projects.

  16. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  17. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  18. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    SciTech Connect

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  19. Aerospace Nickel-cadmium Cell Verification

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Strawn, D. Michael; Hall, Stephen W.

    2001-01-01

    During the early years of satellites, NASA successfully flew "NASA-Standard" nickel-cadmium (Ni-Cd) cells manufactured by GE/Gates/SAFF on a variety of spacecraft. In 1992 a NASA Battery Review Board determined that the strategy of a NASA Standard Cell and Battery Specification and the accompanying NASA control of a standard manufacturing control document (MCD) for Ni-Cd cells and batteries was unwarranted. As a result of that determination, standards were abandoned and the use of cells other than the NASA Standard was required. In order to gain insight into the performance and characteristics of the various aerospace Ni-Cd products available, tasks were initiated within the NASA Aerospace Flight Battery Systems Program that involved the procurement and testing of representative aerospace Ni-Cd cell designs. A standard set of test conditions was established in order to provide similar information about the products from various vendors. The objective of this testing was to provide independent verification of representative commercial flight cells available in the marketplace today. This paper will provide a summary of the verification tests run on cells from various manufacturers: Sanyo 35 Ampere-hour (Ali) standard and 35 Ali advanced Ni-Cd cells, SAFr 50 Ah Ni-Cd cells and Eagle-Picher 21 Ali Magnum and 21 Ali Super Ni-CdTM cells from Eagle-Picher were put through a full evaluation. A limited number of 18 and 55 Ali cells from Acme Electric were also tested to provide an initial evaluation of the Acme aerospace cell designs. Additionally, 35 Ali aerospace design Ni-MH cells from Sanyo were evaluated under the standard conditions established for this program. Ile test program is essentially complete. The cell design parameters, the verification test plan and the details of the test result will be discussed.

  20. Verification and validation of control system software

    SciTech Connect

    Munro, J.K. Jr.; Kisner, R.A. ); Bhadtt, S.C. )

    1991-01-01

    The following guidelines are proposed for verification and validation (V V) of nuclear power plant control system software: (a) use risk management to decide what and how much V V is needed; (b) classify each software application using a scheme that reflects what type and how much V V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs.

  1. Woodward Effect Experimental Verifications

    NASA Astrophysics Data System (ADS)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  2. Verification Survey of the Building 315 Zero Power Reactor-6 Facility, Argonne National Laboratory-East, Argonne, Illinois

    SciTech Connect

    W. C. Adams

    2007-05-25

    Oak Ridge Institute for Science and Education (ORISE) conducted independent verification radiological survey activities at Argonne National Laboratory’s Building 315, Zero Power Reactor-6 facility in Argonne, Illinois. Independent verification survey activities included document and data reviews, alpha plus beta and gamma surface scans, alpha and beta surface activity measurements, and instrumentation comparisons. An interim letter report and a draft report, documenting the verification survey findings, were submitted to the DOE on November 8, 2006 and February 22, 2007, respectively (ORISE 2006b and 2007).

  3. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  4. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  5. Technical challenges for dismantlement verification

    SciTech Connect

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-11-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion.

  6. Test/QA Plan for Verification of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments across International Borders

    EPA Science Inventory

    The verification test will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Technology Verification (ETV) Program. It will be performed by Battelle, which is managing the ETV Advanced Monitoring Systems (AMS) Center throu...

  7. Safeguards for spent fuels: Verification problems

    SciTech Connect

    Pillay, K.K.S.; Picard, R.R.

    1991-01-01

    The accumulation of large quantities of spent nuclear fuels world-wide is a serious problem for international safeguards. A number of International Atomic Energy Agency (IAEA) member states, including the US, consider spent fuel to be a material form for which safeguards cannot be terminated, even after permanent disposal in a geologic repository. Because safeguards requirements for spent fuels are different from those of conventional bulk-handling and item-accounting facilities, there is room for innovation to design a unique safeguards regime for spent fuels that satisfies the goals of the nuclear nonproliferation treaty at a reasonable cost to both the facility and the IAEA. Various strategies being pursued for long-term management of spent fuels are examined with a realistic example to illustrate the problems of verifying safeguards under the present regime. Verification of a safeguards regime for spent fuels requires a mix of standard safeguards approaches, such as quantitative verification and use of seals, with other measures that are unique to spent fuels. 17 refs.

  8. Subsurface barrier integrity verification using perfluorocarbon tracers

    SciTech Connect

    Sullivan, T.M.; Heiser, J.; Milian, L.; Senum, G.

    1996-12-01

    Subsurface barriers are an extremely promising remediation option to many waste management problems. Gas phase tracers include perfluorocarbon tracers (PFT`s) and chlorofluorocarbon tracers (CFC`s). Both have been applied for leak detection in subsurface systems. The focus of this report is to describe the barrier verification tests conducted using PFT`s and analysis of the data from the tests. PFT verification tests have been performed on a simulated waste pit at the Hanford Geotechnical facility and on an actual waste pit at Brookhaven National Laboratory (BNL). The objective of these tests were to demonstrate the proof-of-concept that PFT technology can be used to determine if small breaches form in the barrier and for estimating the effectiveness of the barrier in preventing migration of the gas tracer to the monitoring wells. The subsurface barrier systems created at Hanford and BNL are described. The experimental results and the analysis of the data follow. Based on the findings of this study, conclusions are offered and suggestions for future work are presented.

  9. Monitoring/Verification using DMS: TATP Example

    SciTech Connect

    Stephan Weeks, Kevin Kyle, Manuel Manard

    2008-05-30

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations-management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. Fast GC is the leading field analytical method for gas phase separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.

  10. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  11. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  12. 38 CFR 21.6160 - Independent living services.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... independent living services which may be furnished include: (1) Training in independent living skills; (2) Health management programs; (3) Identification of appropriate housing accommodations; and (4)...

  13. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  14. Definition of ground test for Large Space Structure (LSS) control verification

    NASA Technical Reports Server (NTRS)

    Waites, H. B.; Doane, G. B., III; Tollison, D. K.

    1984-01-01

    An overview for the definition of a ground test for the verification of Large Space Structure (LSS) control is given. The definition contains information on the description of the LSS ground verification experiment, the project management scheme, the design, development, fabrication and checkout of the subsystems, the systems engineering and integration, the hardware subsystems, the software, and a summary which includes future LSS ground test plans. Upon completion of these items, NASA/Marshall Space Flight Center will have an LSS ground test facility which will provide sufficient data on dynamics and control verification of LSS so that LSS flight system operations can be reasonably ensured.

  15. Verification and validation of impinging round jet simulations using an adaptive FEM

    NASA Astrophysics Data System (ADS)

    Pelletier, Dominique; Turgeon, Éric; Tremblay, Dominique

    2004-03-01

    This paper illustrates the use of an adaptive finite element method as a means of achieving verification of codes and simulations of impinging round jets, that is obtaining numerical predictions with controlled accuracy. Validation of these grid-independent solution is then performed by comparing predictions to measurements. We adopt the standard and accepted definitions of verification and validation (Technical Report AIAA-G-077-1998, American Institute of Aeronautics and Astronautics, 1998; Verification and Validation in Computational Science and Engineering. Hermosa Publishers: Albuquerque, NM, 1998). Mesh adaptation is used to perform the systematic and rigorous grid refinement studies required for both verification and validation in CFD. This ensures that discrepancies observed between predictions and measurements are due to deficiencies in the mathematical model of the flow. Issues in verification and validation are discussed. The paper presents an example of code verification by the method of manufactured solution. Examples of successful and unsuccessful validation for laminar and turbulent impinging jets show that agreement with experiments is achieved only with a good mathematical model of the flow physics combined with accurate numerical solution of the differential equations. The paper emphasizes good CFD practice to systematically achieve verification so that validation studies are always performed on solid grounds.

  16. Automated verification system user's guide

    NASA Technical Reports Server (NTRS)

    Hoffman, R. H.

    1972-01-01

    Descriptions of the operational requirements for all of the programs of the Automated Verification System (AVS) are provided. The AVS programs are: (1) FORTRAN code analysis and instrumentation program (QAMOD); (2) Test Effectiveness Evaluation Program (QAPROC); (3) Transfer Control Variable Tracking Program (QATRAK); (4) Program Anatomy Table Generator (TABGEN); and (5) Network Path Analysis Program (RAMBLE).

  17. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  18. Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) Independent Analysis

    NASA Technical Reports Server (NTRS)

    Davis, Mitchell L.; Aguilar, Michael L.; Mora, Victor D.; Regenie, Victoria A.; Ritz, William F.

    2009-01-01

    Two approaches were compared to the Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) approach: the Flat-Sat and Shuttle Avionics Integration Laboratory (SAIL). The Flat-Sat and CAIL/SAIL approaches are two different tools designed to mitigate different risks. Flat-Sat approach is designed to develop a mission concept into a flight avionics system and associated ground controller. The SAIL approach is designed to aid in the flight readiness verification of the flight avionics system. The approaches are complimentary in addressing both the system development risks and mission verification risks. The following NESC team findings were identified: The CAIL assumption is that the flight subsystems will be matured for the system level verification; The Flat-Sat and SAIL approaches are two different tools designed to mitigate different risks. The following NESC team recommendation was provided: Define, document, and manage a detailed interface between the design and development (EDL and other integration labs) to the verification laboratory (CAIL).

  19. U.S. verification method disputed

    NASA Astrophysics Data System (ADS)

    Maggs, William Ward

    Milo Nordyke, senior scientist at Lawrence Liver more National Laboratory in Liver more, Calif., testified October 6 at a Senate Foreign Affairs Committee hearing on Soviet test ban noncompliance and the recently concluded Joint Verification Experiment. He said that the the government's method for on-site test monitoring is intrusive, expensive, and could limit some U.S. weapon design programs. In addition, Gregory Van der Vink of the congressional Office of Technology Assessment presented new evidence that White House charges that the Soviet Union has not complied with the current 150 kiloton test limit are probably without basis.Also testifying were Paul Robinson, U.S. negotiator for the Nuclear Testing Talks; Peter Sharfman, program manager for International Security and Commerce at OTA; and physicist David Hafemeister of California Polytechnical State University, San Luis Obispo.

  20. On Crowd-verification of Biological Networks

    PubMed Central

    Ansari, Sam; Binder, Jean; Boue, Stephanie; Di Fabio, Anselmo; Hayes, William; Hoeng, Julia; Iskandar, Anita; Kleiman, Robin; Norel, Raquel; O’Neel, Bruce; Peitsch, Manuel C.; Poussin, Carine; Pratt, Dexter; Rhrissorrakrai, Kahn; Schlage, Walter K.; Stolovitzky, Gustavo; Talikka, Marja

    2013-01-01

    Biological networks with a structured syntax are a powerful way of representing biological information generated from high density data; however, they can become unwieldy to manage as their size and complexity increase. This article presents a crowd-verification approach for the visualization and expansion of biological networks. Web-based graphical interfaces allow visualization of causal and correlative biological relationships represented using Biological Expression Language (BEL). Crowdsourcing principles enable participants to communally annotate these relationships based on literature evidences. Gamification principles are incorporated to further engage domain experts throughout biology to gather robust peer-reviewed information from which relationships can be identified and verified. The resulting network models will represent the current status of biological knowledge within the defined boundaries, here processes related to human lung disease. These models are amenable to computational analysis. For some period following conclusion of the challenge, the published models will remain available for continuous use and expansion by the scientific community. PMID:24151423

  1. Design and Verification Guidelines for Vibroacoustic and Transient Environments

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Design and verification guidelines for vibroacoustic and transient environments contain many basic methods that are common throughout the aerospace industry. However, there are some significant differences in methodology between NASA/MSFC and others - both government agencies and contractors. The purpose of this document is to provide the general guidelines used by the Component Analysis Branch, ED23, at MSFC, for the application of the vibroacoustic and transient technology to all launch vehicle and payload components and payload components and experiments managed by NASA/MSFC. This document is intended as a tool to be utilized by the MSFC program management and their contractors as a guide for the design and verification of flight hardware.

  2. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans

    NASA Astrophysics Data System (ADS)

    Spezi, E.; Lewis, D. G.; Smith, C. W.

    2002-12-01

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region.

  3. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans.

    PubMed

    Spezi, E; Lewis, D G; Smith, C W

    2002-12-07

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region.

  4. Results from an Independent View on The Validation of Safety-Critical Space Systems

    NASA Astrophysics Data System (ADS)

    Silva, N.; Lopes, R.; Esper, A.; Barbosa, R.

    2013-08-01

    The Independent verification and validation (IV&V) has been a key process for decades, and is considered in several international standards. One of the activities described in the “ESA ISVV Guide” is the independent test verification (stated as Integration/Unit Test Procedures and Test Data Verification). This activity is commonly overlooked since customers do not really see the added value of checking thoroughly the validation team work (could be seen as testing the tester's work). This article presents the consolidated results of a large set of independent test verification activities, including the main difficulties, results obtained and advantages/disadvantages for the industry of these activities. This study will support customers in opting-in or opting-out for this task in future IV&V contracts since we provide concrete results from real case studies in the space embedded systems domain.

  5. Managing Multiple Sources of Information in an Independent K-12 Private School: A Case Study in a Student Information Systems Evaluation

    ERIC Educational Resources Information Center

    Yares, Ali Chava Kaufman

    2010-01-01

    Information is everywhere and finding the best method to manage it is a problem that all types of organizations have to deal with. Schools use Student Information Systems (SIS) to manage Student Data, Financial Information, Development, Human Resources, Admission, Financial Aid, Enrollment, Scheduling, and Health Information. A survey of 107…

  6. Extremely accurate sequential verification of RELAP5-3D

    SciTech Connect

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method of manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.

  7. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  8. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  9. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  10. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    NASA Astrophysics Data System (ADS)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  11. External Verification of the Bundle Adjustment in Photogrammetric Software Using the Damped Bundle Adjustment Toolbox

    NASA Astrophysics Data System (ADS)

    Börlin, Niclas; Grussenmeyer, Pierre

    2016-06-01

    The aim of this paper is to investigate whether the Matlab-based Damped Bundle Adjustment Toolbox (DBAT) can be used to provide independent verification of the BA computation of two popular software—PhotoModeler (PM) and PhotoScan (PS). For frame camera data sets with lens distortion, DBAT is able to reprocess and replicate subsets of PM results with high accuracy. For lens-distortion-free data sets, DBAT can furthermore provide comparative results between PM and PS. Data sets for the discussed projects are available from the authors. The use of an external verification tool such as DBAT will enable users to get an independent verification of the computations of their software. In addition, DBAT can provide computation of quality parameters such as estimated standard deviations, correlation between parameters, etc., something that should be part of best practice for any photogrammetric software. Finally, as the code is free and open-source, users can add computations of their own.

  12. Wavelet Features Based Fingerprint Verification

    NASA Astrophysics Data System (ADS)

    Bagadi, Shweta U.; Thalange, Asha V.; Jain, Giridhar P.

    2010-11-01

    In this work; we present a automatic fingerprint identification system based on Level 3 features. Systems based only on minutiae features do not perform well for poor quality images. In practice, we often encounter extremely dry, wet fingerprint images with cuts, warts, etc. Due to such fingerprints, minutiae based systems show poor performance for real time authentication applications. To alleviate the problem of poor quality fingerprints, and to improve overall performance of the system, this paper proposes fingerprint verification based on wavelet statistical features & co-occurrence matrix features. The features include mean, standard deviation, energy, entropy, contrast, local homogeneity, cluster shade, cluster prominence, Information measure of correlation. In this method, matching can be done between the input image and the stored template without exhaustive search using the extracted feature. The wavelet transform based approach is better than the existing minutiae based method and it takes less response time and hence suitable for on-line verification, with high accuracy.

  13. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  14. Ontology Matching with Semantic Verification

    PubMed Central

    Jean-Mary, Yves R.; Shironoshita, E. Patrick; Kabuka, Mansur R.

    2009-01-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies. PMID:20186256

  15. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  16. 30 CFR 227.601 - What are a State's responsibilities if it performs automated verification?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What are a State's responsibilities if it performs automated verification? 227.601 Section 227.601 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR MINERALS REVENUE MANAGEMENT DELEGATION TO STATES States' Responsibilities...

  17. 77 FR 28401 - Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-14

    ... Bureau of Safety and Environmental Enforcement Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office of Management and Budget (OMB) Review; Comment Request ACTION... comments on a collection of information that we will submit to the Office of Management and Budget...

  18. Structural System Identification Technology Verification

    DTIC Science & Technology

    1981-11-01

    USAAVRADCOM-TR-81-D-28Q V󈧄 ADA1091 81 LEI STRUCTURAL SYSTEM IDENTIFICATION TECHNOLOGY VERIFICATION \\ N. Giansante, A. Berman, W. o. Flannelly, E...release; distribution unlimited. Prepared for APPLIED TECHNOLOGY LABORATORY U. S. ARMY RESEARCH AND TECHNOLOGY LABORATORIES (AVRADCOM) S Fort Eustis...Va. 23604 4-J" APPLI ED TECHNOLOGY LABORATORY POSITION STATEMENT The Applied Technology Laboratory has been involved in the development of the Struc

  19. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  20. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  1. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  2. Earthquake Forecasting, Validation and Verification

    NASA Astrophysics Data System (ADS)

    Rundle, J.; Holliday, J.; Turcotte, D.; Donnellan, A.; Tiampo, K.; Klein, B.

    2009-05-01

    Techniques for earthquake forecasting are in development using both seismicity data mining methods, as well as numerical simulations. The former rely on the development of methods to recognize patterns in data, while the latter rely on the use of dynamical models that attempt to faithfully replicate the actual fault systems. Testing such forecasts is necessary not only to determine forecast quality, but also to improve forecasts. A large number of techniques to validate and verify forecasts have been developed for weather and financial applications. Many of these have been elaborated in public locations, including, for example, the URL as listed below. Typically, the goal is to test for forecast resolution, reliability and sharpness. A good forecast is characterized by consistency, quality and value. Most, if not all of these forecast verification procedures can be readily applied to earthquake forecasts as well. In this talk, we discuss both methods of forecasting, as well as validation and verification using a number of these standard methods. We show how these test methods might be useful for both fault-based forecasting, a group of forecast methods that includes the WGCEP and simulator-based renewal models, and grid-based forecasting, which includes the Relative Intensity, Pattern Informatics, and smoothed seismicity methods. We find that applying these standard methods of forecast verification is straightforward. Judgments about the quality of a given forecast method can often depend on the test applied, as well as on the preconceptions and biases of the persons conducting the tests.

  3. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  4. Realistic weather simulations and forecast verification with COSMO-EULAG

    NASA Astrophysics Data System (ADS)

    Wójcik, Damian; Piotrowski, Zbigniew; Rosa, Bogdan; Ziemiański, Michał

    2015-04-01

    Research conducted at Polish Institute of Meteorology and Water Management, National Research Institute, in collaboration with Consortium for Small Scale Modeling (COSMO) resulted in the development of a new prototype model COSMO-EULAG. The dynamical core of the new model is based on anelastic set of equation and numerics adopted from the EULAG model. The core is coupled, with the 1st degree of accuracy, to the COSMO physical parameterizations involving turbulence, friction, radiation, moist processes and surface fluxes. The tool is capable to compute weather forecast in mountainous area for the horizontal resolutions ranging from 2.2 km to 0.1 km and with slopes reaching 82 degree of inclination. An employment of EULAG allows to profit from its desirable conservative properties and numerical robustness confirmed in number of benchmark tests and widely documented in scientific literature. In this study we show a realistic case study of Alpine summer convection simulated by COSMO-EULAG. It compares the convection-permitting realization of the flow using 2.2 km horizontal grid size, typical for contemporary very high resolution regional NWP forecast, with realization of LES type using grid size of 100 m. The study presents comparison of flow, cloud and precipitation structure together with the reference results of standard compressible COSMO Runge-Kutta model forecast in 2.2 km horizontal resolution. The case study results are supplemented by COSMO-EULAG forecast verification results for Alpine domain in 2.2 km horizontal resolution. Wind, temperature, cloud, humidity and precipitation scores are being presented. Verification period covers one summer month (June 2013) and one autumn month (November 2013). Verification is based on data collected by a network of approximately 200 stations (surface data verification) and 6 stations (upper-air verification) located in the Alps and vicinity.

  5. Interim Letter Report - Verification Survey Results for Activities Performed in March 2009 for the Vitrification Test Facility Warehouse at the West Valley Demonstration Project, Ashford, New York

    SciTech Connect

    B.D. Estes

    2009-04-24

    The objective of the verification activities was to provide independent radiological surveys and data for use by the Department of Energy (DOE) to ensure that the building satisfies the requirements for release without radiological controls.

  6. Letter Report - Verification Results for the Non-Real Property Radiological Release Program at the West Valley Demonstration Project, Ashford, New York

    SciTech Connect

    M.A. Buchholz

    2009-04-29

    The objective of the verification activities is to provide an independent review of the design, implementation, and performance of the radiological unrestricted release program for personal property, materials, and equipment (non-real property).

  7. Subsurface barrier verification technologies, informal report

    SciTech Connect

    Heiser, J.H.

    1994-06-01

    One of the more promising remediation options available to the DOE waste management community is subsurface barriers. Some of the uses of subsurface barriers include surrounding and/or containing buried waste, as secondary confinement of underground storage tanks, to direct or contain subsurface contaminant plumes and to restrict remediation methods, such as vacuum extraction, to a limited area. To be most effective the barriers should be continuous and depending on use, have few or no breaches. A breach may be formed through numerous pathways including: discontinuous grout application, from joints between panels and from cracking due to grout curing or wet-dry cycling. The ability to verify barrier integrity is valuable to the DOE, EPA, and commercial sector and will be required to gain full public acceptance of subsurface barriers as either primary or secondary confinement at waste sites. It is recognized that no suitable method exists for the verification of an emplaced barrier`s integrity. The large size and deep placement of subsurface barriers makes detection of leaks challenging. This becomes magnified if the permissible leakage from the site is low. Detection of small cracks (fractions of an inch) at depths of 100 feet or more has not been possible using existing surface geophysical techniques. Compounding the problem of locating flaws in a barrier is the fact that no placement technology can guarantee the completeness or integrity of the emplaced barrier. This report summarizes several commonly used or promising technologies that have been or may be applied to in-situ barrier continuity verification.

  8. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  9. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  10. IN PURSUIT OF AN INTERNATIONAL APPROACH TO QUALITY ASSURANCE FOR ENVIRONMENTAL TECHNOLOGY VERIFICATION

    EPA Science Inventory

    In the mid-1990's, the USEPA began the Environmental Technology Verification (ETV) Program in order to provide purchasers of environmental technology with independently acquired, quality-assured, test data, upon which to base their purchasing decisions. From the beginning, a str...

  11. The Evolution of Improved Baghouse Filter Media as Observed in the Environmental Technology Verification Program

    EPA Science Inventory

    The U.S. EPA implemented the Environmental Technology Verification (ETV) program in 1995 to generate independent and credible data on the performance of innovative technologies that have the potential to improve protection of public health and the environment. Results are publicl...

  12. Compendium of Arms Control Verification Proposals.

    DTIC Science & Technology

    1982-03-01

    ZONAL ON-SITE INSPECTION ............ 123 CHAPTER D - CONTROL POSTS ................................... 139 CHAPTER E - RECORDS MONITORING...de:cribi.nr in reneral the zirnifiemit features of the verification method concerned. I’ ’ ’vi.i Chapters A to D deal with verification by direct on...inspection (i.e. increasing as confidence develops), and chapter D with control or observation posts. Chapter E deals with verification by examination of

  13. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  14. Using tools for verification, documentation and testing

    NASA Technical Reports Server (NTRS)

    Osterweil, L. J.

    1978-01-01

    Methodologies are discussed on four of the major approaches to program upgrading -- namely dynamic testing, symbolic execution, formal verification and static analysis. The different patterns of strengths, weaknesses and applications of these approaches are shown. It is demonstrated that these patterns are in many ways complementary, offering the hope that they can be coordinated and unified into a single comprehensive program testing and verification system capable of performing a diverse and useful variety of error detection, verification and documentation functions.

  15. A glutathione-independent glyoxalase of the DJ-1 superfamily plays an important role in managing metabolically generated methylglyoxal in Candida albicans.

    PubMed

    Hasim, Sahar; Hussin, Nur Ahmad; Alomar, Fadhel; Bidasee, Keshore R; Nickerson, Kenneth W; Wilson, Mark A

    2014-01-17

    Methylglyoxal is a cytotoxic reactive carbonyl compound produced by central metabolism. Dedicated glyoxalases convert methylglyoxal to d-lactate using multiple catalytic strategies. In this study, the DJ-1 superfamily member ORF 19.251/GLX3 from Candida albicans is shown to possess glyoxalase activity, making this the first demonstrated glutathione-independent glyoxalase in fungi. The crystal structure of Glx3p indicates that the protein is a monomer containing the catalytic triad Cys(136)-His(137)-Glu(168). Purified Glx3p has an in vitro methylglyoxalase activity (Km = 5.5 mM and kcat = 7.8 s(-1)) that is significantly greater than that of more distantly related members of the DJ-1 superfamily. A close Glx3p homolog from Saccharomyces cerevisiae (YDR533C/Hsp31) also has glyoxalase activity, suggesting that fungal members of the Hsp31 clade of the DJ-1 superfamily are all probable glutathione-independent glyoxalases. A homozygous glx3 null mutant in C. albicans strain SC5314 displays greater sensitivity to millimolar levels of exogenous methylglyoxal, elevated levels of intracellular methylglyoxal, and carbon source-dependent growth defects, especially when grown on glycerol. These phenotypic defects are complemented by restoration of the wild-type GLX3 locus. The growth defect of Glx3-deficient cells in glycerol is also partially complemented by added inorganic phosphate, which is not observed for wild-type or glucose-grown cells. Therefore, C. albicans Glx3 and its fungal homologs are physiologically relevant glutathione-independent glyoxalases that are not redundant with the previously characterized glutathione-dependent GLO1/GLO2 system. In addition to its role in detoxifying glyoxals, Glx3 and its close homologs may have other important roles in stress response.

  16. Managing Change in the Nonprofit Sector: Lessons from the Evolution of Five Independent Research Libraries. Jossey-Bass Nonprofit Sector Series.

    ERIC Educational Resources Information Center

    Bergman, Jed I.; And Others

    This book presents a historical review of five private research libraries in the United States and analyzes how these five nonprofit organizations managed the pressures of change that all nonprofits face. Part one contains five case studies: (1) the Huntington Library, Art Collections, and Botanical Gardens; (2) the Pierpont Morgan Library; (3)…

  17. Duty of Care and Autonomy: How Support Workers Managed the Tension between Protecting Service Users from Risk and Promoting Their Independence in a Specialist Group Home

    ERIC Educational Resources Information Center

    Hawkins, R.; Redley, M.; Holland, A. J.

    2011-01-01

    Background: In the UK those paid to support adults with intellectual disabilities must manage two potentially conflicting duties that are set out in policy documents as being vital to their role: protecting service users (their duty of care) and recognising service users' autonomy. This study focuses specifically on the support of people with the…

  18. [Discussion on care management operation of a visiting nurse--a case of increased ADL by the support of independent life].

    PubMed

    Shiraishi, Minako

    2002-12-01

    In addition to the visiting nursing service conventionally provided, the Department of Long-term Care Insurance Service of this hospital inaugurated the home care supporting service in April 2000. Senior citizens rated higher in the degree of necessity of care in the initial accreditation and in the renewal accreditation of the Long-term Care Insurance tend to have fewer changes in the services for the last two years. At present, care managers of various professions are involved in the home care supporting services and have no choice but to provide care in non-specialty areas. Under the situation, care management by the visiting nurse helped an elderly increase ADL and live on his own, and the case is introduced in this article. Mr. K.T. developed angina pectoris at the age of 76, had recurrences of complications and repeated transfers of hospitals and was eventually admitted to the hospital. Though he had declined muscular strength and ADL because of the long bed-ridden life, he was discharged from the hospital. Nursing services centered on visiting nursing were provided as the home care supporting service when home medical care for the patient was started. Since Mr. K.T. required medical management, he and his family members were not sure whether it is possible to provide care for him at home and required guidance about health/life and mental supports. Therefore, visiting nursing care was provided by a nurse to assess needs or condition of the person, which reduced anxiety and encouraged the person. As a result, ADL increased and the degree of necessity of care decreased from 4 to 2. This success is attributed to the visiting nurse's appropriate care management based on the medical expertise from the perspective of nursing and the introduction of necessary services at the necessary time based on the appropriate assessment of changes in the physical condition or willingness and the nursing condition of family members. Coordination with the staffs engaged in each

  19. Verification and Implementation of Operations Safety Controls for Flight Missions

    NASA Technical Reports Server (NTRS)

    Jones, Cheryl L.; Smalls, James R.; Carrier, Alicia S.

    2010-01-01

    Approximately eleven years ago, the International Space Station launched the first module from Russia, the Functional Cargo Block (FGB). Safety and Mission Assurance (S&MA) Operations (Ops) Engineers played an integral part in that endeavor by executing strict flight product verification as well as continued staffing of S&MA's console in the Mission Evaluation Room (MER) for that flight mission. How were these engineers able to conduct such a complicated task? They conducted it based on product verification that consisted of ensuring that safety requirements were adequately contained in all flight products that affected crew safety. S&MA Ops engineers apply both systems engineering and project management principles in order to gain a appropriate level of technical knowledge necessary to perform thorough reviews which cover the subsystem(s) affected. They also ensured that mission priorities were carried out with a great detail and success.

  20. Formal Verification of Air Traffic Conflict Prevention Bands Algorithms

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony J.; Munoz, Cesar A.; Dowek, Gilles

    2010-01-01

    In air traffic management, a pairwise conflict is a predicted loss of separation between two aircraft, referred to as the ownship and the intruder. A conflict prevention bands system computes ranges of maneuvers for the ownship that characterize regions in the airspace that are either conflict-free or 'don't go' zones that the ownship has to avoid. Conflict prevention bands are surprisingly difficult to define and analyze. Errors in the calculation of prevention bands may result in incorrect separation assurance information being displayed to pilots or air traffic controllers. This paper presents provably correct 3-dimensional prevention bands algorithms for ranges of track angle; ground speed, and vertical speed maneuvers. The algorithms have been mechanically verified in the Prototype Verification System (PVS). The verification presented in this paper extends in a non-trivial way that of previously published 2-dimensional algorithms.

  1. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  2. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  3. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  4. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    NASA Astrophysics Data System (ADS)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  5. Verification and validation of COBRA-SFS transient analysis capability

    SciTech Connect

    Rector, D.R.; Michener, T.E.; Cuta, J.M.

    1998-05-01

    This report provides documentation of the verification and validation testing of the transient capability in the COBRA-SFS code, and is organized into three main sections. The primary documentation of the code was published in September 1995, with the release of COBRA-SFS, Cycle 2. The validation and verification supporting the release and licensing of COBRA-SFS was based solely on steady-state applications, even though the appropriate transient terms have been included in the conservation equations from the first cycle. Section 2.0, COBRA-SFS Code Description, presents a capsule description of the code, and a summary of the conservation equations solved to obtain the flow and temperature fields within a cask or assembly model. This section repeats in abbreviated form the code description presented in the primary documentation (Michener et al. 1995), and is meant to serve as a quick reference, rather than independent documentation of all code features and capabilities. Section 3.0, Transient Capability Verification, presents a set of comparisons between code calculations and analytical solutions for selected heat transfer and fluid flow problems. Section 4.0, Transient Capability Validation, presents comparisons between code calculations and experimental data obtained in spent fuel storage cask tests. Based on the comparisons presented in Sections 2.0 and 3.0, conclusions and recommendations for application of COBRA-SFS to transient analysis are presented in Section 5.0.

  6. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cycles § 1065.550 Gas analyzer range verification and drift verification. (a) Range verification. If an... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations...-specific emissions over the entire duty cycle for drift. For each constituent to be verified, both sets...

  7. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    SciTech Connect

    Flach, G. P.

    2015-05-12

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  8. Independent Peer Reviews

    SciTech Connect

    2012-03-16

    Independent Assessments: DOE's Systems Integrator convenes independent technical reviews to gauge progress toward meeting specific technical targets and to provide technical information necessary for key decisions.

  9. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    SciTech Connect

    Weaver, Phyllis C.

    2012-08-29

    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.

  10. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  11. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  12. Optimal Imaging for Treaty Verification

    SciTech Connect

    Brubaker, Erik; Hilton, Nathan R.; Johnson, William; Marleau, Peter; Kupinski, Matthew; MacGahan, Christopher Jonathan

    2014-09-01

    Future arms control treaty verification regimes may use radiation imaging measurements to confirm and track nuclear warheads or other treaty accountable items (TAIs). This project leverages advanced inference methods developed for medical and adaptive imaging to improve task performance in arms control applications. Additionally, we seek a method to acquire and analyze imaging data of declared TAIs without creating an image of those objects or otherwise storing or revealing any classified information. Such a method would avoid the use of classified-information barriers (IB).

  13. Why do verification and validation?

    DOE PAGES

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  14. Approaches to wind resource verification

    NASA Technical Reports Server (NTRS)

    Barchet, W. R.

    1982-01-01

    Verification of the regional wind energy resource assessments produced by the Pacific Northwest Laboratory addresses the question: Is the magnitude of the resource given in the assessments truly representative of the area of interest? Approaches using qualitative indicators of wind speed (tree deformation, eolian features), old and new data of opportunity not at sites specifically chosen for their exposure to the wind, and data by design from locations specifically selected to be good wind sites are described. Data requirements and evaluation procedures for verifying the resource are discussed.

  15. CFE verification: The decision to inspect

    SciTech Connect

    Allentuck, J.

    1990-01-01

    Verification of compliance with the provisions of the treaty on Conventional Forces-Europe (CFE) is subject to inspection quotas of various kinds. Thus the decision to carry out a specific inspection or verification activity must be prudently made. This decision process is outlined, and means for conserving quotas'' are suggested. 4 refs., 1 fig.

  16. Identity Verification, Control, and Aggression in Marriage

    ERIC Educational Resources Information Center

    Stets, Jan E.; Burke, Peter J.

    2005-01-01

    In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…

  17. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Verification program. 460.17 Section...

  18. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Verification program. 460.17 Section...

  19. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Verification program. 460.17 Section...

  20. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  1. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 5 2011-04-01 2011-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  2. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  3. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  4. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  5. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  6. 18 CFR 286.107 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... Contested Audit Findings and Proposed Remedies § 286.107 Verification. The facts stated in the...

  7. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  8. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  9. 18 CFR 286.107 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... Contested Audit Findings and Proposed Remedies § 286.107 Verification. The facts stated in the...

  10. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  11. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  12. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  13. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  14. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  15. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  16. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  17. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  18. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  19. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  1. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  2. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  3. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... a flight. Verification must include flight testing. ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification program. 460.17 Section 460.17... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17...

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  5. Coherent lidar design and performance verification

    NASA Technical Reports Server (NTRS)

    Frehlich, Rod

    1993-01-01

    The verification of LAWS beam alignment in space can be achieved by a measurement of heterodyne efficiency using the surface return. The crucial element is a direct detection signal that can be identified for each surface return. This should be satisfied for LAWS but will not be satisfied for descoped LAWS. The performance of algorithms for velocity estimation can be described with two basic parameters: the number of coherently detected photo-electrons per estimate and the number of independent signal samples per estimate. The average error of spectral domain velocity estimation algorithms are bounded by a new periodogram Cramer-Rao Bound. Comparison of the periodogram CRB with the exact CRB indicates a factor of two improvement in velocity accuracy is possible using non-spectral domain estimators. This improvement has been demonstrated with a maximum-likelihood estimator. The comparison of velocity estimation algorithms for 2 and 10 micron coherent lidar was performed by assuming all the system design parameters are fixed and the signal statistics are dominated by a 1 m/s rms wind fluctuation over the range gate. The beam alignment requirements for 2 micron are much more severe than for a 10 micron lidar. The effects of the random backscattered field on estimating the alignment error is a major problem for space based lidar operation, especially if the heterodyne efficiency cannot be estimated. For LAWS, the biggest science payoff would result from a short transmitted pulse, on the order of 0.5 microseconds instead of 3 microseconds. The numerically errors for simulation of laser propagation in the atmosphere have been determined as a joint project with the University of California, San Diego. Useful scaling laws were obtained for Kolmogorov atmospheric refractive turbulence and an atmospheric refractive turbulence characterized with an inner scale. This permits verification of the simulation procedure which is essential for the evaluation of the effects of

  6. Video-based fingerprint verification.

    PubMed

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-09-04

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low.

  7. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness, which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  8. Modular verification of concurrent systems

    SciTech Connect

    Sobel, A.E.K.

    1986-01-01

    During the last ten years, a number of authors have proposed verification techniques that allow one to prove properties of individual processes by using global assumptions about the behavior of the remaining processes in the distributed program. As a result, one must justify these global assumptions before drawing any conclusions regarding the correctness of the entire program. This justification is often the most difficult part of the proof and presents a serious obstacle to hierarchical program development. This thesis develops a new approach to the verification of concurrent systems. The approach is modular and supports compositional development of programs since the proofs of each individual process of a program are completely isolated from all others. The generality of this approach is illustrated by applying it to a representative set of contemporary concurrent programming languages, namely: CSP, ADA, Distributed Processes, and a shared variable language. In addition, it is also shown how the approach may be used to deal with a number of other constructs that have been proposed for inclusion in concurrent languages: FORK and JOIN primitives, nested monitor calls, path expressions, atomic transactions, and asynchronous message passing. These results allow argument that the approach is universal and can be used to design proof systems for any concurrent language.

  9. The effect of mystery shopper reports on age verification for tobacco purchases.

    PubMed

    Krevor, Brad S; Ponicki, William R; Grube, Joel W; DeJong, William

    2011-09-01

    Mystery shops involving attempted tobacco purchases by young buyers have been implemented in order to monitor retail stores' performance in refusing underage sales. Anecdotal evidence suggests that mystery shop visits with immediate feedback to store personnel can improve age verification. This study investigated the effect of monthly and twice-monthly mystery shop reports on age verification. Mystery shoppers visited 45 Walgreens stores 20 times. The stores were randomly assigned to 1 of 3 conditions. Control group stores received no feedback, whereas 2 treatment groups received feedback communications on every visit (twice monthly) or on every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Postbaseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement compared with the control group stores. Verification rates increased significantly during the study period for all 3 groups, with delayed improvement among control group stores. Communication between managers regarding the mystery shop program may account for the delayed age-verification improvements observed in the control group stores. Encouraging interstore communication might extend the benefits of mystery shop programs beyond those stores that receive this intervention.

  10. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  11. Hierarchical Design and Verification for VLSI

    NASA Technical Reports Server (NTRS)

    Shostak, R. E.; Elliott, W. D.; Levitt, K. N.

    1983-01-01

    The specification and verification work is described in detail, and some of the problems and issues to be resolved in their application to Very Large Scale Integration VLSI systems are examined. The hierarchical design methodologies enable a system architect or design team to decompose a complex design into a formal hierarchy of levels of abstraction. The first step inprogram verification is tree formation. The next step after tree formation is the generation from the trees of the verification conditions themselves. The approach taken here is similar in spirit to the corresponding step in program verification but requires modeling of the semantics of circuit elements rather than program statements. The last step is that of proving the verification conditions using a mechanical theorem-prover.

  12. Monthly and seasonally verification of precipitation in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, K.; Linkowska, J.

    2009-04-01

    The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds.

  13. Monthly and seasonally verification of precipitation in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, K.; Linkowska, J.

    2009-04-01

    The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds

  14. Reactive system verification case study: Fault-tolerant transputer communication

    NASA Technical Reports Server (NTRS)

    Crane, D. Francis; Hamory, Philip J.

    1993-01-01

    A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.

  15. AUTOMATED, HIGHLY ACCURATE VERIFICATION OF RELAP5-3D

    SciTech Connect

    George L Mesina; David Aumiller; Francis Buschman

    2014-07-01

    Computer programs that analyze light water reactor safety solve complex systems of governing, closure and special process equations to model the underlying physics. In addition, these programs incorporate many other features and are quite large. RELAP5-3D[1] has over 300,000 lines of coding for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. Verification ensures that a program is built right by checking that it meets its design specifications. Recently, there has been an increased importance on the development of automated verification processes that compare coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions[2]. For the first time, the ability exists to ensure that the data transfer operations associated with timestep advancement/repeating and writing/reading a solution to a file have no unintended consequences. To ensure that the code performs as intended over its extensive list of applications, an automated and highly accurate verification method has been modified and applied to RELAP5-3D. Furthermore, mathematical analysis of the adequacy of the checks used in the comparisons is provided.

  16. Guidance and Control Software Project Data - Volume 3: Verification Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  17. Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program

    NASA Technical Reports Server (NTRS)

    Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby

    2017-01-01

    Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.

  18. KAT-7 Science Verification Highlights

    NASA Astrophysics Data System (ADS)

    Lucero, Danielle M.; Carignan, Claude; KAT-7 Science Data; Processing Team, KAT-7 Science Commissioning Team

    2015-01-01

    KAT-7 is a pathfinder of the Square Kilometer Array precursor MeerKAT, which is under construction. Its short baselines and low system temperature make it sensitive to large scale, low surface brightness emission. This makes it an ideal instrument to use in searches for faint extended radio emission and low surface density extraplanar gas. We present an update on the progress of several such ongoing KAT-7 science verification projects. These include a large scale radio continuum and polarization survey of the Galactic Center, deep HI observations (100+ hours) of nearby disk galaxies (e.g. NGC253 and NGC3109), and targeted searches for HI tidal tails in galaxy groups (e.g. IC1459). A brief status update for MeerKAT will also be presented if time permits.

  19. MFTF sensor verification computer program

    SciTech Connect

    Chow, H.K.

    1984-11-09

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system.

  20. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  1. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  2. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  3. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  4. How frequent are overactive bladder symptoms in women with urodynamic verification of an overactive bladder?

    PubMed Central

    Yeniel, Ahmet Özgür; Ergenoğlu, Mete Ahmet; Meseri, Reci; Aşkar, Niyazi; İtil, İsmail Mete

    2012-01-01

    Objective To determine the relationship between overactive bladder symptoms and urodynamic verification of overactive bladder. Material and Methods Between June 2011 and November 2011, 159 patients underwent urodynamics (UDS) at our urogynecology unit in the Ege University Hospital. Of these, 95 patients who complained of urgency, did not have any overt neurological diseases, bladder outlet obstruction and did not take any medication affecting the lower urinary tract function were evaluated. SPSS (ver. 15.0) was used to evaluate the data and the chi-square test and t test for independent samples were used for analysis. Results The mean age was found to be 54.5±12. Frequency was the most frequent symptom in women with overactive bladder (OAB) (82.1%), nocturia (57.8%) and (57.8%) urgency urinary incontinence followed in frequency. Detrusor over activity incidence was found to be 38.9%. There was no significant relationship between the presence of detrusor over activity (DOA) and OAB symptoms. Leak at urodynamics was found in 46.3% and there is no significant association with detrusor overactivity. Total bladder capacity was found to be significantly lower in women who had DOA (p=0.000). Conclusion It appears that overactive bladder symptoms do not predict detrusor over activity. Urodynamic investigation is not mandatory in the initial management of women with only OAB symptoms. PMID:24592016

  5. Independent Study in Iowa.

    ERIC Educational Resources Information Center

    Idaho Univ., Moscow.

    This guide to independent study in Idaho begins with introductory information on the following aspects of independent study: the Independent Study in Idaho consortium, student eligibility, special needs, starting dates, registration, costs, textbooks and instructional materials, e-mail and faxing, refunds, choosing a course, time limits, speed…

  6. Enhanced Cancelable Biometrics for Online Signature Verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. In this paper, we propose a scheme to improve the performance of a cancelable approach for online signature verification. Our scheme generates two cancelable dataset from one raw dataset and uses them for verification. Preliminary experiments were performed using a distance-based online signature verification algorithm. The experimental results show that our proposed scheme is promising.

  7. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  8. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH ELECTRONIC SENSOR TECHNOLOGY MODEL 4100

    EPA Science Inventory

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. As part of this program, the...

  10. 30 CFR 285.705 - When must I use a Certified Verification Agent (CVA)?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CVA)? 285.705 Section 285.705 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND... OUTER CONTINENTAL SHELF Facility Design, Fabrication, and Installation Certified Verification Agent... the Facility Design Report, the Fabrication and Installation Report, and the Project Modifications...

  11. Proficiency Verification Systems: A Large-Scale, Flexible-Use Program for Evaluating Achievement in Mathematics.

    ERIC Educational Resources Information Center

    Buchanan, Aaron D.; Milazzo, Patricia A.

    Proficiency Verification Systems (PVS) is a new concept in providing management information about local achievement in basic skills. The program includes a network of assessment and reporting components which can be combined in varied ways to generate proficiency information about individual pupils and groups, for teachers, principals, and school…

  12. 47 CFR 64.1120 - Verification of orders for telecommunications service.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... not be owned, managed, controlled, or directed by the carrier or the carrier's marketing agent; must... carrier's marketing agent; and must operate in a location physically separate from the carrier or the carrier's marketing agent. (i) Methods of third party verification. Automated third party...

  13. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  14. 28 CFR 603.1 - Jurisdiction of the Independent Counsel

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... JURISDICTION OF THE INDEPENDENT COUNSEL: IN RE MADISON GUARANTY SAVINGS & LOAN ASSOCIATION § 603.1 Jurisdiction of the Independent Counsel (a) The Independent Counsel: In re Madison Guaranty Savings & Loan... Corporation; or (3) Capital Management Services. (b) The Independent Counsel: In re Madison Guaranty...

  15. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  16. An Approach to Keeping Independent Colleges Independent.

    ERIC Educational Resources Information Center

    Northwest Area Foundation, St. Paul, Minn.

    As a result of the financial difficulties faced by independent colleges in the northwestern United States, the Northwest Area Foundation in 1972 surveyed the administrations of 80 private colleges to get a profile of the colleges, a list of their current problems, and some indication of how the problems might be approached. The three top problems…

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  19. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... approve the following Reliability Standards that were submitted to the Commission for approval by the North American Electric Reliability Corporation, the Commission-certified Electric...

  20. Environmental Technology Verification Program (ETV) Policy Compendium

    EPA Science Inventory

    The Policy Compendium summarizes operational decisions made to date by participants in the U.S. Environmental Protection Agency's (EPA's) Environmental Technology Verification Program (ETV) to encourage consistency among the ETV centers. The policies contained herein evolved fro...

  1. Engineering drawing field verification program. Revision 3

    SciTech Connect

    Ulk, P.F.

    1994-10-12

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented.

  2. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  3. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  4. Verification of the Calore thermal analysis code.

    SciTech Connect

    Dowding, Kevin J.; Blackwell, Bennie Francis

    2004-07-01

    Calore is the ASC code developed to model steady and transient thermal diffusion with chemistry and dynamic enclosure radiation. An integral part of the software development process is code verification, which addresses the question 'Are we correctly solving the model equations'? This process aids the developers in that it identifies potential software bugs and gives the thermal analyst confidence that a properly prepared input will produce satisfactory output. Grid refinement studies have been performed on problems for which we have analytical solutions. In this talk, the code verification process is overviewed and recent results are presented. Recent verification studies have focused on transient nonlinear heat conduction and verifying algorithms associated with (tied) contact and adaptive mesh refinement. In addition, an approach to measure the coverage of the verification test suite relative to intended code applications is discussed.

  5. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... deviation occurs; (d) Reviewing the critical limits; (e) Reviewing other records pertaining to the...

  6. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  7. Transmutation Fuel Performance Code Thermal Model Verification

    SciTech Connect

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  8. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  9. 30 CFR 250.911 - If my platform is subject to the Platform Verification Program, what must I do?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... a project management timeline, Gantt Chart, that depicts when interim and final reports required by... 30 Mineral Resources 2 2010-07-01 2010-07-01 false If my platform is subject to the Platform Verification Program, what must I do? 250.911 Section 250.911 Mineral Resources MINERALS MANAGEMENT...

  10. A Runtime Verification Framework for Control System Simulation

    SciTech Connect

    Ciraci, Selim; Fuller, Jason C.; Daily, Jeffrey A.; Makhmalbaf, Atefe; Callahan, Charles D.

    2014-08-02

    n a standard workflow for the validation of a control system, the control system is implemented as an extension to a simulator. Such simulators are complex software systems, and engineers may unknowingly violate constraints a simulator places on extensions. As such, errors may be introduced in the implementation of either the control system or the simulator leading to invalid simulation results. This paper presents a novel runtime verification approach for verifying control system implementations within simulators. The major contribution of the approach is the two-tier specification process. In the first tier, engineers model constraints using a domain-specific language tailored to modeling a controller’s response to changes in its input. The language is high-level and effectively hides the implementation details of the simulator, allowing engineers to specify design-level constraints independent of low-level simulator interfaces. In the second tier, simulator developers provide mapping rules for mapping design-level constraints to the implementation of the simulator. Using the rules, an automated tool transforms the design-level specifications into simulator-specific runtime verification specifications and generates monitoring code which is injected into the implementation of the simulator. During simulation, these monitors observe the input and output variables of the control system and report changes to the verifier. The verifier checks whether these changes follow the constraints of the control system. We describe application of this approach to the verification of the constraints of an HVAC control system implemented with the power grid simulator GridLAB-D.

  11. National Verification System of National Meteorological Center , China

    NASA Astrophysics Data System (ADS)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  12. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy

  13. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ElvaX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ElvaX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as s

  14. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Oxford ED2000 x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ED2000 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ED2000 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by com

  15. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rigaku ZSX Mini II (ZSX Mini II) XRF Services x-ray fluorescence (XRF) analyzer was demon-strated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ZSX Mini II analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ZSX Mini II analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element con

  16. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XT400 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XT400 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was as

  17. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rontec PicoTAX x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the PicoTAX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the PicoTAX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by c

  18. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  19. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  20. ALMA Band 5 Science Verification

    NASA Astrophysics Data System (ADS)

    Humphreys, L.; Biggs, A.; Immer, K.; Laing, R.; Liu, H. B.; Marconi, G.; Mroczkowski, T.; Testi, L.; Yagoubov, P.

    2017-03-01

    ALMA Band 5 (163–211 GHz) was recently commissioned and Science Verification (SV) observations were obtained in the latter half of 2016. A primary scientific focus of this band is the H2O line at 183.3 GHz, which can be observed around 15% of the time when the precipitable water vapour is sufficiently low (< 0.5 mm). Many more lines are covered in Band 5 and can be observed for over 70% of the time on Chajnantor, requiring similar restrictions to those for ALMA Bands 4 and 6. Examples include the H218O line at 203 GHz, some of the bright (3–2) lines of singly and doubly deuterated forms of formaldehyde, the (2–1) lines of HCO+, HCN, HNC, N2H+ and several of their isotopologues. A young star-forming region near the centre of the Milky Way, an evolved star also in our Galaxy, and a nearby ultraluminous infrared galaxy (ULIRG) were observed as part of the SV process and the data are briefly described. The reduced data, along with imaged data products, are now public and demonstrate the power of ALMA for high-resolution studies of H2O and other molecules in a variety of astronomical targets.

  1. Towards a Theory for Integration of Mathematical Verification and Empirical Testing

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Boyd, Mark; Kulkarni, Deepak

    1998-01-01

    From the viewpoint of a project manager responsible for the V&V (verification and validation) of a software system, mathematical verification techniques provide a possibly useful orthogonal dimension to otherwise standard empirical testing. However, the value they add to an empirical testing regime both in terms of coverage and in fault detection has been difficult to quantify. Furthermore, potential cost savings from replacing testing with mathematical verification techniques cannot be realized until the tradeoffs and synergies can be formulated. Integration of formal verification with empirical testing is also difficult because the idealized view of mathematical verification providing a correctness proof with total coverage is unrealistic and does not reflect the limitations imposed by computational complexity of mathematical techniques. This paper first describes a framework based on software reliability and formalized fault models for a theory of software design fault detection - and hence the utility of various tools for debugging. It then describes a utility model for integrating mathematical and empirical techniques with respect to fault detection and coverage analysis. It then considers the optimal combination of black-box testing, white-box (structural) testing, and formal methods in V&V of a software system. Using case studies from NASA software systems, it then demonstrates how this utility model can be used in practice.

  2. Hybrid Enrichment Verification Array: Module Characterization Studies

    SciTech Connect

    Zalavadia, Mital A.; Smith, Leon E.; McDonald, Benjamin S.; Kulisek, Jonathan A.; Mace, Emily K.; Deshmukh, Nikhil S.

    2016-03-01

    The work presented in this report is focused on the characterization and refinement of the Hybrid Enrichment Verification Array (HEVA) approach, which combines the traditional 186-keV 235U signature with high-energy prompt gamma rays from neutron capture in the detector and surrounding collimator material, to determine the relative enrichment and 235U mass of the cylinder. The design of the HEVA modules (hardware and software) deployed in the current field trial builds on over seven years of study and evolution by PNNL, and consists of a ø3''×3'' NaI(Tl) scintillator coupled to an Osprey digital multi-channel analyzer tube base from Canberra. The core of the HEVA methodology, the high-energy prompt gamma-ray signature, serves as an indirect method for the measurement of total neutron emission from the cylinder. A method for measuring the intrinsic efficiency of this “non-traditional” neutron signature and the results from a benchmark experiment are presented. Also discussed are potential perturbing effects on the non-traditional signature, including short-lived activation of materials in the HEVA module. Modeling and empirical results are presented to demonstrate that such effects are expected to be negligible for the envisioned implementation scenario. In comparison to previous versions, the new design boosts the high-energy prompt gamma-ray signature, provides more flexible and effective collimation, and improves count-rate management via commercially available pulse-processing electronics with a special modification prompted by PNNL.

  3. Verification of operating software for cooperative monitoring applications

    SciTech Connect

    Tolk, K.M.; Rembold, R.K.

    1997-08-01

    Monitoring agencies often use computer based equipment to control instruments and to collect data at sites that are being monitored under international safeguards or other cooperative monitoring agreements. In order for this data to be used as an independent verification of data supplied by the host at the facility, the software used must be trusted by the monitoring agency. The monitoring party must be sure that the software has not be altered to give results that could lead to erroneous conclusions about nuclear materials inventories or other operating conditions at the site. The host might also want to verify that the software being used is the software that has been previously inspected in order to be assured that only data that is allowed under the agreement is being collected. A description of a method to provide this verification using keyed has functions and how the proposed method overcomes possible vulnerabilities in methods currently in use such as loading the software from trusted disks is presented. The use of public key data authentication for this purpose is also discussed.

  4. Formal verification of a microcoded VIPER microprocessor using HOL

    NASA Technical Reports Server (NTRS)

    Levitt, Karl; Arora, Tejkumar; Leung, Tony; Kalvala, Sara; Schubert, E. Thomas; Windley, Philip; Heckman, Mark; Cohen, Gerald C.

    1993-01-01

    The Royal Signals and Radar Establishment (RSRE) and members of the Hardware Verification Group at Cambridge University conducted a joint effort to prove the correspondence between the electronic block model and the top level specification of Viper. Unfortunately, the proof became too complex and unmanageable within the given time and funding constraints, and is thus incomplete as of the date of this report. This report describes an independent attempt to use the HOL (Cambridge Higher Order Logic) mechanical verifier to verify Viper. Deriving from recent results in hardware verification research at UC Davis, the approach has been to redesign the electronic block model to make it microcoded and to structure the proof in a series of decreasingly abstract interpreter levels, the lowest being the electronic block level. The highest level is the RSRE Viper instruction set. Owing to the new approach and some results on the proof of generic interpreters as applied to simple microprocessors, this attempt required an effort approximately an order of magnitude less than the previous one.

  5. 321 B North and PRL-S006 Verification Surveys at Former McClellan AFB, Sacramento, CA

    DTIC Science & Technology

    2013-01-22

    remediation and after the contractor had completed its Final Status Survey (FSS) sampling. Radium -226 was the sole radionuclide of concern...Medicine (USAFSAM), former McClellan AFB, radium -226, verification survey, final status survey, independent radiological assessment 16. SECURITY...after the contractor had completed its Final Status Survey (FSS) sampling. Radium -226 (Ra-226) was the sole radionuclide of concern. Environmental

  6. Concepts of Model Verification and Validation

    SciTech Connect

    B.H.Thacker; S.W.Doebling; F.M.Hemez; M.C. Anderson; J.E. Pepin; E.A. Rodriguez

    2004-10-30

    Model verification and validation (V&V) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model V&V procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model V&V program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model V&V is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define V&V methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for V&V applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of V&V procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model V&V for all

  7. American Independence. Fifth Grade.

    ERIC Educational Resources Information Center

    Crosby, Annette

    This fifth grade teaching unit covers early conflicts between the American colonies and Britain, battles of the American Revolutionary War, and the Declaration of Independence. Knowledge goals address the pre-revolutionary acts enforced by the British, the concepts of conflict and independence, and the major events and significant people from the…

  8. Independence of Internal Auditors.

    ERIC Educational Resources Information Center

    Montondon, Lucille; Meixner, Wilda F.

    1993-01-01

    A survey of 288 college and university auditors investigated patterns in their appointment, reporting, and supervisory practices as indicators of independence and objectivity. Results indicate a weakness in the positioning of internal auditing within institutions, possibly compromising auditor independence. Because the auditing function is…

  9. PRL S-030A Verification Survey at Former McClellan AFB, Sacramento, CA

    DTIC Science & Technology

    2013-03-25

    contractor had completed the majority of the excavation of contaminated soils from the site. Radium -226 was the sole radionuclide of concern. Cabrera...of Aerospace Medicine (USAFSAM), former McClellan AFB, radium -226, verification survey, final status survey, independent radiological assessment 16...majority of the excavation of contaminated soils from the site. Radium -226 (Ra-226) was the sole radionuclide of concern. Cabrera Services, Inc

  10. Towards the Formal Verification of a Distributed Real-Time Automotive System

    NASA Technical Reports Server (NTRS)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  11. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  12. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  13. Device-Independent Certification of a Nonprojective Qubit Measurement

    NASA Astrophysics Data System (ADS)

    Gómez, Esteban S.; Gómez, Santiago; González, Pablo; Cañas, Gustavo; Barra, Johanna F.; Delgado, Aldo; Xavier, Guilherme B.; Cabello, Adán; Kleinmann, Matthias; Vértesi, Tamás; Lima, Gustavo

    2016-12-01

    Quantum measurements on a two-level system can have more than two independent outcomes, and in this case, the measurement cannot be projective. Measurements of this general type are essential to an operational approach to quantum theory, but so far, the nonprojective character of a measurement can only be verified experimentally by already assuming a specific quantum model of parts of the experimental setup. Here, we overcome this restriction by using a device-independent approach. In an experiment on pairs of polarization-entangled photonic qubits we violate by more than 8 standard deviations a Bell-like correlation inequality that is valid for all sets of two-outcome measurements in any dimension. We combine this with a device-independent verification that the system is best described by two qubits, which therefore constitutes the first device-independent certification of a nonprojective quantum measurement.

  14. Device-Independent Certification of a Nonprojective Qubit Measurement.

    PubMed

    Gómez, Esteban S; Gómez, Santiago; González, Pablo; Cañas, Gustavo; Barra, Johanna F; Delgado, Aldo; Xavier, Guilherme B; Cabello, Adán; Kleinmann, Matthias; Vértesi, Tamás; Lima, Gustavo

    2016-12-23

    Quantum measurements on a two-level system can have more than two independent outcomes, and in this case, the measurement cannot be projective. Measurements of this general type are essential to an operational approach to quantum theory, but so far, the nonprojective character of a measurement can only be verified experimentally by already assuming a specific quantum model of parts of the experimental setup. Here, we overcome this restriction by using a device-independent approach. In an experiment on pairs of polarization-entangled photonic qubits we violate by more than 8 standard deviations a Bell-like correlation inequality that is valid for all sets of two-outcome measurements in any dimension. We combine this with a device-independent verification that the system is best described by two qubits, which therefore constitutes the first device-independent certification of a nonprojective quantum measurement.

  15. Shell Element Verification & Regression Problems for DYNA3D

    SciTech Connect

    Zywicz, E

    2008-02-01

    A series of quasi-static regression/verification problems were developed for the triangular and quadrilateral shell element formulations contained in Lawrence Livermore National Laboratory's explicit finite element program DYNA3D. Each regression problem imposes both displacement- and force-type boundary conditions to probe the five independent nodal degrees of freedom employed in the targeted formulation. When applicable, the finite element results are compared with small-strain linear-elastic closed-form reference solutions to verify select aspects of the formulations implementation. Although all problems in the suite depict the same geometry, material behavior, and loading conditions, each problem represents a unique combination of shell formulation, stabilization method, and integration rule. Collectively, the thirty-six new regression problems in the test suite cover nine different shell formulations, three hourglass stabilization methods, and three families of through-thickness integration rules.

  16. The DES Science Verification Weak Lensing Shear Catalogs

    SciTech Connect

    Jarvis, M.

    2016-05-01

    We present weak lensing shear catalogs for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogs of 2.12 million and 3.44 million galaxies respectively. We also detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SV data. Furthermore, we discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogs for the full 5-year DES, which is expected to cover 5000 square degrees.

  17. The DES Science Verification weak lensing shear catalogues

    NASA Astrophysics Data System (ADS)

    Jarvis, M.; Sheldon, E.; Zuntz, J.; Kacprzak, T.; Bridle, S. L.; Amara, A.; Armstrong, R.; Becker, M. R.; Bernstein, G. M.; Bonnett, C.; Chang, C.; Das, R.; Dietrich, J. P.; Drlica-Wagner, A.; Eifler, T. F.; Gangkofner, C.; Gruen, D.; Hirsch, M.; Huff, E. M.; Jain, B.; Kent, S.; Kirk, D.; MacCrann, N.; Melchior, P.; Plazas, A. A.; Refregier, A.; Rowe, B.; Rykoff, E. S.; Samuroff, S.; Sánchez, C.; Suchyta, E.; Troxel, M. A.; Vikram, V.; Abbott, T.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Capozzi, D.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Clampitt, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Gaztanaga, E.; Gerdes, D. W.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Martini, P.; Miquel, R.; Mohr, J. J.; Neilsen, E.; Nord, B.; Ogando, R.; Reil, K.; Romer, A. K.; Roodman, A.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Thomas, D.; Walker, A. R.; Wechsler, R. H.

    2016-08-01

    We present weak lensing shear catalogues for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogues of 2.12 million and 3.44 million galaxies, respectively. We detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SV data. We also discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogues for the full 5-yr DES, which is expected to cover 5000 square degrees.

  18. The DES Science Verification Weak Lensing Shear Catalogs

    DOE PAGES

    Jarvis, M.

    2016-05-01

    We present weak lensing shear catalogs for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogs of 2.12 million and 3.44 million galaxies respectively. We also detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SVmore » data. Furthermore, we discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogs for the full 5-year DES, which is expected to cover 5000 square degrees.« less

  19. INF verification: a guide for the perplexed

    SciTech Connect

    Mendelsohn, J.

    1987-09-01

    The administration has dug itself some deep holes on the verification issue. It will have to conclude an arms control treaty without having resolved earlier (but highly questionable) compliance issues on which it has placed great emphasis. It will probably have to abandon its more sweeping (and unnecessary) on-site inspection (OSI) proposals because of adverse security and political implications for the United States and its allies. And, finally, it will probably have to present to the Congress an INF treaty that will provide for a considerably less-stringent (but nonetheless adequate) verification regime that it had originally demanded. It is difficult to dispel the impression that, when the likelihood of concluding an INF treaty seemed remote, the administration indulged its penchant for intrusive and non-negotiable verification measures. As the possibility of, and eagerness for, a treaty increased, and as the Soviet Union shifted its policy from one of the resistance to OSI to one of indicating that on-site verification involved reciprocal obligations, the administration was forced to scale back its OSI rhetoric. This re-evaluation of OSI by the administration does not make the INF treaty any less verifiable; from the outset the Reagan administration was asking for a far more extensive verification package than was necessary, practicable, acceptable, or negotiable.

  20. Neighborhood Repulsed Metric Learning for Kinship Verification.

    PubMed

    Lu, Jiwen; Zhou, Xiuzhuang; Tan, Yap-Pen; Shang, Yuanyuan; Zhou, Jie

    2013-07-16

    Kinship verification from facial images is an interesting and challenging problem in computer vision, and there is very limited attempts on tackle this problem in the iterature. In this paper, we propose a new neighborhood repulsed metric learning (NRML) method for kinship verification. Motivated by the fact that interclass samples (without kinship relations) with higher similarity usually lie in a neighborhood and are more easily misclassified than those with lower similarity, we aim to learn a distance metric under which the intraclass samples (with kinship relations) are pulled as close as possible and interclass samples lying in a neighborhood are repulsed and pushed away as far as possible, simultaneously, such that more discriminative information can be exploited for verification. To make better use of multiple feature descriptors to extract complementary information, we further propose a multiview NRML (MNRML) method to seek a common distance metric to perform multiple feature fusion to improve the kinship verification performance. Experimental results are presented to demonstrate the efficacy of our proposed methods. Lastly, we also test human ability in kinship verification from facial images and our experimental results show that our methods are comparable to that of human observers.

  1. Neighborhood repulsed metric learning for kinship verification.

    PubMed

    Lu, Jiwen; Zhou, Xiuzhuang; Tan, Yap-Pen; Shang, Yuanyuan; Zhou, Jie

    2014-02-01

    Kinship verification from facial images is an interesting and challenging problem in computer vision, and there are very limited attempts on tackle this problem in the literature. In this paper, we propose a new neighborhood repulsed metric learning (NRML) method for kinship verification. Motivated by the fact that interclass samples (without a kinship relation) with higher similarity usually lie in a neighborhood and are more easily misclassified than those with lower similarity, we aim to learn a distance metric under which the intraclass samples (with a kinship relation) are pulled as close as possible and interclass samples lying in a neighborhood are repulsed and pushed away as far as possible, simultaneously, such that more discriminative information can be exploited for verification. To make better use of multiple feature descriptors to extract complementary information, we further propose a multiview NRML (MNRML) method to seek a common distance metric to perform multiple feature fusion to improve the kinship verification performance. Experimental results are presented to demonstrate the efficacy of our proposed methods. Finally, we also test human ability in kinship verification from facial images and our experimental results show that our methods are comparable to that of human observers.

  2. Hybrid Deep Learning for Face Verification.

    PubMed

    Sun, Yi; Wang, Xiaogang; Tang, Xiaoou

    2016-10-01

    This paper proposes a hybrid convolutional network (ConvNet)-Restricted Boltzmann Machine (RBM) model for face verification. A key contribution of this work is to learn high-level relational visual features with rich identity similarity information. The deep ConvNets in our model start by extracting local relational visual features from two face images in comparison, which are further processed through multiple layers to extract high-level and global relational features. To keep enough discriminative information, we use the last hidden layer neuron activations of the ConvNet as features for face verification instead of those of the output layer. To characterize face similarities from different aspects, we concatenate the features extracted from different face region pairs by different deep ConvNets. The resulting high-dimensional relational features are classified by an RBM for face verification. After pre-training each ConvNet and the RBM separately, the entire hybrid network is jointly optimized to further improve the accuracy. Various aspects of the ConvNet structures, relational features, and face verification classifiers are investigated. Our model achieves the state-of-the-art face verification performance on the challenging LFW dataset under both the unrestricted protocol and the setting when outside data is allowed to be used for training.

  3. Monitoring/Verification using DMS: TATP Example

    SciTech Connect

    Stephan Weeks; Kevin Kyle

    2008-03-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the use of explosives or chemical and biological weapons in terrorist activities. Two peroxide-based liquid explosives, triacetone triperoxide (TATP) and hexamethylene triperoxide diamine (HMTD), are synthesized from common chemicals such as hydrogen peroxide, acetone, sulfuric acid, ammonia, and citric acid (Figure 1). Recipes can be readily found on the Internet by anyone seeking to generate sufficient quantities of these highly explosive chemicals to cause considerable collateral damage. Detection of TATP and HMTD by advanced sensing systems can provide the early warning necessary to prevent terror plots from coming to fruition. DMS is currently one of the foremost emerging technologies for the separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. DMS separates and identifies ions at ambient pressures by utilizing the non-linear dependence of an ion's mobility on the radio frequency (rf) electric field strength. GC is widely considered to be one of the leading analytical methods for the separation of chemical species in complex mixtures. Advances in the technique have led to the development of low-thermal-mass fast GC columns. These columns are capable of

  4. MACCS2 development and verification efforts

    SciTech Connect

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  5. Media independent interface

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The work done on the Media Independent Interface (MII) Interface Control Document (ICD) program is described and recommendations based on it were made. Explanations and rationale for the content of the ICD itself are presented.

  6. PBL Verification with Radiosonde and Aircraft Data

    NASA Astrophysics Data System (ADS)

    Tsidulko, M.; McQueen, J.; Dimego, G.; Ek, M.

    2008-12-01

    Boundary layer depth is an important characteristic in weather forecasting and it is a key parameter in air quality modeling determining extent of turbulence and dispersion for pollutants. Real-time PBL depths from the NAM(WRF/NMM) model are verified with different types of observations. PBL depths verification is incorporated into NCEP verification system including an ability to provide a range of statistical characteristics for the boundary layer heights. For the model, several types of boundary layer definitions are used. PBL height from the TKE scheme, critical Ri number approach as well as mixed layer depth are compared with observations. Observed PBL depths are determined applying Ri number approach to radiosonde profiles. Also, preliminary study of using ACARS data for PBL verification is conducted.

  7. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  8. Land Ice Verification and Validation Kit

    SciTech Connect

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.

  9. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  10. 75 FR 1492 - Commission Guidance Regarding Independent Public Accountant Engagements Performed Pursuant to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-11

    ... provides direction with respect to the independent verification and internal control report as required... such resignation, dismissal, removal, or other termination. III. Internal Control Report Rule 206(4)-2... receive from its related person an internal control report related to its or its affiliates'...

  11. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    SciTech Connect

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H

    2015-06-15

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  12. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  13. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2016-09-14

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this research, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determines their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d1, and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical Kinship Verification via Representation Learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU Kinship Database is created which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields stateof- the-art kinship verification accuracy on the WVU Kinship database and on four existing benchmark datasets. Further, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  14. Pavement management

    SciTech Connect

    Ross, F.R.; Connor, B.; Lytton, R.L.; Darter, M.I.; Shahin, M.Y.

    1982-01-01

    The 11 papers in this report deal with the following areas: effect of pavement roughness on vehicle fuel consumption; rational seasonal load restrictions and overload permits; state-level pavement monitoring program; data requirements for long-term monitoring of pavements as a basis for development of multiple regression relations; simplified pavement management at the network level; combined priority programming of maintenance and rehabilitation for pavement networks; Arizona pavement management system: Phase 2-verification of performance prediction models and development of data base; overview of paver pavement management system; economic analysis of field implementation of paver pavement management system; development of a statewide pavement maintenance management system; and, prediction of pavement maintenance expenditure by using a statistical cost function.

  15. 37 CFR 380.6 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TRANSMISSIONS, NEW SUBSCRIPTION SERVICES AND THE MAKING OF EPHEMERAL REPRODUCTIONS § 380.6 Verification of... purpose of the audit. The Collective shall retain the report of the verification for a period of not...

  16. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  17. Jet Propulsion Laboratory Environmental Verification Processes and Test Effectiveness

    NASA Technical Reports Server (NTRS)

    Hoffman, Alan R.; Green, Nelson W.

    2006-01-01

    Viewgraphs on the JPL processes for enviornmental verification and testing of aerospace systems is presented. The topics include: 1) Processes: a) JPL Design Principles b) JPL Flight Project Practices; 2) Environmental Verification; and 3) Test Effectiveness Assessment: Inflight Anomaly Trends.

  18. Data Storage Accounting and Verification at LHC experiments

    NASA Astrophysics Data System (ADS)

    Huang, C.-H.; Lanciotti, E.; Magini, N.; Ratnikova, N.; Sanchez-Hernandez, A.; Serfon, C.; Wildish, T.; Zhang, X.

    2012-12-01

    All major experiments at the Large Hadron Collider (LHC) need to measure real storage usage at the Grid sites. This information is equally important for resource management, planning, and operations. To verify the consistency of central catalogs, experiments are asking sites to provide a full list of the files they have on storage, including size, checksum, and other file attributes. Such storage dumps, provided at regular intervals, give a realistic view of the storage resource usage by the experiments. Regular monitoring of the space usage and data verification serve as additional internal checks of the system integrity and performance. Both the importance and the complexity of these tasks increase with the constant growth of the total data volumes during the active data taking period at the LHC. The use of common solutions helps to reduce the maintenance costs, both at the large Tier1 facilities supporting multiple virtual organizations and at the small sites that often lack manpower. We discuss requirements and solutions to the common tasks of data storage accounting and verification, and present experiment-specific strategies and implementations used within the LHC experiments according to their computing models.

  19. Static and Dynamic Verification of Critical Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  20. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  1. Proton Therapy Verification with PET Imaging

    PubMed Central

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  2. Verification of Plan Models Using UPPAAL

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Haveland, Klaus; Lau, Sonic (Technical Monitor)

    2001-01-01

    This paper describes work on the verification of HSTS, the planner and scheduler of the Remote Agent autonomous control system deployed in Deep Space 1 (DS1). The verification is done using UPPAAL, a real time model checking tool. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify. Finally, we conclude with a summary.

  3. Challenges in High-Assurance Runtime Verification

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  4. Verification Of Tooling For Robotic Welding

    NASA Technical Reports Server (NTRS)

    Osterloh, Mark R.; Sliwinski, Karen E.; Anderson, Ronald R.

    1991-01-01

    Computer simulations, robotic inspections, and visual inspections performed to detect discrepancies. Method for verification of tooling for robotic welding involves combination of computer simulations and visual inspections. Verification process ensures accuracy of mathematical model representing tooling in off-line programming system that numerically simulates operation of robotic welding system. Process helps prevent damaging collisions between welding equipment and workpiece, ensures tooling positioned and oriented properly with respect to workpiece, and/or determines whether tooling to be modified or adjusted to achieve foregoing objectives.

  5. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  6. On Backward-Style Anonymity Verification

    NASA Astrophysics Data System (ADS)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  7. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  8. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  9. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  10. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  11. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  12. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270...

  13. 15 CFR 748.13 - Delivery Verification (DV).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 2 2011-01-01 2011-01-01 false Delivery Verification (DV). 748.13... (CLASSIFICATION, ADVISORY, AND LICENSE) AND DOCUMENTATION § 748.13 Delivery Verification (DV). (a) Scope. (1) BIS may request the licensee to obtain verifications of delivery on a selective basis. A...

  14. 15 CFR 748.13 - Delivery Verification (DV).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Delivery Verification (DV). 748.13... (CLASSIFICATION, ADVISORY, AND LICENSE) AND DOCUMENTATION § 748.13 Delivery Verification (DV). (a) Scope. (1) BIS may request the licensee to obtain verifications of delivery on a selective basis. A...

  15. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Verification and validation. 120.11 Section 120.11 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD... § 120.11 Verification and validation. (a) Verification. Each processor shall verify that the...

  16. Sterilization of compounded parenteral preparations: verification of autoclaves.

    PubMed

    Rahe, Hank

    2013-01-01

    This article discusses the basic principles for verification of a sterilization process and provides a recommended approach to assure that autoclaves deliver the sterility-assured levels required for patient safety. Included is a summary of the protocol and verification (validation) results of a previously published case study involving autoclaves. To assure the sterility of compounded preparations, a verification procedure must be in place.

  17. 38 CFR 21.146 - Independent instructor course.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... independent instructor course in a rehabilitation plan. A veteran and his or her case manager may include an independent instructor course in a rehabilitation plan, other than one involving a farm cooperative program... instructor course. The case manager, the veteran, and the instructor should jointly plan the training...

  18. Homebirth and independent midwifery.

    PubMed

    Harris, G

    2000-07-01

    Why do women choose to give birth at home, and midwives to work independently, in a culture that does little to support this option? This article looks at the reasons childbearing women and midwives make these choices and the barriers to achieving them. The safety of the homebirth option is supported in reference to analysis of mortality and morbidity. Homebirth practices and level of success are compared in Australia and New Zealand (NZ), in particular, and The Netherlands, England and America. The success of popularity of homebirths is analysed in terms of socio-economic status. The current situation and challenges of independent midwifery in Darwin are described.

  19. 21 CFR 123.8 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Verification. 123.8 Section 123.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... processor shall verify that the HACCP plan is adequate to control food safety hazards that are...

  20. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...