Science.gov

Sample records for management independent verification

  1. Software risk management through independent verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Zhou, Tong C.; Wood, Ralph

    1995-01-01

    Software project managers need tools to estimate and track project goals in a continuous fashion before, during, and after development of a system. In addition, they need an ability to compare the current project status with past project profiles to validate management intuition, identify problems, and then direct appropriate resources to the sources of problems. This paper describes a measurement-based approach to calculating the risk inherent in meeting project goals that leverages past project metrics and existing estimation and tracking models. We introduce the IV&V Goal/Questions/Metrics model, explain its use in the software development life cycle, and describe our attempts to validate the model through the reverse engineering of existing projects.

  2. Guide to good practices for independent verification

    SciTech Connect

    1998-12-01

    This Guide to Good Practices is written to enhance understanding of, and provide direction for, Independent Verification, Chapter X of Department of Energy (DOE) Order 5480.19, Conduct of Operations Requirements for DOE Facilities. The practices in this guide should be considered when planning or reviewing independent verification activities. Contractors are advised to adopt procedures that meet the intent of DOE Order 5480.19. Independent Verification is an element of an effective Conduct of Operations program. The complexity and array of activities performed in DOE facilities dictate the necessity for coordinated independent verification activities to promote safe and efficient operations.

  3. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort...

  4. Independent Verification and Validation Of SAPHIRE 8 Risk Management Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2009-11-01

    This report provides an evaluation of the risk management. Risk management is intended to ensure a methodology for conducting risk management planning, identification, analysis, responses, and monitoring and control activities associated with the SAPHIRE project work, and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  5. Systems analysis-independent analysis and verification

    SciTech Connect

    Badin, J.S.; DiPietro, J.P.

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  6. A Tutorial on Text-Independent Speaker Verification

    NASA Astrophysics Data System (ADS)

    Bimbot, Frédéric; Bonastre, Jean-François; Fredouille, Corinne; Gravier, Guillaume; Magrin-Chagnolleau, Ivan; Meignier, Sylvain; Merlin, Teva; Ortega-García, Javier; Petrovska-Delacrétaz, Dijana; Reynolds, Douglas A.

    2004-12-01

    This paper presents an overview of a state-of-the-art text-independent speaker verification system. First, an introduction proposes a modular scheme of the training and test phases of a speaker verification system. Then, the most commonly speech parameterization used in speaker verification, namely, cepstral analysis, is detailed. Gaussian mixture modeling, which is the speaker modeling technique used in most systems, is then explained. A few speaker modeling alternatives, namely, neural networks and support vector machines, are mentioned. Normalization of scores is then explained, as this is a very important step to deal with real-world data. The evaluation of a speaker verification system is then detailed, and the detection error trade-off (DET) curve is explained. Several extensions of speaker verification are then enumerated, including speaker tracking and segmentation by speakers. Then, some applications of speaker verification are proposed, including on-site applications, remote applications, applications relative to structuring audio information, and games. Issues concerning the forensic area are then recalled, as we believe it is very important to inform people about the actual performance and limitations of speaker verification systems. This paper concludes by giving a few research trends in speaker verification for the next couple of years.

  7. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... planning services. (6) Develop performance metrics which allow tracking project completion against... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626 Section 95.626 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION...

  8. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... planning services. (6) Develop performance metrics which allow tracking project completion against... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626 Section 95.626 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION...

  9. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... planning services. (6) Develop performance metrics which allow tracking project completion against... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626 Section 95.626 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION...

  10. Final Report - Independent Verification Survey Report for the Waste Loading Area, Former Hazardous Waste Management Facility, Brookhaven National Laboratory, Upton, New York

    SciTech Connect

    P.C. Weaver

    2008-08-19

    The objective of the verification survey was to obtain evidence by means of measurements and sampling to confirm that the final radiological conditions were less than the established release criteria. This objective was achieved via multiple verification components including document reviews to determine the accuracy and adequacy of FSS documentation.

  11. Systems analysis - independent analysis and verification

    SciTech Connect

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S.

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  12. Applying Independent Verification and Validation to Automatic Test Equipment

    NASA Technical Reports Server (NTRS)

    Calhoun, Cynthia C.

    1997-01-01

    This paper describes a general overview of applying Independent Verification and Validation (IV&V) to Automatic Test Equipment (ATE). The overview is not inclusive of all IV&V activities that can occur or of all development and maintenance items that can be validated and verified, during the IV&V process. A sampling of possible IV&V activities that can occur within each phase of the ATE life cycle are described.

  13. Verification of Oncentra brachytherapy planning using independent calculation

    NASA Astrophysics Data System (ADS)

    Safian, N. A. M.; Abdullah, N. H.; Abdullah, R.; Chiang, C. S.

    2016-03-01

    This study was done to investigate the verification technique of treatment plan quality assurance for brachytherapy. It is aimed to verify the point doses in 192Ir high dose rate (HDR) brachytherapy between Oncentra Masterplan brachytherapy treatment planning system and independent calculation software at a region of rectum, bladder and prescription points for both pair ovoids and full catheter set ups. The Oncentra TPS output text files were automatically loaded into the verification programme that has been developed based on spreadsheets. The output consists of source coordinates, desired calculation point coordinates and the dwell time of a patient plan. The source strength and reference dates were entered into the programme and then dose point calculations were independently performed. The programme shows its results in a comparison of its calculated point doses with the corresponding Oncentra TPS outcome. From the total of 40 clinical cases that consisted of two fractions for 20 patients, the results that were given in term of percentage difference, it shows an agreement between TPS and independent calculation are in the range of 2%. This programme only takes a few minutes to be used is preferably recommended to be implemented as the verification technique in clinical brachytherapy dosimetry.

  14. A practical experience with independent verification and validation

    NASA Technical Reports Server (NTRS)

    Page, Gerald; Mcgarry, Frank E.; Card, David N.

    1985-01-01

    One approach to reducing software cost and increasing reliability is the use of an independent verification and validation (IV & V) methodology. The Software Engineering Laboratory (SEL) applied the IV & V methodology to two medium-size flight dynamics software development projects. Then, to measure the effectiveness of the IV & V approach, the SEL compared these two projects with two similar past projects, using measures like productivity, reliability, and maintain ablilty. Results indicated that the use of the IV & V methodology did not help the overall process nor improve the product in these cases.

  15. Experimental measurement-device-independent verification of quantum steering

    NASA Astrophysics Data System (ADS)

    Kocsis, Sacha; Hall, Michael J. W.; Bennet, Adam J.; Saunders, Dylan J.; Pryde, Geoff J.

    2015-01-01

    Bell non-locality between distant quantum systems—that is, joint correlations which violate a Bell inequality—can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.

  16. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Independent Review of Verification and Validation..., AND APPLIANCES Pt. 236, App. D Appendix D to Part 236—Independent Review of Verification and Validation (a) This appendix provides minimum requirements for independent third-party assessment of...

  17. Ares I-X Range Safety Simulation Verification and Analysis Independent Validation and Verification

    NASA Technical Reports Server (NTRS)

    Merry, Carl M.; Tarpley, Ashley F.; Craig, A. Scott; Tartabini, Paul V.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle

    2011-01-01

    NASA s Ares I-X vehicle launched on a suborbital test flight from the Eastern Range in Florida on October 28, 2009. To obtain approval for launch, a range safety final flight data package was generated to meet the data requirements defined in the Air Force Space Command Manual 91-710 Volume 2. The delivery included products such as a nominal trajectory, trajectory envelopes, stage disposal data and footprints, and a malfunction turn analysis. The Air Force s 45th Space Wing uses these products to ensure public and launch area safety. Due to the criticality of these data, an independent validation and verification effort was undertaken to ensure data quality and adherence to requirements. As a result, the product package was delivered with the confidence that independent organizations using separate simulation software generated data to meet the range requirements and yielded consistent results. This document captures Ares I-X final flight data package verification and validation analysis, including the methodology used to validate and verify simulation inputs, execution, and results and presents lessons learned during the process

  18. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Independent Review of Verification and Validation... Validation (a) This appendix provides minimum requirements for independent third-party assessment of product safety verification and validation pursuant to subpart H or subpart I of this part. The goal of...

  19. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Independent Review of Verification and Validation... Validation (a) This appendix provides minimum requirements for independent third-party assessment of product safety verification and validation pursuant to subpart H or subpart I of this part. The goal of...

  20. Verification of redundancy management design

    NASA Technical Reports Server (NTRS)

    Gelderloos, H. C.; Wilson, D. V.

    1978-01-01

    Statistical method checks designs by simulating system operating conditions and adding error factors. Method has potential applicability to commercial and industrial situations where redundancy management system is used to detect and isolate failed components.

  1. Independent verification and validation for Space Shuttle flight software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Committee for Review of Oversight Mechanisms for Space Shuttle Software was asked by the National Aeronautics and Space Administration's (NASA) Office of Space Flight to determine the need to continue independent verification and validation (IV&V) for Space Shuttle flight software. The Committee found that the current IV&V process is necessary to maintain NASA's stringent safety and quality requirements for man-rated vehicles. Therefore, the Committee does not support NASA's plan to eliminate funding for the IV&V effort in fiscal year 1993. The Committee believes that the Space Shuttle software development process is not adequate without IV&V and that elimination of IV&V as currently practiced will adversely affect the overall quality and safety of the software, both now and in the future. Furthermore, the Committee was told that no organization within NASA has the expertise or the manpower to replace the current IV&V function in a timely fashion, nor will building this expertise elsewhere necessarily reduce cost. Thus, the Committee does not recommend moving IV&V functions to other organizations within NASA unless the current IV&V is maintained for as long as it takes to build comparable expertise in the replacing organization.

  2. Independent verification and validation of large software requirement specification databases

    SciTech Connect

    Twitchell, K.E.

    1992-04-01

    To enhance quality, an independent verification and validation (IV V) review is conducted as software requirements are defined. Requirements are inspected for consistency and completeness. IV V strives to detect defects early in the software development life cycle and to prevent problems before they occur. The IV V review process of a massive software requirements specification, the Reserve Component Automation System (RCAS) Functional Description (FD) is explored. Analysis of the RCAS FD error history determined that there are no predictors of errors. The size of the FD mandates electronic analysis of the databases. Software which successfully performs automated consistency and completeness checks is discussed. The process of verifying the quality of analysis software is described. The use of intuitive ad hoc techniques, in addition to the automatic analysis of the databases, is required because of the varying content of the requirements databases. The ad hoc investigation process is discussed. Case studies are provided to illustrate how the process works. This thesis demonstrates that it is possible to perform an IV V review on a massive software requirements specification. Automatic analysis enables inspecting for completeness and consistency. The work with the RCAS FD clearly indicates that the IV V review process is not static; it must continually grow, adapt, and change as conditions warrant. The ad hoc investigation process provides this required flexibility This process also analyzes errors discovered by manual review and automatic processing. The analysis results in the development of new algorithms and the addition of new programs to the automatic inspection software.

  3. Independent Verification Final Summary Report for the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    SciTech Connect

    P.C. Weaver

    2009-04-29

    The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified “hot spot” cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: 1) performing radiological walkover surveys, and 2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site was removed in its entirety.

  4. Independent verification and validation of large software requirement specification databases

    SciTech Connect

    Twitchell, K.E.

    1992-04-01

    To enhance quality, an independent verification and validation (IV&V) review is conducted as software requirements are defined. Requirements are inspected for consistency and completeness. IV&V strives to detect defects early in the software development life cycle and to prevent problems before they occur. The IV&V review process of a massive software requirements specification, the Reserve Component Automation System (RCAS) Functional Description (FD) is explored. Analysis of the RCAS FD error history determined that there are no predictors of errors. The size of the FD mandates electronic analysis of the databases. Software which successfully performs automated consistency and completeness checks is discussed. The process of verifying the quality of analysis software is described. The use of intuitive ad hoc techniques, in addition to the automatic analysis of the databases, is required because of the varying content of the requirements databases. The ad hoc investigation process is discussed. Case studies are provided to illustrate how the process works. This thesis demonstrates that it is possible to perform an IV&V review on a massive software requirements specification. Automatic analysis enables inspecting for completeness and consistency. The work with the RCAS FD clearly indicates that the IV&V review process is not static; it must continually grow, adapt, and change as conditions warrant. The ad hoc investigation process provides this required flexibility This process also analyzes errors discovered by manual review and automatic processing. The analysis results in the development of new algorithms and the addition of new programs to the automatic inspection software.

  5. Independent Verification Survey Report for the Operable Unit-1 Miamisburg Closure Project, Miamisburg, OH

    SciTech Connect

    Weaver, P.

    2008-03-17

    The objectives of the independent verification survey were to confirm that remedial actions have been effective in meeting established release criteria and that documentation accurately and adequately describes the current radiological and chemical conditions of the MCP site.

  6. Integrated safety management system verification: Volume 2

    SciTech Connect

    Christensen, R.F.

    1998-08-10

    Department of Energy (DOE) Policy (P) 450.4, Safety Management System Policy, commits to institutionalization of an Integrated Safety Management System (ISMS) throughout the DOE complex. The DOE Acquisition Regulations (DEAR, 48 CFR 970) requires contractors to manage and perform work in accordance with a documented Integrated Safety Management System (ISMS). Guidance and expectations have been provided to PNNL by incorporation into the operating contract (Contract DE-ACM-76FL0 1830) and by letter. The contract requires that the contractor submit a description of their ISMS for approval by DOE. PNNL submitted their proposed Safety Management System Description for approval on November 25,1997. RL tentatively approved acceptance of the description pursuant to a favorable recommendation from this review. The Integrated Safety Management System Verification is a review of the adequacy of the ISMS description in fulfilling the requirements of the DEAR and the DOE Policy. The purpose of this review is to provide the Richland Operations Office Manager with a recommendation for approval of the ISMS description of the Pacific Northwest Laboratory based upon compliance with the requirements of 49 CFR 970.5204(-2 and -78); and to verify the extent and maturity of ISMS implementation within the Laboratory. Further the review will provide a model for other DOE laboratories managed by the Office of Assistant Secretary for Energy Research.

  7. INDEPENDENT VERIFICATION OF THE BUILDING 3550 SLAB AT OAK RIDGE NATIONAL LABORATORY OAK RIDGE, TENNESSEE

    SciTech Connect

    Weaver, Phyllis C.

    2012-05-08

    The Oak Ridge Institute for Science and Education (ORISE) has completed the independent verification survey of the Building 3550 Slab. The results of this effort are provided. The objective of this verification survey is to provide independent review and field assessment of remediation actions conducted by Safety and Ecology Corporation (SEC) to document that the final radiological condition of the slab meets the release guidelines. Verification survey activities on the Building 3550 Slab that included scans, measurements, and the collection of smears. Scans for alpha, alpha plus beta, and gamma activity identified several areas that were investigated.

  8. 49 CFR 236.1017 - Independent third party verification and validation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Independent third party verification and validation. 236.1017 Section 236.1017 Transportation Other Regulations Relating to Transportation (Continued... validation. (a) The PTCSP must be supported by an independent third-party assessment when the...

  9. 49 CFR 236.1017 - Independent third party Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Independent third party Verification and Validation. 236.1017 Section 236.1017 Transportation Other Regulations Relating to Transportation (Continued... Validation. (a) The PTCSP must be supported by an independent third-party assessment when the...

  10. Improving semi-text-independent method of writer verification using difference vector

    NASA Astrophysics Data System (ADS)

    Li, Xin; Ding, Xiaoqing

    2009-01-01

    The semi-text-independent method of writer verification based on the linear framework is a method that can use all characters of two handwritings to discriminate the writers in the condition of knowing the text contents. The handwritings are allowed to just have small numbers of even totally different characters. This fills the vacancy of the classical text-dependent methods and the text-independent methods of writer verification. Moreover, the information, what every character is, is used for the semi-text-independent method in this paper. Two types of standard templates, generated from many writer-unknown handwritten samples and printed samples of each character, are introduced to represent the content information of each character. The difference vectors of the character samples are gotten by subtracting the standard templates from the original feature vectors and used to replace the original vectors in the process of writer verification. By removing a large amount of content information and remaining the style information, the verification accuracy of the semi-text-independent method is improved. On a handwriting database involving 30 writers, when the query handwriting and the reference handwriting are composed of 30 distinct characters respectively, the average equal error rate (EER) of writer verification reaches 9.96%. And when the handwritings contain 50 characters, the average EER falls to 6.34%, which is 23.9% lower than the EER of not using the difference vectors.

  11. Design and performance verification of a passive propellant management system

    NASA Technical Reports Server (NTRS)

    Hess, D. A.; Regnier, W. W.

    1978-01-01

    This paper describes the design and verification testing of a reusable passive propellant management system. The system was designed to acquire propellant in low- or zero-g environments and also retain this propellant under high axially directed accelerations that may be experienced during launch and orbit-to-orbit transfer. The system design requirements were established to satisfy generally the requirements for a large number of potential NASA and military applications, such as orbit-to-orbit shuttles and satellite vehicles. The resulting concept was a multicompartmented tank with independent surface tension acquisition channels in each compartment. The tank was designed to provide a minimum expulsion efficiency of 98 percent when subjected to the simultaneous conditions of acceleration, vibration, and outflow. The system design has the unique capability to demonstrate low-g performance in a 1-g test environment, and the test program summarized was structured around this capability.

  12. Space telescope observatory management system preliminary test and verification plan

    NASA Technical Reports Server (NTRS)

    Fritz, J. S.; Kaldenbach, C. F.; Williams, W. B.

    1982-01-01

    The preliminary plan for the Space Telescope Observatory Management System Test and Verification (TAV) is provided. Methodology, test scenarios, test plans and procedure formats, schedules, and the TAV organization are included. Supporting information is provided.

  13. Camera-based independent couch height verification in radiation oncology.

    PubMed

    Kusters, Martijn; Louwe, Rob; Biemans-van Kastel, Liesbeth; Nieuwenkamp, Henk; Zahradnik, Rien; Claessen, Roy; van Seters, Ronald; Huizenga, Henk

    2015-01-01

    For specific radiation therapy (RT) treatments, it is advantageous to use the isocenter-to-couch distance (ICD) for initial patient setup.(1) Since sagging of the treatment couch is not properly taken into account by the electronic readout of the treatment machine, this readout cannot be used for initial patient positioning using the isocenter-to-couch distance (ICD). Therefore, initial patient positioning to the prescribed ICD has been carried out using a ruler prior to each treatment fraction in our institution. However, the ruler method is laborious and logging of data is not possible. The objective of this study is to replace the ruler-based setup of the couch height with an independent, user-friendly, optical camera-based method whereby the radiation technologists have to move only the couch to the correct couch height, which is visible on a display. A camera-based independent couch height measurement system (ICHS) was developed in cooperation with Panasonic Electric Works Western Europe. Clinical data showed that the ICHS is at least as accurate as the application of a ruler to verify the ICD. The camera-based independent couch height measurement system has been successfully implemented in seven treatment rooms, since 10 September 2012. The benefits of this system are a more streamlined workflow, reduction of human errors during initial patient setup, and logging of the actual couch height at the isocenter. Daily QA shows that the systems are stable and operate within the set 1 mm tolerance. Regular QA of the system is necessary to guarantee that the system works correctly. PMID:26699308

  14. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... D Appendix D to Part 236 Transportation Other Regulations Relating to Transportation (Continued..., AND APPLIANCES Pt. 236, App. D Appendix D to Part 236—Independent Review of Verification and..., in FRA's judgment, for FRA to monitor the assessment. (d) The reviewer shall evaluate the...

  15. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... D Appendix D to Part 236 Transportation Other Regulations Relating to Transportation (Continued..., AND APPLIANCES Pt. 236, App. D Appendix D to Part 236—Independent Review of Verification and..., in FRA's judgment, for FRA to monitor the assessment. (d) The reviewer shall evaluate the...

  16. ETV - ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) - RISK MANAGEMENT

    EPA Science Inventory

    In October 1995, the Environmental Technology Verification (ETV) Program was established by EPA. The goal of ETV is to provide credible performance data for commercial-ready environmental technologies to speed their implementation for the benefit of vendors, purchasers, permitter...

  17. INDEPENDENT VERIFICATION SURVEY REPORT FOR ZONE 1 OF THE EAST TENNESSEE TECHNOLOGY PARK IN OAK RIDGE, TENNESSEE

    SciTech Connect

    King, David A.

    2012-08-16

    Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs).

  18. Results of the independent verification survey at the Old Betatron Building, Granite City, Illinois

    SciTech Connect

    Murray, M.E.; Brown, K.S.

    1994-07-01

    A team from the Measurement Applications and Development Group, Oak Ridge National Laboratory (ORNL), conducted an independent verification of the radiological condition of the Old Betatron Building, Granite City, Illinois, at the request of the Department of Energy in June of 1993. The building is owned by the National Steel Corporation. The contamination present resulted from the handling of uranium slabs of metal during the time the betatron facility was used to x-ray the slabs for metallurgical defects. The designation survey did not characterize the entire floor space because of obstructing equipment and debris. Therefore, prior to remediation by Bechtel National, Incorporated (BNI), a thorough characterization of the floor was conducted, and the results were immediately conveyed to on-site staff of BNI. An independent verification assessment was also performed after the cleanup activities were performed under the direction of BNI. The process of characterization, remediation, and verification was accomplished within a five-day period. Based on results of the independent verification assessment, the Old Betatron Building was determined to meet the DOE radiological guidelines for unrestricted use.

  19. Independent Verification and Validation (IV and V) - Adding Mission Assurance to NASA Flight Software

    NASA Technical Reports Server (NTRS)

    Savarino, Shirley; Krasner, Sanford; Huy, Frank

    2011-01-01

    The NASA Independent Verification and Validation (IV&V) Facility objective is to identify potential defects in flight software using independent analysis techniques. This paper describes the tailored IV&V techniques that have been developed in support of critical interactions on the Mars Science Laboratory (MSL) project, scheduled to launch in November, 2011. The IV&V techniques for interface analysis use independently developed sequence diagrams of critical scenarios. The results from these analyses have had a positive impact on the requirements flow down, consistency amongst MSL requirements and identification of missing requirements. The results of these analyses and the positive impact to the MSL project are provided.

  20. INDEPENDENT VERIFICATION OF THE CENTRAL CAMPUS AND SOUTHEAST LABORATORY COMPLEX BUILDING SLABS AT OAK RIDGE NATIONAL LABORATORY, OAK RIDGE, TENNESSEE

    SciTech Connect

    Weaver, Phyllis C.

    2012-07-24

    Oak Ridge Associated Universities/Oak Ridge Institute for Science and Education (ORAU/ORISE) has completed the independent verification survey of the Central Campus and Southeast Lab Complex Building Slabs. The results of this effort are provided. The objective of this verification survey was to provide independent review and field assessment of remediation actions conducted by SEC, and to independently assess whether the final radiological condition of the slabs met the release guidelines.

  1. An independent verification and validation of the Future Theater Level Model conceptual model

    SciTech Connect

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  2. How Nasa's Independent Verification and Validation (IVandV) Program Builds Reliability into a Space Mission Software System (SMSS)

    NASA Technical Reports Server (NTRS)

    Fisher, Marcus S.; Northey, Jeffrey; Stanton, William

    2014-01-01

    The purpose of this presentation is to outline how the NASA Independent Verification and Validation (IVV) Program helps to build reliability into the Space Mission Software Systems (SMSSs) that its customers develop.

  3. Benchmark testing and independent verification of the VS2DT computer code

    SciTech Connect

    McCord, J.T.; Goodrich, M.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code`s original documentation.

  4. Formal verification of a set of memory management units

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, Gerald C.

    1992-01-01

    This document describes the verification of a set of memory management units (MMU). The verification effort demonstrates the use of hierarchical decomposition and abstract theories. The MMUs can be organized into a complexity hierarchy. Each new level in the hierarchy adds a few significant features or modifications to the lower level MMU. The units described include: (1) a page check translation look-aside module (TLM); (2) a page check TLM with supervisor line; (3) a base bounds MMU; (4) a virtual address translation MMU; and (5) a virtual address translation MMU with memory resident segment table.

  5. Commitment at Work and Independence from Management.

    ERIC Educational Resources Information Center

    Belanger, Jacques; Edwards, Paul K.; Wright, Martyn

    2003-01-01

    Case study of a Canadian aluminum smelter through 15 interviews, observation, and employee survey (n=214) revealed high commitment, acceptance of change, and worker independence from management. This pattern emerged from a traditionally strong union presence. Comparison with other cases underlines the centrality of collective organization to…

  6. Business Management for Independent Schools. Fourth Edition.

    ERIC Educational Resources Information Center

    National Association of Independent Schools, Boston, MA.

    This fourth edition of a guide for independent school business managers has been produced in looseleaf format so that changes may be made promptly as decisions of regulatory bodies require modifications in current practice. Fourteen chapters are organized under three broad topic headings. Chapters in part 1, Accounting and Financial Reporting,…

  7. A method for online verification of adapted fields using an independent dose monitor

    SciTech Connect

    Chang Jina; Norrlinger, Bernhard D.; Heaton, Robert K.; Jaffray, David A.; Cho, Young-Bin; Islam, Mohammad K.; Mahon, Robert

    2013-07-15

    Purpose: Clinical implementation of online adaptive radiotherapy requires generation of modified fields and a method of dosimetric verification in a short time. We present a method of treatment field modification to account for patient setup error, and an online method of verification using an independent monitoring system.Methods: The fields are modified by translating each multileaf collimator (MLC) defined aperture in the direction of the patient setup error, and magnifying to account for distance variation to the marked isocentre. A modified version of a previously reported online beam monitoring system, the integral quality monitoring (IQM) system, was investigated for validation of adapted fields. The system consists of a large area ion-chamber with a spatial gradient in electrode separation to provide a spatially sensitive signal for each beam segment, mounted below the MLC, and a calculation algorithm to predict the signal. IMRT plans of ten prostate patients have been modified in response to six randomly chosen setup errors in three orthogonal directions.Results: A total of approximately 49 beams for the modified fields were verified by the IQM system, of which 97% of measured IQM signal agree with the predicted value to within 2%.Conclusions: The modified IQM system was found to be suitable for online verification of adapted treatment fields.

  8. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    SciTech Connect

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies.

  9. Independent verification and validation testing of the FLASH computer code, Versiion 3. 0

    SciTech Connect

    Martian, P.; Chung, J.N. . Dept. of Mechanical and Materials Engineering)

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies.

  10. Cryogenic fluid management experiment trunnion fatigue verification

    NASA Technical Reports Server (NTRS)

    Bailey, W. J.; Fester, D. A.; Toth, J. M., Jr.; Kasper, H. J.

    1983-01-01

    A subcritical liquid hydrogen orbital storage and transfer experiment was designed for flight in the Shuttle cargo bay. The Cryogenic Fluid Management Experiment (CFME) includes a liquid hydrogen tank supported in a vacuum jacket by two fiberglass epoxy trunnion mounts. This composite material was selected for the trunnions since it provides desirable strength, weight and thermal characteristics for supporting cryogenic tankage. An experimental program was conducted to provide material property and fatigue data for S-glass epoxy composite materials at ambient and liquid hydrogen temperatures and to verify structural integrity of the CFME trunnion supports.

  11. Methodology evaluation: Effects of independent verification and intergration on one class of application

    NASA Technical Reports Server (NTRS)

    Page, J.

    1981-01-01

    The effects of an independent verification and integration (V and I) methodology on one class of application are described. Resource profiles are discussed. The development environment is reviewed. Seven measures are presented to test the hypothesis that V and I improve the development and product. The V and I methodology provided: (1) a decrease in requirements ambiguities and misinterpretation; (2) no decrease in design errors; (3) no decrease in the cost of correcting errors; (4) a decrease in the cost of system and acceptance testing; (5) an increase in early discovery of errors; (6) no improvement in the quality of software put into operation; and (7) a decrease in productivity and an increase in cost.

  12. Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia

    1996-01-01

    The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.

  13. Independent calculation-based verification of IMRT plans using a 3D dose-calculation engine

    SciTech Connect

    Arumugam, Sankar; Xing, Aitang; Goozee, Gary; Holloway, Lois

    2013-01-01

    Independent monitor unit verification of intensity-modulated radiation therapy (IMRT) plans requires detailed 3-dimensional (3D) dose verification. The aim of this study was to investigate using a 3D dose engine in a second commercial treatment planning system (TPS) for this task, facilitated by in-house software. Our department has XiO and Pinnacle TPSs, both with IMRT planning capability and modeled for an Elekta-Synergy 6 MV photon beam. These systems allow the transfer of computed tomography (CT) data and RT structures between them but do not allow IMRT plans to be transferred. To provide this connectivity, an in-house computer programme was developed to convert radiation therapy prescription (RTP) files as generated by many planning systems into either XiO or Pinnacle IMRT file formats. Utilization of the technique and software was assessed by transferring 14 IMRT plans from XiO and Pinnacle onto the other system and performing 3D dose verification. The accuracy of the conversion process was checked by comparing the 3D dose matrices and dose volume histograms (DVHs) of structures for the recalculated plan on the same system. The developed software successfully transferred IMRT plans generated by 1 planning system into the other. Comparison of planning target volume (TV) DVHs for the original and recalculated plans showed good agreement; a maximum difference of 2% in mean dose, − 2.5% in D95, and 2.9% in V95 was observed. Similarly, a DVH comparison of organs at risk showed a maximum difference of +7.7% between the original and recalculated plans for structures in both high- and medium-dose regions. However, for structures in low-dose regions (less than 15% of prescription dose) a difference in mean dose up to +21.1% was observed between XiO and Pinnacle calculations. A dose matrix comparison of original and recalculated plans in XiO and Pinnacle TPSs was performed using gamma analysis with 3%/3 mm criteria. The mean and standard deviation of pixels passing

  14. Independent calculation-based verification of IMRT plans using a 3D dose-calculation engine.

    PubMed

    Arumugam, Sankar; Xing, Aitang; Goozee, Gary; Holloway, Lois

    2013-01-01

    Independent monitor unit verification of intensity-modulated radiation therapy (IMRT) plans requires detailed 3-dimensional (3D) dose verification. The aim of this study was to investigate using a 3D dose engine in a second commercial treatment planning system (TPS) for this task, facilitated by in-house software. Our department has XiO and Pinnacle TPSs, both with IMRT planning capability and modeled for an Elekta-Synergy 6MV photon beam. These systems allow the transfer of computed tomography (CT) data and RT structures between them but do not allow IMRT plans to be transferred. To provide this connectivity, an in-house computer programme was developed to convert radiation therapy prescription (RTP) files as generated by many planning systems into either XiO or Pinnacle IMRT file formats. Utilization of the technique and software was assessed by transferring 14 IMRT plans from XiO and Pinnacle onto the other system and performing 3D dose verification. The accuracy of the conversion process was checked by comparing the 3D dose matrices and dose volume histograms (DVHs) of structures for the recalculated plan on the same system. The developed software successfully transferred IMRT plans generated by 1 planning system into the other. Comparison of planning target volume (TV) DVHs for the original and recalculated plans showed good agreement; a maximum difference of 2% in mean dose, - 2.5% in D95, and 2.9% in V95 was observed. Similarly, a DVH comparison of organs at risk showed a maximum difference of +7.7% between the original and recalculated plans for structures in both high- and medium-dose regions. However, for structures in low-dose regions (less than 15% of prescription dose) a difference in mean dose up to +21.1% was observed between XiO and Pinnacle calculations. A dose matrix comparison of original and recalculated plans in XiO and Pinnacle TPSs was performed using gamma analysis with 3%/3mm criteria. The mean and standard deviation of pixels passing gamma

  15. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  16. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2009-09-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  17. US-VISIT Independent Verification and Validation Project: Test Bed Establishment Report

    SciTech Connect

    Jensen, N W; Gansemer, J D

    2011-01-21

    This document describes the computational and data systems available at the Lawrence Livermore National Laboratory for use on the US-VISIT Independent Verification and Validation (IV&V) project. This system - composed of data, software and hardware - is designed to be as close as a representation of the operational ADIS system as is required to verify and validate US-VISIT methodologies. It is not required to reproduce the computational capabilities of the enterprise-class operational system. During FY10, the test bed was simplified from the FY09 version by reducing the number of database host computers from three to one, significantly reducing the maintenance and overhead while simultaneously increasing system throughput. During the current performance period, a database transfer was performed as a set of Data Pump Export files. The previous RMAN backup from 2007 required the availability of an AIX system which is not required when using data pump. Due to efficiencies in the new system and process, loading of the database refresh was able to be accomplished in a much shorter time frame than was previously required. The FY10 Oracle Test Bed now consists of a single Linux platform hosting two Oracle databases including the 2007 copy as well as the October 2010 refresh.

  18. Independent verification of plutonium decontamination on Johnston Atoll (1992--1996)

    SciTech Connect

    Wilson-Nichols, M.J.; Wilson, J.E.; McDowell-Boyer, L.M.; Davidson, J.R.; Egidi, P.V.; Coleman, R.L.

    1998-05-01

    The Field Command, Defense Special Weapons Agency (FCDSWA) (formerly FCDNA) contracted Oak Ridge National Laboratory (ORNL) Environmental Technology Section (ETS) to conduct an independent verification (IV) of the Johnston Atoll (JA) Plutonium Decontamination Project by an interagency agreement with the US Department of Energy in 1992. The main island is contaminated with the transuranic elements plutonium and americium, and soil decontamination activities have been ongoing since 1984. FCDSWA has selected a remedy that employs a system of sorting contaminated particles from the coral/soil matrix, allowing uncontaminated soil to be reused. The objective of IV is to evaluate the effectiveness of remedial action. The IV contractor`s task is to determine whether the remedial action contractor has effectively reduced contamination to levels within established criteria and whether the supporting documentation describing the remedial action is adequate. ORNL conducted four interrelated tasks from 1992 through 1996 to accomplish the IV mission. This document is a compilation and summary of those activities, in addition to a comprehensive review of the history of the project.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT STORMWATER MANAGEMENT INC., STORMFILTER SYSTEM WITH ZPG MEDIA

    EPA Science Inventory

    Verification testing of the Stormwater Management, Inc. StormFilter Using ZPG Filter Media was conducted on a 0.19 acre portion of the eastbound highway surface of Interstate 794, at an area commonly referred to as the "Riverwalk" site near downtown Milwaukee, Wisconsin...

  20. Stirling Research Laboratory Providing Independent Performance Verification of Convertors for a Stirling Radioisotope Generator

    NASA Technical Reports Server (NTRS)

    Thieme, Lanny G.

    2002-01-01

    The Department of Energy (DOE), Germantown, Maryland, Stirling Technology Company (STC), Kennewick, Washington, and NASA Glenn Research Center are developing a free-piston Stirling convertor for a high-efficiency Stirling Radioisotope Generator for NASA Space Science missions. This generator is being developed for multimission use, including providing electric power for unmanned Mars rovers and for deep space missions. STC is developing the 55-W Technology Demonstration Convertor (TDC) under contract to DOE. Glenn is conducting an in-house technology project to assist in developing the convertor for readiness for space qualification and mission implementation. As part of this effort, a Stirling Research Laboratory was established to test the TDC's and related technologies. A key task is providing an independent verification and validation of the TDC performance. Four TDC's are now being tested at Glenn. Acceptance testing has been completed for all convertors, and in general, performance agreed well with that achieved by STC prior to the delivery of the convertors. Performance mapping has also been completed on two of the convertors over a range of hot-end temperatures (450 to 650 C), cold-end temperatures (80 to 120 C), and piston amplitudes (5.2 to 6.2 mm). These test data are available online at http://www.grc.nasa.gov/WWW/tmsb/. The TDC's can be tested in either a horizontal orientation with dual-opposed convertors or in a vertical orientation with a single convertor. Synchronized dual-opposed pairs are used for dynamically balanced operation that results in very low levels of vibration. The Stirling Research Laboratory also supports launch environment testing of the TDC's in Glenn's Structural Dynamics Laboratory and electromagnetic interference and electromagnetic compatibility characterization and reduction efforts. In addition, the TDC's will be used for long-term endurance testing, and preparations are underway for unattended operation.

  1. Business Management for Independent Schools. Third Edition.

    ERIC Educational Resources Information Center

    National Association of Independent Schools, Boston, MA.

    This business management manual discusses school accounting and reporting principles; in particular, financial management, computerization, and records retention techniques. First is described the basic accounting principles, plant funds, endowment funds, operational funds, chart of accounts, and financial states of the school's annual financial…

  2. The Business Manager in the Independent School.

    ERIC Educational Resources Information Center

    Ritter, Paul M.

    The responsibilities of the school business manager have grown more complex since this manual was first published in 1967. As a consequence, this edition updates it and defines the complexities that have evolved since then. The guide begins with an outline of the work of the business manager and the relation of that work to the rest of tbe school.…

  3. A Multitier System for the Verification, Visualization and Management of CHIMERA

    SciTech Connect

    Lingerfelt, Eric J; Messer, Bronson; Osborne, James A; Budiardja, R. D.; Mezzacappa, Anthony

    2011-01-01

    CHIMERA is a multi-dimensional radiation hydrodynamics code designed to study core-collapse supernovae. The code is made up of three essentially independent parts: a hydrodynamics module, a nuclear burning module, and a neutrino transport solver combined within an operator-split approach. Given CHIMERA s complexity and pace of ongoing development, a new support system, Bellerophon, has been designed and implemented to perform automated verification, visualization and management tasks while integrating with other workflow systems utilized by CHIMERA s development group. In order to achieve these goals, a multitier approach has been adopted. By integrating supercomputing platforms, visualization clusters, a dedicated web server and a client-side desktop application, this system attempts to provide an encapsulated, end-to-end solution to these needs.

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION: A VEHICLE FOR INDEPENDENT, CREDIBLE PERFORMANCE RESULTS ON COMMERCIALLY READY TECHNOLOGIES

    EPA Science Inventory

    The paper discusses the U. S. Environmental Protection Agency's Environmental Technology Verification (ETV) Program: its history, operations, past successes, and future plans. Begun in 1995 in response to President Clinton's "Bridge to a Sustainable Future" as a means to work wit...

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: QUALITY AND MANAGEMENT PLAN FOR THE PILOT PERIOD (1995-2000)

    EPA Science Inventory

    Based upon the structure and specifications in ANSI/ASQC E4-1994, Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs, the Environmental Technology Verification (ETV) program Quality and Management Plan (QMP) f...

  6. VERIFICATION TESTING OF AIR POLLUTION CONTROL TECHNOLOGY QUALITY MANAGEMENT PLAN

    EPA Science Inventory

    This document is the basis for quality assurance for the Air Pollution Control Technology Verification Center (APCT Center) operated under the U.S. Environmental Protection Agency (EPA). It describes the policies, organizational structure, responsibilities, procedures, and qualit...

  7. Inside Sweden's Independent Public Schools: Innovations in Management.

    ERIC Educational Resources Information Center

    Raham, Helen

    2003-01-01

    Profiles three Swedish tuition-free, independent public schools. Independent schools were formed after the Swedish government enacted school choice legislation in 1992 resulting in the replacement of private schools with a system of tuition-free, self-managed public schools. These schools (now over 800) provide parents with alternatives to…

  8. An independent system for real-time dynamic multileaf collimation trajectory verification using EPID.

    PubMed

    Fuangrod, Todsaporn; Woodruff, Henry C; Rowshanfarzad, Pejman; O'Connor, Daryl J; Middleton, Richard H; Greer, Peter B

    2014-01-01

    A new tool has been developed to verify the trajectory of dynamic multileaf collimators (MLCs) used in advanced radiotherapy techniques using only the information provided by the electronic portal imaging devices (EPID) measured image frames. The prescribed leaf positions are resampled to a higher resolution in a pre-processing stage to improve the verification precision. Measured MLC positions are extracted from the EPID frames using a template matching method. A cosine similarity metric is then applied to synchronise measured and planned leaf positions for comparison. Three additional comparison functions were incorporated to ensure robust synchronisation. The MLC leaf trajectory error detection was simulated for both intensity modulated radiation therapy (IMRT) (prostate) and volumetric modulated arc therapy (VMAT) (head-and-neck) deliveries with anthropomorphic phantoms in the beam. The overall accuracy for MLC positions automatically extracted from EPID image frames was approximately 0.5 mm. The MLC leaf trajectory verification system can detect leaf position errors during IMRT and VMAT with a tolerance of 3.5 mm within 1 s. PMID:24334552

  9. Final Report Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York

    SciTech Connect

    Harpeneau, Evan M.

    2011-06-24

    On May 9, 2011, ORISE conducted verification survey activities including scans, sampling, and the collection of smears of the remaining soils and off-gas pipe associated with the 802 Fan House within the HFBR (High Flux Beam Reactor) Complex at BNL. ORISE is of the opinion, based on independent scan and sample results obtained during verification activities at the HFBR 802 Fan House, that the FSS (final status survey) unit meets the applicable site cleanup objectives established for as left radiological conditions.

  10. SU-E-T-505: CT-Based Independent Dose Verification for RapidArc Plan as a Secondary Check

    SciTech Connect

    Tachibana, H; Baba, H; Kamima, T; Takahashi, R

    2014-06-01

    Purpose: To design and develop a CT-based independent dose verification for the RapidArc plan and also to show the effectiveness of inhomogeneous correction in the secondary check for the plan. Methods: To compute the radiological path from the body surface to the reference point and equivalent field sizes from the multiple MLC aperture shapes in the RapidArc MLC sequences independently, DICOM files of CT image, structure and RapidArc plan were imported to our in-house software. The radiological path was computed using a three-dimensional CT arrays for each segment. The multiple MLC aperture shapes were used to compute tissue maximum ratio and phantom scatter factor using the Clarkson-method. In this study, two RapidArc plans for oropharynx cancer were used to compare the doses in CT-based calculation and water-equivalent phantom calculation using the contoured body structure to the dose in a treatment planning system (TPS). Results: The comparison in the one plan shows good agreement in both of the calculation (within 1%). However, in the other case, the CT-based calculation shows better agreement compared to the water-equivalent phantom calculation (CT-based: -2.8% vs. Water-based: -3.8%). Because there were multiple structures along the multiple beam paths and the radiological path length in the CT-based calculation and the path in the water-homogenous phantom calculation were comparatively different. Conclusion: RapidArc treatments are performed in any sites (from head, chest, abdomen to pelvis), which includes inhomogeneous media. Therefore, a more reliable CT-based calculation may be used as a secondary check for the independent verification.

  11. The integrated exposure uptake biokinetic model for lead in children: independent validation and verification.

    PubMed Central

    Zaragoza, L; Hogan, K

    1998-01-01

    The U.S. Environmental Protection Agency employs a model, the integrated exposure biokinetic (IEUBK) model for lead in children, for the assessment of risks to children posed by environmental lead at hazardous waste sites. This paper describes results of an effort to verify the consistency of the documentation with the computer model and to test the computer code using a group that is independent from those involved in the model development. This review concluded that the IEUBK model correctly calculates the equations specified in the IEUBK model theory documentation. However, several issues were identified on model documentation, model performance, and the C++ programming language code (i.e., IEUBK model source code) documentation. These issues affect the ability of an independent reviewer to understand the workings of the IEUBK model but not the model's reliability. As a result of these findings, recommendations have been provided for updating documentation to the model as well as associated adjustments to the model documentation. PMID:9860914

  12. Imaging for dismantlement verification: information management and analysis algorithms

    SciTech Connect

    Robinson, Sean M.; Jarman, Kenneth D.; Pitts, W. Karl; Seifert, Allen; Misner, Alex C.; Woodring, Mitchell L.; Myjak, Mitchell J.

    2012-01-11

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  13. A simple method of independent treatment time verification in gamma knife radiosurgery using integral dose

    SciTech Connect

    Jin Jianyue; Drzymala, Robert; Li Zuofeng

    2004-12-01

    The purpose of this study is to develop a simple independent dose calculation method to verify treatment plans for Leksell Gamma Knife radiosurgery. Our approach uses the total integral dose within the skull as an end point for comparison. The total integral dose is computed using a spreadsheet and is compared to that obtained from Leksell GammaPlan registered . It is calculated as the sum of the integral doses of 201 beams, each passing through a cylindrical volume. The average length of the cylinders is estimated from the Skull-Scaler measurement data taken before treatment. Correction factors are applied to the length of the cylinder depending on the location of a shot in the skull. The radius of the cylinder corresponds to the collimator aperture of the helmet, with a correction factor for the beam penumbra and scattering. We have tested our simple spreadsheet program using treatment plans of 40 patients treated with Gamma Knife registered in our center. These patients differ in geometry, size, lesion locations, collimator helmet, and treatment complexities. Results show that differences between our calculations and treatment planning results are typically within {+-}3%, with a maximum difference of {+-}3.8%. We demonstrate that our spreadsheet program is a convenient and effective independent method to verify treatment planning irradiation times prior to implementation of Gamma Knife radiosurgery.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE - STORMWATER MANAGEMENT INC., CATCH BASIN STORMFILTER®

    EPA Science Inventory

    Verification testing of the Stormwater Management CatchBasin StormFilter® (CBSF) was conducted on a 0.16 acre drainage basin at the City of St. Clair Shores, Michigan Department of Public Works facility. The four-cartridge CBSF consists of a storm grate and filter chamber inlet b...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER MANAGEMENT STORMFILTER® TREATMENT SYSTEM USING PERLITE MEDIA

    EPA Science Inventory

    Verification testing of the Stormwater Management, Inc. StormFilter® Using Perlite Filter Media was conducted on a 0.7 acre drainage basin near downtown Griffin, Georgia. The system consists of an inlet bay, flow spreader, cartridge bay, overflow baffle, and outlet bay, housed in...

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE: STORMWATER MANAGEMENT INC., STORMSCREEN� TREATMENT SYSTEM

    EPA Science Inventory

    Verification Testing of the Stormwater Management, Inc. StormScreen treatment technology was performed during a 12-month period starting in May, 2003. The system was previously installed in a city-owned right-of-way near downtown Griffin, GA., and is a device for removing trash,...

  17. Verification of a Quality Management Theory: Using a Delphi Study

    PubMed Central

    Mosadeghrad, Ali Mohammad

    2013-01-01

    Background: A model of quality management called Strategic Collaborative Quality Management (SCQM) model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. Methods: The proposed model was further developed using feedback from thirty quality management experts using a Delphi method. Further, a guidebook for its implementation was prepared including a road map and performance measurement. Results: The research led to the development of a context-specific model of quality management for healthcare organisations and a series of guidelines for its implementation. Conclusion: A proper model of quality management should be developed and implemented properly in healthcare organisations to achieve business excellence. PMID:24596883

  18. Advanced reservoir management for independent oil and gas producers

    SciTech Connect

    Sgro, A.G.; Kendall, R.P.; Kindel, J.M.; Webster, R.B.; Whitney, E.M.

    1996-11-01

    There are more than fifty-two hundred oil and gas producers operating in the United States today. Many of these companies have instituted improved oil recovery programs in some form, but very few have had access to state-of-the-art modeling technologies routinely used by major producers to manage these projects. Since independent operators are playing an increasingly important role in the production of hydrocarbons in the United States, it is important to promote state-of-the-art management practices, including the planning and monitoring of improved oil recovery projects, within this community. This is one of the goals of the Strategic Technologies Council, a special interest group of independent oil and gas producers. Reservoir management technologies have the potential to increase oil recovery while simultaneously reducing production costs. These technologies were pioneered by major producers and are routinely used by them. Independent producers confront two problems adopting this approach: the high cost of acquiring these technologies and the high cost of using them even if they were available. Effective use of reservoir management tools requires, in general, the services of a professional (geoscientist or engineer) who is already familiar with the details of setting up, running, and interpreting computer models.

  19. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 & Vol 2

    SciTech Connect

    PARSONS, J.E.

    2000-07-15

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  20. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  1. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2009-10-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  2. Independent Verification Survey of the Clean Coral Storage Pile at the Johnston Atoll Plutonium Contaminated Soil Remediation Project

    SciTech Connect

    Wilson-Nichols, M.J.; Egidi, P.V.; Roemer, E.K.; Schlosser, R.M.

    2000-09-01

    f I The Oak Ridge National Laboratory (ORNL) Environmental Technology Section conducted an independent verification (IV) survey of the clean storage pile at the Johnston Atoll Plutonium Contaminated Soil Remediation Project (JAPCSRP) from January 18-25, 1999. The goal of the JAPCSRP is to restore a 24-acre area that was contaminated with plutonium oxide particles during nuclear testing in the 1960s. The selected remedy was a soil sorting operation that combined radiological measurements and mining processes to identify and sequester plutonium-contaminated soil. The soil sorter operated from about 1990 to 1998. The remaining clean soil is stored on-site for planned beneficial use on Johnston Island. The clean storage pile currently consists of approximately 120,000 m3 of coral. ORNL conducted the survey according to a Sampling and Analysis Plan, which proposed to provide an IV of the clean pile by collecting a minimum number (99) of samples. The goal was to ascertain wi th 95% confidence whether 97% of the processed soil is less than or equal to the accepted guideline (500-Bq/kg or 13.5-pCi/g) total transuranic (TRU) activity.

  3. Imaging for dismantlement verification: information management and analysis algorithms

    SciTech Connect

    Seifert, Allen; Miller, Erin A.; Myjak, Mitchell J.; Robinson, Sean M.; Jarman, Kenneth D.; Misner, Alex C.; Pitts, W. Karl; Woodring, Mitchell L.

    2010-09-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute. However, this process must be performed with care. Computing the perimeter, area, and intensity of an object, for example, might reveal sensitive information relating to shape, size, and material composition. This paper presents three analysis algorithms that reduce full image information to non-sensitive feature information. Ultimately, the algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We evaluate the algorithms on both their technical performance in image analysis, and their application with and without an explicitly constructed information barrier. The underlying images can be highly detailed, since they are dynamically generated behind the information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography.

  4. A DVE Time Management Simulation and Verification Platform Based on Causality Consistency Middleware

    NASA Astrophysics Data System (ADS)

    Zhou, Hangjun; Zhang, Wei; Peng, Yuxing; Li, Sikun

    During the course of designing a time management algorithm for DVEs, the researchers always become inefficiency for the distraction from the realization of the trivial and fundamental details of simulation and verification. Therefore, a platform having realized theses details is desirable. However, this has not been achieved in any published work to our knowledge. In this paper, we are the first to design and realize a DVE time management simulation and verification platform providing exactly the same interfaces as those defined by the HLA Interface Specification. Moreover, our platform is based on a new designed causality consistency middleware and might offer the comparison of three kinds of time management services: CO, RO and TSO. The experimental results show that the implementation of the platform only costs small overhead, and that the efficient performance of it is highly effective for the researchers to merely focus on the improvement of designing algorithms.

  5. DOE handbook: Integrated safety management systems (ISMS) verification team leader`s handbook

    SciTech Connect

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  6. Checklists for Business Managers. A Tool for Effective Independent School Management.

    ERIC Educational Resources Information Center

    National Association of Independent Schools, Boston, MA.

    The business office guides of the departments of education of Illinois and New Jersey served as the basic resource documents in forming this guide for independent school business managers. The checklists are grouped under the following headings: financial management, insurance and risk management, records retention, purchasing, nonacademic staff,…

  7. Management of citation verification requests for multiple projects at Sandia National Laboratories

    SciTech Connect

    Crawford, C.S.

    1995-12-31

    Sandia National Laboratories` (SNL) Technical Library is now responsible for providing citation verification management support for all references cited in technical reports issued by the Nuclear Waste Management (NWM) Program. This paper dancing how this process is managed for the Yucca Mountain Site Characterization (YWP), Waste Isolation Pilot Plant (WIPP), Idaho National Engineering Laboratory (INEL), and Greater Confinement Disposal (GCD) projects. Since technical reports are the main product of these projects, emphasis is placed on meeting the constantly evolving needs of these customers in a timely and cost-effective manner.

  8. Adaptive beamlet-based finite-size pencil beam dose calculation for independent verification of IMRT and VMAT

    SciTech Connect

    Park, Justin C.; Li, Jonathan G.; Arhjoul, Lahcen; Yan, Guanghua; Lu, Bo; Fan, Qiyong; Liu, Chihray

    2015-04-15

    Purpose: The use of sophisticated dose calculation procedure in modern radiation therapy treatment planning is inevitable in order to account for complex treatment fields created by multileaf collimators (MLCs). As a consequence, independent volumetric dose verification is time consuming, which affects the efficiency of clinical workflow. In this study, the authors present an efficient adaptive beamlet-based finite-size pencil beam (AB-FSPB) dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. Methods: The computational time of finite-size pencil beam (FSPB) algorithm is proportional to the number of infinitesimal and identical beamlets that constitute an arbitrary field shape. In AB-FSPB, dose distribution from each beamlet is mathematically modeled such that the sizes of beamlets to represent an arbitrary field shape no longer need to be infinitesimal nor identical. As a result, it is possible to represent an arbitrary field shape with combinations of different sized and minimal number of beamlets. In addition, the authors included the model parameters to consider MLC for its rounded edge and transmission. Results: Root mean square error (RMSE) between treatment planning system and conventional FSPB on a 10 × 10 cm{sup 2} square field using 10 × 10, 2.5 × 2.5, and 0.5 × 0.5 cm{sup 2} beamlet sizes were 4.90%, 3.19%, and 2.87%, respectively, compared with RMSE of 1.10%, 1.11%, and 1.14% for AB-FSPB. This finding holds true for a larger square field size of 25 × 25 cm{sup 2}, where RMSE for 25 × 25, 2.5 × 2.5, and 0.5 × 0.5 cm{sup 2} beamlet sizes were 5.41%, 4.76%, and 3.54% in FSPB, respectively, compared with RMSE of 0.86%, 0.83%, and 0.88% for AB-FSPB. It was found that AB-FSPB could successfully account for the MLC transmissions without major discrepancy. The algorithm was also graphical processing unit (GPU) compatible to maximize its computational speed. For an intensity modulated radiation therapy (

  9. Independent Verification and Validation Of SAPHIRE 8 Volume 3 Users' Guide Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Volume 3 Users’ Guide is to assess the user documentation for its completeness, correctness, and consistency with respect to requirements for user interface and for any functionality that can be invoked by the user. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  10. Cryogenic Fluid Management Experiment (CFME) trunnion verification testing

    NASA Technical Reports Server (NTRS)

    Bailey, W. J.; Fester, D. A.

    1983-01-01

    The Cryogenic Fluid Management Experiment (CFME) was designed to characterize subcritical liquid hydrogen storage and expulsion in the low-g space environment. The CFME has now become the storage and supply tank for the Cryogenic Fluid Management Facility, which includes transfer line and receiver tanks, as well. The liquid hydrogen storage and supply vessel is supported within a vacuum jacket to two fiberglass/epoxy composite trunnions which were analyzed and designed. Analysis using the limited available data indicated the trunnion was the most fatigue critical component in the storage vessel. Before committing the complete storage tank assembly to environmental testing, an experimental assessment was performed to verify the capability of the trunnion design to withstand expected vibration and loading conditions. Three tasks were conducted to evaluate trunnion integrity. The first determined the fatigue properties of the trunnion composite laminate materials. Tests at both ambient and liquid hydrogen temperatures showed composite material fatigue properties far in excess of those expected. Next, an assessment of the adequacy of the trunnion designs was performed (based on the tested material properties).

  11. Environmental Technology Verification Program Materials Management and Remediation Center Generic Protocol for Verification of In Situ Chemical Oxidation

    EPA Science Inventory

    The protocol provides generic procedures for implementing a verification test for the performance of in situ chemical oxidation (ISCO), focused specifically to expand the application of ISCO at manufactured gas plants with polyaromatic hydrocarbon (PAH) contamination (MGP/PAH) an...

  12. Energy management and control system verification study. Master's thesis

    SciTech Connect

    Boulware, K.E.; Williamson, G.C.

    1983-09-01

    Energy Management and Control Systems (EMCS) are being installed and operated throughout the Air Force. Millions of dollars have been spent on EMCS, but no study has conclusively proved that EMCS has actually saved the Air Force energy. This thesis used the Regression subprogram of Statistical Packages for the Social Sciences (SPSS) to determine if these systems are indeed saving the Air Force energy. Previous studies have shown that Multiple Linear Regression (MLR) is the best statistical predictor of base energy consumption. Eight bases were selected that had an operational EMCS. Two EMCS bases were compared with one control base for each of four CONUS winter heating zones. The results indicated small (less than 2%) energy savings have occurred at half of the EMCS bases studied. Therefore, this study does not conclusively prove that EMCS's have saved energy on Air Force bases. However, the methodology developed in this report could be applied on a broader scale to develop a more conclusive result.

  13. Project TEAMS (Techniques and Education for Achieving Management Skills): Independent Business Owner/Managers.

    ERIC Educational Resources Information Center

    Platte Technical Community Coll., Columbus, NE.

    These Project TEAMS (Techniques and Education for Achieving Managerial Skills) instructional materials consist of five units for use in training independent business owner/managers. The first unit contains materials which deal with management skills relating to personal characteristics of successful business people, knowledge of self and chosen…

  14. Independent Business Owner/Managers. Project TEAMS. (Techniques and Education for Achieving Management Skills).

    ERIC Educational Resources Information Center

    Platte Technical Community Coll., Columbus, NE.

    Prepared as part of Platte Technical Community College's project to help managers and supervisors develop practical, up-to-date managerial skills in a relatively short time, this instructional workbook provides information and exercises applicable to on-the-job situations encountered by independent business owner/managers. Unit I provides…

  15. The SAMS: Smartphone Addiction Management System and verification.

    PubMed

    Lee, Heyoung; Ahn, Heejune; Choi, Samwook; Choi, Wanbok

    2014-01-01

    While the popularity of smartphones has given enormous convenience to our lives, their pathological use has created a new mental health concern among the community. Hence, intensive research is being conducted on the etiology and treatment of the condition. However, the traditional clinical approach based surveys and interviews has serious limitations: health professionals cannot perform continual assessment and intervention for the affected group and the subjectivity of assessment is questionable. To cope with these limitations, a comprehensive ICT (Information and Communications Technology) system called SAMS (Smartphone Addiction Management System) is developed for objective assessment and intervention. The SAMS system consists of an Android smartphone application and a web application server. The SAMS client monitors the user's application usage together with GPS location and Internet access location, and transmits the data to the SAMS server. The SAMS server stores the usage data and performs key statistical data analysis and usage intervention according to the clinicians' decision. To verify the reliability and efficacy of the developed system, a comparison study with survey-based screening with the K-SAS (Korean Smartphone Addiction Scale) as well as self-field trials is performed. The comparison study is done using usage data from 14 users who are 19 to 50 year old adults that left at least 1 week usage logs and completed the survey questionnaires. The field trial fully verified the accuracy of the time, location, and Internet access information in the usage measurement and the reliability of the system operation over more than 2 weeks. The comparison study showed that daily use count has a strong correlation with K-SAS scores, whereas daily use times do not strongly correlate for potentially addicted users. The correlation coefficients of count and times with total K-SAS score are CC = 0.62 and CC =0.07, respectively, and the t-test analysis for the

  16. SUMMARY AND RESULTS LETTER REPORT – INDEPENDENT VERIFICATION OF THE HIGH FLUX BEAM REACTOR UNDERGROUND UTILITIES REMOVAL PROJECT, PHASE 3: TRENCHES 2, 3, AND 4 BROOKHAVEN NATIONAL LABORATORY UPTON, NEW YORK

    SciTech Connect

    E.M. Harpenau

    2010-11-15

    5098-LR-02-0 SUMMARY AND RESULTS LETTER REPORT – INDEPENDENT VERIFICATION OF THE HIGH FLUX BEAM REACTOR UNDERGROUND UTILITIES REMOVAL PROJECT, PHASE 3 TRENCHES 2, 3, AND 4 BROOKHAVEN NATIONAL LABORATORY

  17. Orion GN&C Fault Management System Verification: Scope And Methodology

    NASA Technical Reports Server (NTRS)

    Brown, Denise; Weiler, David; Flanary, Ronald

    2016-01-01

    In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.

  18. Environmental restoration and waste management department independent safety review committee program management plan

    SciTech Connect

    Not Available

    1992-10-01

    This Program Management Plan (PMP) describes and governs the Independent Safety Review Committee (ISRC) established within the Environmental Restoration and Waste Management Department (ER WMD). The ISRC performs independent safety reviews for the ER WMD as required and specified by the governing documents mentioned above. This PMP defines the ISRC organization, work plan, and scope of work. The PMP is organized consistent with the requirements of DOE Order 4700.1, Project Management System. For the purpose of readability, this document shall use the term program'' to include not only the chartered activities of the ISRC, but also the related activities conducted by the chairman and staff. This PMP is subordinate to the ER WMD Implementing Program Management Plan, EGG-WM-10220.

  19. Environmental restoration and waste management department independent safety review committee program management plan

    SciTech Connect

    Not Available

    1992-10-01

    This Program Management Plan (PMP) describes and governs the Independent Safety Review Committee (ISRC) established within the Environmental Restoration and Waste Management Department (ER&WMD). The ISRC performs independent safety reviews for the ER&WMD as required and specified by the governing documents mentioned above. This PMP defines the ISRC organization, work plan, and scope of work. The PMP is organized consistent with the requirements of DOE Order 4700.1, Project Management System. For the purpose of readability, this document shall use the term ``program`` to include not only the chartered activities of the ISRC, but also the related activities conducted by the chairman and staff. This PMP is subordinate to the ER&WMD Implementing Program Management Plan, EGG-WM-10220.

  20. Independent Verification Survey of the Clean Coral Storage Pile at the Johnston Atoll Plutonium-Contaminated Soil Remediation Project

    SciTech Connect

    Wilson-Nichols, M.J.

    2000-12-07

    The Oak Ridge National Laboratory (ORNL) Environmental Technology Section conducted an independent verification (IV) survey of the clean storage pile at the Johnston Atoll Plutonium Contaminated Soil Remediation Project (JAPCSRP) from January 18-25, 1999. The goal of the JAPCSRP is to restore a 24-acre area that was contaminated with plutonium oxide particles during nuclear testing in the 1960s. The selected remedy was a soil sorting operation that combined radiological measurements and mining processes to identify and sequester plutonium-contaminated soil. The soil sorter operated from about 1990 to 1998. The remaining clean soil is stored on-site for planned beneficial use on Johnston Island. The clean storage pile currently consists of approximately 120,000 m{sup 3} of coral. ORNL conducted the survey according to a Sampling and Analysis Plan, which proposed to provide an IV of the clean pile by collecting a minimum number (99) of samples. The goal was to ascertain with 95% confidence whether 97% of the processed soil is less than or equal to the accepted guideline (500-Bq/kg or 13.5-pCi/g) total transuranic (TRU) activity. In previous IV tasks, ORNL has (1) evaluated and tested the soil sorter system software and hardware and (2) evaluated the quality control (QC) program used at the soil sorter plant. The IV has found that the soil sorter decontamination was effective and significantly reduced plutonium contamination in the soil processed at the JA site. The Field Command Defense Threat Reduction Agency currently plans to re-use soil from the clean pile as a cover to remaining contamination in portions of the radiological control area. Therefore, ORNL was requested to provide an IV. The survey team collected samples from 103 random locations within the top 4 ft of the clean storage pile. The samples were analyzed in the on-site radioanalytical counting laboratory with an American Nuclear Systems (ANS) field instrument used for the detection of low

  1. Independent Verification and Validation Of SAPHIRE 8 Software Quality Assurance Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    This report provides an evaluation of the Software Quality Assurance Plan. The Software Quality Assurance Plan is intended to ensure all actions necessary for the software life cycle; verification and validation activities; documentation and deliverables; project management; configuration management, nonconformance reporting and corrective action; and quality assessment and improvement have been planned and a systematic pattern of all actions necessary to provide adequate confidence that a software product conforms to established technical requirements; and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  2. Independent Verification and Validation Of SAPHIRE 8 Software Quality Assurance Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-02-01

    This report provides an evaluation of the Software Quality Assurance Plan. The Software Quality Assurance Plan is intended to ensure all actions necessary for the software life cycle; verification and validation activities; documentation and deliverables; project management; configuration management, nonconformance reporting and corrective action; and quality assessment and improvement have been planned and a systematic pattern of all actions necessary to provide adequate confidence that a software product conforms to established technical requirements; and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  3. Transient analysis for the tajoura critical facility with IRT-2M HEU fuel and IRT-4M leu fuel : ANL independent verification results.

    SciTech Connect

    Garner, P. L.; Hanan, N. A.

    2005-12-02

    Calculations have been performed for postulated transients in the Critical Facility at the Tajoura Nuclear Research Center (TNRC) in Libya. These calculations have been performed at the request of staff of the Renewable Energy and Water Desalinization Research Center (REWDRC) who are performing similar calculations. The transients considered were established during a working meeting between ANL and REWDRC staff on October 1-2, 2005 and subsequent email correspondence. Calculations were performed for the current high-enriched uranium (HEU) core and the proposed low-enriched uranium (LEU) core. These calculations have been performed independently from those being performed by REWDRC and serve as one step in the verification process.

  4. Independent Verification and Validation Of SAPHIRE 8 Software Acceptance Test Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Software Acceptance Test Plan is to assess the approach to be taken for intended testing activities. The plan typically identifies the items to be tested, the requirements being tested, the testing to be performed, test schedules, personnel requirements, reporting requirements, evaluation criteria, and any risks requiring contingency planning. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  5. Global climate change mitigation and sustainable forest management--The challenge of monitoring and verification

    SciTech Connect

    Makundi, Willy R.

    1997-12-31

    In this paper, sustainable forest management is discussed within the historical and theoretical framework of the sustainable development debate. The various criteria and indicators for sustainable forest management put forth by different institutions are critically explored. Specific types of climate change mitigation policies/projects in the forest sector are identified and examined in the light of the general criteria for sustainable forest management. Areas of compatibility and contradiction between the climate mitigation objectives and the minimum criteria for sustainable forest management are identified and discussed. Emphasis is put on the problems of monitoring and verifying carbon benefits associated with such projects given their impacts on pre-existing policy objectives on sustainable forest management. The implications of such policy interactions on assignment of carbon credits from forest projects under Joint Implementation/Activities Implemented Jointly initiatives are discussed. The paper concludes that a comprehensive monitoring and verification regime must include an impact assessment on the criteria covered under other agreements such as the Biodiversity and/or Desertification Conventions. The actual carbon credit assigned to a specific project should at least take into account the negative impacts on the criteria for sustainable forest management. The value of the impacts and/or the procedure to evaluate them need to be established by interested parties such as the Councils of the respective Conventions.

  6. River Protection Project Integrated safety management system phase II verification report, volumes I and II - 8/19/99

    SciTech Connect

    SHOOP, D.S.

    1999-09-10

    The Department of Energy policy (DOE P 450.4) is that safety is integrated into all aspects of the management and operations of its facilities. In simple and straightforward terms, the Department will ''Do work safely.'' The purpose of this River Protection Project (RPP) Integrated Safety Management System (ISMS) Phase II Verification was to determine whether ISMS programs and processes are implemented within RFP to accomplish the goal of ''Do work safely.'' The goal of an implemented ISMS is to have a single integrated system that includes Environment, Safety, and Health (ES&H) requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and federal property over the RPP life cycle. The ISMS is comprised of the (1) described functions, components, processes, and interfaces (system map or blueprint) and (2) personnel who are executing those assigned roles and responsibilities to manage and control the ISMS. Therefore, this review evaluated both the ''paper'' and ''people'' aspects of the ISMS to ensure that the system is implemented within RPP. Richland Operations Office (RL) conducted an ISMS Phase I Verification of the TWRS from September 28-October 9, 1998. The resulting verification report recommended that TWRS-RL and the contractor proceed with Phase II of ISMS verification given that the concerns identified from the Phase I verification review are incorporated into the Phase II implementation plan.

  7. Management by Objectives: A Guide for Starting an Independent School.

    ERIC Educational Resources Information Center

    Heggins, Martha Jean Adams

    Because education itself is a business, starting an independent school is much like starting a private business. This guide, designed to provide a sound basis for planning, implementing, and evaluating an independent school, focuses on various objectives, tasks, and other operational concerns that are essential in the initial planning process.…

  8. INDEPENDENT VERIFICATION SURVEY OF THE SPRU LOWER LEVEL HILLSIDE AREA AT THE KNOLLS ATOMIC POWER LABORATORY NISKAYUNA, NEW YORK

    SciTech Connect

    Harpenau, Evan M.; Weaver, Phyllis C.

    2012-06-06

    During August 10, 2011 through August 19, 2011, and October 23, 2011 through November 4, 2011, ORAU/ORISE conducted verification survey activities at the Separations Process Research Unit (SPRU) site that included in-process inspections, surface scans, and soil sampling of the Lower Level Hillside Area. According to the Type-B Investigation Report, Sr-90 was the primary contributor to the majority of the activity (60 times greater than the Cs-137 activity). The evaluation of the scan data and sample results obtained during verification activities determined that the primary radionuclide of concern, Sr-90, was well below the agreed upon soil cleanup objective (SCO) of 30 pCi/g for the site. However, the concentration of Cs-137 in the four judgmental samples collected in final status survey (FSS) Units A and B was greater than the SCO. Both ORAU and aRc surveys identified higher Cs-137 concentrations in FSS Units A and B; the greatest concentrations were indentified in FSS Unit A.

  9. SU-E-T-351: Verification of Monitor Unit Calculation for Lung Stereotactic Body Radiation Therapy Using a Secondary Independent Planning System

    SciTech Connect

    Tsuruta, Y; Nakata, M; Higashimura, K; Nakamura, M; Miyabe, Y; Akimoto, M; Ono, T; Mukumoto, N; Ishihara, Y; Matsuo, Y; Mizowaki, T; Hiraoka, M

    2014-06-01

    Purpose: To compare isocenter (IC) dose between X-ray Voxel Monte Carlo (XVMC) and Acuros XB (AXB) as part of an independent verification of monitor unit (MU) calculation for lung stereotactic body radiation therapy (SBRT) using a secondary independent treatment planning system (TPS). Methods: Treatment plans of 110 lesions from 101 patients who underwent lung SBRT with Vero4DRT (Mitsubishi Heavy Industries, Ltd., Japan, and BrainLAB, Feldkirchen, Germany) were evaluated retrospectively. Dose distribution was calculated with X-ray Voxel Monte Carlo (XVMC) in iPlan 4.5.1 (BrainLAB, Feldkirchen, Germany) on averaged intensity projection images. A spatial resolution and mean variance were 2 mm and 2%, respectively. The clinical treatment plans were transferred from iPlan to Eclipse (Varian Medical Systems, Palo Alto, CA, USA), and doses were recalculated with well commissioned AXB ver. 11.0.31 while maintaining the XVMC-calculated MUs and beam arrangement. Dose calculations were made in the dose-to-medium dose reporting mode with the calculation grid size of 2.5 mm. The mean and standard deviation (SD) of the IC dose difference between XVMC and AXB were calculated. The tolerance level was defined as |mean|+2SD. Additionally, the relationship between IC dose difference and the size of planning target volume (PTV) or computed tomography (CT) value of internal target volume (ITV) was evaluated. Results: The mean±SD of the IC dose difference between XVMC and AXB was −0.32±0.73%. The tolerance level was 1.8%. Absolute IC dose differences exceeding the tolerance level were observed in 3 patients (2.8%). There were no strong correlations between IC dose difference and PTV size (R=−0.14) or CT value of ITV (R=−0.33). Conclusion: The present study suggested that independent verification of MU calculation for lung SBRT using a secondary TPS is useful.

  10. Utilizing Self-Management to Teach Independence on the Job.

    ERIC Educational Resources Information Center

    Lagomarcino, Thomas R.; And Others

    1989-01-01

    The paper presents a seven-step model that job coaches may use to teach self-management to employees with severe disabilities in supported employment settings. Steps include: identifying the problem, establishing a range of acceptable behavior, selecting self-management procedures, training self-management skills by withdrawing external…

  11. Management of the JWST MIRI pFM environmental and performance verification test campaign

    NASA Astrophysics Data System (ADS)

    Eccleston, Paul; Glasse, Alistair; Grundy, Timothy; Detre, Örs Hunor; O'Sullivan, Brian; Shaughnessy, Bryan; Sykes, Jon; Thatcher, John; Walker, Helen; Wells, Martyn; Wright, Gillian; Wright, David

    2012-09-01

    The Mid-Infrared Instrument (MIRI) is one of four scientific instruments on the James Webb Space Telescope (JWST) observatory, scheduled for launch in 2018. It will provide unique capabilities to probe the distant or deeply dust-enshrouded regions of the Universe, investigating the history of star and planet formation from the earliest universe to the present day. To enable this the instrument optical module must be cooled below 7K, presenting specific challenges for the environmental testing and calibration activities. The assembly, integration and verification (AIV) activities for the proto-flight model (pFM) instrument ran from March 2010 to May 2012 at RAL where the instrument has been put through a full suite of environmental and performance tests with a non-conventional single cryo-test approach. In this paper we present an overview of the testing conducted on the MIRI pFM including ambient alignment testing, vibration testing, gravity release testing, cryogenic performance and calibration testing, functional testing at ambient and operational temperatures, thermal balance tests, and Electro-Magnetic Compatibility (EMC) testing. We discuss how tests were planned and managed to ensure that the whole AIV process remained on schedule and give an insight into the lessons learned from this process. We also show how the process of requirement verification for this complex system was managed and documented. We describe how the risks associated with a single long duration test at operating temperature were controlled so that the complete suite of environmental tests could be used to build up a full picture of instrument compliance.

  12. Neutronic, steady-state, and transient analyses for the Kazakhstan VVR-K reactor with LEU fuel: ANL independent verification results

    SciTech Connect

    Hanan, Nelson A.; Garner, Patrick L.

    2015-08-01

    Calculations have been performed for steady state and postulated transients in the VVR-K reactor at the Institute of Nuclear Physics (INP), Kazakhstan. (The reactor designation in Cyrillic is BBP-K; transliterating characters to English gives VVR-K but translating words gives WWR-K.) These calculations have been performed at the request of staff of the INP who are performing similar calculations. The selection of the transients considered started during working meetings and email correspondence between Argonne National Laboratory (ANL) and INP staff. In the end the transient were defined by the INP staff. Calculations were performed for the fresh low-enriched uranium (LEU) core and for four subsequent cores as beryllium is added to maintain critically during the first 15 cycles. These calculations have been performed independently from those being performed by INP and serve as one step in the verification process.

  13. Transient analyses for the Uzbekistan VVR-SM reactor with IRT-3M HEU fuel and IRT-4M LEU fuel : ANL independent verification results.

    SciTech Connect

    Garner, P. L.; Hanan, N. A.; Nuclear Engineering Division

    2007-09-24

    Calculations have been performed for postulated transients in the VVR-SM Reactor at the Institute of Nuclear Physics (INP) of the Academy of Sciences in the Republic of Uzbekistan. (The reactor designation in Cyrillic is BBP-CM; transliterating characters to English gives VVRSM but translating words gives WWR-SM.) These calculations have been performed at the request of staff of the INP who are performing similar calculations. The transients considered were established during working meetings between Argonne National Laboratory (ANL) and INP staff during summer 2006 [Ref. 1], subsequent email correspondence, and subsequent staff visits. Calculations were performed for the current high-enriched uranium (HEU) core, the proposed low-enriched uranium (LEU) core, and one mixed HEU-LEU core during the transition. These calculations have been performed independently from those being performed by INP and serve as one step in the verification process.

  14. The Management of Independent Secondary School Libraries in England and Wales: The Skills and Perceptions of Library Managers

    ERIC Educational Resources Information Center

    Turner, Richard; Matthews, Graham; Ashcroft, Linda; Farrow, Janet

    2007-01-01

    This paper investigates aspects of the management of independent secondary school libraries in England and Wales. It is based on a survey of 150 independent school library managers, with a response rate of 68.7 percent, which was carried out as part of an ongoing PhD research project. The paper considers a range of issues important to school…

  15. TH-E-BRE-11: Adaptive-Beamlet Based Finite Size Pencil Beam (AB-FSPB) Dose Calculation Algorithm for Independent Verification of IMRT and VMAT

    SciTech Connect

    Park, C; Arhjoul, L; Yan, G; Lu, B; Li, J; Liu, C

    2014-06-15

    Purpose: In current IMRT and VMAT settings, the use of sophisticated dose calculation procedure is inevitable in order to account complex treatment field created by MLCs. As a consequence, independent volumetric dose verification procedure is time consuming which affect the efficiency of clinical workflow. In this study, the authors present an efficient Pencil Beam based dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. Methods: The computational time of Finite Size Pencil Beam (FSPB) algorithm is proportional to the number of infinitesimal identical beamlets that constitute the arbitrary field shape. In AB-FSPB, the dose distribution from each beamlet is mathematically modelled such that the sizes of beamlets to represent arbitrary field shape are no longer needed to be infinitesimal nor identical. In consequence, it is possible to represent arbitrary field shape with combinations of different sized and minimal number of beamlets. Results: On comparing FSPB with AB-FSPB, the complexity of the algorithm has been reduced significantly. For 25 by 25 cm2 squared shaped field, 1 beamlet of 25 by 25 cm2 was sufficient to calculate dose in AB-FSPB, whereas in conventional FSPB, minimum 2500 beamlets of 0.5 by 0.5 cm2 size were needed to calculate dose that was comparable to the Result computed from Treatment Planning System (TPS). The algorithm was also found to be GPU compatible to maximize its computational speed. On calculating 3D dose of IMRT (∼30 control points) and VMAT plan (∼90 control points) with grid size 2.0 mm (200 by 200 by 200), the dose could be computed within 3∼5 and 10∼15 seconds. Conclusion: Authors have developed an efficient Pencil Beam type dose calculation algorithm called AB-FSPB. The fast computation nature along with GPU compatibility has shown performance better than conventional FSPB. This completely enables the implantation of AB-FSPB in the clinical environment for independent

  16. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  17. Independent Validation and Verification of Process Design and Optimization Technology Diagnostic and Control of Natural Gas Fired Furnaces via Flame Image Analysis Technology

    SciTech Connect

    Cox, Daryl

    2009-05-01

    The United States Department of Energy, Industrial Technologies Program has invested in emerging Process Design and Optimizations Technologies (PDOT) to encourage the development of new initiatives that might result in energy savings in industrial processes. Gas fired furnaces present a harsh environment, often making accurate determination of correct air/fuel ratios a challenge. Operation with the correct air/fuel ratio and especially with balanced burners in multi-burner combustion equipment can result in improved system efficiency, yielding lower operating costs and reduced emissions. Flame Image Analysis offers a way to improve individual burner performance by identifying and correcting fuel-rich burners. The anticipated benefit of this technology is improved furnace thermal efficiency, and lower NOx emissions. Independent validation and verification (V&V) testing of the FIA technology was performed at Missouri Forge, Inc., in Doniphan, Missouri by Environ International Corporation (V&V contractor) and Enterprise Energy and Research (EE&R), the developer of the technology. The test site was selected by the technology developer and accepted by Environ after a meeting held at Missouri Forge. As stated in the solicitation for the V&V contractor, 'The objective of this activity is to provide independent verification and validation of the performance of this new technology when demonstrated in industrial applications. A primary goal for the V&V process will be to independently evaluate if this technology, when demonstrated in an industrial application, can be utilized to save a significant amount of the operating energy cost. The Seller will also independently evaluate the other benefits of the demonstrated technology that were previously identified by the developer, including those related to product quality, productivity, environmental impact, etc'. A test plan was provided by the technology developer and is included as an appendix to the summary report submitted

  18. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  19. FINAL REPORT –INDEPENDENT VERIFICATION SURVEY SUMMARY AND RESULTS FOR THE ARGONNE NATIONAL LABORATORY BUILDING 330 PROJECT FOOTPRINT, ARGONNE, ILLINOIS

    SciTech Connect

    ERIKA N. BAILEY

    2012-02-29

    ORISE conducted onsite verification activities of the Building 330 project footprint during the period of June 6 through June 7, 2011. The verification activities included technical reviews of project documents, visual inspections, radiation surface scans, and sampling and analysis. The draft verification report was issued in July 2011 with findings and recommendations. The contractor performed additional evaluations and remediation.

  20. Report of the Space Shuttle Management Independent Review Team

    NASA Technical Reports Server (NTRS)

    1995-01-01

    At the request of the NASA Administrator a team was formed to review the Space Shuttle Program and propose a new management system that could significantly reduce operating costs. Composed of a group of people with broad and extensive experience in spaceflight and related areas, the team received briefings from the NASA organizations and most of the supporting contractors involved in the Shuttle Program. In addition, a number of chief executives from the supporting contractors provided advice and suggestions. The team found that the present management system has functioned reasonably well despite its diffuse structure. The team also determined that the shuttle has become a mature and reliable system, and--in terms of a manned rocket-propelled space launch system--is about as safe as today's technology will provide. In addition, NASA has reduced shuttle operating costs by about 25 percent over the past 3 years. The program, however, remains in a quasi-development mode and yearly costs remain higher than required. Given the current NASA-contractor structure and incentives, it is difficult to establish cost reduction as a primary goal and implement changes to achieve efficiencies. As a result, the team sought to create a management structure and associated environment that enables and motivates the Program to further reduce operational costs. Accordingly, the review team concluded that the NASA Space Shuttle Program should (1) establish a clear set of program goals, placing a greater emphasis on cost-efficient operations and user-friendly payload integration; (2) redefine the management structure, separating development and operations and disengaging NASA from the daily operation of the space shuttle; and (3) provide the necessary environment and conditions within the program to pursue these goals.

  1. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  2. Identification, Verification, and Compilation of Produced Water Management Practices for Conventional Oil and Gas Production Operations

    SciTech Connect

    Rachel Henderson

    2007-09-30

    The project is titled 'Identification, Verification, and Compilation of Produced Water Management Practices for Conventional Oil and Gas Production Operations'. The Interstate Oil and Gas Compact Commission (IOGCC), headquartered in Oklahoma City, Oklahoma, is the principal investigator and the IOGCC has partnered with ALL Consulting, Inc., headquartered in Tulsa, Oklahoma, in this project. State agencies that also have partnered in the project are the Wyoming Oil and Gas Conservation Commission, the Montana Board of Oil and Gas Conservation, the Kansas Oil and Gas Conservation Division, the Oklahoma Oil and Gas Conservation Division and the Alaska Oil and Gas Conservation Commission. The objective is to characterize produced water quality and management practices for the handling, treating, and disposing of produced water from conventional oil and gas operations throughout the industry nationwide. Water produced from these operations varies greatly in quality and quantity and is often the single largest barrier to the economic viability of wells. The lack of data, coupled with renewed emphasis on domestic oil and gas development, has prompted many experts to speculate that the number of wells drilled over the next 20 years will approach 3 million, or near the number of current wells. This level of exploration and development undoubtedly will draw the attention of environmental communities, focusing their concerns on produced water management based on perceived potential impacts to fresh water resources. Therefore, it is imperative that produced water management practices be performed in a manner that best minimizes environmental impacts. This is being accomplished by compiling current best management practices for produced water from conventional oil and gas operations and to develop an analysis tool based on a geographic information system (GIS) to assist in the understanding of watershed-issued permits. That would allow management costs to be kept in line with

  3. INDEPENDENT VERIFICATION SURVEY REPORT FOR EXPOSURE UNITS Z2-24, Z2-31, Z2-32, AND Z2-36 IN ZONE 2 OF THE EAST TENNESSEE TECHNOLOGY PARK OAK RIDGE, TENNESSEE

    SciTech Connect

    2013-10-10

    The U.S. Department of Energy (DOE) Oak Ridge Office of Environmental Management selected Oak Ridge Associated Universities (ORAU), through the Oak Ridge Institute for Science and Education (ORISE) contract, to perform independent verification (IV) at Zone 2 of the East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. ORAU has concluded IV surveys, per the project-specific plan (PSP) (ORAU 2013a) covering exposure units (EUs) Z2-24, -31, -32, and -36. The objective of this effort was to verify the following. • Target EUs comply with requirements in the Zone 2 Record of Decision (ROD) (DOE 2005), as implemented by using the dynamic verification strategy presented in the dynamic work plan (DWP) (BJC 2007) • Commitments in the DWP were adequately implemented, as verified via IV surveys and soil sampling The Zone 2 ROD establishes maximum remediation level (RLmax) values and average RL (RLavg) values for the primary contaminants of concern (COCs) U-234, U-235, U-238, Cs-137, Np-237, Ra-226, Th-232, arsenic, mercury, and polychlorinated biphenyls (PCBs). Table 1.1 lists Zone 2 COCs with associated RLs. Additional radiological and chemical contaminants were also identified during past characterization and monitoring actions, though the ROD does not present RLs for these potential contaminants. IV activities focused on the identification and quantification of ROD-specific COCs in surface soils, but also generated data for other analytes to support future decisions. ORAU personnel also reviewed EU-specific phased construction completion reports (PCCRs) to focus IV activities and identify potential judgmental sample locations, if any.

  4. Provenance In Sensor Data Management: A Cohesive, Independent Solution

    SciTech Connect

    Hensley, Zachary P; Sanyal, Jibonananda; New, Joshua Ryan

    2014-01-01

    In today's information-driven workplaces, data is constantly undergoing transformations and being moved around. The typical business-as-usual approach is to use email attachments, shared network locations, databases, and now, the cloud. More often than not, there are multiple versions of the data sitting in different locations and users of this data are confounded by the lack of metadata describing its provenance, or in other words, its lineage. Our project is aimed to solve this issue in the context of sensor data. The Oak Ridge National Laboratory's Building Technologies Research and Integration Center has reconfigurable commercial buildings deployed on the Flexible Research Platforms (FRPs). These FRPs are instrumented with a large number of sensors which measure a number of variables such as HVAC efficiency, relative humidity, and temperature gradients across doors, windows, and walls. Sub-minute resolution data from hundreds of channels is acquired. This sensor data, traditionally, was saved to a shared network location which was accessible to a number of scientists for performing complicated simulation and analysis tasks. The sensor data also participates in elaborate quality assurance exercises as a result of inherent faults. Sometimes, faults are induced to observe building behavior. It became apparent that proper scientific controls required not just managing the data acquisition and delivery, but to also manage the metadata associated with temporal subsets of the sensor data. We built a system named ProvDMS, or Provenance Data Management System for the FRPs, which would both allow researchers to retrieve data of interest as well as trace data lineage. This provides researchers a one-stop shop for comprehensive views of various data transformation allowing researchers to effectively trace their data to its source so that experiments, and derivations of experiments, may be reused and reproduced without much overhead of the repeatability of experiments that

  5. International Space Station Atmosphere Control and Supply, Atmosphere Revitalization, and Water Recovery and Management Subsystem - Verification for Node 1

    NASA Technical Reports Server (NTRS)

    Williams, David E.

    2007-01-01

    The International Space Station (ISS) Node 1 Environmental Control and Life Support (ECLS) System is comprised of five subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), and Water Recovery and Management (WRM). This paper provides a summary of the nominal operation of the Node 1 ACS, AR, and WRM design and detailed Element Verification methodologies utilized during the Qualification phase for Node 1.

  6. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  7. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  8. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  9. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  10. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  11. Geometric verification

    NASA Technical Reports Server (NTRS)

    Grebowsky, G. J.

    1982-01-01

    Present LANDSAT data formats are reviewed to clarify how the geodetic location and registration capabilities were defined for P-tape products and RBV data. Since there is only one geometric model used in the master data processor, geometric location accuracy of P-tape products depends on the absolute accuracy of the model and registration accuracy is determined by the stability of the model. Due primarily to inaccuracies in data provided by the LANDSAT attitude management system, desired accuracies are obtained only by using ground control points and a correlation process. The verification of system performance with regards to geodetic location requires the capability to determine pixel positions of map points in a P-tape array. Verification of registration performance requires the capability to determine pixel positions of common points (not necessarily map points) in 2 or more P-tape arrays for a given world reference system scene. Techniques for registration verification can be more varied and automated since map data are not required. The verification of LACIE extractions is used as an example.

  12. INDEPENDENT VERIFICATION SURVEY SUMMARY AND RESULTS FOR SUB-SLAB SOILS ASSOCIATED WITH THE FORMER BUILDING K-33, OAK RIDGE, TENNESSEE

    SciTech Connect

    NICK A. ALTIC

    2012-09-20

    At DOE’s request, ORAU conducted confirmatory surveys of the K-33 sub-slab soil during the period of August 2011 through May 2012. The survey activities included visual inspections and measurement and sampling activities. LSRS was forthcoming with information relating to surface scan results. Scans performed by the contractor were of adequate coverage and overall data appear to represent actual site conditions. However, the LSRS technicians failed to identify several areas of elevated direct gamma radiation. Most of the samples taken by ORAU at locations of elevated instrument response were above the remediation concentration for one or more radionuclides of concern (ROC). The contractor was, however, quick to perform additional remediation of areas identified to have contamination above the guidelines. Further investigation by ORAU was not requested once additional remediation was completed. It is presumed the remediation contractor’s future PCCR will present detailed and conclusive evidence that K-33 sub-slab soils either comply or do not comply with record of decision (ROD) criteria. However, ORAU concludes, based on both independent verification (IV) data and data provided by LSRS, that the remediation contractor followed appropriate and applicable procedures and that the associated data adequately represent site conditions.

  13. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  14. Demand-side management implementation and verification at Fort Drum, New York

    SciTech Connect

    Armstrong, P.R.; Dixon, D.R.; Richman, E.E.; Rowley, S.E.

    1995-06-01

    Through the Facility Energy Decision Screening (FEDS) process, the U.S. Army Forces Command (FORSCOM) has identified present value savings of nearly $47 million in cost-effective energy conservation measures (ECMs) at Fort Drum, New York. With associated costs of more than $16 million (1992 $), the measures provide a net present value of $30.6 million for all identified projects. By implementing all cost-effective ECMs, Fort Drum can reduce its annual energy use by more than 230,000 MBtu (11% of its fossil energy consumption) and more than 27,000 MWh (32% of its electric energy consumption). The annual cost of energy services will decrease by $2.8 million (20%) at current energy rates. The servicing utility (Niagara Mohawk Power Corporation) has informally agreed to finance and implement cost-effective ECMs and to participate in the verification of energy savings. Verification baselining is under way; implementation of retrofit projects is expected to begin in late 1994. The utility-administered financing and contracting arrangements and the alternative federal programs for implementing the projects are described. The verification protocols and sampling plans for audit, indirect, and direct measurement levels of verification and the responsibilities of Fort Drum, the utility, the energy service companies (ESCos), and Pacific Northwest Laboratory (PNL) in the verification process are also presented. A preliminary weather-normalized model of baseline energy consumption has been developed based on a full year`s metered data.

  15. Demand-side management implementation and verification at Fort Drum, New York

    SciTech Connect

    Armstrong, P.R.; Dixon, D.R.; Richman, E.E.; Rowley, S.E.

    1994-12-01

    Through the Facility Energy Decision Screening (FEDS) process, the US Army Forces Command (FORSCOM) has identified present value savings of nearly $47 million in cost-effective energy conservation measures (ECMs) at Fort Drum, New York. With associated costs of more than $16 million (1992 $), the measures provide a net present value of $30.6 million for all identified projects. By implementing all cost-effective ECMs, Fort Drum can reduce its annual energy use by more than 230,000 MBtu (11% of its fossil energy consumption) and more than 27,000 MWh (32% of its electric energy consumption). The annual cost of energy services will decrease by $2.8 million (20%) at current energy rates. The servicing utility (Niagara Mohawk Power Corporation) has informally agreed to finance and implement cost-effective ECMs and to participate in the verification of energy savings. Verification baselining is under way; implementation of retrofit projects is expected to begin in late 1994. The utility-administered financing and contracting arrangements and the alternative federal programs for implementing the projects are described. The verification protocols and sampling plans for audit, indirect, and direct measurement levels of verification and the responsibilities of Fort Drum, the utility, the energy service companies (ESCOs), and Pacific Northwest Laboratory (PNL) in the verification process are also presented. A preliminary weather-normalized model of baseline energy consumption has been developed based on a full year`s metered data.

  16. Independent Verification and Validation Program

    NASA Technical Reports Server (NTRS)

    Bailey, Brandon T.

    2015-01-01

    Presentation to be given to European Space Agency counterparts to give an overview of NASA's IVV Program and the layout and structure of the Software Testing and Research laboratory maintained at IVV. Seeking STI-ITAR review due to the international audience. Most of the information has been presented to public audiences in the past, with some variations on data, or is in the public domain.

  17. Neutronics, steady-state, and transient analyses for the Poland MARIA reactor for irradiation testing of LEU lead test fuel assemblies from CERCA : ANL independent verification results.

    SciTech Connect

    Garner, P. L.; Hanan, N. A.

    2011-06-07

    The MARIA reactor at the Institute of Atomic Energy (IAE) in Swierk (30 km SE of Warsaw) in the Republic of Poland is considering conversion from high-enriched uranium (HEU) to low-enriched uranium (LEU) fuel assemblies (FA). The FA design in MARIA is rather unique; a suitable LEU FA has never been designed or tested. IAE has contracted with CERCA (the fuel supply portion of AREVA in France) to supply 2 lead test assemblies (LTA). The LTAs will be irradiated in MARIA to burnup level of at least 40% for both LTAs and to 60% for one LTA. IAE may decide to purchase additional LEU FAs for a full core conversion after the test irradiation. The Reactor Safety Committee within IAE and the National Atomic Energy Agency in Poland (PAA) must approve the LTA irradiation process. The approval will be based, in part, on IAE submitting revisions to portions of the Safety Analysis Report (SAR) which are affected by the insertion of the LTAs. (A similar process will be required for the full core conversion to LEU fuel.) The analysis required was established during working meetings between Argonne National Laboratory (ANL) and IAE staff during August 2006, subsequent email correspondence, and subsequent staff visits. The analysis needs to consider the current high-enriched uranium (HEU) core and 4 core configurations containing 1 and 2 LEU LTAs in various core positions. Calculations have been performed at ANL in support of the LTA irradiation. These calculations are summarized in this report and include criticality, burn-up, neutronics parameters, steady-state thermal hydraulics, and postulated transients. These calculations have been performed at the request of the IAE staff, who are performing similar calculations to be used in their SAR amendment submittal to the PAA. The ANL analysis has been performed independently from that being performed by IAE and should only be used as one step in the verification process.

  18. Proceedings of the International Workshop on Sustainable ForestManagement: Monitoring and Verification of Greenhouse Gases

    SciTech Connect

    Sathaye , Jayant; Makundi , Willy; Goldberg ,Beth; Andrasko , Ken; Sanchez , Arturo

    1997-07-01

    The International Workshop on Sustainable Forest Management: Monitoring and Verification of Greenhouse Gases was held in San Jose, Costa Rica, July 29-31, 1996. The main objectives of the workshop were to: (1) assemble key practitioners of forestry greenhouse gas (GHG) or carbon offset projects, remote sensing of land cover change, guidelines development, and the forest products certification movement, to offer presentations and small group discussions on findings relevant to the crucial need for the development of guidelines for monitoring and verifying offset projects, and (2) disseminate the findings to interested carbon offset project developers and forestry and climate change policy makers, who need guidance and consistency of methods to reduce project transaction costs and increase probable reliability of carbon benefits, at appropriate venues. The workshop brought together about 45 participants from developed, developing, and transition countries. The participants included researchers, government officials, project developers, and staff from regional and international agencies. Each shared his or her perspectives based on experience in the development and use of methods for monitoring and verifying carbon flows from forest areas and projects. A shared sense among the participants was that methods for monitoring forestry projects are well established, and the techniques are known and used extensively, particularly in production forestry. Introducing climate change with its long-term perspective is often in conflict with the shorter-term perspective of most forestry projects and standard accounting principles. The resolution of these conflicts may require national and international agreements among the affected parties. The establishment of guidelines and protocols for better methods that are sensitive to regional issues will be an important first step to increase the credibility of forestry projects as viable mitigation options. The workshop deliberations led

  19. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  20. Managing Complexity in the MSL/Curiosity Entry, Descent, and Landing Flight Software and Avionics Verification and Validation Campaign

    NASA Technical Reports Server (NTRS)

    Stehura, Aaron; Rozek, Matthew

    2013-01-01

    The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.

  1. Leveraging Independent Management and Chief Engineer Hierarchy: Vertically and Horizontally-Derived Technical Authority Value

    NASA Technical Reports Server (NTRS)

    Barley, Bryan; Newhouse, Marilyn

    2012-01-01

    In the development of complex spacecraft missions, project management authority is usually extended hierarchically from NASA's highest agency levels down to the implementing institution's project team level, through both the center and the program. In parallel with management authority, NASA utilizes a complementary, but independent, hierarchy of technical authority (TA) that extends from the agency level to the project, again, through both the center and the program. The chief engineers (CEs) who serve in this technical authority capacity oversee and report on the technical status and ensure sound engineering practices, controls, and management of the projects and programs. At the lowest level, implementing institutions assign project CEs to technically engage projects, lead development teams, and ensure sound technical principles, processes, and issue resolution. At the middle level, programs and centers independently use CEs to ensure the technical success of their projects and programs. At the agency level, NASA's mission directorate CEs maintain technical cognizance over every program and project in their directorate and advise directorate management on the technical, cost, schedule, and programmatic health of each. As part of this vertically-extended CE team, a program level CE manages a continually varying balance between penetration depth and breadth across his or her assigned missions. Teamwork issues and information integration become critical for management at all levels to ensure value-added use of both the synergy available between CEs at the various agency levels, and the independence of the technical authority at each organization.

  2. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.; Winberg, M.R.; McIsaac, C.V.

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  3. SU-E-T-24: A Simple Correction-Based Method for Independent Monitor Unit (MU) Verification in Monte Carlo (MC) Lung SBRT Plans

    SciTech Connect

    Pokhrel, D; Badkul, R; Jiang, H; Estes, C; Kumar, P; Wang, F

    2014-06-01

    Purpose: Lung-SBRT uses hypo-fractionated dose in small non-IMRT fields with tissue-heterogeneity corrected plans. An independent MU verification is mandatory for safe and effective delivery of the treatment plan. This report compares planned MU obtained from iPlan-XVM-Calgorithm against spreadsheet-based hand-calculation using most commonly used simple TMR-based method. Methods: Treatment plans of 15 patients who underwent for MC-based lung-SBRT to 50Gy in 5 fractions for PTV V100%=95% were studied. ITV was delineated on MIP images based on 4D-CT scans. PTVs(ITV+5mm margins) ranged from 10.1- 106.5cc(average=48.6cc). MC-SBRT plans were generated using a combination of non-coplanar conformal arcs/beams using iPlan XVM-Calgorithm (BrainLAB iPlan ver.4.1.2) for Novalis-TX consisting of micro-MLCs and 6MV-SRS (1000MU/min) beam. These plans were re-computed using heterogeneity-corrected Pencil-Beam (PB-hete) algorithm without changing any beam parameters, such as MLCs/MUs. Dose-ratio: PB-hete/MC gave beam-by-beam inhomogeneity-correction-factors (ICFs):Individual Correction. For independent-2nd-check, MC-MUs were verified using TMR-based hand-calculation and obtained an average ICF:Average Correction, whereas TMR-based hand-calculation systematically underestimated MC-MUs by ∼5%. Also, first 10 MC-plans were verified with an ion-chamber measurement using homogenous phantom. Results: For both beams/arcs, mean PB-hete dose was systematically overestimated by 5.5±2.6% and mean hand-calculated MU systematic underestimated by 5.5±2.5% compared to XVMC. With individual correction, mean hand-calculated MUs matched with XVMC by - 0.3±1.4%/0.4±1.4 for beams/arcs, respectively. After average 5% correction, hand-calculated MUs matched with XVMC by 0.5±2.5%/0.6±2.0% for beams/arcs, respectively. Smaller dependence on tumor volume(TV)/field size(FS) was also observed. Ion-chamber measurement was within ±3.0%. Conclusion: PB-hete overestimates dose to lung tumor relative to

  4. Approaches in highly parameterized inversion - GENIE, a general model-independent TCP/IP run manager

    USGS Publications Warehouse

    Muffels, Christopher T.; Schreuder, Willem A.; Doherty, John E.; Karanovic, Marinko; Tonkin, Matthew J.; Hunt, Randall J.; Welter, David E.

    2012-01-01

    GENIE is a model-independent suite of programs that can be used to generally distribute, manage, and execute multiple model runs via the TCP/IP infrastructure. The suite consists of a file distribution interface, a run manage, a run executer, and a routine that can be compiled as part of a program and used to exchange model runs with the run manager. Because communication is via a standard protocol (TCP/IP), any computer connected to the Internet can serve in any of the capacities offered by this suite. Model independence is consistent with the existing template and instruction file protocols of the widely used PEST parameter estimation program. This report describes (1) the problem addressed; (2) the approach used by GENIE to queue, distribute, and retrieve model runs; and (3) user instructions, classes, and functions developed. It also includes (4) an example to illustrate the linking of GENIE with Parallel PEST using the interface routine.

  5. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  6. Integrated Safety Management System Phase 1 and 2 Verification for the Environmental Restoration Contractor Volumes 1 and 2

    SciTech Connect

    CARTER, R.P.

    2000-04-04

    DOE Policy 450.4 mandates that safety be integrated into all aspects of the management and operations of its facilities. The goal of an institutionalized Integrated Safety Management System (ISMS) is to have a single integrated system that includes Environment, Safety, and Health requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and the federal property over the life cycle of the Environmental Restoration (ER) Project. The purpose of this Environmental Restoration Contractor (ERC) ISMS Phase MI Verification was to determine whether ISMS programs and processes were institutionalized within the ER Project, whether these programs and processes were implemented, and whether the system had promoted the development of a safety conscious work culture.

  7. Climate Change Risk Management Consulting: The opportunity for an independent business practice

    NASA Astrophysics Data System (ADS)

    Ciccozzi, R.

    2009-04-01

    The Paper outlines the main questions to be addressed with reference to the actual demand of climate change risk management consulting, in the financial services. Moreover, the Project shall also try to investigate if the Catastrophe Modelling Industry can start and manage a business practice specialised on climate change risk exposures. In this context, the Paper aims at testing the possibility to build a sound business case, based upon typical MBA course analysis tools, such as PEST(LE), SWOT, etc. Specific references to the tools to be used and to other contribution from academic literature and general documentation are also discussed in the body of the Paper and listed at the end. The analysis shall also focus on the core competencies required for an independent climate change risk management consulting business practice, with the purpose to outline a valid definition of how to achieve competitive advantage in climate change risk management consulting.

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    EPA Science Inventory

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  9. Integrated Safety Management System Phase I Verification for the Plutonium Finishing Plant (PFP) [VOL 1 & 2

    SciTech Connect

    SETH, S.S.

    2000-01-10

    U.S. Department of Energy (DOE) Policy 450.4, Safety Management System Policy commits to institutionalizing an Integrated Safety Management System (ISMS) throughout the DOE complex as a means of accomplishing its missions safely. DOE Acquisition Regulation 970.5204-2 requires that contractors manage and perform work in accordance with a documented safety management system.

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE; PRACTICAL BEST MANAGEMENT OF GEORGIA, INC., CRYSTALSTREAM� WATER QUALITY VAULT MODEL 1056

    EPA Science Inventory

    Verification testing of the Practical Best Management, Inc., CrystalStream™ stormwater treatment system was conducted over a 15-month period starting in March, 2003. The system was installed in a test site in Griffin, Georgia, and served a drainage basin of approximately 4 ...

  11. Spent Nuclear Fuel (SNF) project Integrated Safety Management System phase I and II Verification Review Plan

    SciTech Connect

    CARTER, R.P.

    1999-11-19

    The U.S. Department of Energy (DOE) commits to accomplishing its mission safely. To ensure this objective is met, DOE issued DOE P 450.4, Safety Management System Policy, and incorporated safety management into the DOE Acquisition Regulations ([DEAR] 48 CFR 970.5204-2 and 90.5204-78). Integrated Safety Management (ISM) requires contractors to integrate safety into management and work practices at all levels so that missions are achieved while protecting the public, the worker, and the environment. The contractor is required to describe the Integrated Safety Management System (ISMS) to be used to implement the safety performance objective.

  12. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  13. Verification of JUPITER Standard Analysis Method for Upgrading Joyo MK-III Core Design and Management

    NASA Astrophysics Data System (ADS)

    Maeda, Shigetaka; Ito, Chikara; Sekine, Takashi; Aoyama, Takafumi

    In the experimental fast reactor Joyo, loading of irradiation test rigs causes a decrease in excess reactivity because the rigs contain less fissile materials than the driver fuel. In order to carry out duty operation cycles using as many irradiation rigs as possible, it is necessary to upgrade the core performance to increase its excess reactivity and irradiation capacity. Core modification plans have been considered, such as the installation of advanced radial reflectors and reduction of the number of control rods. To implement such core modifications, it is first necessary to improve the prediction accuracy in core design and to optimize safety margins. In the present study, verification of the JUPITER fast reactor standard analysis method was conducted through a comparison between the calculated and the measured Joyo MK-III core characteristics, and it was concluded that the accuracy for a small sodium-cooled fast reactor with a hard neutron spectrum was within 5 % of unity. It was shown that, the performance of the irradiation bed core could be upgraded by the improvement of the prediction accuracy of the core characteristics and optimization of safety margins.

  14. The role of data management in discipline-independent data visualization

    NASA Technical Reports Server (NTRS)

    Treinish, Lloyd A.

    1990-01-01

    The common data format (CDF) is described in terms of its support applications for the database management of visualization systems. The CDF is a self-describing data abstraction technique for the storage and manipulation of multidimensional data that are based on block structures. The discipline-independent approach is designed to manage, manipulate, archive, display, and analyze data, and can be applied to heterogeneous equipment communicating different data structures over networks. An improved CDF version incorporates a hyperplane access allowing random aggregate access to subdimensional blocks within a multidimensional variable. The visualization pipeline is also discussed, which controls the flow of data and permits the visualization of different classes of data representation techniques. The system is found to accommodate a large variety of scientific data structures and large disk-based data sets.

  15. Fluor Daniel Hanford Inc. integrated safety management system phase 1 verification final report

    SciTech Connect

    PARSONS, J.E.

    1999-10-28

    The purpose of this review is to verify the adequacy of documentation as submitted to the Approval Authority by Fluor Daniel Hanford, Inc. (FDH). This review is not only a review of the Integrated Safety Management System (ISMS) System Description documentation, but is also a review of the procedures, policies, and manuals of practice used to implement safety management in an environment of organizational restructuring. The FDH ISMS should support the Hanford Strategic Plan (DOE-RL 1996) to safely clean up and manage the site's legacy waste; deploy science and technology while incorporating the ISMS theme to ''Do work safely''; and protect human health and the environment.

  16. Development of an airborne remote sensing system for crop pest management: System integration and verification

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remote sensing along with Global Positioning Systems, Geographic Information Systems, and variable rate technology has been developed, which scientists can implement to help farmers maximize the economic and environmental benefits of crop pest management through precision agriculture. Airborne remo...

  17. Data Verification Tools for Minimizing Management Costs of Dense Air-Quality Monitoring Networks.

    PubMed

    Miskell, Georgia; Salmond, Jennifer; Alavi-Shoshtari, Maryam; Bart, Mark; Ainslie, Bruce; Grange, Stuart; McKendry, Ian G; Henshaw, Geoff S; Williams, David E

    2016-01-19

    Aiming at minimizing the costs, both of capital expenditure and maintenance, of an extensive air-quality measurement network, we present simple statistical methods that do not require extensive training data sets for automated real-time verification of the reliability of data delivered by a spatially dense hybrid network of both low-cost and reference ozone measurement instruments. Ozone is a pollutant that has a relatively smooth spatial spread over a large scale although there can be significant small-scale variations. We take advantage of these characteristics and demonstrate detection of instrument calibration drift within a few days using a rolling 72 h comparison of hourly averaged data from the test instrument with that from suitably defined proxies. We define the required characteristics of the proxy measurements by working from a definition of the network purpose and specification, in this case reliable determination of the proportion of hourly averaged ozone measurements that are above a threshold in any given day, and detection of calibration drift of greater than ±30% in slope or ±5 parts-per-billion in offset. By analyzing results of a study of an extensive deployment of low-cost instruments in the Lower Fraser Valley, we demonstrate that proxies can be established using land-use criteria and that simple statistical comparisons can identify low-cost instruments that are not stable and therefore need replacing. We propose that a minimal set of compliant reference instruments can be used to verify the reliability of data from a much more extensive network of low-cost devices. PMID:26654467

  18. River Protection Project Integrated safety management system phase II verification review plan - 7/29/99

    SciTech Connect

    SHOOP, D.S.

    1999-09-10

    The purpose of this review is to verify the implementation status of the Integrated Safety Management System (ISMS) for the River Protection Project (RPP) facilities managed by Fluor Daniel Hanford, Inc. (FDH) and operated by Lockheed Martin Hanford Company (LMHC). This review will also ascertain whether within RPP facilities and operations the work planning and execution processes are in place and functioning to effectively protect the health and safety of the workers, public, environment, and federal property over the RPP life cycle. The RPP ISMS should support the Hanford Strategic Plan (DOERL-96-92) to safely clean up and manage the site's legacy waste and deploy science and technology while incorporating the ISMS central theme to ''Do work safely'' and protect human health and the environment.

  19. The Home Independence Program with non-health professionals as care managers: an evaluation.

    PubMed

    Lewin, Gill; Concanen, Karyn; Youens, David

    2016-01-01

    The Home Independence Program (HIP), an Australian restorative home care/reablement service for older adults, has been shown to be effective in reducing functional dependency and increasing functional mobility, confidence in everyday activities, and quality of life. These gains were found to translate into a reduced need for ongoing care services and reduced health and aged care costs over time. Despite these positive outcomes, few Australian home care agencies have adopted the service model - a key reason being that few Australian providers employ health professionals, who act as care managers under the HIP service model. A call for proposals from Health Workforce Australia for projects to expand the scope of practice of health/aged care staff then provided the opportunity to develop, implement, and evaluate a service delivery model, in which nonprofessionals replaced the health professionals as Care Managers in the HIP service. Seventy older people who received the HIP Coordinator (HIPC) service participated in the outcomes evaluation. On a range of personal outcome measures, the group showed statistically significant improvement at 3 and 12 months compared to baseline. On each outcome, the improvement observed was larger than that observed in a previous trial in which the service was delivered by health professionals. However, differences in the timing of data collection between the two studies mean that a direct comparison cannot be made. Clients in both studies showed a similarly reduced need for ongoing home care services at both follow-up points. The outcomes achieved by HIPC, with non-health professionals as Care Managers, were positive and can be considered to compare favorably with the outcomes achieved in HIP when health professionals take the Care Manager role. These findings will be of interest to managers of home care services and to policy makers interested in reducing the long-term care needs of older community dwelling individuals. PMID:27382264

  20. The Home Independence Program with non-health professionals as care managers: an evaluation

    PubMed Central

    Lewin, Gill; Concanen, Karyn; Youens, David

    2016-01-01

    The Home Independence Program (HIP), an Australian restorative home care/reablement service for older adults, has been shown to be effective in reducing functional dependency and increasing functional mobility, confidence in everyday activities, and quality of life. These gains were found to translate into a reduced need for ongoing care services and reduced health and aged care costs over time. Despite these positive outcomes, few Australian home care agencies have adopted the service model – a key reason being that few Australian providers employ health professionals, who act as care managers under the HIP service model. A call for proposals from Health Workforce Australia for projects to expand the scope of practice of health/aged care staff then provided the opportunity to develop, implement, and evaluate a service delivery model, in which nonprofessionals replaced the health professionals as Care Managers in the HIP service. Seventy older people who received the HIP Coordinator (HIPC) service participated in the outcomes evaluation. On a range of personal outcome measures, the group showed statistically significant improvement at 3 and 12 months compared to baseline. On each outcome, the improvement observed was larger than that observed in a previous trial in which the service was delivered by health professionals. However, differences in the timing of data collection between the two studies mean that a direct comparison cannot be made. Clients in both studies showed a similarly reduced need for ongoing home care services at both follow-up points. The outcomes achieved by HIPC, with non-health professionals as Care Managers, were positive and can be considered to compare favorably with the outcomes achieved in HIP when health professionals take the Care Manager role. These findings will be of interest to managers of home care services and to policy makers interested in reducing the long-term care needs of older community dwelling individuals. PMID:27382264

  1. Environmental Technology Verification Program Quality Management Plan, Version 3.0

    EPA Science Inventory

    The ETV QMP is a document that addresses specific policies and procedures that have been established for managing quality-related activities in the ETV program. It is the “blueprint” that defines an organization’s QA policies and procedures; the criteria for and areas of QA appli...

  2. Experimental Verification and Integration of a Next Generation Smart Power Management System

    NASA Astrophysics Data System (ADS)

    Clemmer, Tavis B.

    With the increase in energy demand by the residential community in this country and the diminishing fossil fuel resources being used for electric energy production there is a need for a system to efficiently manage power within a residence. The Smart Green Power Node (SGPN) is a next generation energy management system that automates on-site energy production, storage, consumption, and grid usage to yield the most savings for both the utility and the consumer. Such a system automatically manages on-site distributed generation sources such as a PhotoVoltaic (PV) input and battery storage to curtail grid energy usage when the price is high. The SGPN high level control features an advanced modular algorithm that incorporates weather data for projected PV generation, battery health monitoring algorithms, user preferences for load prioritization within the home in case of an outage, Time of Use (ToU) grid power pricing, and status of on-site resources to intelligently schedule and manage power flow between the grid, loads, and the on-site resources. The SGPN has a scalable, modular architecture such that it can be customized for user specific applications. This drove the topology for the SGPN which connects on-site resources at a low voltage DC microbus; a two stage bi-directional inverter/rectifier then couples the AC load and residential grid connect to on-site generation. The SGPN has been designed, built, and is undergoing testing. Hardware test results obtained are consistent with the design goals set and indicate that the SGPN is a viable system with recommended changes and future work.

  3. Independent naturalists make matchless contributions to science and resource management (Invited)

    NASA Astrophysics Data System (ADS)

    Crimmins, T. M.; Crimmins, M.; Bertelsen, C. D.

    2013-12-01

    Much of the recent growth in PPSR, or public participation in scientific research, has been in 'contributory' or 'collaborative'-type PPSR projects, where non-scientists' roles primarily are data collection or some participation in other aspects of project design or execution. A less common PPSR model, referred to as 'collegial' in recent literature, is characterized by dedicated naturalists collecting rich and extensive data sets outside of an organized program and then working with professional scientists to analyze these data and disseminate findings. The three collaborators on this presentation represent an example of the collegial model; our team is comprised of an independent naturalist who has collected over 150,000 records of plant flowering phenology spanning three decades, a professional climatologist, and a professional plant ecologist. Together, we have documented fundamental plant-climate relationships and seasonal patterns in flowering in the Sonoran Desert region, as well as changes in flowering community composition and distribution associated with changing climate conditions in the form of seven peer-reviewed journal articles and several conference presentations and proceedings. These novel findings address critical gaps in our understanding of plant ecology in the Sky Islands region, and have been incorporated into the Southwest Climate Change and other regional planning documents. It is safe to say that the data resource amassed by a single very dedicated individual, which is far beyond what could be accomplished by probably nearly all researchers or resource managers, has been instrumental in documenting fundamental ecological relationships in the Sky Islands region as well as how these systems are changing in this period of rapidly changing climate. The research findings that have resulted from this partnership have the potential to also directly affect management decisions. The watershed under study, managed by the US Forest Service, has been

  4. Independent management and financial review, Yucca Mountain Project, Nevada. Final report, Appendix

    SciTech Connect

    1995-07-15

    The Nuclear Waste Policy Act of 1982 (Public Law 97-425), as amended by Public Law 100-203, December 22, 1987, established the Office of Civilian Radioactive Waste Management (OCRWM) within the Department of Energy (DOE), and directed the Office to investigate a site at Yucca Mountain, Nevada, to determine if this site is suitable for the construction of a repository for the disposal of high level nuclear waste. Work on site characterization has been under way for several years. Thus far, about $1.47 billion have been spent on Yucca Mountain programs. This work has been funded by Congressional appropriations from a Nuclear Waste Fund to which contributions have been made by electric utility ratepayers through electric utilities generating power from nuclear power stations. The Secretary of Energy and the Governor of the State of Nevada have appointed one person each to a panel to oversee an objective, independent financial and management evaluation of the Yucca Mountain Project. The Requirements for the work will include an analysis of (1) the Yucca Mountain financial and, contract management techniques and controls; (2) Project schedules and credibility of the proposed milestones; (3) Project organizational effectiveness and internal planning processes, and (4) adequacy of funding levels and funding priorities, including the cost of infrastructure and scientific studies. The recipient will provide monthly progress report and the following reports/documents will be presented as deliverables under the contract: (1) Financial and Contract Management Preliminary Report; (2) Project Scheduling Preliminary Report; (3)Project Organizational Effectiveness Preliminary Report; (4) Project Funding Levels and Funding Priorities Preliminary Report; and (5) Final Report.

  5. Independent practice associations and physician-hospital organizations can improve care management for smaller practices.

    PubMed

    Casalino, Lawrence P; Wu, Frances M; Ryan, Andrew M; Copeland, Kennon; Rittenhouse, Diane R; Ramsay, Patricia P; Shortell, Stephen M

    2013-08-01

    Pay-for-performance, public reporting, and accountable care organization programs place pressures on physicians to use health information technology and organized care management processes to improve the care they provide. But physician practices that are not large may lack the resources and size to implement such processes. We used data from a unique national survey of 1,164 practices with fewer than twenty physicians to provide the first information available on the extent to which independent practice associations (IPAs) and physician-hospital organizations (PHOs) might make it possible for these smaller practices to share resources to improve care. Nearly a quarter of the practices participated in an IPA or a PHO that accounted for a significant proportion of their patients. On average, practices participating in these organizations provided nearly three times as many care management processes for patients with chronic conditions as nonparticipating practices did (10.4 versus 3.8). Half of these processes were provided only by IPAs or PHOs. These organizations may provide a way for small and medium-size practices to systematically improve care and participate in accountable care organizations. PMID:23918481

  6. The behavior of multiple independent managers and ecological traits interact to determine prevalence of weeds.

    PubMed

    Coutts, Shaun R; Yokomizo, Hiroyuki; Buckley, Yvonne M

    2013-04-01

    Management of damaging invasive plants is often undertaken by multiple decision makers, each managing only a small part of the invader's population. As weeds can move between properties and re-infest eradicated sites from unmanaged sources, the dynamics of multiple decision makers plays a significant role in weed prevalence and invasion risk at the landscape scale. We used a spatially explicit agent-based simulation to determine how individual agent behavior, in concert with weed population ecology, determined weed prevalence. We compared two invasive grass species that differ in ecology, control methods, and costs: Nassella trichotoma (serrated tussock) and Eragrostis curvula (African love grass). The way decision makers reacted to the benefit of management had a large effect on the extent of a weed. If benefits of weed control outweighed the costs, and either net benefit was very large or all agents were very sensitive to net benefits, then agents tended to act synchronously, reducing the pool of infested agents available to spread the weed. As N. trichotoma was more damaging than E. curvula and had more effective control methods, agents chose to manage it more often, which resulted in lower prevalence of N. trichotoma. A relatively low number of agents who were intrinsically less motivated to control weeds led to increased prevalence of both species. This was particularly apparent when long-distance dispersal meant each infested agent increased the invasion risk for a large portion of the landscape. In this case, a small proportion of land mangers reluctant to control, regardless of costs and benefits, could lead to the whole landscape being infested, even when local control stopped new infestations. Social pressure was important, but only if it was independent of weed prevalence, suggesting that early access to information, and incentives to act on that information, may be crucial in stopping a weed from infesting large areas. The response of our model to both

  7. W-026, Waste Receiving and Processing Facility data management system validation and verification report

    SciTech Connect

    Palmer, M.E.

    1997-12-05

    This V and V Report includes analysis of two revisions of the DMS [data management system] System Requirements Specification (SRS) and the Preliminary System Design Document (PSDD); the source code for the DMS Communication Module (DMSCOM) messages; the source code for selected DMS Screens, and the code for the BWAS Simulator. BDM Federal analysts used a series of matrices to: compare the requirements in the System Requirements Specification (SRS) to the specifications found in the System Design Document (SDD), to ensure the design supports the business functions, compare the discreet parts of the SDD with each other, to ensure that the design is consistent and cohesive, compare the source code of the DMS Communication Module with the specifications, to ensure that the resultant messages will support the design, compare the source code of selected screens to the specifications to ensure that resultant system screens will support the design, compare the source code of the BWAS simulator with the requirements to interface with DMS messages and data transfers relating to the BWAS operations.

  8. Condition Self-Management in Pediatric Spina Bifida: A Longitudinal Investigation of Medical Adherence, Responsibility-Sharing, and Independence Skills

    PubMed Central

    Psihogios, Alexandra M.; Kolbuck, Victoria

    2015-01-01

    Objective This study aimed to evaluate rates of medical adherence, responsibility, and independence skills across late childhood and adolescence in youth with spina bifida (SB) and to explore associations among these disease self-management variables. Method 111 youth with SB, their parents, and a health professional participated at two time points. Informants completed questionnaires regarding medical adherence, responsibility-sharing, and child independence skills. Results Youth gained more responsibility and independence skills across time, although adherence rates did not follow a similar trajectory. Increased child medical responsibility was related to poorer adherence, and father-reported independence skills were associated with increased child responsibility. Conclusions This study highlights medical domains that are the most difficult for families to manage (e.g., skin checks). Although youth appear to gain more autonomy across time, ongoing parental involvement in medical care may be necessary to achieve optimal adherence across adolescence. PMID:26002195

  9. Migration check tool: automatic plan verification following treatment management systems upgrade and database migration.

    PubMed

    Hadley, Scott W; White, Dale; Chen, Xiaoping; Moran, Jean M; Keranen, Wayne M

    2013-01-01

    Software upgrades of the treatment management system (TMS) sometimes require that all data be migrated from one version of the database to another. It is necessary to verify that the data are correctly migrated to assure patient safety. It is impossible to verify by hand the thousands of parameters that go into each patient's radiation therapy treatment plan. Repeating pretreatment QA is costly, time-consuming, and may be inadequate in detecting errors that are introduced during the migration. In this work we investigate the use of an automatic Plan Comparison Tool to verify that plan data have been correctly migrated to a new version of a TMS database from an older version. We developed software to query and compare treatment plans between different versions of the TMS. The same plan in the two TMS systems are translated into an XML schema. A plan comparison module takes the two XML schemas as input and reports any differences in parameters between the two versions of the same plan by applying a schema mapping. A console application is used to query the database to obtain a list of active or in-preparation plans to be tested. It then runs in batch mode to compare all the plans, and a report of success or failure of the comparison is saved for review. This software tool was used as part of software upgrade and database migration from Varian's Aria 8.9 to Aria 11 TMS. Parameters were compared for 358 treatment plans in 89 minutes. This direct comparison of all plan parameters in the migrated TMS against the previous TMS surpasses current QA methods that relied on repeating pretreatment QA measurements or labor-intensive and fallible hand comparisons. PMID:24257281

  10. Phase two of Site 300`s ecological risk assessment: Model verification and risk management

    SciTech Connect

    Carlson, T.M.; Gregory, S.D.

    1995-12-31

    The authors completed the baseline ecological risk assessment (ERA) for Lawrence Livermore National Laboratory`s Site 300 in 1993. Using data collection and modeling techniques adapted from the human health risk assessment (HRA), they evaluated the potential hazard of contaminants in environmental media to ecological receptors. They identified potential hazards to (1) aquatic invertebrates from heavy metal contaminants in surface water, (2) burrowing vertebrates from contaminants volatilizing from subsurface soil into burrow air, and (3) grazing deer and burrowing vertebrates from cadmium contamination in surface soil. They recently began collecting data to refine the estimates of potential hazard to these ecological receptors. Bioassay results form the surface water failed to verify a hazard to aquatic invertebrates. Soil vapor surveys of subsurface burrows did verify the presence of high concentrations of volatile organic compounds (VOCs). However, they have not yet verified a true impact on the burrowing populations. The authors also completed an extensive surface soil sampling program, which identified local hot spots of cadmium contamination. In addition, they have been collecting data on the land use patterns of the deer population. Their data indicate that deer do not typically use those areas with cadmium surface soil contamination. Information from this phase of the ERA, along with the results of the HRA, will direct the selection of remedial alternatives for the site. For the ecological receptors, remedial alternatives include developing a risk management program which includes ensuring that (1) sensitive burrowing species (such as rare or endangered species) do not use areas of surface or subsurface contamination, and (2) deer populations do not use areas of surface soil contamination.

  11. Life Management: Moving Out! Solving Practical Problems for Independent Living. Utah Home Economics and Family Life Curriculum Guide.

    ERIC Educational Resources Information Center

    Utah State Office of Education, Salt Lake City.

    This guide, which has been developed for Utah's home economics and family life education program, contains materials for use in teaching a life management course emphasizing the problem-solving skills required for independent living. Discussed first are the assumptions underlying the curriculum, development of the guide, and suggestions for its…

  12. Design of Experiments with Multiple Independent Variables: A Resource Management Perspective on Complete and Reduced Factorial Designs

    ERIC Educational Resources Information Center

    Collins, Linda M.; Dziak, John J.; Li, Runze

    2009-01-01

    An investigator who plans to conduct an experiment with multiple independent variables must decide whether to use a complete or reduced factorial design. This article advocates a resource management perspective on making this decision, in which the investigator seeks a strategic balance between service to scientific objectives and economy.…

  13. Urban/industrial pollution for the New York City-Washington, D. C., corridor, 1996-1998: 1. Providing independent verification of CO and PCE emissions inventories

    NASA Astrophysics Data System (ADS)

    Barnes, Diana H.; Wofsy, Steven C.; Fehlau, Brian P.; Gottlieb, Elaine W.; Elkins, James W.; Dutton, Geoffrey S.; Montzka, Stephen A.

    2003-03-01

    Atmospheric mixing ratios of carbon monoxide (CO) and perchloroethylene (PCE, C2Cl4) were measured above the canopy at Harvard forest, MA every half-hour for 3 years starting in January 1996. Pollution enhancements are strongly correlated with winds from the southwest, the direction of the New York City-Washington, D. C., corridor, as compared to background levels observed during northwest winds traveling from Canada. We establish the ratio of CO to PCE pollution enhancements by wind direction, by season, and by year and use these results to test the quality of county-level and national source emission inventories for these two gases. The EPA carbon monoxide emission county-level inventories and the McCulloch and Midgley sales-based national-level PCE release estimates are found to be in accord with our independent observations of urban/industrial releases. For the New York City-Washington, D. C., corridor the inventory-based COI/PCEI emissions ratio of 584 (kg/kg) for 1996 falls well within the range of observationally-based ΔCO/ΔPCE pollution plume ratios of 388 to 706 (kg/kg) and is only 11% higher than the observed mean of 521 ± 90 (kg/kg). On the basis of this agreement, PCE emission estimates for 1997 and 1998 are derived from the CO inventory emissions values and the observed ΔCO/ΔPCE ratios in pollution plumes for those years; despite the call for voluntary cutbacks, urban/industrial emissions of PCE appear to be on the rise.

  14. Low-Intrusion Techniques and Sensitive Information Management for Warhead Counting and Verification: FY2011 Annual Report

    SciTech Connect

    Jarman, Kenneth D.; Robinson, Sean M.; McDonald, Benjamin S.; Gilbert, Andrew J.; Misner, Alex C.; Pitts, W. Karl; White, Timothy A.; Seifert, Allen; Miller, Erin A.

    2011-09-01

    Future arms control treaties may push nuclear weapons limits to unprecedented low levels and may entail precise counting of warheads as well as distinguishing between strategic and tactical nuclear weapons. Such advances will require assessment of form and function to confidently verify the presence or absence of nuclear warheads and/or their components. Imaging with penetrating radiation can provide such an assessment and could thus play a unique role in inspection scenarios. Yet many imaging capabilities have been viewed as too intrusive from the perspective of revealing weapon design details, and the potential for the release of sensitive information poses challenges in verification settings. A widely held perception is that verification through radiography requires images of sufficient quality that an expert (e.g., a trained inspector or an image-matching algorithm) can verify the presence or absence of components of a device. The concept of information barriers (IBs) has been established to prevent access to relevant weapon-design information by inspectors (or algorithms), and has, to date, limited the usefulness of radiographic inspection. The challenge of this project is to demonstrate that radiographic information can be used behind an IB to improve the capabilities of treaty-verification weapons-inspection systems.

  15. Interim Letter Report - Verification Survey of Partial Grid E9, David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    SciTech Connect

    P.C. Weaver

    2008-06-12

    Conduct verification surveys of available grids at the DWI 1630 in Knoxville, Tennessee. A representative with the Independent Environmental Assessment and Verification (IEAV) team from ORISE conducted a verification survey of a partial area within Grid E9.

  16. Telescope performance verification

    NASA Astrophysics Data System (ADS)

    Swart, Gerhard P.; Buckley, David A. H.

    2004-09-01

    While Systems Engineering appears to be widely applied on the very large telescopes, it is lacking in the development of many of the medium and small telescopes currently in progress. The latter projects rely heavily on the experience of the project team, verbal requirements and conjecture based on the successes and failures of other telescopes. Furthermore, it is considered an unaffordable luxury to "close-the-loop" by carefully analysing and documenting the requirements and then verifying the telescope's compliance with them. In this paper the authors contend that a Systems Engineering approach is a keystone in the development of any telescope and that verification of the telescope's performance is not only an important management tool but also forms the basis upon which successful telescope operation can be built. The development of the Southern African Large Telescope (SALT) has followed such an approach and is now in the verification phase of its development. Parts of the SALT verification process will be discussed in some detail to illustrate the suitability of this approach, including oversight by the telescope shareholders, recording of requirements and results, design verification and performance testing. Initial test results will be presented where appropriate.

  17. Survey of management and housing in farrowing quarters among independent and integrated swine farms in Québec.

    PubMed Central

    Ravel, A; D'Allaire, S; Bigras-Poulin, M

    1996-01-01

    Forty-eight randomly selected owner-operated swine breeding farms (independent farms) and 38 belonging to 5 integrated organizations specializing in swine production chosen from the largest in the province of Québec (integrated farms) were separately described regarding their general characteristics, sow feeding, management practises, and housing features in farrowing quarters. The parallel description of these 2 groups of farms aids in understanding what is done in the field. It also provides insight into potential differences between independent and integrated farms. Generally speaking, production tended to be more specialized and concentrated in integrated organizations. Specifically, more new practises seemed to have been adopted on the integrated farms, and their stockpersons seemed to have a more proactive style of management in farrowing quarters. Increased size of operations, proximity of information sources, profits yielded by new practises, and ease of implementation are discussed as explanations for this higher rate of adoption of new techniques among the organizations. These differences between the independent farms and the integrated organizations appeared to be all related to basic differences in their respective sizes. Although some differences were observed within, as well as between, each organization, many similarities were found across the majority of farms within each organization, thus supporting the existence of policies specific to each organization. Although these findings have to be confirmed before being generalized, they tend to suggest that independent swine farms and integrated organizations should be considered differently. PMID:8825989

  18. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  19. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  20. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  1. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  2. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  3. VERIFICATION OF GLOBAL CLIMATE CHANGE MITIGATION TECHNOLOGIES

    EPA Science Inventory

    This is a continuation of independent performance evaluations of environmental technologies under EPA's Environmental Technology Verification Program. Emissions of some greenhouse gases, most notably methane. can be controlled profitably now, even in the absence of regulations. ...

  4. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11..., Health and Safety Auditor Certification: California Climate Action Registry; Clean Development Mechanism..., such as the California Climate Action Registry Certification Protocol, the Climate Leaders...

  5. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11..., Health and Safety Auditor Certification: California Climate Action Registry; Clean Development Mechanism..., such as the California Climate Action Registry Certification Protocol, the Climate Leaders...

  6. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11..., Health and Safety Auditor Certification: California Climate Action Registry; Clean Development Mechanism..., such as the California Climate Action Registry Certification Protocol, the Climate Leaders...

  7. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11..., Health and Safety Auditor Certification: California Climate Action Registry; Clean Development Mechanism..., such as the California Climate Action Registry Certification Protocol, the Climate Leaders...

  8. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11..., Health and Safety Auditor Certification: California Climate Action Registry; Clean Development Mechanism..., such as the California Climate Action Registry Certification Protocol, the Climate Leaders...

  9. Case managers' and independent living counselors' perspectives on health promotion activities for individuals with physical and developmental disabilities.

    PubMed

    James, Aimee S; Shireman, Theresa I

    2010-01-01

    A fundamental component of maximizing the quality of life for individuals with disabilities is quality health care. We describe the perspectives of case managers and independent living counselors on the role of health promotion as a component of targeted case management services. Respondents held health promotion as an essential element of maximizing the quality of life for individuals with disabilities, although they spent more time on social services as compared to medical services. Their confidence in assisting the individuals they serve with respect to health promotion and disease management activities was demonstrably weaker than their reported knowledge levels for most items. Barriers to accessing those services might create this apparent disconnect between knowledge and confidence. PMID:21104516

  10. Computer systems for dental practice management: a new generation of independent dental software.

    PubMed

    Gilboe, D B; Scott, D A

    1994-04-01

    A new generation of computer programs for dental patient management eliminates total dependence on the vendor for programming support. The software design enables information collected with the dental system to be transferred to popular off-the-shelf programs designed for business. A simplified example is used to illustrate the practitioners the advantages of this type of data structure management. Programs designed on this basis offer optimum performance and expandability for both present and future needs. PMID:7940401

  11. A software package for the data-independent management of multidimensional data

    NASA Technical Reports Server (NTRS)

    Treinish, Lloyd A.; Gough, Michel L.

    1987-01-01

    The Common Data Format (CDF), a structure which provides true data independence for applications software and has been developed at the National Space Science Data Center, is discussed. The background to the CDF is reviewed, and the CDF is described. The conceptual organization of the CDF is discussed, and a sample CDF structure is shown and described. The implementation of CDF, its status, and its applications are examined.

  12. Successful Management of Refractory Dialysis Independent Wegener's Granulomatosis with Combination of Therapeutic Plasma Exchange and Rituximab.

    PubMed

    Malhotra, Sheetal; Dhawan, Hari Krishan; Sharma, Ratti Ram; Marwaha, Neelam; Sharma, Aman

    2016-06-01

    Wegeners granulomatosis (WG) is an autoimmune, antineutrophil cytoplasmic antibody mediated necrotizing vasculitis involving renal, and upper and lower respiratory systems. Treatment relies on a combination of immunosuppressive drugs and tapering regimen of glucocorticoids. Therapeutic plasma exchange (TPE) has been recognized as a second line treatment. We report the successful use of TPE in combination with rituximab in achieving remission in a patient with WG (dialysis independent) not responding to conventional therapy. PMID:27408429

  13. ETV - VERIFICATION TESTING (ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM)

    EPA Science Inventory

    Verification testing is a major component of the Environmental Technology Verification (ETV) program. The ETV Program was instituted to verify the performance of innovative technical solutions to problems that threaten human health or the environment and was created to substantia...

  14. Independent technical evaluation and recommendations for contaminated groundwater at the department of energy office of legacy management Riverton processing site

    SciTech Connect

    Looney, Brain B.; Denham, Miles E.; Eddy-Dilek, Carol A.

    2014-04-01

    The U.S. Department of Energy Office of Legacy Management (DOE-LM) manages the legacy contamination at the Riverton, WY, Processing Site – a former uranium milling site that operated from 1958 to 1963. The tailings and associated materials were removed in 1988-1989 and contaminants are currently flushing from the groundwater. DOE-LM commissioned an independent technical team to assess the status of the contaminant flushing, identify any issues or opportunities for DOE-LM, and provide key recommendations. The team applied a range of technical frameworks – spatial, temporal, hydrological and geochemical – in performing the evaluation. In each topic area, an in depth evaluation was performed using DOE-LM site data (e.g., chemical measurements in groundwater, surface water and soil, water levels, and historical records) along with information collected during the December 2013 site visit (e.g., plant type survey, geomorphology, and minerals that were observed, collected and evaluated).

  15. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  16. Establishing an Independent Mobile Health Program for Chronic Disease Self-Management Support in Bolivia

    PubMed Central

    Piette, John D.; Valverde, Helen; Marinec, Nicolle; Jantz, Rachel; Kamis, Kevin; de la Vega, Carlos Lazo; Woolley, Timothy; Pinto, Bismarck

    2014-01-01

    Background: Mobile health (m-health) work in low- and middle-income countries (LMICs) mainly consists of small pilot programs with an unclear path to scaling and dissemination. We describe the deployment and testing of an m-health platform for non-communicable disease (NCD) self-management support in Bolivia. Methods: Three hundred sixty-four primary care patients in La Paz with diabetes or hypertension completed surveys about their use of mobile phones, health and access to care. One hundred sixty-five of those patients then participated in a 12-week demonstration of automated telephone monitoring and self-management support. Weekly interactive voice response (IVR) calls were made from a platform established at a university in La Paz, under the direction of the regional health ministry. Results: Thirty-seven percent of survey respondents spoke indigenous languages at home and 38% had six or fewer years of education. Eighty-two percent had a mobile phone, 45% used text messaging with a standard phone, and 9% had a smartphone. Smartphones were least common among patients who were older, spoke indigenous languages, or had less education. IVR program participants completed 1007 self-management support calls with an overall response rate of 51%. IVR call completion was lower among older adults, but was not related to patients’ ethnicity, health status, or healthcare access. IVR health and self-care reports were consistent with information reported during in-person baseline interviews. Patients’ likelihood of reporting excellent, very good, or good health (versus fair or poor health) via IVR increased during program participation and was associated with better medication adherence. Patients completing follow-up interviews were satisfied with the program, with 19/20 (95%) reporting that they would recommend it to a friend. Conclusion: By collaborating with LMICs, m-health programs can be transferred from higher-resource centers to LMICs and implemented in ways that

  17. Costs and risks of weekend anesthesia staffing at 6 independently managed surgical suites.

    PubMed

    Dexter, Franklin; Epstein, Richard H; Marsh, H Michael

    2002-10-01

    We previously developed a statistical method that managers can use to assure that nurse anesthetists are on call on weekends for as few hours as possible while providing a specified level of care for operating room (OR) patients. The statistically derived staffing solutions are optimal, meaning that the total number of staffed hours is guaranteed to be as low as possible to achieve the specified risk of being unable to care for patients as promptly as they had in the recent past. We used the statistical method to review nurse anesthetist weekend staffing at 6 surgical suites that were part of a healthcare system with a cost-conscious management team. Four of the suites had already made staffing changes resulting in a greater than 6% risk of being understaffed. One suite had adequate current staffing but slightly exceeded the minimum total staffing hours. One suite had more anesthetist coverage than was needed, resulting in excess staffing costs greater than $200,000 per year. We conclude that the principal value of the statistical method may be in helping healthcare system administrators and anesthetists quantify the impact of contemplated reductions in staffing on their risk of understaffing and prologing patients' wait for OR care. PMID:12425127

  18. Assisting the Frail Elderly Living in Subsidized Housing for the Independent Elderly: A Profile of the Management and Its Support Priorities.

    ERIC Educational Resources Information Center

    Heumann, Leonard F.

    1988-01-01

    Surveyed 64 site managers of subsidized housing for independent elderly to determine staff response to residents' loss of functional independence. Found that most sites lacked adequate staff with training to monitor aging population. No clear policies were found regarding when to retain or transfer residents. Staff appeared unprepared to locate or…

  19. Risk management and market efficiency on the Midwest Independent System Operator electricity exchange

    NASA Astrophysics Data System (ADS)

    Jones, Kevin

    Midwest Independent Transmission System Operator, Inc. (MISO) is a non-profit regional transmission organization (RTO) that oversees electricity production and transmission across thirteen states and one Canadian province. MISO also operates an electronic exchange for buying and selling electricity for each of its five regional hubs. MISO oversees two types of markets. The forward market, which is referred to as the day-ahead (DA) market, allows market participants to place demand bids and supply offers on electricity to be delivered at a specified hour the following day. The equilibrium price, known as the locational marginal price (LMP), is determined by MISO after receiving sale offers and purchase bids from market participants. MISO also coordinates a spot market, which is known as the real-time (RT) market. Traders in the real-time market must submit bids and offers by thirty minutes prior to the hour for which the trade will be executed. After receiving purchase and sale offers for a given hour in the real time market, MISO then determines the LMP for that particular hour. The existence of the DA and RT markets allows producers and retailers to hedge against the large fluctuations that are common in electricity prices. Hedge ratios on the MISO exchange are estimated using various techniques. No hedge ratio technique examined consistently outperforms the unhedged portfolio in terms of variance reduction. Consequently, none of the hedge ratio methods in this study meet the general interpretation of FASB guidelines for a highly effective hedge. One of the major goals of deregulation is to bring about competition and increased efficiency in electricity markets. Previous research suggests that electricity exchanges may not be weak-form market efficient. A simple moving average trading rule is found to produce statistically and economically significant profits on the MISO exchange. This could call the long-term survivability of the MISO exchange into question.

  20. 12 CFR 715.8 - Requirements for verification of accounts and passbooks.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... scope on which to base conclusions concerning management's financial reporting objectives; and (v... to base conclusions concerning management's financial reporting objectives to provide assurance that... CREDIT UNIONS SUPERVISORY COMMITTEE AUDITS AND VERIFICATIONS § 715.8 Requirements for verification...

  1. Independent management and financial review, Yucca Mountain Project, Nevada. Final report

    SciTech Connect

    1995-07-15

    The Yucca Mountain Project is one part of the Department of Energy`s Office of Civilian Radioactive Waste Management Program (the Program) which was established by the Nuclear Waste Policy Act of 1982, and as amended in 1987. The Program`s goal is to site the nation`s first geologic repository for the permanent disposal of high-level nuclear waste, in the form of spent fuel rod assemblies, generated by the nuclear power industry and a smaller quantity of Government radioactive waste. The Program, which also encompasses the transportation system and the multipurpose canister system was not the subject of this Report. The subject of this Review was only the Yucca Mountain Project in Nevada. While the Review was directed toward the Yucca Mountain Project rather than the Program as a whole, there are certain elements of the Project which cannot be addressed except through discussion of some Program issues. An example is the Total System Life Cycle Cost addressed in Section 7 of this report. Where Program issues are discussed in this Report, the reader is reminded of the scope limitations of the National Association of Regulatory Utility Commissioners (NARUC) contract to review only the Yucca Mountain Project. The primary scope of the Review was to respond to the specific criteria contained in the NARUC scope of work. In responding to these criteria, the Review Team understood that some interested parties have expressed concern over the requirements of the Nuclear Waste Policy Act relative to the Yucca Mountain Project and the nature of activities currently being carried out by the Department of Energy at the Yucca Mountain Project site. The Review Team has attempted to analyze relevant portions of the Nuclear Waste Policy Act as Amended, but has not conducted a thorough analysis of this legislation that could lead to any specific legal conclusions about all aspects of it.

  2. THIRD PARTY TECHNOLOGY PERFORMANCE VERIFICATION DATA FROM A STAKEHOLD-DRIVEN TECHNOLOGY TESTING PROGRAM

    EPA Science Inventory

    The Greenhouse Gas (GHG) Technology Verification Center is one of 12 independently operated verification centers established by the U.S. Environmental Protection Agency. The Center provides third-party performance data to stakeholders interested in environmetnal technologies tha...

  3. A review on the technologies and services used in the self-management of health and independent living of elderly.

    PubMed

    Arif, Mohammad Jafar; El Emary, Ibrahiem M M; Koutsouris, Dimitrios-Dionisios

    2014-01-01

    As the number of aged people is rapidly growing, the need for health and living care of aged people living alone becomes imperative. The telecare systems are able to provide flexible services for older people suffering from chronic diseases, but are largely user group oriented. However, it is common in elderly to show symptoms of a combination of (chronic) diseases. Moreover, elderly are totally dependent on a third person as they are unable to perform a number of basic functions at home. They also feel cutt off from the social fabric. Old people living in remote places typically use telephone that dials a social alarm control center or mobile social alarm systems and monitoring systems. This study examines the existing solutions related to elderly assistance and proposes an advanced solution based on web technology for the self-management of health and independent living of elderly. PMID:25134962

  4. Welfare Eligibility: Deficit Reduction Act Income Verification Issues. Fact Sheet for the Ranking Minority Member, Subcommittee on Oversight of Government Management, Committee on Governmental Affairs, United States Senate.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Div. of Human Resources.

    This income and eligibility verification system (IEVS) database was created to aid the implementation of data exchanges among federal and state agencies. These exchanges are important for income and eligibility verification of persons who receive benefits from welfare and unemployment programs. Attempts are being made to match the computer…

  5. International Space Station United States Laboratory Module Water Recovery Management Subsystem Verification from Flight 5A to Stage ULF2

    NASA Technical Reports Server (NTRS)

    Williams, David E.; Labuda, Laura

    2009-01-01

    The International Space Station (ISS) Environmental Control and Life Support (ECLS) system comprises of seven subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), Vacuum System (VS), Water Recovery and Management (WRM), and Waste Management (WM). This paper provides a summary of the nominal operation of the United States (U.S.) Laboratory Module WRM design and detailed element methodologies utilized during the Qualification phase of the U.S. Laboratory Module prior to launch and the Qualification of all of the modification kits added to it from Flight 5A up and including Stage ULF2.

  6. REPORT ON THE VERIFICATION AND APPLICATION OF THE UPDATED STORMWATER MANAGEMENT MODEL TO PREDICT POLLUTANT LOADINGS TO MEET TMDL REQUIREMENTS.

    EPA Science Inventory

    In September 2003, the initial phase of the modernization of the EPA Storm Water Management Model( SWMM), SWMM5 Beta Version B, is expected to be completed. The work in this phase includes (1) a revised architecture of the SWMM computational engine, using object oriented programm...

  7. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  8. Sensor to User - NASA/EOS Data for Coastal Zone Management Applications Developed from Integrated Analyses: Verification, Validation and Benchmark Report

    NASA Technical Reports Server (NTRS)

    Hall, Callie; Arnone, Robert

    2006-01-01

    The NASA Applied Sciences Program seeks to transfer NASA data, models, and knowledge into the hands of end-users by forming links with partner agencies and associated decision support tools (DSTs). Through the NASA REASoN (Research, Education and Applications Solutions Network) Cooperative Agreement, the Oceanography Division of the Naval Research Laboratory (NRLSSC) is developing new products through the integration of data from NASA Earth-Sun System assets with coastal ocean forecast models and other available data to enhance coastal management in the Gulf of Mexico. The recipient federal agency for this research effort is the National Oceanic and Atmospheric Administration (NOAA). The contents of this report detail the effort to further the goals of the NASA Applied Sciences Program by demonstrating the use of NASA satellite products combined with data-assimilating ocean models to provide near real-time information to maritime users and coastal managers of the Gulf of Mexico. This effort provides new and improved capabilities for monitoring, assessing, and predicting the coastal environment. Coastal managers can exploit these capabilities through enhanced DSTs at federal, state and local agencies. The project addresses three major issues facing coastal managers: 1) Harmful Algal Blooms (HABs); 2) hypoxia; and 3) freshwater fluxes to the coastal ocean. A suite of ocean products capable of describing Ocean Weather is assembled on a daily basis as the foundation for this semi-operational multiyear effort. This continuous realtime capability brings decision makers a new ability to monitor both normal and anomalous coastal ocean conditions with a steady flow of satellite and ocean model conditions. Furthermore, as the baseline data sets are used more extensively and the customer list increased, customer feedback is obtained and additional customized products are developed and provided to decision makers. Continual customer feedback and response with new improved

  9. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  10. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  11. Radiography Facility - Building 239 Independent Validation Review

    SciTech Connect

    Altenbach, T J; Beaulieu, R A; Watson, J F; Wong, H J

    2010-02-02

    The purpose of this task was to perform an Independent Validation Review to evaluate the successful implementation and effectiveness of Safety Basis controls, including new and revised controls, to support the implementation of a new DSA/TSR for B239. This task addresses Milestone 2 of FY10 PEP 7.6.6. As the first IVR ever conducted on a LLNL nuclear facility, it was designated a pilot project. The review follows the outline developed for Milestone 1 of the PEP, which is based on the DOE Draft Guide for Performance of Independent Verification Review of Safety Basis Controls. A formal Safety Basis procedure will be developed later, based on the lessons learned with this pilot project. Note, this review is termed a ''Validation'' in order to be consistent with the PEP definition and address issues historically raised about verification mechanisms at LLNL. Validation is intended to confirm that implementing mechanisms realistically establish the ability of TSR LCO, administrative control or safety management program to accomplish its intended safety function and that the controls are being implemented. This effort should not, however, be confused with a compliance assessment against all relevant DOE requirements and national standards. Nor is it used as a vehicle to question the derivation of controls already approved by LSO unless a given TSR statement simply cannot be implemented as stated.

  12. Design of Experiments with Multiple Independent Variables: A Resource Management Perspective on Complete and Reduced Factorial Designs

    PubMed Central

    Collins, Linda M.; Dziak, John J.; Li, Runze

    2009-01-01

    An investigator who plans to conduct experiments with multiple independent variables must decide whether to use a complete or reduced factorial design. This article advocates a resource management perspective on making this decision, in which the investigator seeks a strategic balance between service to scientific objectives and economy. Considerations in making design decisions include whether research questions are framed as main effects or simple effects; whether and which effects are aliased (confounded) in a particular design; the number of experimental conditions that must be implemented in a particular design and the number of experimental subjects the design requires to maintain the desired level of statistical power; and the costs associated with implementing experimental conditions and obtaining experimental subjects. In this article four design options are compared: complete factorial, individual experiments, single factor, and fractional factorial designs. Complete and fractional factorial designs and single factor designs are generally more economical than conducting individual experiments on each factor. Although relatively unfamiliar to behavioral scientists, fractional factorial designs merit serious consideration because of their economy and versatility. PMID:19719358

  13. Verification of Adaptive Systems

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui; Vassev, Emil; Hinchey, Mike; Rouff, Christopher; Buskens, Richard

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance for them. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  14. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  15. M&V Guidelines: Measurement and Verification for Performance-Based Contracts Version 4.0

    SciTech Connect

    2015-11-02

    Document outlines the Federal Energy Management Program's standard procedures and guidelines for measurement and verification (M&V) for federal energy managers, procurement officials, and energy service providers.

  16. INTERIM REPORT--INDEPENDENT VERIFICATION SURVEY OF SECTION 3, SURVEY UNITS 1, 4 AND 5 EXCAVATED SURFACES, WHITTAKER CORPORATION, REYNOLDS INDUSTRIAL PARK, TRANSFER, PENNSYLVANIA DCN: 5002-SR-04-0"

    SciTech Connect

    ADAMS, WADE C

    2013-04-18

    At Pennsylvania Department of Environmental Protection's request, ORAU's IEAV program conducted verification surveys on the excavated surfaces of Section 3, SUs 1, 4, and 5 at the Whittaker site on March 13 and 14, 2013. The survey activities included visual inspections, gamma radiation surface scans, gamma activity measurements, and soil sampling activities. Verification activities also included the review and assessment of the licensee's project documentation and methodologies. Surface scans identified four areas of elevated direct gamma radiation distinguishable from background; one area within SUs 1 and 4 and two areas within SU5. One area within SU5 was remediated by removing a golf ball size piece of slag while ORAU staff was onsite. With the exception of the golf ball size piece of slag within SU5, a review of the ESL Section 3 EXS data packages for SUs 1, 4, and 5 indicated that these locations of elevated gamma radiation were also identified by the ESL gamma scans and that ESL personnel performed additional investigations and soil sampling within these areas. The investigative results indicated that the areas met the release criteria.

  17. Fuel Retrieval System (FRS) Design Verification

    SciTech Connect

    YANOCHKO, R.M.

    2000-01-27

    This document was prepared as part of an independent review to explain design verification activities already completed, and to define the remaining design verification actions for the Fuel Retrieval System. The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR).

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF BAGHOUSE FILTRATION PRODUCTS

    EPA Science Inventory

    The Environmental Technology Verification Program (ETV) was started by EPA in 1995 to generate independent credible data on the performance of innovative technologies that have potential to improve protection of public health and the environment. ETV does not approve or certify p...

  19. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  20. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  1. Verification issues for rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; Riley, Gary; Savely, Robert T.

    1987-01-01

    Verification and validation of expert systems is very important for the future success of this technology. Software will never be used in non-trivial applications unless the program developers can assure both users and managers that the software is reliable and generally free from error. Therefore, verification and validation of expert systems must be done. The primary hindrance to effective verification and validation is the use of methodologies which do not produce testable requirements. An extension of the flight technique panels used in previous NASA programs should provide both documented requirements and very high levels of verification for expert systems.

  2. Verification of post permanently manned configuration Space Station elements

    NASA Technical Reports Server (NTRS)

    Scully, E. J.; Edwards, M. D.

    1986-01-01

    An account is given of the techniques and ground systems designed to fulfill post permanently manned configuration (PMC) Space Station verification tasks. Consideration is given to analysis using computer math models and computer-aided interface verification systems, testing using simulators and interface mixtures, and special inspection. It is noted that an initial Space Station design that accommodates and facilitates verification is crucial to an effective verification program as well as proper instrumentation, built-in test capability, and a precise configuration management, control and record system. It is concluded that post PMC verification should be accounted for both in the initial Space Station design and in the subsequent development of initial assembly flight verification techniques and capabilities.

  3. Greater years of maternal schooling and higher scores on academic achievement tests are independently associated with improved management of child diarrhea by rural Guatemalan mothers.

    PubMed

    Webb, Aimee L; Ramakrishnan, Usha; Stein, Aryeh D; Sellen, Daniel W; Merchant, Moeza; Martorell, Reynaldo

    2010-09-01

    Appropriate home management can alleviate many of the consequences of diarrhea including malnutrition, impaired development, growth faltering, and mortality. Maternal cognitive ability, years of schooling, and acquired academic skills are hypothesized to improve child health by improving maternal child care practices, such as illness management. Using information collected longitudinally in 1996-1999 from 466 rural Guatemalan women with children <36 months, we examined the independent associations between maternal years of schooling, academic skills, and scores on the Raven's Progressive Matrices and an illness management index (IMI). Women scoring in the lowest and middle tertiles of academic skills scored lower on the IMI compared to women in the highest tertile (-0.24 [95% CI: -0.54, 0.07]; -0.30 [95% CI: -0.54, -0.06], respectively) independent of sociodemographic factors, schooling, and Raven's scores. Among mothers with less than 1 year of schooling, scoring in the lowest tertile on the Raven's Progressive Matrices compared to the highest was significantly associated with scoring one point lower on the IMI (-1.18 [95% CI: -2.20, -0.17]). Greater academic skills were independently associated with maternal care during episodes of infant diarrhea. Schooling of young girls and/or community based programs that provide women with academic skills such as literacy, numeracy and knowledge could potentially improve mothers' care giving practices. PMID:19685178

  4. [The implementation of an independent and differentiated pain management SOP (Standard Operating Procedure) for the interdisciplinary intensive care unit].

    PubMed

    Aust, Hansjörg; Wulf, Hinnerk; Vassiliou, Timon

    2013-03-01

    Up to the present day, pain management in the ICU (Intensive Care Units) is a unresolved clinical problem due to patient heterogeneity with complex variation in etiopathology and treatment of the underlying diseases. Therefore, therapeutic strategies in terms of standard operating procedure (SOP) are a necessary to improve the pain management for intensive care patients. Common guidelines for analgosedation are often inadequate to reflect the clinical situation. In particular, for an ICU setting without permanent presence of a physician a missing pain management SOP resulting in delayed pain therapy caused by a therapeutic uncertainty of the nurse staff. In addition to our pre-existing SOP for analgosedation we implemented a pain management SOP for our interdisciplinary, anaesthesiologic ICU. A exploratory survey among the nurse staff was conducted to assess the efficacy of the SOP. The results of the evaluation after a 6 month follow-up indicated a faster onset of pain management and good acceptance by the nursing staff. PMID:23589009

  5. 30 CFR 227.601 - What are a State's responsibilities if it performs automated verification?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... performs automated verification? 227.601 Section 227.601 Mineral Resources MINERALS MANAGEMENT SERVICE... Perform Delegated Functions § 227.601 What are a State's responsibilities if it performs automated verification? To perform automated verification of production reports or royalty reports, you must: (a)...

  6. Are Independent Probes Truly Independent?

    ERIC Educational Resources Information Center

    Camp, Gino; Pecher, Diane; Schmidt, Henk G.; Zeelenberg, Rene

    2009-01-01

    The independent cue technique has been developed to test traditional interference theories against inhibition theories of forgetting. In the present study, the authors tested the critical criterion for the independence of independent cues: Studied cues not presented during test (and unrelated to test cues) should not contribute to the retrieval…

  7. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  8. Verification of RADTRAN

    SciTech Connect

    Kanipe, F.L.; Neuhauser, K.S.

    1995-12-31

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes.

  9. 340 and 310 drawing field verification

    SciTech Connect

    Langdon, J.

    1996-09-27

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format.

  10. INDEPENDENT TECHNICAL ASSESSMENT OF MANAGEMENT OF STORMWATER AND WASTEWATER AT THE SEPARATIONS PROCESS RESEARCH UNIT (SPRU) DISPOSITION PROJECT, NEW YORK

    SciTech Connect

    Abitz, R.; Jackson, D.; Eddy-Dilek, C.

    2011-06-27

    The U.S. Department of Energy (DOE) is currently evaluating the water management procedures at the Separations Process Research Unit (SPRU). The facility has three issues related to water management that require technical assistance: (1) due to a excessive rainfall event in October, 2010, contaminated water collected in basements of G2 and H2 buildings. As a result of this event, the contractor has had to collect and dispose of water offsite; (2) The failure of a sump pump at a KAPL outfall resulted in a Notice of Violation issued by the New York State Department of Environment and Conservation (NYSDEC) and subsequent Consent Order. On-site water now requires treatment and off-site disposition; and (3) stormwater infiltration has resulted in Strontium-90 levels discharged to the storm drains that exceed NR standards. The contractor has indicated that water management at SPRU requires major staff resources (at least 50 persons). The purpose of this review is to determine if the contractor's technical approach warrants the large number of staff resources and to ensure that the technical approach is compliant and in accordance with federal, state and NR requirements.

  11. Salam's independence

    NASA Astrophysics Data System (ADS)

    Fraser, Gordon

    2009-01-01

    In his kind review of my biography of the Nobel laureate Abdus Salam (December 2008 pp45-46), John W Moffat wrongly claims that Salam had "independently thought of the idea of parity violation in weak interactions".

  12. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  13. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  14. Wind gust warning verification

    NASA Astrophysics Data System (ADS)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  15. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  16. TFE verification program

    NASA Astrophysics Data System (ADS)

    1994-01-01

    This is the final semiannual progress report for the Thermionic Fuel Elements (TFE) verification. A decision was made in August 1993 to begin a Close Out Program on October 1, 1993. Final reports summarizing the design analyses and test activities of the TFE Verification Program will be written, stand-alone documents for each task. The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein includes evaluated test data, design evaluations, the results of analyses and the significance of results.

  17. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  18. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  19. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  20. Telomerase Enzyme Inhibition (TEI) and Cytolytic Therapy in the Management of Androgen Independent Osseous Metastatic Prostate Cancer

    PubMed Central

    Li, Yingming; Malaeb, Bahaa S.; Li, Zhong-ze; Thompson, Melissa G.; Chen, Zhi; Corey, David R.; Hsieh, Jer-Tsong; Shay, Jerry W.; Koeneman, Kenneth S.

    2014-01-01

    BACKGROUND Recurrent prostate cancer can be osseous, androgen independent and lethal. The purpose is to discern the efficacy of synthetic small molecule telomerase enzyme inhibitors (TEI) alone or in combination with other cytotoxic therapies in controlling metastatic osseous prostate cancer. METHODS C4-2B was pre-treated with a match or mismatch TEI for 6 weeks and then inoculated into nude mice subcutaneously or intraosseously. In a separate experiment, untreated C4-2B was injected into femur of nude mice. The mice were divided into seven systemic “combination” treatment groups of control, Ad-BSP-E1a virus, docetaxel, mismatch and match TEI. Serum PSA was followed longitudinally. Histology analyses and histomorphometry were performed. Repeated measure analysis was applied for statistical analysis and Bonferroni method was used in multiple comparisons. RESULTS In the pre-treated study, the PSA of match treated cells in subcutaneous or intraosseous model was significantly lower than mismatch TEI or PBS treated group (P <0.05). Histology revealed increased fibrosis, apoptosis and decreased PSA staining in the match TEI treated subcutaneous xenografts. In the combination treatment study, the PSA was significantly lower in single/double treatment and triple treatment than control (P <0.05). Histology revealed that triple therapy mice had normal femur architecture. Histomorphometrics revealed that the area of femur tumor and woven bone was significantly positively correlated (P =0.007). CONCLUSIONS Multiple lines of data point toward the efficacy of systemically administered telomerase inhibitors. Combining cytotoxic regimens with telomerase inhibitors could be beneficial in controlling prostate cancer. Clinical trials are warranted to explore the efficacy of TEI in prostate cancer. PMID:20043297

  1. Context Effects in Sentence Verification.

    ERIC Educational Resources Information Center

    Kiger, John I.; Glass, Arnold L.

    1981-01-01

    Three experiments examined what happens to reaction time to verify easy items when they are mixed with difficult items in a verification task. Subjects verification of simple arithmetic equations and sentences took longer when placed in a difficult list. Difficult sentences also slowed the verification of easy arithmetic equations. (Author/RD)

  2. Independent Evaluation of the integrated Community Case Management of Childhood Illness Strategy in Malawi Using a National Evaluation Platform Design

    PubMed Central

    Amouzou, Agbessi; Kanyuka, Mercy; Hazel, Elizabeth; Heidkamp, Rebecca; Marsh, Andrew; Mleme, Tiope; Munthali, Spy; Park, Lois; Banda, Benjamin; Moulton, Lawrence H.; Black, Robert E.; Hill, Kenneth; Perin, Jamie; Victora, Cesar G.; Bryce, Jennifer

    2016-01-01

    We evaluated the impact of integrated community case management of childhood illness (iCCM) on careseeking for childhood illness and child mortality in Malawi, using a National Evaluation Platform dose-response design with 27 districts as units of analysis. “Dose” variables included density of iCCM providers, drug availability, and supervision, measured through a cross-sectional cellular telephone survey of all iCCM-trained providers. “Response” variables were changes between 2010 and 2014 in careseeking and mortality in children aged 2–59 months, measured through household surveys. iCCM implementation strength was not associated with changes in careseeking or mortality. There were fewer than one iCCM-ready provider per 1,000 under-five children per district. About 70% of sick children were taken outside the home for care in both 2010 and 2014. Careseeking from iCCM providers increased over time from about 2% to 10%; careseeking from other providers fell by a similar amount. Likely contributors to the failure to find impact include low density of iCCM providers, geographic targeting of iCCM to “hard-to-reach” areas although women did not identify distance from a provider as a barrier to health care, and displacement of facility careseeking by iCCM careseeking. This suggests that targeting iCCM solely based on geographic barriers may need to be reconsidered. PMID:26787158

  3. Independent Evaluation of the integrated Community Case Management of Childhood Illness Strategy in Malawi Using a National Evaluation Platform Design.

    PubMed

    Amouzou, Agbessi; Kanyuka, Mercy; Hazel, Elizabeth; Heidkamp, Rebecca; Marsh, Andrew; Mleme, Tiope; Munthali, Spy; Park, Lois; Banda, Benjamin; Moulton, Lawrence H; Black, Robert E; Hill, Kenneth; Perin, Jamie; Victora, Cesar G; Bryce, Jennifer

    2016-03-01

    We evaluated the impact of integrated community case management of childhood illness (iCCM) on careseeking for childhood illness and child mortality in Malawi, using a National Evaluation Platform dose-response design with 27 districts as units of analysis. "Dose" variables included density of iCCM providers, drug availability, and supervision, measured through a cross-sectional cellular telephone survey of all iCCM-trained providers. "Response" variables were changes between 2010 and 2014 in careseeking and mortality in children aged 2-59 months, measured through household surveys. iCCM implementation strength was not associated with changes in careseeking or mortality. There were fewer than one iCCM-ready provider per 1,000 under-five children per district. About 70% of sick children were taken outside the home for care in both 2010 and 2014. Careseeking from iCCM providers increased over time from about 2% to 10%; careseeking from other providers fell by a similar amount. Likely contributors to the failure to find impact include low density of iCCM providers, geographic targeting of iCCM to "hard-to-reach" areas although women did not identify distance from a provider as a barrier to health care, and displacement of facility careseeking by iCCM careseeking. This suggests that targeting iCCM solely based on geographic barriers may need to be reconsidered. PMID:26787158

  4. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  5. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  7. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  8. Automated claim and payment verification.

    PubMed

    Segal, Mark J; Morris, Susan; Rubin, James M O

    2002-01-01

    Since the start of managed care, there has been steady deterioration in the ability of physicians, hospitals, payors, and patients to understand reimbursement and the contracts and payment policies that drive it. This lack of transparency has generated administrative costs, confusion, and mistrust. It is therefore essential that physicians, hospitals, and payors have rapid access to accurate information on contractual payment terms. This article summarizes problems with contract-based reimbursement and needed responses by medical practices. It describes an innovative, Internet-based claims and payment verification service, Phynance, which automatically verifies the accuracy of all claims and payments by payor, contract and line item. This service enables practices to know and apply the one, true, contractually obligated allowable. The article details implementation costs and processes and anticipated return on investment. The resulting transparency improves business processes throughout health care, increasing efficiency and lowering costs for physicians, hospitals, payors, employers--and patients. PMID:12122814

  9. 76 FR 29805 - Submission for Review: Verification of Full-Time School Attendance, RI 25-49

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-23

    ... MANAGEMENT Submission for Review: Verification of Full-Time School Attendance, RI 25-49 AGENCY: U.S. Office... opportunity to comment on a revised information collection request (ICR) 3206-0215, Verification of Full-Time...@opm.gov or faxed to (202) 606-0910. SUPPLEMENTARY INFORMATION: RI 25-49, Verification of...

  10. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  11. Fluor Hanford Integrated Safety Management System Phase 1 Verification 04/12/2000 Thru 04/28/2000 Volume 1 and 2

    SciTech Connect

    PARSONS, J.E.

    2000-03-01

    The U.S. Department of Energy (DOE) commits to accomplishing its mission safely. To ensure this objective is met, DOE issued DOE P 450.4, Safety Management System Policy, and incorporated safety management into the DOE Acquisition Regulations ([DEAR] 48 CFR 970.5204-2 and 90.5204-78).

  12. Integrated testing and verification system for research flight software

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.

    1979-01-01

    The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.

  13. 24 CFR 81.102 - Verification and enforcement to ensure GSE data integrity.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Verification and enforcement to ensure GSE data integrity. 81.102 Section 81.102 Housing and Urban Development Office of the Secretary... Provisions § 81.102 Verification and enforcement to ensure GSE data integrity. (a) Independent...

  14. 24 CFR 81.102 - Verification and enforcement to ensure GSE data integrity.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Verification and enforcement to ensure GSE data integrity. 81.102 Section 81.102 Housing and Urban Development Office of the Secretary... Provisions § 81.102 Verification and enforcement to ensure GSE data integrity. (a) Independent...

  15. Quantitative Measures for Software Independent Verification and Validation

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    1996-01-01

    As software is maintained or reused, it undergoes an evolution which tends to increase the overall complexity of the code. To understand the effects of this, we brought in statistics experts and leading researchers in software complexity, reliability, and their interrelationships. These experts' project has resulted in our ability to statistically correlate specific code complexity attributes, in orthogonal domains, to errors found over time in the HAL/S flight software which flies in the Space Shuttle. Although only a prototype-tools experiment, the result of this research appears to be extendable to all other NASA software, given appropriate data similar to that logged for the Shuttle onboard software. Our research has demonstrated that a more complete domain coverage can be mathematically demonstrated with the approach we have applied, thereby ensuring full insight into the cause-and-effects relationship between the complexity of a software system and the fault density of that system. By applying the operational profile we can characterize the dynamic effects of software path complexity under this same approach We now have the ability to measure specific attributes which have been statistically demonstrated to correlate to increased error probability, and to know which actions to take, for each complexity domain. Shuttle software verifiers can now monitor the changes in the software complexity, assess the added or decreased risk of software faults in modified code, and determine necessary corrections. The reports, tool documentation, user's guides, and new approach that have resulted from this research effort represent advances in the state of the art of software quality and reliability assurance. Details describing how to apply this technique to other NASA code are contained in this document.

  16. Characteristics verification of an independently controllable electromagnetic spherical motor.

    PubMed

    Maeda, Shuhei; Hirata, Katsuhiro; Niguchi, Noboru

    2014-01-01

    We have been developing electromagnetic spherical actuators capable of three-degree-of-freedom rotation. However, these actuators require complex control to realize simultaneous triaxial drive, because rotation around one axis interferes with rotation around another. In this paper, we propose a new three-degree-of-freedom actuator where 3-axes rotation can be controlled easily. The basic structure and the operating principle of the actuator are described. Then the torque characteristics and the dynamic characteristics are computed by employing 3D-FEM and the effectiveness of this actuator is clarified. Finally, the experimental results using the prototype of the actuator are shown to verify the dynamic performance. PMID:24919011

  17. Characteristics Verification of an Independently Controllable Electromagnetic Spherical Motor

    PubMed Central

    Maeda, Shuhei; Hirata, Katsuhiro; Niguchi, Noboru

    2014-01-01

    We have been developing electromagnetic spherical actuators capable of three-degree-of-freedom rotation. However, these actuators require complex control to realize simultaneous triaxial drive, because rotation around one axis interferes with rotation around another. In this paper, we propose a new three-degree-of-freedom actuator where 3-axes rotation can be controlled easily. The basic structure and the operating principle of the actuator are described. Then the torque characteristics and the dynamic characteristics are computed by employing 3D-FEM and the effectiveness of this actuator is clarified. Finally, the experimental results using the prototype of the actuator are shown to verify the dynamic performance. PMID:24919011

  18. SAPHIRE 8 Software Independent Verification and Validation Plan

    SciTech Connect

    Rae J. Nims

    2009-04-01

    SAPHIRE 8 is being developed with a phased or cyclic iterative rapid application development methodology. Due to this approach, a similar approach is being taken for the IV&V activities on each vital software object. The IV&V plan is structured around NUREG/BR-0167, “Software Quality Assurance Program and Guidelines,” February 1993. The Nuclear Regulatory Research Office Instruction No.: PRM-12, “Software Quality Assurance for RES Sponsored Codes,” March 26, 2007 specifies that RES-sponsored software is to be evaluated against NUREG/BR-0167. Per the guidance in NUREG/BR-0167, SAPHIRE is classified as “Level 1.” Level 1 software corresponds to technical application software used in a safety decision.

  19. Independent Verification and Validation (IV and V) Criteria

    NASA Technical Reports Server (NTRS)

    McGill, Kenneth

    2000-01-01

    The purpose of this appendix is to establish quantifiable criteria for determining whether IV&V should be applied to a given software development. Since IV&V should begin in the Formulation Subprocess of a project, the process here described is based on metrics which are available before project approval.

  20. SAPHIRE 8 Software Independent Verification and Validation Plan

    SciTech Connect

    Rae J. Nims; Kent M. Norris

    2010-02-01

    SAPHIRE 8 is being developed with a phased or cyclic iterative rapid application development methodology. Due to this approach, a similar approach is being taken for the IV&V activities on each vital software object. The IV&V plan is structured around NUREG/BR-0167, “Software Quality Assurance Program and Guidelines,” February 1993. The Nuclear Regulatory Research Office Instruction No.: PRM-12, “Software Quality Assurance for RES Sponsored Codes,” March 26, 2007 specifies that RES-sponsored software is to be evaluated against NUREG/BR-0167. Per the guidance in NUREG/BR-0167, SAPHIRE is classified as “Level 1.” Level 1 software corresponds to technical application software used in a safety decision.

  1. Subtype-independent near full-length HIV-1 genome sequencing and assembly to be used in large molecular epidemiological studies and clinical management

    PubMed Central

    Grossmann, Sebastian; Nowak, Piotr; Neogi, Ujjwal

    2015-01-01

    Introduction HIV-1 near full-length genome (HIV-NFLG) sequencing from plasma is an attractive multidimensional tool to apply in large-scale population-based molecular epidemiological studies. It also enables genotypic resistance testing (GRT) for all drug target sites allowing effective intervention strategies for control and prevention in high-risk population groups. Thus, the main objective of this study was to develop a simplified subtype-independent, cost- and labour-efficient HIV-NFLG protocol that can be used in clinical management as well as in molecular epidemiological studies. Methods Plasma samples (n=30) were obtained from HIV-1B (n=10), HIV-1C (n=10), CRF01_AE (n=5) and CRF01_AG (n=5) infected individuals with minimum viral load >1120 copies/ml. The amplification was performed with two large amplicons of 5.5 kb and 3.7 kb, sequenced with 17 primers to obtain HIV-NFLG. GRT was validated against ViroSeq™ HIV-1 Genotyping System. Results After excluding four plasma samples with low-quality RNA, a total of 26 samples were attempted. Among them, NFLG was obtained from 24 (92%) samples with the lowest viral load being 3000 copies/ml. High (>99%) concordance was observed between HIV-NFLG and ViroSeq™ when determining the drug resistance mutations (DRMs). The N384I connection mutation was additionally detected by NFLG in two samples. Conclusions Our high efficiency subtype-independent HIV-NFLG is a simple and promising approach to be used in large-scale molecular epidemiological studies. It will facilitate the understanding of the HIV-1 pandemic population dynamics and outline effective intervention strategies. Furthermore, it can potentially be applicable in clinical management of drug resistance by evaluating DRMs against all available antiretrovirals in a single assay. PMID:26115688

  2. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  3. Independence and Survival.

    ERIC Educational Resources Information Center

    James, H. Thomas

    Independent schools that are of viable size, well managed, and strategically located to meet competition will survive and prosper past the current financial crisis. We live in a complex technological society with insatiable demands for knowledgeable people to keep it running. The future will be marked by the orderly selection of qualified people,…

  4. Independence, Disengagement, and Discipline

    ERIC Educational Resources Information Center

    Rubin, Ron

    2012-01-01

    School disengagement is linked to a lack of opportunities for students to fulfill their needs for independence and self-determination. Young people have little say about what, when, where, and how they will learn, the criteria used to assess their success, and the content of school and classroom rules. Traditional behavior management discourages…

  5. Independent technical review, handbook

    SciTech Connect

    Not Available

    1994-02-01

    Purpose Provide an independent engineering review of the major projects being funded by the Department of Energy, Office of Environmental Restoration and Waste Management. The independent engineering review will address questions of whether the engineering practice is sufficiently developed to a point where a major project can be executed without significant technical problems. The independent review will focus on questions related to: (1) Adequacy of development of the technical base of understanding; (2) Status of development and availability of technology among the various alternatives; (3) Status and availability of the industrial infrastructure to support project design, equipment fabrication, facility construction, and process and program/project operation; (4) Adequacy of the design effort to provide a sound foundation to support execution of project; (5) Ability of the organization to fully integrate the system, and direct, manage, and control the execution of a complex major project.

  6. Understanding independence

    NASA Astrophysics Data System (ADS)

    Annan, James; Hargreaves, Julia

    2016-04-01

    In order to perform any Bayesian processing of a model ensemble, we need a prior over the ensemble members. In the case of multimodel ensembles such as CMIP, the historical approach of ``model democracy'' (i.e. equal weight for all models in the sample) is no longer credible (if it ever was) due to model duplication and inbreeding. The question of ``model independence'' is central to the question of prior weights. However, although this question has been repeatedly raised, it has not yet been satisfactorily addressed. Here I will discuss the issue of independence and present a theoretical foundation for understanding and analysing the ensemble in this context. I will also present some simple examples showing how these ideas may be applied and developed.

  7. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  8. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  9. TFE verification program

    NASA Astrophysics Data System (ADS)

    1990-03-01

    The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a Thermionic Fuel Element (TFE) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program; (3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-88; and (5) Thermionic Program in 1986 and 1987.

  10. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  11. 10 CFR 72.79 - Facility information and verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Facility information and verification. 72.79 Section 72.79 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR THE INDEPENDENT STORAGE OF SPENT NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE...

  12. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: EVALUATION OF THE XP-SWMM STORMWATER WASTEWATER MANAGEMENT MODEL, VERSION 8.2, 2000, FROM XP SOFTWARE, INC.

    EPA Science Inventory

    XP-SWMM is a commercial software package used throughout the United States and around the world for simulation of storm, sanitary and combined sewer systems. It was designed based on the EPA Storm Water Management Model (EPA SWMM), but has enhancements and additional algorithms f...

  14. Continuous verification using multimodal biometrics.

    PubMed

    Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep

    2007-04-01

    Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system. PMID:17299225

  15. Formal Verification of Large Software Systems

    NASA Technical Reports Server (NTRS)

    Yin, Xiang; Knight, John

    2010-01-01

    We introduce a scalable proof structure to facilitate formal verification of large software systems. In our approach, we mechanically synthesize an abstract specification from the software implementation, match its static operational structure to that of the original specification, and organize the proof as the conjunction of a series of lemmas about the specification structure. By setting up a different lemma for each distinct element and proving each lemma independently, we obtain the important benefit that the proof scales easily for large systems. We present details of the approach and an illustration of its application on a challenge problem from the security domain

  16. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  17. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  18. Signal verification can promote reliable signalling.

    PubMed

    Broom, Mark; Ruxton, Graeme D; Schaefer, H Martin

    2013-11-22

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer-resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism. PMID:24068354

  19. Automating engineering verification in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Castillo, Jorge

    2014-08-01

    The Atacama Large Millimeter/submillimeter Array is an interferometer comprising 66 individual high precision antennas located over 5000 meters altitude in the north of Chile. Several complex electronic subsystems need to be meticulously tested at different stages of an antenna commissioning, both independently and when integrated together. First subsystem integration takes place at the Operations Support Facilities (OSF), at an altitude of 3000 meters. Second integration occurs at the high altitude Array Operations Site (AOS), where also combined performance with Central Local Oscillator (CLO) and Correlator is assessed. In addition, there are several other events requiring complete or partial verification of instrument specifications compliance, such as parts replacements, calibration, relocation within AOS, preventive maintenance and troubleshooting due to poor performance in scientific observations. Restricted engineering time allocation and the constant pressure of minimizing downtime in a 24/7 astronomical observatory, impose the need to complete (and report) the aforementioned verifications in the least possible time. Array-wide disturbances, such as global power interruptions and following recovery, generate the added challenge of executing this checkout on multiple antenna elements at once. This paper presents the outcome of the automation of engineering verification setup, execution, notification and reporting in ALMA and how these efforts have resulted in a dramatic reduction of both time and operator training required. Signal Path Connectivity (SPC) checkout is introduced as a notable case of such automation.

  20. Signal verification can promote reliable signalling

    PubMed Central

    Broom, Mark; Ruxton, Graeme D.; Schaefer, H. Martin

    2013-01-01

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer–resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism. PMID:24068354

  1. 'Independence' Panorama

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Click on the image for 'Independence' Panorama (QTVR)

    This is the Spirit 'Independence' panorama, acquired on martian days, or sols, 536 to 543 (July 6 to 13, 2005), from a position in the 'Columbia Hills' near the summit of 'Husband Hill.' The summit of 'Husband Hill' is the peak near the right side of this panorama and is about 100 meters (328 feet) away from the rover and about 30 meters (98 feet) higher in elevation. The rocky outcrops downhill and on the left side of this mosaic include 'Larry's Lookout' and 'Cumberland Ridge,' which Spirit explored in April, May, and June of 2005.

    The panorama spans 360 degrees and consists of 108 individual images, each acquired with five filters of the rover's panoramic camera. The approximate true color of the mosaic was generated using the camera's 750-, 530-, and 480-nanometer filters. During the 8 martian days, or sols, that it took to acquire this image, the lighting varied considerably, partly because of imaging at different times of sol, and partly because of small sol-to-sol variations in the dustiness of the atmosphere. These slight changes produced some image seams and rock shadows. These seams have been eliminated from the sky portion of the mosaic to better simulate the vista a person standing on Mars would see. However, it is often not possible or practical to smooth out such seams for regions of rock, soil, rover tracks or solar panels. Such is the nature of acquiring and assembling large panoramas from the rovers.

  2. Optical verification of the James Webb Space Telescope

    NASA Astrophysics Data System (ADS)

    McComas, Brian; Rifelli, Rich; Barto, Allison; Contos, Adam; Whitman, Tony; Wells, Conrad; Hagopian, John

    2006-06-01

    The optical system of the James Webb Space Telescope (JWST) is split between two of the Observatory's element, the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM). The OTE optical design consists of an 18-hexagonal segmented primary mirror (25m2 clear aperture), a secondary mirror, a tertiary mirror, and a flat fine steering mirror used for fine guidance control. All optical components are made of beryllium. The primary and secondary mirror elements have hexapod actuation that provides six degrees of freedom rigid body adjustment. The optical components are mounted to a very stable truss structure made of composite materials. The OTE structure also supports the ISIM. The ISIM contains the Science Instruments (SIs) and Fine Guidance Sensor (FGS) needed for acquiring mission science data and for Observatory pointing and control and provides mechanical support for the SIs and FGS. The optical performance of the telescope is a key performance metric for the success of JWST. To ensure proper performance, the JWST optical verification program is a comprehensive, incremental, end-to-end verification program which includes multiple, independent, cross checks of key optical performance metrics to reduce risk of an on-orbit telescope performance issues. This paper discusses the verification testing and analysis necessary to verify the Observatory's image quality and sensitivity requirements. This verification starts with component level verification and ends with the Observatory level verification at Johnson Space Flight Center. The optical verification of JWST is a comprehensive, incremental, end-to-end optical verification program which includes both test and analysis.

  3. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  4. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    NASA Technical Reports Server (NTRS)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network

  5. The independent medical examination.

    PubMed

    Ameis, Arthur; Zasler, Nathan D

    2002-05-01

    The physiatrist, owing to expertise in impairment and disability analysis, is able to offer the medicolegal process considerable assistance. This chapter describes the scope and process of the independent medical examination (IME) and provides an overview of its component parts. Practical guidelines are provided for performing a physiatric IME of professional standard, and for serving as an impartial, expert witness. Caveats are described regarding testifying and medicolegal ethical issues along with practice management advice. PMID:12122847

  6. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  7. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  8. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  9. HDL to verification logic translator

    NASA Astrophysics Data System (ADS)

    Gambles, J. W.; Windley, P. J.

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  10. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  11. 28 CFR 541.30 - Lack of verification of need for protection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 2 2012-07-01 2012-07-01 false Lack of verification of need for protection. 541.30 Section 541.30 Judicial Administration BUREAU OF PRISONS, DEPARTMENT OF JUSTICE INSTITUTIONAL MANAGEMENT INMATE DISCIPLINE AND SPECIAL HOUSING UNITS Special Housing Units § 541.30 Lack of verification of need for protection. If a...

  12. Verification of micro-beam irradiation

    NASA Astrophysics Data System (ADS)

    Li, Qiongge; Juang, Titania; Beth, Rachel; Chang, Sha; Oldham, Mark

    2015-01-01

    Micro-beam Radiation Therapy (MRT) is an experimental radiation therapy with provocative experimental data indicating potential for improved efficacy in some diseases. Here we demonstrated a comprehensive micro-beam verification method utilizing high resolution (50pm) PRESAGE/Micro-Optical-CT 3D Dosimetry. A small PRESAGE cylindrical dosimeter was irradiated by a novel compact Carbon-Nano-Tube (CNT) field emission based MRT system. The Percentage Depth Dose (PDD), Peak-to-Valley Dose Ratio (PVDR) and beam width (FWHM) data were obtained and analyzed from a three strips radiation experiment. A fast dose drop-off with depth, a preserved beam width with depth (an averaged FWHM across three beams remains constant (405.3um, sigma=13.2um) between depth of 3.0~14.0mm), and a high PVDR value (increases with depth from 6.3 at 3.0mm depth to 8.6 at 14.0mm depth) were discovered during this verification process. Some operating procedures such as precise dosimeter mounting, robust mechanical motions (especially rotation) and stray-light artifact management were optimized and developed to achieve a more accurate and dosimetric verification method.

  13. Hybrid Verification of an Air Traffic Operational Concept

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.

    2005-01-01

    A concept of operations for air traffic management consists of a set of flight rules and procedures aimed to keep aircraft safely separated. This paper reports on the formal verification of separation properties of the NASA's Small Aircraft Transportation System, Higher Volume Operations (SATS HVO) concept for non-towered, non-radar airports. Based on a geometric description of the SATS HVO air space, we derive analytical formulas to compute spacing requirements on nominal approaches. Then, we model the operational concept by a hybrid non-deterministic asynchronous state transition system. Using an explicit state exploration technique, we show that the spacing requirements are always satisfied on nominal approaches. All the mathematical development presented in this paper has been formally verified in the Prototype Verification System (PVS). Keywords. Formal verification, hybrid systems, air traffic management, theorem proving

  14. Verification and validation of RADMODL Version 1.0

    SciTech Connect

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  15. Correction, improvement and model verification of CARE 3, version 3

    NASA Technical Reports Server (NTRS)

    Rose, D. M.; Manke, J. W.; Altschul, R. E.; Nelson, D. L.

    1987-01-01

    An independent verification of the CARE 3 mathematical model and computer code was conducted and reported in NASA Contractor Report 166096, Review and Verification of CARE 3 Mathematical Model and Code: Interim Report. The study uncovered some implementation errors that were corrected and are reported in this document. The corrected CARE 3 program is called version 4. Thus the document, correction. improvement, and model verification of CARE 3, version 3 was written in April 1984. It is being published now as it has been determined to contain a more accurate representation of CARE 3 than the preceding document of April 1983. This edition supercedes NASA-CR-166122 entitled, 'Correction and Improvement of CARE 3,' version 3, April 1983.

  16. Cold Flow Verification Test Facility

    SciTech Connect

    Shamsi, A.; Shadle, L.J.

    1996-12-31

    The cold flow verification test facility consists of a 15-foot high, 3-foot diameter, domed vessel made of clear acrylic in two flanged sections. The unit can operate up to pressures of 14 psig. The internals include a 10-foot high jetting fluidized bed, a cylindrical baffle that hangs from the dome, and a rotating grate for control of continuous solids removal. The fluid bed is continuously fed solids (20 to 150 lb/hr) through a central nozzle made up of concentric pipes. It can either be configured as a half or full cylinder of various dimensions. The fluid bed has flow loops for separate air flow control for conveying solids (inner jet, 500 to 100000 scfh) , make-up into the jet (outer jet, 500 to 8000 scfh), spargers in the solids removal annulus (100 to 2000 scfh), and 6 air jets (20 to 200 scfh) on the sloping conical grid. Additional air (500 to 10000 scfh) can be added to the top of the dome and under the rotating grate. The outer vessel, the hanging cylindrical baffles or skirt, and the rotating grate can be used to study issues concerning moving bed reactors. There is ample allowance for access and instrumentation in the outer shell. Furthermore, this facility is available for future Cooperative Research and Development Program Manager Agreements (CRADA) to study issues and problems associated with fluid- and fixed-bed reactors. The design allows testing of different dimensions and geometries.

  17. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications. PMID:17365425

  18. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  19. Infrared scanner concept verification test report

    NASA Technical Reports Server (NTRS)

    Bachtel, F. D.

    1980-01-01

    The test results from a concept verification test conducted to assess the use of an infrared scanner as a remote temperature sensing device for the space shuttle program are presented. The temperature and geometric resolution limits, atmospheric attenuation effects including conditions with fog and rain, and the problem of surface emissivity variations are included. It is concluded that the basic concept of using an infrared scanner to determine near freezing surface temperatures is feasible. The major problem identified is concerned with infrared reflections which result in significant errors if not controlled. Action taken to manage these errors result in design and operational constraints to control the viewing angle and surface emissivity.

  20. 49 CFR Appendix F to Part 236 - Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System Safety...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or...

  1. 49 CFR Appendix F to Part 236 - Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System Safety...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or...

  2. Assessment and Verification of National Vocational Qualifications: Policy and Practice.

    ERIC Educational Resources Information Center

    Konrad, John

    2000-01-01

    To overcome problems of assessment and verification of National Vocational Qualifications, the system should move from narrow quality control to total quality management. Situated learning in communities of practice, including assessors and assessees, should be developed. This requires radically different quality criteria and professional…

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH - PERKIN-ELMER PHOTOVAC, INC. VOYAGOR

    EPA Science Inventory

    The U.S Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. Reports document the performa...

  4. Better Buildings Alliance, Advanced Rooftop Unit Campaign: Rooftop Unit Measurement and Verification (Fact Sheet)

    SciTech Connect

    Not Available

    2014-09-01

    This document provides facility managers and building owners an introduction to measurement and verification (M&V) methods to estimate energy and cost savings of rooftop units replacement or retrofit projects to estimate paybacks or to justify future projects.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHOTOACOUSTIC SPECTROPHOTOMATER INNOVA AIR TECH INSTRUMENTS MODEL 1312 MULTI-GAS MONITOR

    EPA Science Inventory

    The U.S. Environmental Protection Agency, Through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report documents demons...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH - SENTEX SYSTEMS, INC. SCENTOGRAPH PLUS II

    EPA Science Inventory

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report documents demons...

  7. Biometric verification with correlation filters.

    PubMed

    Vijaya Kumar, B V K; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-10

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification. PMID:14735958

  8. Biometric verification with correlation filters

    NASA Astrophysics Data System (ADS)

    Vijaya Kumar, B. V. K.; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-01

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification.

  9. TPS verification with UUT simulation

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Meng, Xiaofeng; Zhao, Ruixian

    2006-11-01

    TPS's (Test Program Set) verification or first article acceptance test commonly depends on fault insertion experiment on UUT (Unit Under Test). However the failure modes injected on UUT is limited and it is almost infeasible when the UUT is in development or in a distributed state. To resolve this problem, a TPS verification method based on UUT interface signal simulation is putting forward. The interoperability between ATS (automatic test system) and UUT simulation platform is very important to realize automatic TPS verification. After analyzing the ATS software architecture, the approach to realize interpretability between ATS software and UUT simulation platform is proposed. And then the UUT simulation platform software architecture is proposed based on the ATS software architecture. The hardware composition and software architecture of the UUT simulation is described in details. The UUT simulation platform has been implemented in avionics equipment TPS development, debug and verification.

  10. Aerospace Nickel-cadmium Cell Verification

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Strawn, D. Michael; Hall, Stephen W.

    2001-01-01

    During the early years of satellites, NASA successfully flew "NASA-Standard" nickel-cadmium (Ni-Cd) cells manufactured by GE/Gates/SAFF on a variety of spacecraft. In 1992 a NASA Battery Review Board determined that the strategy of a NASA Standard Cell and Battery Specification and the accompanying NASA control of a standard manufacturing control document (MCD) for Ni-Cd cells and batteries was unwarranted. As a result of that determination, standards were abandoned and the use of cells other than the NASA Standard was required. In order to gain insight into the performance and characteristics of the various aerospace Ni-Cd products available, tasks were initiated within the NASA Aerospace Flight Battery Systems Program that involved the procurement and testing of representative aerospace Ni-Cd cell designs. A standard set of test conditions was established in order to provide similar information about the products from various vendors. The objective of this testing was to provide independent verification of representative commercial flight cells available in the marketplace today. This paper will provide a summary of the verification tests run on cells from various manufacturers: Sanyo 35 Ampere-hour (Ali) standard and 35 Ali advanced Ni-Cd cells, SAFr 50 Ah Ni-Cd cells and Eagle-Picher 21 Ali Magnum and 21 Ali Super Ni-CdTM cells from Eagle-Picher were put through a full evaluation. A limited number of 18 and 55 Ali cells from Acme Electric were also tested to provide an initial evaluation of the Acme aerospace cell designs. Additionally, 35 Ali aerospace design Ni-MH cells from Sanyo were evaluated under the standard conditions established for this program. Ile test program is essentially complete. The cell design parameters, the verification test plan and the details of the test result will be discussed.

  11. Employee vs independent contractor.

    PubMed

    Kolender, Ellen

    2012-01-01

    Finding qualified personnel for the cancer registry department has become increasingly difficult, as experienced abstractors retire and cancer diagnoses increase. Faced with hiring challenges, managers turn to teleworkers to fill positions and accomplish work in a timely manner. Suddenly, the hospital hires new legal staff and all telework agreements are disrupted. The question arises: Are teleworkers employees or independent contractors? Creating telework positions requires approval from the legal department and human resources. Caught off-guard in the last quarter of the year, I found myself again faced with hiring challenges. PMID:23599033

  12. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  13. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    SciTech Connect

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  14. Verification and validation of control system software

    SciTech Connect

    Munro, J.K. Jr.; Kisner, R.A. ); Bhadtt, S.C. )

    1991-01-01

    The following guidelines are proposed for verification and validation (V V) of nuclear power plant control system software: (a) use risk management to decide what and how much V V is needed; (b) classify each software application using a scheme that reflects what type and how much V V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs.

  15. Multimodal Speaker Verification Based on Electroglottograph Signal and Glottal Activity Detection

    NASA Astrophysics Data System (ADS)

    Ćirović, Zoran; Milosavljević, Milan; Banjac, Zoran

    2010-12-01

    To achieve robust speaker verification, we propose a multimodal method which includes additional nonaudio features and glottal activity detector. As a nonaudio sensor an electroglottograph (EGG) is applied. Parameters of EGG signal are used to augment conventional audio feature vector. Algorithm for EGG parameterization is based on the shape of the idealized waveform and glottal activity detector. We compare our algorithm with conventional one in the term of verification accuracy in high noise environment. All experiments are performed using Gaussian Mixture Model recognition system. Obtained results show a significant improvement of the text-independent speaker verification in high noise environment and opportunity for further improvements in this area.

  16. Fifty years of progress in speaker verification

    NASA Astrophysics Data System (ADS)

    Rosenberg, Aaron E.

    2004-10-01

    The modern era in speaker recognition started about 50 years ago at Bell Laboratories with the controversial invention of the voiceprint technique for speaker identification based on expert analysis of speech spectrograms. Early speaker recognition research concentrated on finding acoustic-phonetic features effective in discriminating speakers. The first truly automatic text dependent speaker verification systems were based on time contours or templates of speaker specific acoustic features. An important element of these systems was the ability to time warp sample templates with model templates in order to provide useful comparisons. Most modern text dependent speaker verification systems are based on statistical representations of acoustic features analyzed as a function of time over specified utterances, most particularly the hidden markov model (HMM) representation. Modern text independent systems are based on vector quantization representations and, more recently, on Gaussian mixture model (GMM) representations. An important ingredient of statistically based systems is likelihood ratio decision techniques making use of speaker background models. Some recent research has shown how to extract higher level features based on speaking behavior and combine it with lower level, acoustic features for improved performance. The talk will present these topics in historical order showing the evolution of techniques.

  17. Evaluation of verification methods for input-accountability measurements

    SciTech Connect

    Maeck, W. J.

    1980-01-01

    As part of TASTEX related programs two independent methods have been evaluated for the purpose of providing verification of the amount of Pu charged to the head-end of a nuclear fuel processing plant. The first is the Pu/U (gravimetric method), TASTEX Task-L, and the second is the Tracer Method, designated Task-M. Summaries of the basic technology, results of various studies under actual plant conditions, future requirements, are given for each of the Tasks.

  18. VESTA: a system-level verification environment based on C++

    NASA Astrophysics Data System (ADS)

    Shahdadpuri, Mahendra V.; Sosa, Javier; Navarro, H. éctor; Montiel-Nelson, Juan A.; Sarmiento, Roberto

    2003-04-01

    System verification is an important issue to do at every design step to ensure the complete system correctness. The verification effort is becoming more time-consuming due to the increase in design complexity. New environments are necessary to reduce the complexity of this task and, most importantly, reduce the time to develop it. Among the languages used in verification, C++ is powerful enough for encapsulating the necessary concepts in a set of classes and templates. This work introduces a framework that allows describing and verifying highly complex systems in a user-friendly and speedy way with C++ classes. These encapsulate hardware description and verification concepts and can be reused throughout a project and also in various development projects. Furthermore, the resulting libraries provide an easy-to-use interface for describing systems and writing test benches in C++, with a transparent connection to an HDL simulator. VESTA includes an advanced memory management with an extremely versatile linked list. The linked list access mode can change on-fly to a FIFO, a LIFO or a memory array access mode, among others. Experimental results demonstrate that the basic types provided by our verification environment excel the features of non-commercial solutions as Openvera or TestBuilder and commercial solutions such as 'e' language. Besides, the results achieved have shown significant productivity gain in creating reusable testbenches and in debugging simulation runs.

  19. Towards Formal Verification of a Separation Microkernel

    NASA Astrophysics Data System (ADS)

    Butterfield, Andrew; Sanan, David; Hinchey, Mike

    2013-08-01

    The best approach to verifying an IMA separation kernel is to use a (fixed) time-space partitioning kernel with a multiple independent levels of separation (MILS) architecture. We describe an activity that explores the cost and feasibility of doing a formal verification of such a kernel to the Common Criteria (CC) levels mandated by the Separation Kernel Protection Profile (SKPP). We are developing a Reference Specification of such a kernel, and are using higher-order logic (HOL) to construct formal models of this specification and key separation properties. We then plan to do a dry run of part of a formal proof of those properties using the Isabelle/HOL theorem prover.

  20. Coherent Lidar Design and Performance Verification

    NASA Technical Reports Server (NTRS)

    Frehlich, Rod

    1996-01-01

    This final report summarizes the investigative results from the 3 complete years of funding and corresponding publications are listed. The first year saw the verification of beam alignment for coherent Doppler lidar in space by using the surface return. The second year saw the analysis and computerized simulation of using heterodyne efficiency as an absolute measure of performance of coherent Doppler lidar. A new method was proposed to determine the estimation error for Doppler lidar wind measurements without the need for an independent wind measurement. Coherent Doppler lidar signal covariance, including wind shear and turbulence, was derived and calculated for typical atmospheric conditions. The effects of wind turbulence defined by Kolmogorov spatial statistics were investigated theoretically and with simulations. The third year saw the performance of coherent Doppler lidar in the weak signal regime determined by computer simulations using the best velocity estimators. Improved algorithms for extracting the performance of velocity estimators with wind turbulence included were also produced.

  1. Verification Survey of the Building 315 Zero Power Reactor-6 Facility, Argonne National Laboratory-East, Argonne, Illinois

    SciTech Connect

    W. C. Adams

    2007-05-25

    Oak Ridge Institute for Science and Education (ORISE) conducted independent verification radiological survey activities at Argonne National Laboratory’s Building 315, Zero Power Reactor-6 facility in Argonne, Illinois. Independent verification survey activities included document and data reviews, alpha plus beta and gamma surface scans, alpha and beta surface activity measurements, and instrumentation comparisons. An interim letter report and a draft report, documenting the verification survey findings, were submitted to the DOE on November 8, 2006 and February 22, 2007, respectively (ORISE 2006b and 2007).

  2. Independent Evaluation: Insights from Public Accounting

    ERIC Educational Resources Information Center

    Brown, Abigail B.; Klerman, Jacob Alex

    2012-01-01

    Background: Maintaining the independence of contract government program evaluation presents significant contracting challenges. The ideal outcome for an agency is often both the impression of an independent evaluation "and" a glowing report. In this, independent evaluation is like financial statement audits: firm management wants both a public…

  3. Managing Multiple Sources of Information in an Independent K-12 Private School: A Case Study in a Student Information Systems Evaluation

    ERIC Educational Resources Information Center

    Yares, Ali Chava Kaufman

    2010-01-01

    Information is everywhere and finding the best method to manage it is a problem that all types of organizations have to deal with. Schools use Student Information Systems (SIS) to manage Student Data, Financial Information, Development, Human Resources, Admission, Financial Aid, Enrollment, Scheduling, and Health Information. A survey of 107…

  4. Independence Generalizing Monotone and Boolean Independences

    NASA Astrophysics Data System (ADS)

    Hasebe, Takahiro

    2011-01-01

    We define conditionally monotone independence in two states which interpolates monotone and Boolean ones. This independence is associative, and therefore leads to a natural probability theory in a non-commutative algebra.

  5. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  6. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  7. Space Telescope performance and verification

    NASA Technical Reports Server (NTRS)

    Wright, W. F.

    1980-01-01

    The verification philosophy for the Space Telescope (ST) has evolved from years of experience with multispacecraft programs modified by the new factors introduced by the Space Transportation System. At the systems level of test, the ST will undergo joint qualification/acceptance tests with environment simulation using Lockheed's large spacecraft test facilities. These tests continue the process of detecting workmanship defects and module interface incompatibilities. The test program culminates in an 'all up' ST environmental test verification program resulting in a 'ready to launch' ST.

  8. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  9. Subsurface barrier integrity verification using perfluorocarbon tracers

    SciTech Connect

    Sullivan, T.M.; Heiser, J.; Milian, L.; Senum, G.

    1996-12-01

    Subsurface barriers are an extremely promising remediation option to many waste management problems. Gas phase tracers include perfluorocarbon tracers (PFT`s) and chlorofluorocarbon tracers (CFC`s). Both have been applied for leak detection in subsurface systems. The focus of this report is to describe the barrier verification tests conducted using PFT`s and analysis of the data from the tests. PFT verification tests have been performed on a simulated waste pit at the Hanford Geotechnical facility and on an actual waste pit at Brookhaven National Laboratory (BNL). The objective of these tests were to demonstrate the proof-of-concept that PFT technology can be used to determine if small breaches form in the barrier and for estimating the effectiveness of the barrier in preventing migration of the gas tracer to the monitoring wells. The subsurface barrier systems created at Hanford and BNL are described. The experimental results and the analysis of the data follow. Based on the findings of this study, conclusions are offered and suggestions for future work are presented.

  10. Monitoring/Verification using DMS: TATP Example

    SciTech Connect

    Stephan Weeks, Kevin Kyle, Manuel Manard

    2008-05-30

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations-management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. Fast GC is the leading field analytical method for gas phase separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.

  11. Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) Independent Analysis

    NASA Technical Reports Server (NTRS)

    Davis, Mitchell L.; Aguilar, Michael L.; Mora, Victor D.; Regenie, Victoria A.; Ritz, William F.

    2009-01-01

    Two approaches were compared to the Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) approach: the Flat-Sat and Shuttle Avionics Integration Laboratory (SAIL). The Flat-Sat and CAIL/SAIL approaches are two different tools designed to mitigate different risks. Flat-Sat approach is designed to develop a mission concept into a flight avionics system and associated ground controller. The SAIL approach is designed to aid in the flight readiness verification of the flight avionics system. The approaches are complimentary in addressing both the system development risks and mission verification risks. The following NESC team findings were identified: The CAIL assumption is that the flight subsystems will be matured for the system level verification; The Flat-Sat and SAIL approaches are two different tools designed to mitigate different risks. The following NESC team recommendation was provided: Define, document, and manage a detailed interface between the design and development (EDL and other integration labs) to the verification laboratory (CAIL).

  12. Test/QA Plan for Verification of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments across International Borders

    EPA Science Inventory

    The verification test will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Technology Verification (ETV) Program. It will be performed by Battelle, which is managing the ETV Advanced Monitoring Systems (AMS) Center throu...

  13. Results from an Independent View on The Validation of Safety-Critical Space Systems

    NASA Astrophysics Data System (ADS)

    Silva, N.; Lopes, R.; Esper, A.; Barbosa, R.

    2013-08-01

    The Independent verification and validation (IV&V) has been a key process for decades, and is considered in several international standards. One of the activities described in the “ESA ISVV Guide” is the independent test verification (stated as Integration/Unit Test Procedures and Test Data Verification). This activity is commonly overlooked since customers do not really see the added value of checking thoroughly the validation team work (could be seen as testing the tester's work). This article presents the consolidated results of a large set of independent test verification activities, including the main difficulties, results obtained and advantages/disadvantages for the industry of these activities. This study will support customers in opting-in or opting-out for this task in future IV&V contracts since we provide concrete results from real case studies in the space embedded systems domain.

  14. A verification system of RMAP protocol controller

    NASA Astrophysics Data System (ADS)

    Khanov, V. Kh; Shakhmatov, A. V.; Chekmarev, S. A.

    2015-01-01

    The functional verification problem of IP blocks of RMAP protocol controller is considered. The application of the verification method using fully- functional models of the processor and the internal bus of a system-on-chip is justified. Principles of construction of a verification system based on the given approach are proposed. The practical results of creating a system of verification of IP block of RMAP protocol controller is presented.

  15. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  16. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for...

  17. Verification and validation of impinging round jet simulations using an adaptive FEM

    NASA Astrophysics Data System (ADS)

    Pelletier, Dominique; Turgeon, Éric; Tremblay, Dominique

    2004-03-01

    This paper illustrates the use of an adaptive finite element method as a means of achieving verification of codes and simulations of impinging round jets, that is obtaining numerical predictions with controlled accuracy. Validation of these grid-independent solution is then performed by comparing predictions to measurements. We adopt the standard and accepted definitions of verification and validation (Technical Report AIAA-G-077-1998, American Institute of Aeronautics and Astronautics, 1998; Verification and Validation in Computational Science and Engineering. Hermosa Publishers: Albuquerque, NM, 1998). Mesh adaptation is used to perform the systematic and rigorous grid refinement studies required for both verification and validation in CFD. This ensures that discrepancies observed between predictions and measurements are due to deficiencies in the mathematical model of the flow. Issues in verification and validation are discussed. The paper presents an example of code verification by the method of manufactured solution. Examples of successful and unsuccessful validation for laminar and turbulent impinging jets show that agreement with experiments is achieved only with a good mathematical model of the flow physics combined with accurate numerical solution of the differential equations. The paper emphasizes good CFD practice to systematically achieve verification so that validation studies are always performed on solid grounds.

  18. Definition of ground test for Large Space Structure (LSS) control verification

    NASA Technical Reports Server (NTRS)

    Waites, H. B.; Doane, G. B., III; Tollison, D. K.

    1984-01-01

    An overview for the definition of a ground test for the verification of Large Space Structure (LSS) control is given. The definition contains information on the description of the LSS ground verification experiment, the project management scheme, the design, development, fabrication and checkout of the subsystems, the systems engineering and integration, the hardware subsystems, the software, and a summary which includes future LSS ground test plans. Upon completion of these items, NASA/Marshall Space Flight Center will have an LSS ground test facility which will provide sufficient data on dynamics and control verification of LSS so that LSS flight system operations can be reasonably ensured.

  19. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  20. Visual Attention During Sentence Verification.

    ERIC Educational Resources Information Center

    Lucas, Peter A.

    Eye movement data were collected for 28 college students reading 32 sentences with sentence verification questions. The factors observed were target sentence voice (active/passive), probe voice, and correct response (true/false). Pairs of subjects received the same set of stimuli, but with agents and objects in the sentences reversed. As expected,…

  1. Improved method for coliform verification.

    PubMed

    Diehl, J D

    1991-02-01

    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay. PMID:1901712

  2. Improved method for coliform verification.

    PubMed Central

    Diehl, J D

    1991-01-01

    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay. PMID:1901712

  3. A scheme for symmetrization verification

    NASA Astrophysics Data System (ADS)

    Sancho, Pedro

    2011-08-01

    We propose a scheme for symmetrization verification in two-particle systems, based on one-particle detection and state determination. In contrast to previous proposals, it does not follow a Hong-Ou-Mandel-type approach. Moreover, the technique can be used to generate superposition states of single particles.

  4. VERIFICATION OF WATER QUALITY MODELS

    EPA Science Inventory

    The basic concepts of water quality models are reviewed and the need to recognize calibration and verification of models with observed data is stressed. Post auditing of models after environmental control procedures are implemented is necessary to determine true model prediction ...

  5. Independent Peer Reviews

    SciTech Connect

    2012-03-16

    Independent Assessments: DOE's Systems Integrator convenes independent technical reviews to gauge progress toward meeting specific technical targets and to provide technical information necessary for key decisions.

  6. On Crowd-verification of Biological Networks

    PubMed Central

    Ansari, Sam; Binder, Jean; Boue, Stephanie; Di Fabio, Anselmo; Hayes, William; Hoeng, Julia; Iskandar, Anita; Kleiman, Robin; Norel, Raquel; O’Neel, Bruce; Peitsch, Manuel C.; Poussin, Carine; Pratt, Dexter; Rhrissorrakrai, Kahn; Schlage, Walter K.; Stolovitzky, Gustavo; Talikka, Marja

    2013-01-01

    Biological networks with a structured syntax are a powerful way of representing biological information generated from high density data; however, they can become unwieldy to manage as their size and complexity increase. This article presents a crowd-verification approach for the visualization and expansion of biological networks. Web-based graphical interfaces allow visualization of causal and correlative biological relationships represented using Biological Expression Language (BEL). Crowdsourcing principles enable participants to communally annotate these relationships based on literature evidences. Gamification principles are incorporated to further engage domain experts throughout biology to gather robust peer-reviewed information from which relationships can be identified and verified. The resulting network models will represent the current status of biological knowledge within the defined boundaries, here processes related to human lung disease. These models are amenable to computational analysis. For some period following conclusion of the challenge, the published models will remain available for continuous use and expansion by the scientific community. PMID:24151423

  7. On Crowd-verification of Biological Networks.

    PubMed

    Ansari, Sam; Binder, Jean; Boue, Stephanie; Di Fabio, Anselmo; Hayes, William; Hoeng, Julia; Iskandar, Anita; Kleiman, Robin; Norel, Raquel; O'Neel, Bruce; Peitsch, Manuel C; Poussin, Carine; Pratt, Dexter; Rhrissorrakrai, Kahn; Schlage, Walter K; Stolovitzky, Gustavo; Talikka, Marja

    2013-01-01

    Biological networks with a structured syntax are a powerful way of representing biological information generated from high density data; however, they can become unwieldy to manage as their size and complexity increase. This article presents a crowd-verification approach for the visualization and expansion of biological networks. Web-based graphical interfaces allow visualization of causal and correlative biological relationships represented using Biological Expression Language (BEL). Crowdsourcing principles enable participants to communally annotate these relationships based on literature evidences. Gamification principles are incorporated to further engage domain experts throughout biology to gather robust peer-reviewed information from which relationships can be identified and verified. The resulting network models will represent the current status of biological knowledge within the defined boundaries, here processes related to human lung disease. These models are amenable to computational analysis. For some period following conclusion of the challenge, the published models will remain available for continuous use and expansion by the scientific community. PMID:24151423

  8. Fingerprint verification on medical image reporting system.

    PubMed

    Chen, Yen-Cheng; Chen, Liang-Kuang; Tsai, Ming-Dar; Chiu, Hou-Chang; Chiu, Jainn-Shiun; Chong, Chee-Fah

    2008-03-01

    The healthcare industry is recently going through extensive changes, through adoption of robust, interoperable healthcare information technology by means of electronic medical records (EMR). However, a major concern of EMR is adequate confidentiality of the individual records being managed electronically. Multiple access points over an open network like the Internet increases possible patient data interception. The obligation is on healthcare providers to procure information security solutions that do not hamper patient care while still providing the confidentiality of patient information. Medical images are also part of the EMR which need to be protected from unauthorized users. This study integrates the techniques of fingerprint verification, DICOM object, digital signature and digital envelope in order to ensure that access to the hospital Picture Archiving and Communication System (PACS) or radiology information system (RIS) is only by certified parties. PMID:18178287

  9. Class 1E software verification and validation: Past, present, and future

    SciTech Connect

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V&V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V&V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V&V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V&V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V&V Guidelines is introduced. The paper concludes with a glossary and bibliography.

  10. Duty of Care and Autonomy: How Support Workers Managed the Tension between Protecting Service Users from Risk and Promoting Their Independence in a Specialist Group Home

    ERIC Educational Resources Information Center

    Hawkins, R.; Redley, M.; Holland, A. J.

    2011-01-01

    Background: In the UK those paid to support adults with intellectual disabilities must manage two potentially conflicting duties that are set out in policy documents as being vital to their role: protecting service users (their duty of care) and recognising service users' autonomy. This study focuses specifically on the support of people with the…

  11. A Spectral Verification of the HELIOS-2 Lattice Physics Code

    SciTech Connect

    D. S. Crawford; B. D. Ganapol; D. W. Nigg

    2012-11-01

    Core modeling of the Advanced Test Reactor (ATR) at INL is currently undergoing a significant update through the Core Modeling Update Project1. The intent of the project is to bring ATR core modeling in line with today’s standard of computational efficiency and verification and validation practices. The HELIOS-2 lattice physics code2 is the lead code of several reactor physics codes to be dedicated to modernize ATR core analysis. This presentation is concerned with an independent verification of the HELIOS-2 spectral representation including the slowing down and thermalization algorithm and its data dependency. Here, we will describe and demonstrate a recently developed simple cross section generation algorithm based entirely on analytical multigroup parameters for both the slowing down and thermal spectrum. The new capability features fine group detail to assess the flux and multiplication factor dependencies on cross section data sets using the fundamental infinite medium as an example.

  12. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  13. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  14. Knowledge-based system verification and validation

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1990-01-01

    The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.

  15. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGESBeta

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  16. Extremely accurate sequential verification of RELAP5-3D

    SciTech Connect

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method of manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.

  17. External Verification of the Bundle Adjustment in Photogrammetric Software Using the Damped Bundle Adjustment Toolbox

    NASA Astrophysics Data System (ADS)

    Börlin, Niclas; Grussenmeyer, Pierre

    2016-06-01

    The aim of this paper is to investigate whether the Matlab-based Damped Bundle Adjustment Toolbox (DBAT) can be used to provide independent verification of the BA computation of two popular software—PhotoModeler (PM) and PhotoScan (PS). For frame camera data sets with lens distortion, DBAT is able to reprocess and replicate subsets of PM results with high accuracy. For lens-distortion-free data sets, DBAT can furthermore provide comparative results between PM and PS. Data sets for the discussed projects are available from the authors. The use of an external verification tool such as DBAT will enable users to get an independent verification of the computations of their software. In addition, DBAT can provide computation of quality parameters such as estimated standard deviations, correlation between parameters, etc., something that should be part of best practice for any photogrammetric software. Finally, as the code is free and open-source, users can add computations of their own.

  18. COS Internal NUV Wavelength Verification

    NASA Astrophysics Data System (ADS)

    Keyes, Charles

    2009-07-01

    This program will be executed after the uplink of the OSM2 position updates derived from the determination of the wavelength-scale zero points and desired spectral ranges for each grating in activity COS14 {program 11474 - COS NUV Internal/External Wavelength Scales}. This program will verify that the operational spectral ranges for each grating, central wavelength, and FP-POS are those desired. Subsequent to a successful verification, COS NUV ERO observations and NUV science can be enabled. An internal wavelength calibration spectrum using the default PtNe lamp {lamp 1} with each NUV grating at each central wavelength setting and each FP-POS position will be obtained for the verification. Additional exposures and waits between certain exposures will be required to avoid - and to evaluate - mechanism drifts.

  19. COS Internal FUV Wavelength Verification

    NASA Astrophysics Data System (ADS)

    Keyes, Charles

    2009-07-01

    This program will be executed after the uplink of the OSM1 position updates derived from the determination of the wavelength-scale zero points and desired spectral ranges for each grating in activity COS29 {program 11487 - COS FUV Internal/External Wavelength Scales}. This program will verify that the operational spectral ranges for each grating, central wavelength, and FP-POS are those desired. Subsequent to a successful verification, COS FUV ERO observations that require accurate wavelength scales {if any} and FUV science can be enabled. An internal wavelength calibration spectrum using the default PtNe lamp {lamp 1} with each FUV grating at each central wavelength setting and each FP-POS position will be obtained for the verification. Additional exposures and waits between certain exposures will be required to avoid - and to evaluate - mechanism drifts.

  20. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  1. Interim Letter Report - Verification Survey Results for Activities Performed in March 2009 for the Vitrification Test Facility Warehouse at the West Valley Demonstration Project, Ashford, New York

    SciTech Connect

    B.D. Estes

    2009-04-24

    The objective of the verification activities was to provide independent radiological surveys and data for use by the Department of Energy (DOE) to ensure that the building satisfies the requirements for release without radiological controls.

  2. Letter Report - Verification Results for the Non-Real Property Radiological Release Program at the West Valley Demonstration Project, Ashford, New York

    SciTech Connect

    M.A. Buchholz

    2009-04-29

    The objective of the verification activities is to provide an independent review of the design, implementation, and performance of the radiological unrestricted release program for personal property, materials, and equipment (non-real property).

  3. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  4. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  5. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  6. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  7. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  8. The Targon FN System for the Management of Intracapsular Neck of Femur Fractures: Minimum 2-Year Experience and Outcome in an Independent Hospital

    PubMed Central

    Tissingh, Elizabeth; Wartenberg, Kakra; Aggarwal, Saurabh; Ismail, Fikry; Orakwe, Sam; Khan, Farid

    2015-01-01

    Background The Targon FN implant was developed in 2007 to treat intracapsular neck of femur fractures. Early results from the design centre have shown good results in terms of fracture complications. We wished to see if these results can be reproduced in an independent institution. Methods The records of consecutive patients, treated with this implant between 2008 and 2011 at Queen Elizabeth Hospital, were identified and collected for this study. Operations were performed by all grades of surgeons under supervision as appropriate. These patients went on to have both clinical and radiological assessment for fracture healing and function. Results Fifty-one patients were identified with 43 patients available for final follow-up. The average age was 66 years with a minimum follow-up of 24 months. A non-union rate of 0% in the undisplaced fracture group and 1 in 12 (8%) in the displaced fracture group was observed. An avascular necrosis rate of 6% and 8% was observed for undisplaced and displaced fracture types, respectively. No significant change in premorbid to postoperative ambulation was observed and there was no wound complication. Conclusions Our study shows similar results with those of the design centre and which are superior to those currently found in the literature for the more traditional fixation methods. It also shows that the promising results with this new implant as seen from the design institutions can be reproduced by all cadres of surgeons in non-specialist practice. PMID:25729515

  9. Independent Schools - Independent Thinking - Independent Art: Testing Assumptions.

    ERIC Educational Resources Information Center

    Carnes, Virginia

    This study consists of a review of selected educational reform issues from the past 10 years that deal with changing attitudes towards art and art instruction in the context of independent private sector schools. The major focus of the study is in visual arts and examines various programs and initiatives with an art focus. Programs include…

  10. Realistic weather simulations and forecast verification with COSMO-EULAG

    NASA Astrophysics Data System (ADS)

    Wójcik, Damian; Piotrowski, Zbigniew; Rosa, Bogdan; Ziemiański, Michał

    2015-04-01

    Research conducted at Polish Institute of Meteorology and Water Management, National Research Institute, in collaboration with Consortium for Small Scale Modeling (COSMO) resulted in the development of a new prototype model COSMO-EULAG. The dynamical core of the new model is based on anelastic set of equation and numerics adopted from the EULAG model. The core is coupled, with the 1st degree of accuracy, to the COSMO physical parameterizations involving turbulence, friction, radiation, moist processes and surface fluxes. The tool is capable to compute weather forecast in mountainous area for the horizontal resolutions ranging from 2.2 km to 0.1 km and with slopes reaching 82 degree of inclination. An employment of EULAG allows to profit from its desirable conservative properties and numerical robustness confirmed in number of benchmark tests and widely documented in scientific literature. In this study we show a realistic case study of Alpine summer convection simulated by COSMO-EULAG. It compares the convection-permitting realization of the flow using 2.2 km horizontal grid size, typical for contemporary very high resolution regional NWP forecast, with realization of LES type using grid size of 100 m. The study presents comparison of flow, cloud and precipitation structure together with the reference results of standard compressible COSMO Runge-Kutta model forecast in 2.2 km horizontal resolution. The case study results are supplemented by COSMO-EULAG forecast verification results for Alpine domain in 2.2 km horizontal resolution. Wind, temperature, cloud, humidity and precipitation scores are being presented. Verification period covers one summer month (June 2013) and one autumn month (November 2013). Verification is based on data collected by a network of approximately 200 stations (surface data verification) and 6 stations (upper-air verification) located in the Alps and vicinity.

  11. IN PURSUIT OF AN INTERNATIONAL APPROACH TO QUALITY ASSURANCE FOR ENVIRONMENTAL TECHNOLOGY VERIFICATION

    EPA Science Inventory

    In the mid-1990's, the USEPA began the Environmental Technology Verification (ETV) Program in order to provide purchasers of environmental technology with independently acquired, quality-assured, test data, upon which to base their purchasing decisions. From the beginning, a str...

  12. The Evolution of Improved Baghouse Filter Media as Observed in the Environmental Technology Verification Program

    EPA Science Inventory

    The U.S. EPA implemented the Environmental Technology Verification (ETV) program in 1995 to generate independent and credible data on the performance of innovative technologies that have the potential to improve protection of public health and the environment. Results are publicl...

  13. Gender verification in competitive sports.

    PubMed

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  14. Subsurface barrier verification technologies, informal report

    SciTech Connect

    Heiser, J.H.

    1994-06-01

    One of the more promising remediation options available to the DOE waste management community is subsurface barriers. Some of the uses of subsurface barriers include surrounding and/or containing buried waste, as secondary confinement of underground storage tanks, to direct or contain subsurface contaminant plumes and to restrict remediation methods, such as vacuum extraction, to a limited area. To be most effective the barriers should be continuous and depending on use, have few or no breaches. A breach may be formed through numerous pathways including: discontinuous grout application, from joints between panels and from cracking due to grout curing or wet-dry cycling. The ability to verify barrier integrity is valuable to the DOE, EPA, and commercial sector and will be required to gain full public acceptance of subsurface barriers as either primary or secondary confinement at waste sites. It is recognized that no suitable method exists for the verification of an emplaced barrier`s integrity. The large size and deep placement of subsurface barriers makes detection of leaks challenging. This becomes magnified if the permissible leakage from the site is low. Detection of small cracks (fractions of an inch) at depths of 100 feet or more has not been possible using existing surface geophysical techniques. Compounding the problem of locating flaws in a barrier is the fact that no placement technology can guarantee the completeness or integrity of the emplaced barrier. This report summarizes several commonly used or promising technologies that have been or may be applied to in-situ barrier continuity verification.

  15. Development of Independent-type Optical CT

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Tatsushi; Shiozawa, Daigoro; Rokunohe, Toshiaki; Kida, Junzo; Zhang, Wei

    Optical current transformers (optical CTs) have features that they can be made much smaller and lighter than conventional electromagnetic induction transformers by their simple structure, and contribute to improvement of equipment reliability because of their excellent surge resistance performance. Authors consider optical CTs to be next generation transformers, and are conducting research and development of optical CTs aiming to apply to measuring and protection in electric power systems. Specifically we developed an independent-type optical CT by utilizing basic data of optical CTs accumulated for large current characteristics, temperature characteristics, vibration resistance characteristics, and so on. In performance verification, type tests complying with IEC standards, such as short-time current tests, insulation tests, accuracy tests, and so on, showed good results. This report describes basic principle and configuration of optical CTs. After that, as basic characteristics of optical CTs, conditions and results of verification tests for dielectric breakdown characteristics of sensor fibers, large current characteristics, temperature characteristics, and vibration resistance characteristics are described. Finally, development outline of the independent-type optical CT aiming to apply to all digital substation and its type tests results are described.

  16. Survey Matches Newspapers with Criteria of Independence.

    ERIC Educational Resources Information Center

    Kopenhaver, Lillian Lodge

    1989-01-01

    Reports representative survey responses from 51 2-year college newspapers and 224 university newspapers regarding publication boards, advisers, general managers, publishers, and finances. Discusses the extent to which the newspapers are and are perceived to be independent. (DMM)

  17. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    SciTech Connect

    Flach, G. P.

    2015-05-12

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION: GENERIC VERIFICATION PROTOCOL FOR BIOLOGICAL AND AEROSOL TESTING OF GENERAL VENTILATION AIR CLEANERS

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  19. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  20. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  1. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  2. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    NASA Astrophysics Data System (ADS)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  3. Verification and validation of COBRA-SFS transient analysis capability

    SciTech Connect

    Rector, D.R.; Michener, T.E.; Cuta, J.M.

    1998-05-01

    This report provides documentation of the verification and validation testing of the transient capability in the COBRA-SFS code, and is organized into three main sections. The primary documentation of the code was published in September 1995, with the release of COBRA-SFS, Cycle 2. The validation and verification supporting the release and licensing of COBRA-SFS was based solely on steady-state applications, even though the appropriate transient terms have been included in the conservation equations from the first cycle. Section 2.0, COBRA-SFS Code Description, presents a capsule description of the code, and a summary of the conservation equations solved to obtain the flow and temperature fields within a cask or assembly model. This section repeats in abbreviated form the code description presented in the primary documentation (Michener et al. 1995), and is meant to serve as a quick reference, rather than independent documentation of all code features and capabilities. Section 3.0, Transient Capability Verification, presents a set of comparisons between code calculations and analytical solutions for selected heat transfer and fluid flow problems. Section 4.0, Transient Capability Validation, presents comparisons between code calculations and experimental data obtained in spent fuel storage cask tests. Based on the comparisons presented in Sections 2.0 and 3.0, conclusions and recommendations for application of COBRA-SFS to transient analysis are presented in Section 5.0.

  4. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    SciTech Connect

    Weaver, Phyllis C.

    2012-08-29

    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.

  5. Verification and Implementation of Operations Safety Controls for Flight Missions

    NASA Technical Reports Server (NTRS)

    Jones, Cheryl L.; Smalls, James R.; Carrier, Alicia S.

    2010-01-01

    Approximately eleven years ago, the International Space Station launched the first module from Russia, the Functional Cargo Block (FGB). Safety and Mission Assurance (S&MA) Operations (Ops) Engineers played an integral part in that endeavor by executing strict flight product verification as well as continued staffing of S&MA's console in the Mission Evaluation Room (MER) for that flight mission. How were these engineers able to conduct such a complicated task? They conducted it based on product verification that consisted of ensuring that safety requirements were adequately contained in all flight products that affected crew safety. S&MA Ops engineers apply both systems engineering and project management principles in order to gain a appropriate level of technical knowledge necessary to perform thorough reviews which cover the subsystem(s) affected. They also ensured that mission priorities were carried out with a great detail and success.

  6. Formal Verification of Air Traffic Conflict Prevention Bands Algorithms

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony J.; Munoz, Cesar A.; Dowek, Gilles

    2010-01-01

    In air traffic management, a pairwise conflict is a predicted loss of separation between two aircraft, referred to as the ownship and the intruder. A conflict prevention bands system computes ranges of maneuvers for the ownship that characterize regions in the airspace that are either conflict-free or 'don't go' zones that the ownship has to avoid. Conflict prevention bands are surprisingly difficult to define and analyze. Errors in the calculation of prevention bands may result in incorrect separation assurance information being displayed to pilots or air traffic controllers. This paper presents provably correct 3-dimensional prevention bands algorithms for ranges of track angle; ground speed, and vertical speed maneuvers. The algorithms have been mechanically verified in the Prototype Verification System (PVS). The verification presented in this paper extends in a non-trivial way that of previously published 2-dimensional algorithms.

  7. Automatic verification methods for finite state systems

    SciTech Connect

    Sifakis, J. )

    1990-01-01

    This volume contains the proceedings of a workshop devoted to the verification of finite state systems. The workshop focused on the development and use of methods, tools and theories for automatic verification of finite state systems. The goal at the workshop was to compare verification methods and tools to assist the applications designer. The papers review verification techniques for finite state systems and evaluate their relative advantages. The techniques considered cover various specification formalisms such as process algebras, automata and logics. Most of the papers focus on exploitation of existing results in three application areas: hardware design, communication protocols and real-time systems.

  8. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  9. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  10. 28 CFR 603.1 - Jurisdiction of the Independent Counsel

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... JURISDICTION OF THE INDEPENDENT COUNSEL: IN RE MADISON GUARANTY SAVINGS & LOAN ASSOCIATION § 603.1 Jurisdiction of the Independent Counsel (a) The Independent Counsel: In re Madison Guaranty Savings & Loan... Corporation; or (3) Capital Management Services. (b) The Independent Counsel: In re Madison Guaranty...

  11. 28 CFR 603.1 - Jurisdiction of the Independent Counsel

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... JURISDICTION OF THE INDEPENDENT COUNSEL: IN RE MADISON GUARANTY SAVINGS & LOAN ASSOCIATION § 603.1 Jurisdiction of the Independent Counsel (a) The Independent Counsel: In re Madison Guaranty Savings & Loan... Corporation; or (3) Capital Management Services. (b) The Independent Counsel: In re Madison Guaranty...

  12. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  13. Field analytical technology verification: The ETV Site Characterization Program

    SciTech Connect

    Einfeld, W.; Jenkins, R.A.; Dindal, A.B.

    1998-06-01

    Innovative field characterization and monitoring technologies are often slow to be adopted by the environmental engineering/consulting community because of concerns that their performance has not been proven by an independent testing body, and/or they have not received the EPA`s blessing on a regional or national level. The purpose of the EPA Environmental Technology Verification (ETV) Site Characterization Pilot, a joint effort between EPA and DOE, is to accelerate the acceptance of technologies that reduce the cost and increase the speed of environmental clean-up and monitoring. Technology verifications that have been completed or are underway include: in situ technologies for the characterization of sub-surface hydrocarbon plumes, field-portable GC/MS systems, field-portable X-ray fluorescence analyzers, soil sampling technologies, field-portable PCB analyzers, analyzers for VOC analysis at the wellhead, and decision support software systems to aid site sample collection and contaminant plume definition. The verification process follows a somewhat generic pathway. A user-community need is identified, the vendor community is canvassed, and relevant, interested companies are selected. A demonstration plan is prepared by the verification organization and circulated to participants prior to the field activities. Field trials are normally held at two geologically or environmentally different sites and typically require one week at each site. Samples (soil, soil gas, water, surface wipe etc.) provided to the vendor at the demonstration include site-specific samples and standards or performance evaluation samples. Sample splits are sent to a pre-selected laboratory for analysis using a reference method. Laboratory data are used for comparison with field technology results during the data analysis phase of the demonstration.

  14. Conformance Verification of Privacy Policies

    NASA Astrophysics Data System (ADS)

    Fu, Xiang

    Web applications are both the consumers and providers of information. To increase customer confidence, many websites choose to publish their privacy protection policies. However, policy conformance is often neglected. We propose a logic based framework for formally specifying and reasoning about the implementation of privacy protection by a web application. A first order extension of computation tree logic is used to specify a policy. A verification paradigm, built upon a static control/data flow analysis, is presented to verify if a policy is satisfied.

  15. Why do verification and validation?

    DOE PAGESBeta

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  16. Science verification results from PMAS

    NASA Astrophysics Data System (ADS)

    Roth, M. M.; Becker, T.; Böhm, P.; Kelz, A.

    2004-02-01

    PMAS, the Potsdam Multi-Aperture Spectrophotometer, is a new integral field instrument which was commissioned at the Calar Alto 3.5m Telescope in May 2001. We report on results obtained from a science verification run in October 2001. We present observations of the low-metallicity blue compact dwarf galaxy SBS0335-052, the ultra-luminous X-ray Source X-1 in the Holmberg;II galaxy, the quadruple gravitational lens system Q2237+0305 (the ``Einstein Cross''), the Galactic planetary nebula NGC7027, and extragalactic planetary nebulae in M31. PMAS is now available as a common user instrument at Calar Alto Observatory.

  17. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  18. Peer Support for Achieving Independence in Diabetes (Peer-AID): Design, methods and baseline characteristics of a randomized controlled trial of community health worker assisted diabetes self-management support

    PubMed Central

    Nelson, Karin; Drain, Nathan; Robinson, June; Kapp, Janet; Hebert, Paul; Taylor, Leslie; Silverman, Julie; Kiefer, Meghan; Lessler, Dan; Krieger, James

    2014-01-01

    Background & Objectives Community health workers (CHWs) may be an important mechanism to provide diabetes self-management to disadvantaged populations. We describe the design and baseline results of a trial evaluating a home-based CHW intervention. Methods & Research Design Peer Support for Achieving Independence in Diabetes (Peer-AID) is a randomized, controlled trial evaluating a home-based CHW-delivered diabetes self-management intervention versus usual care. The study recruited participants from 3 health systems. Change in A1c measured at 12 months is the primary outcome. Change in blood pressure, lipids, health care utilization, health-related quality of life, self-efficacy and diabetes self-management behaviors at 12 months are secondary outcomes. Results A total of 1,438 patients were identified by medical record review as potentially eligible, 445 patients were screened by telephone for eligibility and 287 were randomized. Groups were comparable at baseline on socio-demographic and clinical characteristics. All participants were low-income and were from diverse racial and ethnic backgrounds. The mean A1c was 8.9%, mean BMI was above the obese range, and non-adherence to diabetes medications was high. The cohort had high rates of co-morbid disease and low self-reported health status. Although one-third reported no health insurance, the mean number of visits to a physician in the past year was 5.7. Trial results are pending. Conclusions Peer-AID recruited and enrolled a diverse group of low income participants with poorly controlled type 2 diabetes and delivered a home-based diabetes self-management program. If effective, replication of the Peer-AID intervention in community based settings could contribute to improved control of diabetes in vulnerable populations. PMID:24956324

  19. Coherent lidar design and performance verification

    NASA Technical Reports Server (NTRS)

    Frehlich, Rod

    1993-01-01

    The verification of LAWS beam alignment in space can be achieved by a measurement of heterodyne efficiency using the surface return. The crucial element is a direct detection signal that can be identified for each surface return. This should be satisfied for LAWS but will not be satisfied for descoped LAWS. The performance of algorithms for velocity estimation can be described with two basic parameters: the number of coherently detected photo-electrons per estimate and the number of independent signal samples per estimate. The average error of spectral domain velocity estimation algorithms are bounded by a new periodogram Cramer-Rao Bound. Comparison of the periodogram CRB with the exact CRB indicates a factor of two improvement in velocity accuracy is possible using non-spectral domain estimators. This improvement has been demonstrated with a maximum-likelihood estimator. The comparison of velocity estimation algorithms for 2 and 10 micron coherent lidar was performed by assuming all the system design parameters are fixed and the signal statistics are dominated by a 1 m/s rms wind fluctuation over the range gate. The beam alignment requirements for 2 micron are much more severe than for a 10 micron lidar. The effects of the random backscattered field on estimating the alignment error is a major problem for space based lidar operation, especially if the heterodyne efficiency cannot be estimated. For LAWS, the biggest science payoff would result from a short transmitted pulse, on the order of 0.5 microseconds instead of 3 microseconds. The numerically errors for simulation of laser propagation in the atmosphere have been determined as a joint project with the University of California, San Diego. Useful scaling laws were obtained for Kolmogorov atmospheric refractive turbulence and an atmospheric refractive turbulence characterized with an inner scale. This permits verification of the simulation procedure which is essential for the evaluation of the effects of

  20. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  1. Runtime Verification with State Estimation

    NASA Technical Reports Server (NTRS)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  2. RISKIND verification and benchmark comparisons

    SciTech Connect

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  3. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  4. The Ontogeny of the Verification System.

    ERIC Educational Resources Information Center

    Akiyama, M. Michael; Guillory, Andrea W.

    1983-01-01

    Young children found it difficult to verify negative statements, but found affirmative statements, affirmative questions, and negative questions equally easy to deal with. It is proposed that children acquire the answering system earlier than the verification system, and use answering to verify statements before acquiring the verification system.…

  5. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  7. 25 CFR 61.8 - Verification forms.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR TRIBAL GOVERNMENT PREPARATION OF ROLLS OF INDIANS § 61.8... enrollment, a verification form, to be completed and returned, shall be mailed to each previous enrollee using the last address of record. The verification form will be used to ascertain the previous...

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  9. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  10. The monitoring and verification of nuclear weapons

    NASA Astrophysics Data System (ADS)

    Garwin, Richard L.

    2014-05-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  11. 40 CFR 1066.220 - Linearity verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Linearity verification. 1066.220 Section 1066.220 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.220 Linearity verification. (a) Scope and frequency. Perform linearity...

  12. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  13. HTGR analytical methods and design verification

    SciTech Connect

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier.

  14. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  15. Verification testing of advanced environmental monitoring systems

    SciTech Connect

    Kelly, T.J.; Riggs, K.B.; Fuerst, R.G.

    1999-03-01

    This paper describes the Advanced Monitoring Systems (AMS) pilot project, one of 12 pilots comprising the US EPA`s Environmental Technology Verification (ETV) program. The aim of ETV is to promote the acceptance of environmental technologies in the marketplace, through objective third-party verification of technology performance.

  16. Hierarchical Design and Verification for VLSI

    NASA Technical Reports Server (NTRS)

    Shostak, R. E.; Elliott, W. D.; Levitt, K. N.

    1983-01-01

    The specification and verification work is described in detail, and some of the problems and issues to be resolved in their application to Very Large Scale Integration VLSI systems are examined. The hierarchical design methodologies enable a system architect or design team to decompose a complex design into a formal hierarchy of levels of abstraction. The first step inprogram verification is tree formation. The next step after tree formation is the generation from the trees of the verification conditions themselves. The approach taken here is similar in spirit to the corresponding step in program verification but requires modeling of the semantics of circuit elements rather than program statements. The last step is that of proving the verification conditions using a mechanical theorem-prover.

  17. New method of verificating optical flat flatness

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Li, Xueyuan; Han, Sen; Zhu, Jianrong; Guo, Zhenglai; Fu, Yuegang

    2014-11-01

    Optical flat is commonly used in optical testing instruments, flatness is the most important parameter of forming errors. As measurement criteria, optical flat flatness (OFF) index needs to have good precision. Current measurement in China is heavily dependent on the artificial visual interpretation, through discrete points to characterize the flatness. The efficiency and accuracy of this method can not meet the demand of industrial development. In order to improve the testing efficiency and accuracy of measurement, it is necessary to develop an optical flat verification system, which can obtain all surface information rapidly and efficiently, at the same time, in accordance with current national metrological verification procedures. This paper reviews current optical flat verification method and solves the problems existing in previous test, by using new method and its supporting software. Final results show that the new system can improve verification efficiency and accuracy, by comparing with JJG 28-2000 metrological verification procedures method.

  18. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  19. The Effect of Mystery Shopper Reports on Age Verification for Tobacco Purchases

    PubMed Central

    KREVOR, BRAD S.; PONICKI, WILLIAM R.; GRUBE, JOEL W.; DeJONG, WILLIAM

    2011-01-01

    Mystery shops (MS) involving attempted tobacco purchases by young buyers have been employed to monitor retail stores’ performance in refusing underage sales. Anecdotal evidence suggests that MS visits with immediate feedback to store personnel can improve age verification. This study investigated the impact of monthly and twice-monthly MS reports on age verification. Forty-five Walgreens stores were each visited 20 times by mystery shoppers. The stores were randomly assigned to one of three conditions. Control group stores received no feedback, whereas two treatment groups received feedback communications every visit (twice monthly) or every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Post-baseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement than control group stores. Verification rates increased significantly during the study period for all three groups, with delayed improvement among control group stores. Communication between managers regarding the MS program may account for the delayed age-verification improvements observed in the control group stores. Encouraging inter-store communication might extend the benefits of MS programs beyond those stores that receive this intervention. PMID:21541874

  20. The effect of mystery shopper reports on age verification for tobacco purchases.

    PubMed

    Krevor, Brad S; Ponicki, William R; Grube, Joel W; DeJong, William

    2011-09-01

    Mystery shops involving attempted tobacco purchases by young buyers have been implemented in order to monitor retail stores' performance in refusing underage sales. Anecdotal evidence suggests that mystery shop visits with immediate feedback to store personnel can improve age verification. This study investigated the effect of monthly and twice-monthly mystery shop reports on age verification. Mystery shoppers visited 45 Walgreens stores 20 times. The stores were randomly assigned to 1 of 3 conditions. Control group stores received no feedback, whereas 2 treatment groups received feedback communications on every visit (twice monthly) or on every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Postbaseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement compared with the control group stores. Verification rates increased significantly during the study period for all 3 groups, with delayed improvement among control group stores. Communication between managers regarding the mystery shop program may account for the delayed age-verification improvements observed in the control group stores. Encouraging interstore communication might extend the benefits of mystery shop programs beyond those stores that receive this intervention. PMID:21541874

  1. Monthly and seasonally verification of precipitation in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, K.; Linkowska, J.

    2009-04-01

    The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds.

  2. Monthly and seasonally verification of precipitation in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, K.; Linkowska, J.

    2009-04-01

    The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds

  3. Reactive system verification case study: Fault-tolerant transputer communication

    NASA Technical Reports Server (NTRS)

    Crane, D. Francis; Hamory, Philip J.

    1993-01-01

    A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.

  4. AUTOMATED, HIGHLY ACCURATE VERIFICATION OF RELAP5-3D

    SciTech Connect

    George L Mesina; David Aumiller; Francis Buschman

    2014-07-01

    Computer programs that analyze light water reactor safety solve complex systems of governing, closure and special process equations to model the underlying physics. In addition, these programs incorporate many other features and are quite large. RELAP5-3D[1] has over 300,000 lines of coding for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. Verification ensures that a program is built right by checking that it meets its design specifications. Recently, there has been an increased importance on the development of automated verification processes that compare coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions[2]. For the first time, the ability exists to ensure that the data transfer operations associated with timestep advancement/repeating and writing/reading a solution to a file have no unintended consequences. To ensure that the code performs as intended over its extensive list of applications, an automated and highly accurate verification method has been modified and applied to RELAP5-3D. Furthermore, mathematical analysis of the adequacy of the checks used in the comparisons is provided.

  5. Independence of Internal Auditors.

    ERIC Educational Resources Information Center

    Montondon, Lucille; Meixner, Wilda F.

    1993-01-01

    A survey of 288 college and university auditors investigated patterns in their appointment, reporting, and supervisory practices as indicators of independence and objectivity. Results indicate a weakness in the positioning of internal auditing within institutions, possibly compromising auditor independence. Because the auditing function is…

  6. American Independence. Fifth Grade.

    ERIC Educational Resources Information Center

    Crosby, Annette

    This fifth grade teaching unit covers early conflicts between the American colonies and Britain, battles of the American Revolutionary War, and the Declaration of Independence. Knowledge goals address the pre-revolutionary acts enforced by the British, the concepts of conflict and independence, and the major events and significant people from the…

  7. Fostering Musical Independence

    ERIC Educational Resources Information Center

    Shieh, Eric; Allsup, Randall Everett

    2016-01-01

    Musical independence has always been an essential aim of musical instruction. But this objective can refer to everything from high levels of musical expertise to more student choice in the classroom. While most conceptualizations of musical independence emphasize the demonstration of knowledge and skills within particular music traditions, this…

  8. Centering on Independent Study.

    ERIC Educational Resources Information Center

    Miller, Stephanie

    Independent study is an instructional approach that can have enormous power in the classroom. It can be used successfully with students at all ability levels, even though it is often associated with gifted students. Independent study is an opportunity for students to study a subject of their own choosing under the guidance of a teacher. The…

  9. Guidance and Control Software Project Data - Volume 3: Verification Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  10. How frequent are overactive bladder symptoms in women with urodynamic verification of an overactive bladder?

    PubMed Central

    Yeniel, Ahmet Özgür; Ergenoğlu, Mete Ahmet; Meseri, Reci; Aşkar, Niyazi; İtil, İsmail Mete

    2012-01-01

    Objective To determine the relationship between overactive bladder symptoms and urodynamic verification of overactive bladder. Material and Methods Between June 2011 and November 2011, 159 patients underwent urodynamics (UDS) at our urogynecology unit in the Ege University Hospital. Of these, 95 patients who complained of urgency, did not have any overt neurological diseases, bladder outlet obstruction and did not take any medication affecting the lower urinary tract function were evaluated. SPSS (ver. 15.0) was used to evaluate the data and the chi-square test and t test for independent samples were used for analysis. Results The mean age was found to be 54.5±12. Frequency was the most frequent symptom in women with overactive bladder (OAB) (82.1%), nocturia (57.8%) and (57.8%) urgency urinary incontinence followed in frequency. Detrusor over activity incidence was found to be 38.9%. There was no significant relationship between the presence of detrusor over activity (DOA) and OAB symptoms. Leak at urodynamics was found in 46.3% and there is no significant association with detrusor overactivity. Total bladder capacity was found to be significantly lower in women who had DOA (p=0.000). Conclusion It appears that overactive bladder symptoms do not predict detrusor over activity. Urodynamic investigation is not mandatory in the initial management of women with only OAB symptoms. PMID:24592016

  11. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  12. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  13. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  14. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  15. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  16. 77 FR 60714 - Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-04

    ... consultation process, on May 14, 2012, we published a Federal Register notice (77 FR 28401) announcing that we... Bureau of Safety and Environmental Enforcement Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office of Management and Budget (OMB) Review; Comment Request...

  17. 78 FR 6849 - Agency Information Collection (Verification of VA Benefits) Activity Under OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-31

    ... AFFAIRS Agency Information Collection (Verification of VA Benefits) Activity Under OMB Review AGENCY... abstracted below to the Office of Management and Budget (OMB) for review and comment. The PRA submission... VA Benefits, VA Form 26-8937. OMB Control Number: 2900-0406. ] Type of Review: Extension of...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH ELECTRONIC SENSOR TECHNOLOGY MODEL 4100

    EPA Science Inventory

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. As part of this program, the...

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - CHROMATOGRAPH/MASS SPECTOMETOR INLICON, INC. HAPSITE

    EPA Science Inventory

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. As part of this program, the...

  20. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  1. 49 CFR Appendix F to Part 236 - Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System Safety...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Minimum Requirements of FRA Directed Independent... F to Part 236—Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System... independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or...

  2. 49 CFR Appendix F to Part 236 - Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System Safety...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Minimum Requirements of FRA Directed Independent... F to Part 236—Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System... independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or...

  3. 49 CFR Appendix F to Part 236 - Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System Safety...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Minimum Requirements of FRA Directed Independent... F to Part 236—Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System... independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or...

  4. Data Machine Independence

    1994-12-30

    Data-machine independence achieved by using four technologies (ASN.1, XDR, SDS, and ZEBRA) has been evaluated by encoding two different applications in each of the above; and their results compared against the standard programming method using C.

  5. Media independent interface

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The work done on the Media Independent Interface (MII) Interface Control Document (ICD) program is described and recommendations based on it were made. Explanations and rationale for the content of the ICD itself are presented.

  6. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  7. Engineering drawing field verification program. Revision 3

    SciTech Connect

    Ulk, P.F.

    1994-10-12

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented.

  8. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  9. U.S. Environmental Technology Verification Program

    EPA Science Inventory

    Overview of the U.S. Environmental Technology Verification Program (ETV), the ETV Greenhouse Gas Technology Center, and energy-related ETV projects. Presented at the Department of Energy's National Renewable Laboratory in Boulder, Colorado on June 23, 2008.

  10. Calibration and verification of environmental models

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Weinberg, N.; Hiser, H.

    1976-01-01

    The problems of calibration and verification of mesoscale models used for investigating power plant discharges are considered. The value of remote sensors for data acquisition is discussed as well as an investigation of Biscayne Bay in southern Florida.

  11. Verification timer for AECL 780 Cobalt unit.

    PubMed

    Smathers, J B; Holly, F E

    1984-05-01

    To obtain verification of the proper time setting of the motorized run down timer for a AECL 780 Cobalt Unit, a digital timer is described, which can be added to the system for under $300. PMID:6735762

  12. Electronic Verification at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Johnson, T. W.

    1995-01-01

    This document reviews some current applications of Electronic Verification and the benefits such applications are providing the Kennedy Space Center (KSC). It also previews some new technologies, including statistics regarding performance and possible utilization of the technology.

  13. THE EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM

    EPA Science Inventory

    The Environmental Protection Agency (EPA) instituted the Environmental Technology Verification Program--or ETV--to verify the performance of innovative technical solutions to problems that threaten human health or the environment. ETV was created to substantially accelerate the e...

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  16. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  17. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  18. Transmutation Fuel Performance Code Thermal Model Verification

    SciTech Connect

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  19. A Runtime Verification Framework for Control System Simulation

    SciTech Connect

    Ciraci, Selim; Fuller, Jason C.; Daily, Jeffrey A.; Makhmalbaf, Atefe; Callahan, Charles D.

    2014-08-02

    n a standard workflow for the validation of a control system, the control system is implemented as an extension to a simulator. Such simulators are complex software systems, and engineers may unknowingly violate constraints a simulator places on extensions. As such, errors may be introduced in the implementation of either the control system or the simulator leading to invalid simulation results. This paper presents a novel runtime verification approach for verifying control system implementations within simulators. The major contribution of the approach is the two-tier specification process. In the first tier, engineers model constraints using a domain-specific language tailored to modeling a controller’s response to changes in its input. The language is high-level and effectively hides the implementation details of the simulator, allowing engineers to specify design-level constraints independent of low-level simulator interfaces. In the second tier, simulator developers provide mapping rules for mapping design-level constraints to the implementation of the simulator. Using the rules, an automated tool transforms the design-level specifications into simulator-specific runtime verification specifications and generates monitoring code which is injected into the implementation of the simulator. During simulation, these monitors observe the input and output variables of the control system and report changes to the verifier. The verifier checks whether these changes follow the constraints of the control system. We describe application of this approach to the verification of the constraints of an HVAC control system implemented with the power grid simulator GridLAB-D.

  20. 76 FR 54810 - Submission for Review: 3206-0215, Verification of Full-Time School Attendance, RI 25-49

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ... MANAGEMENT Submission for Review: 3206-0215, Verification of Full-Time School Attendance, RI 25-49 AGENCY: U... on May 23, 2011 at Volume 76 FR 29805 allowing for a 60-day public comment period. No comments were... Management must confirm that a full-time enrollment has been maintained. Analysis Agency:...

  1. Verification and Validation in Computational Fluid Dynamics

    SciTech Connect

    OBERKAMPF, WILLIAM L.; TRUCANO, TIMOTHY G.

    2002-03-01

    Verification and validation (V and V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V and V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V and V, and develops a number of extensions to existing ideas. The review of the development of V and V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V and V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized.

  2. Results Oriented Management in Education. Project R.O.M.E. The Verification and Validation of Principal Competencies and Performance Indicators: Assessment Design--Procedures--Instrumentation--Field Test Results. Volume 3--Instrument Appendix to Accompany Final Report.

    ERIC Educational Resources Information Center

    Georgia State Dept. of Education, Atlanta.

    This document represents a complete compilation of instruments used by the University of Georgia Project R.O.M.E. (Results Oriented Management in Education) staff to field test the Georgia Principal Assessment System in order to validate high priority principal competencies and performance indicators during the 1974-75 project year. The seven…

  3. Development of advanced seal verification

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kosten, Susan E.; Abushagur, Mustafa A.

    1992-01-01

    The purpose of this research is to develop a technique to monitor and insure seal integrity with a sensor that has no active elements to burn-out during a long duration activity, such as a leakage test or especially during a mission in space. The original concept proposed is that by implementing fiber optic sensors, changes in the integrity of a seal can be monitored in real time and at no time should the optical fiber sensor fail. The electrical components which provide optical excitation and detection through the fiber are not part of the seal; hence, if these electrical components fail, they can be easily changed without breaking the seal. The optical connections required for the concept to work does present a functional problem to work out. The utility of the optical fiber sensor for seal monitoring should be general enough that the degradation of a seal can be determined before catastrophic failure occurs and appropriate action taken. Two parallel efforts were performed in determining the feasibility of using optical fiber sensors for seal verification. In one study, research on interferometric measurements of the mechanical response of the optical fiber sensors to seal integrity was studied. In a second study, the implementation of the optical fiber to a typical vacuum chamber was implemented and feasibility studies on microbend experiments in the vacuum chamber were performed. Also, an attempt was made to quantify the amount of pressure actually being applied to the optical fiber using finite element analysis software by Algor.

  4. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  5. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  6. Verification of excess defense material

    SciTech Connect

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1997-12-01

    The international community in the post-Cold War period has expressed an interest in the International Atomic Energy Agency (IAEA) using its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring excess materials, which include both classified and unclassified materials. Although the IAEA has suggested the need to address inspections of both types of materials, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA safeguards is the conflict between these traditional IAEA materials accounting procedures and the US classification laws and nonproliferation policy designed to prevent the disclosure of critical weapon-design information. Possible verification approaches to classified excess defense materials could be based on item accountancy, attributes measurements, and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical and political feasibility, or for their possible acceptability in an international inspection regime. Substantial work remains in these areas. This paper examines many of the challenges presented by international inspections of classified materials.

  7. Verification survey of the 17th Street Drainage Area, Santa Susana Field Laboratory, The Boeing Company, Ventura County, California

    SciTech Connect

    John R. Morton

    2000-04-14

    An independent (third-party) verification of contractor remedial actions of the subject site verifies that remedial actions have been effective in meeting established and site-specific guidelines and that the documentation accurately and adequately describes the radiological conditions at the site.

  8. National Verification System of National Meteorological Center , China

    NASA Astrophysics Data System (ADS)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  9. Independent NOAA considered

    NASA Astrophysics Data System (ADS)

    Richman, Barbara T.

    A proposal to pull the National Oceanic and Atmospheric Administration (NOAA) out of the Department of Commerce and make it an independent agency was the subject of a recent congressional hearing. Supporters within the science community and in Congress said that an independent NOAA will benefit by being more visible and by not being tied to a cabinet-level department whose main concerns lie elsewhere. The proposal's critics, however, cautioned that making NOAA independent could make it even more vulnerable to the budget axe and would sever the agency's direct access to the President.The separation of NOAA from Commerce was contained in a June 1 proposal by President Ronald Reagan that also called for all federal trade functions under the Department of Commerce to be reorganized into a new Department of International Trade and Industry (DITI).

  10. Formal verification of a microcoded VIPER microprocessor using HOL

    NASA Technical Reports Server (NTRS)

    Levitt, Karl; Arora, Tejkumar; Leung, Tony; Kalvala, Sara; Schubert, E. Thomas; Windley, Philip; Heckman, Mark; Cohen, Gerald C.

    1993-01-01

    The Royal Signals and Radar Establishment (RSRE) and members of the Hardware Verification Group at Cambridge University conducted a joint effort to prove the correspondence between the electronic block model and the top level specification of Viper. Unfortunately, the proof became too complex and unmanageable within the given time and funding constraints, and is thus incomplete as of the date of this report. This report describes an independent attempt to use the HOL (Cambridge Higher Order Logic) mechanical verifier to verify Viper. Deriving from recent results in hardware verification research at UC Davis, the approach has been to redesign the electronic block model to make it microcoded and to structure the proof in a series of decreasingly abstract interpreter levels, the lowest being the electronic block level. The highest level is the RSRE Viper instruction set. Owing to the new approach and some results on the proof of generic interpreters as applied to simple microprocessors, this attempt required an effort approximately an order of magnitude less than the previous one.

  11. Supporting independent inventors

    SciTech Connect

    Bernard, M.J. III; Whalley, P.; Loyola Univ., Chicago, IL . Dept. of Sociology)

    1989-01-01

    Independent inventors contribute products to the marketplace despite the well-financed brain trusts at corporate, university, and federal R and D laboratories. But given the environment in which the basement/garage inventor labors, transferring a worthwhile invention into a commercial product is quite difficult. There is a growing effort by many state and local agencies and organizations to improve the inventor's working environment and begin to routinize the process of developing ideas and inventional of independent inventors into commercial products. 4 refs.

  12. Towards the Formal Verification of a Distributed Real-Time Automotive System

    NASA Technical Reports Server (NTRS)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  13. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... list. (24 CFR 982.54(d)(1) and 982.204(a)) (2) HUD verification method: The independent auditor (IA... the reasonable rent. (24 CFR 982.4, 24 CFR 982.54(d)(15), 982.158(f)(7) and 982.507) (2) HUD... in determining the gross rent. (24 CFR part 5, subpart F and 24 CFR 982.516) (2) HUD...

  14. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... list. (24 CFR 982.54(d)(1) and 982.204(a)) (2) HUD verification method: The independent auditor (IA... the reasonable rent. (24 CFR 982.4, 24 CFR 982.54(d)(15), 982.158(f)(7) and 982.507) (2) HUD... in determining the gross rent. (24 CFR part 5, subpart F and 24 CFR 982.516) (2) HUD...

  15. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... list. (24 CFR 982.54(d)(1) and 982.204(a)) (2) HUD verification method: The independent auditor (IA... the reasonable rent. (24 CFR 982.4, 24 CFR 982.54(d)(15), 982.158(f)(7) and 982.507) (2) HUD... in determining the gross rent. (24 CFR part 5, subpart F and 24 CFR 982.516) (2) HUD...

  16. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... list. (24 CFR 982.54(d)(1) and 982.204(a)) (2) HUD verification method: The independent auditor (IA... the reasonable rent. (24 CFR 982.4, 24 CFR 982.54(d)(15), 982.158(f)(7) and 982.507) (2) HUD... in determining the gross rent. (24 CFR part 5, subpart F and 24 CFR 982.516) (2) HUD...

  17. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... list. (24 CFR 982.54(d)(1) and 982.204(a)) (2) HUD verification method: The independent auditor (IA... the reasonable rent. (24 CFR 982.4, 24 CFR 982.54(d)(15), 982.158(f)(7) and 982.507) (2) HUD... in determining the gross rent. (24 CFR part 5, subpart F and 24 CFR 982.516) (2) HUD...

  18. Concepts of Model Verification and Validation

    SciTech Connect

    B.H.Thacker; S.W.Doebling; F.M.Hemez; M.C. Anderson; J.E. Pepin; E.A. Rodriguez

    2004-10-30

    Model verification and validation (V&V) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model V&V procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model V&V program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model V&V is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define V&V methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for V&V applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of V&V procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model V&V for all

  19. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Gas analyzer range verification and drift verification. 1065.550 Section 1065.550 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.550 Gas analyzer...

  20. Towards a Theory for Integration of Mathematical Verification and Empirical Testing

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Boyd, Mark; Kulkarni, Deepak

    1998-01-01

    From the viewpoint of a project manager responsible for the V&V (verification and validation) of a software system, mathematical verification techniques provide a possibly useful orthogonal dimension to otherwise standard empirical testing. However, the value they add to an empirical testing regime both in terms of coverage and in fault detection has been difficult to quantify. Furthermore, potential cost savings from replacing testing with mathematical verification techniques cannot be realized until the tradeoffs and synergies can be formulated. Integration of formal verification with empirical testing is also difficult because the idealized view of mathematical verification providing a correctness proof with total coverage is unrealistic and does not reflect the limitations imposed by computational complexity of mathematical techniques. This paper first describes a framework based on software reliability and formalized fault models for a theory of software design fault detection - and hence the utility of various tools for debugging. It then describes a utility model for integrating mathematical and empirical techniques with respect to fault detection and coverage analysis. It then considers the optimal combination of black-box testing, white-box (structural) testing, and formal methods in V&V of a software system. Using case studies from NASA software systems, it then demonstrates how this utility model can be used in practice.

  1. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  2. Postcard from Independence, Mo.

    ERIC Educational Resources Information Center

    Archer, Jeff

    2004-01-01

    This article reports results showing that the Independence, Missori school district failed to meet almost every one of its improvement goals under the No Child Left Behind Act. The state accreditation system stresses improvement over past scores, while the federal law demands specified amounts of annual progress toward the ultimate goal of 100…

  3. Touchstones of Independence.

    ERIC Educational Resources Information Center

    Roha, Thomas Arden

    1999-01-01

    Foundations affiliated with public higher education institutions can avoid having to open records for public scrutiny, by having independent boards of directors, occupying leased office space or paying market value for university space, using only foundation personnel, retaining legal counsel, being forthcoming with information and use of public…

  4. Independent Human Studies.

    ERIC Educational Resources Information Center

    Kaplan, Suzanne; Wilson, Gordon

    1978-01-01

    The Independent Human Studies program at Schoolcraft College offers an alternative method of earning academic credits. Students delineate an area of study, pose research questions, gather resources, synthesize the information, state the thesis, choose the method of presentation, set schedules, and take responsibility for meeting deadlines. (MB)

  5. Caring about Independent Lives

    ERIC Educational Resources Information Center

    Christensen, Karen

    2010-01-01

    With the rhetoric of independence, new cash for care systems were introduced in many developed welfare states at the end of the 20th century. These systems allow local authorities to pay people who are eligible for community care services directly, to enable them to employ their own careworkers. Despite the obvious importance of the careworker's…

  6. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  7. The DES Science Verification weak lensing shear catalogues

    NASA Astrophysics Data System (ADS)

    Jarvis, M.; Sheldon, E.; Zuntz, J.; Kacprzak, T.; Bridle, S. L.; Amara, A.; Armstrong, R.; Becker, M. R.; Bernstein, G. M.; Bonnett, C.; Chang, C.; Das, R.; Dietrich, J. P.; Drlica-Wagner, A.; Eifler, T. F.; Gangkofner, C.; Gruen, D.; Hirsch, M.; Huff, E. M.; Jain, B.; Kent, S.; Kirk, D.; MacCrann, N.; Melchior, P.; Plazas, A. A.; Refregier, A.; Rowe, B.; Rykoff, E. S.; Samuroff, S.; Sánchez, C.; Suchyta, E.; Troxel, M. A.; Vikram, V.; Abbott, T.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Capozzi, D.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Clampitt, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Gaztanaga, E.; Gerdes, D. W.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Martini, P.; Miquel, R.; Mohr, J. J.; Neilsen, E.; Nord, B.; Ogando, R.; Reil, K.; Romer, A. K.; Roodman, A.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Thomas, D.; Walker, A. R.; Wechsler, R. H.

    2016-08-01

    We present weak lensing shear catalogues for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogues of 2.12 million and 3.44 million galaxies respectively. We detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SV data. We also discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogues for the full 5-year DES, which is expected to cover 5000 square degrees.

  8. Shell Element Verification & Regression Problems for DYNA3D

    SciTech Connect

    Zywicz, E

    2008-02-01

    A series of quasi-static regression/verification problems were developed for the triangular and quadrilateral shell element formulations contained in Lawrence Livermore National Laboratory's explicit finite element program DYNA3D. Each regression problem imposes both displacement- and force-type boundary conditions to probe the five independent nodal degrees of freedom employed in the targeted formulation. When applicable, the finite element results are compared with small-strain linear-elastic closed-form reference solutions to verify select aspects of the formulations implementation. Although all problems in the suite depict the same geometry, material behavior, and loading conditions, each problem represents a unique combination of shell formulation, stabilization method, and integration rule. Collectively, the thirty-six new regression problems in the test suite cover nine different shell formulations, three hourglass stabilization methods, and three families of through-thickness integration rules.

  9. The DES Science Verification Weak Lensing Shear Catalogs

    SciTech Connect

    Jarvis, M.

    2015-07-20

    We present weak lensing shear catalogs for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogs of 2.12 million and 3.44 million galaxies respectively. We also detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SV data. Furthermore, we discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogs for the full 5-year DES, which is expected to cover 5000 square degrees.

  10. The DES Science Verification weak lensing shear catalogues

    NASA Astrophysics Data System (ADS)

    Jarvis, M.; Sheldon, E.; Zuntz, J.; Kacprzak, T.; Bridle, S. L.; Amara, A.; Armstrong, R.; Becker, M. R.; Bernstein, G. M.; Bonnett, C.; Chang, C.; Das, R.; Dietrich, J. P.; Drlica-Wagner, A.; Eifler, T. F.; Gangkofner, C.; Gruen, D.; Hirsch, M.; Huff, E. M.; Jain, B.; Kent, S.; Kirk, D.; MacCrann, N.; Melchior, P.; Plazas, A. A.; Refregier, A.; Rowe, B.; Rykoff, E. S.; Samuroff, S.; Sánchez, C.; Suchyta, E.; Troxel, M. A.; Vikram, V.; Abbott, T.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Capozzi, D.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Clampitt, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Gaztanaga, E.; Gerdes, D. W.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Martini, P.; Miquel, R.; Mohr, J. J.; Neilsen, E.; Nord, B.; Ogando, R.; Reil, K.; Romer, A. K.; Roodman, A.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Thomas, D.; Walker, A. R.; Wechsler, R. H.

    2016-08-01

    We present weak lensing shear catalogues for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogues of 2.12 million and 3.44 million galaxies, respectively. We detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SV data. We also discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogues for the full 5-yr DES, which is expected to cover 5000 square degrees.

  11. The DES Science Verification Weak Lensing Shear Catalogs

    DOE PAGESBeta

    Jarvis, M.

    2016-05-01

    We present weak lensing shear catalogs for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogs of 2.12 million and 3.44 million galaxies respectively. We also detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SVmore » data. Furthermore, we discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogs for the full 5-year DES, which is expected to cover 5000 square degrees.« less

  12. The DES Science Verification Weak Lensing Shear Catalogues

    NASA Astrophysics Data System (ADS)

    Jarvis, M.; Sheldon, E.; Zuntz, J.; Kacprzak, T.; Bridle, S. L.; Amara, A.; Armstrong, R.; Becker, M. R.; Bernstein, G. M.; Bonnett, C.; Chang, C.; Das, R.; Dietrich, J. P.; Drlica-Wagner, A.; Eifler, T. F.; Gangkofner, C.; Gruen, D.; Hirsch, M.; Huff, E. M.; Jain, B.; Kent, S.; Kirk, D.; MacCrann, N.; Melchior, P.; Plazas, A. A.; Refregier, A.; Rowe, B.; Rykoff, E. S.; Samuroff, S.; Sánchez, C.; Suchyta, E.; Troxel, M. A.; Vikram, V.; Abbott, T.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Capozzi, D.; Rosell, A. Carnero; Kind, M. Carrasco; Carretero, J.; Castander, F. J.; Clampitt, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Neto, A. Fausti; Flaugher, B.; Fosalba, P.; Frieman, J.; Gaztanaga, E.; Gerdes, D. W.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Martini, P.; Miquel, R.; Mohr, J. J.; Neilsen, E.; Nord, B.; Ogando, R.; Reil, K.; Romer, A. K.; Roodman, A.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Thomas, D.; Walker, A. R.; Wechsler, R. H.

    2016-05-01

    We present weak lensing shear catalogues for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogues of 2.12 million and 3.44 million galaxies respectively. We detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SV data. We also discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogues for the full 5-year DES, which is expected to cover 5000 square degrees.

  13. Ozone Monitoring Instrument geolocation verification

    NASA Astrophysics Data System (ADS)

    Kroon, M.; Dobber, M. R.; Dirksen, R.; Veefkind, J. P.; van den Oord, G. H. J.; Levelt, P. F.

    2008-08-01

    Verification of the geolocation assigned to individual ground pixels as measured by the Ozone Monitoring Instrument (OMI) aboard the NASA EOS-Aura satellite was performed by comparing geophysical Earth surface details as observed in OMI false color images with the high-resolution continental outline vector map as provided by the Interactive Data Language (IDL) software tool from ITT Visual Information Solutions. The OMI false color images are generated from the OMI visible channel by integration over 20-nm-wide spectral bands of the Earth radiance intensity around 484 nm, 420 nm, and 360 nm wavelength per ground pixel. Proportional to the integrated intensity, we assign color values composed of CRT standard red, green, and blue to the OMI ground pixels. Earth surface details studied are mostly high-contrast coast lines where arid land or desert meets deep blue ocean. The IDL high-resolution vector map is based on the 1993 CIA World Database II Map with a 1-km accuracy. Our results indicate that the average OMI geolocation offset over the years 2005-2006 is 0.79 km in latitude and 0.29 km in longitude, with a standard deviation of 1.64 km in latitude and 2.04 km in longitude, respectively. Relative to the OMI nadir pixel size, one obtains mean displacements of ˜6.1% in latitude and ˜1.2% in longitude, with standard deviations of 12.6% and 7.9%, respectively. We conclude that the geolocation assigned to individual OMI ground pixels is sufficiently accurate to support scientific studies of atmospheric features as observed in OMI level 2 satellite data products, such as air quality issues on urban scales or volcanic eruptions and its plumes, that occur on spatial scales comparable to or smaller than OMI nadir pixels.

  14. Pavement management

    SciTech Connect

    Ross, F.R.; Connor, B.; Lytton, R.L.; Darter, M.I.; Shahin, M.Y.

    1982-01-01

    The 11 papers in this report deal with the following areas: effect of pavement roughness on vehicle fuel consumption; rational seasonal load restrictions and overload permits; state-level pavement monitoring program; data requirements for long-term monitoring of pavements as a basis for development of multiple regression relations; simplified pavement management at the network level; combined priority programming of maintenance and rehabilitation for pavement networks; Arizona pavement management system: Phase 2-verification of performance prediction models and development of data base; overview of paver pavement management system; economic analysis of field implementation of paver pavement management system; development of a statewide pavement maintenance management system; and, prediction of pavement maintenance expenditure by using a statistical cost function.

  15. INDEPENDENT TECHNICAL REVIEW OF THE FOCUSED FEASIBILITY STUDY AND PROPOSED PLAN FOR DESIGNATED SOLID WASTE MANAGEMENT UNITS CONTRIBUTING TO THE SOUTHWEST GROUNDWATER PLUME AT THE PADUCAH GASEOUS DIFFUSION PLANT

    SciTech Connect

    Looney, B.; Eddy-Dilek, C.; Amidon, M.; Rossabi, J.; Stewart, L.

    2011-05-31

    The U. S. Department of Energy (DOE) is currently developing a Proposed Plan (PP) for remediation of designated sources of chlorinated solvents that contribute contamination to the Southwest (SW) Groundwater Plume at the Paducah Gaseous Diffusion Plant (PGDP), in Paducah, KY. The principal contaminants in the SW Plume are trichloroethene (TCE) and other volatile organic compounds (VOCs); these industrial solvents were used and disposed in various facilities and locations at PGDP. In the SW plume area, residual TCE sources are primarily in the fine-grained sediments of the Upper Continental Recharge System (UCRS), a partially saturated zone that delivers contaminants downward into the coarse-grained Regional Gravel Aquifer (RGA). The RGA serves as the significant lateral groundwater transport pathway for the plume. In the SW Plume area, the four main contributing TCE source units are: (1) Solid Waste Management Unit (SWMU) 1 / Oil Landfarm; (2) C-720 Building TCE Northeast Spill Site (SWMU 211A); (3) C-720 Building TCE Southeast Spill Site (SWMU 211B); and (4) C-747 Contaminated Burial Yard (SWMU 4). The PP presents the Preferred Alternatives for remediation of VOCs in the UCRS at the Oil Landfarm and the C-720 Building spill sites. The basis for the PP is documented in a Focused Feasibility Study (FFS) (DOE, 2011) and a Site Investigation Report (SI) (DOE, 2007). The SW plume is currently within the boundaries of PGDP (i.e., does not extend off-site). Nonetheless, reasonable mitigation of the multiple contaminant sources contributing to the SW plume is one of the necessary components identified in the PGDP End State Vision (DOE, 2005). Because of the importance of the proposed actions DOE assembled an Independent Technical Review (ITR) team to provide input and assistance in finalizing the PP.

  16. Chartering an Experience Bank: A Guide to Forming a Local Association of Independent Businessmen.

    ERIC Educational Resources Information Center

    Center for Venture Management, Milwaukee, WI.

    This monograph briefly explores the nature of independent business and the entrepreneur turned business manager, and presents the concept of a Council of Independent Businessmen for the advancement of independent business managers. A variety of experiences contribute to learning of any kind. The entrepreneur-manager should develop some means to…

  17. MACCS2 development and verification efforts

    SciTech Connect

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  18. Monitoring/Verification using DMS: TATP Example

    SciTech Connect

    Stephan Weeks; Kevin Kyle

    2008-03-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the use of explosives or chemical and biological weapons in terrorist activities. Two peroxide-based liquid explosives, triacetone triperoxide (TATP) and hexamethylene triperoxide diamine (HMTD), are synthesized from common chemicals such as hydrogen peroxide, acetone, sulfuric acid, ammonia, and citric acid (Figure 1). Recipes can be readily found on the Internet by anyone seeking to generate sufficient quantities of these highly explosive chemicals to cause considerable collateral damage. Detection of TATP and HMTD by advanced sensing systems can provide the early warning necessary to prevent terror plots from coming to fruition. DMS is currently one of the foremost emerging technologies for the separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. DMS separates and identifies ions at ambient pressures by utilizing the non-linear dependence of an ion's mobility on the radio frequency (rf) electric field strength. GC is widely considered to be one of the leading analytical methods for the separation of chemical species in complex mixtures. Advances in the technique have led to the development of low-thermal-mass fast GC columns. These columns are capable of

  19. Agent independent task planning

    NASA Technical Reports Server (NTRS)

    Davis, William S.

    1990-01-01

    Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.

  20. PBL Verification with Radiosonde and Aircraft Data

    NASA Astrophysics Data System (ADS)

    Tsidulko, M.; McQueen, J.; Dimego, G.; Ek, M.

    2008-12-01

    Boundary layer depth is an important characteristic in weather forecasting and it is a key parameter in air quality modeling determining extent of turbulence and dispersion for pollutants. Real-time PBL depths from the NAM(WRF/NMM) model are verified with different types of observations. PBL depths verification is incorporated into NCEP verification system including an ability to provide a range of statistical characteristics for the boundary layer heights. For the model, several types of boundary layer definitions are used. PBL height from the TKE scheme, critical Ri number approach as well as mixed layer depth are compared with observations. Observed PBL depths are determined applying Ri number approach to radiosonde profiles. Also, preliminary study of using ACARS data for PBL verification is conducted.