Conducting Research from Small University Observatories: Investigating Exoplanet Candidates
NASA Astrophysics Data System (ADS)
Moreland, Kimberly D.
2018-01-01
Kepler has to date discovered 4,496 exoplanet candidates, but only half are confirmed, and only a handful are thought to be Earth sized and in the habitable zone. Planet verification often involves extensive follow-up observations, which are both time and resource intensive. The data set collected by Kepler is massive and will be studied for decades. University/small observatories, such as the one at Texas State University, are in a good position to assist with the exoplanet candidate verification process. By preforming extended monitoring campaigns, which are otherwise cost ineffective for larger observatories, students gain valuable research experience and contribute valuable data and results to the scientific community.
Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth A.
2016-01-01
We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,
The MCNP6 Analytic Criticality Benchmark Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-06-16
Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less
Working Memory Mechanism in Proportional Quantifier Verification
ERIC Educational Resources Information Center
Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria
2014-01-01
The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…
Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools
NASA Technical Reports Server (NTRS)
Bis, Rachael; Maul, William A.
2015-01-01
Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.
Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth
2016-01-01
If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.
Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System
NASA Astrophysics Data System (ADS)
Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li
The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.J. Vitkus
2008-06-25
The objectives of the verification survey were to confirm that accessible surfaces of the three laboratories meet the DOE’s established criteria for residual contamination. Drain pipes and ductwork were not included within the survey scope.
Code of Federal Regulations, 2010 CFR
2010-07-01
... confirmation: A process by which the Secretary, by means of a matching program conducted with the INS, compares... records of that status maintained by the INS in its Alien Status Verification Index (ASVI) system for the... the INS, in response to the submission of INS Document Verification Form G-845 by an institution...
Verification Processes in Recognition Memory: The Role of Natural Language Mediators
ERIC Educational Resources Information Center
Marshall, Philip H.; Smith, Randolph A. S.
1977-01-01
The existence of verification processes in recognition memory was confirmed in the context of Adams' (Adams & Bray, 1970) closed-loop theory. Subjects' recognition was tested following a learning session. The expectation was that data would reveal consistent internal relationships supporting the position that natural language mediation plays…
HDL to verification logic translator
NASA Technical Reports Server (NTRS)
Gambles, J. W.; Windley, P. J.
1992-01-01
The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.
An Efficient Location Verification Scheme for Static Wireless Sensor Networks.
Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok
2017-01-24
In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.
An Efficient Location Verification Scheme for Static Wireless Sensor Networks
Kim, In-hwan; Kim, Bo-sung; Song, JooSeok
2017-01-01
In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007
Study of techniques for redundancy verification without disrupting systems, phases 1-3
NASA Technical Reports Server (NTRS)
1970-01-01
The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.
Student-Teacher Linkage Verification: Model Process and Recommendations
ERIC Educational Resources Information Center
Watson, Jeffery; Graham, Matthew; Thorn, Christopher A.
2012-01-01
As momentum grows for tracking the role of individual educators in student performance, school districts across the country are implementing projects that involve linking teachers to their students. Programs that link teachers to student outcomes require a verification process for student-teacher linkages. Linkage verification improves accuracy by…
Why Verifying Diagnostic Decisions with a Checklist Can Help: Insights from Eye Tracking
ERIC Educational Resources Information Center
Sibbald, Matthew; de Bruin, Anique B. H.; Yu, Eric; van Merrienboer, Jeroen J. G.
2015-01-01
Making a diagnosis involves ratifying or verifying a proposed answer. Formalizing this verification process with checklists, which highlight key variables involved in the diagnostic decision, is often advocated. However, the mechanisms by which a checklist might allow clinicians to improve their verification process have not been well studied. We…
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
Comparative Cognitive Task Analyses of Experimental Science and Instructional Laboratory Courses
NASA Astrophysics Data System (ADS)
Wieman, Carl
2015-09-01
Undergraduate instructional labs in physics generate intense opinions. Their advocates are passionate as to their importance for teaching physics as an experimental activity and providing "hands-on" learning experiences, while their detractors (often but not entirely students) offer harsh criticisms that they are pointless, confusing and unsatisfying, and "cookbook." Here, both to help understand the reason for such discrepant views and to aid in the design of instructional lab courses, I compare the mental tasks or types of thinking ("cognitive task analysis") associated with a physicist doing tabletop experimental research with the cognitive tasks of students in an introductory physics instructional lab involving traditional verification/confirmation exercises.
NASA Astrophysics Data System (ADS)
Bańkowski, Wojciech; Król, Jan; Gałązka, Karol; Liphardt, Adam; Horodecka, Renata
2018-05-01
Recycling of bituminous pavements is an issue increasingly being discussed in Poland. The analysis of domestic and foreign experience indicates a need to develop this technology in our country, in particular the hot feeding and production technologies. Various steps are being taken in this direction, including research projects. One of them is the InnGA project entitled: “Reclaimed asphalt pavement: Innovative technology of bituminous mixtures using material from reclaimed asphalt pavement”. The paper presents the results of research involving the design of bituminous mixtures in accordance with the required properties and in excess of the content of reclaimed asphalt permitted by the technical guidelines. It presents selected bituminous mixtures with the content of RAP of up to 50% and the results of tests from verification of industrial production of those mixtures. The article discusses the details of the design process of mixtures with a high content of reclaimed asphalt, the carried out production tests and discusses the results of tests under the verification of industrial production. Testing included basic tests according to the Polish technical requirements of WT- 2 and the extended functional testing. The conducted tests and analyses helped to determine the usefulness of the developed bituminous mixtures for use in experimental sections and confirmed the possibility of using an increased amount of reclaimed asphalt up to 50% in mixtures intended for construction of national roads.
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
The capability of lithography simulation based on MVM-SEM® system
NASA Astrophysics Data System (ADS)
Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong
2015-10-01
The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.
Experimental evaluation of fingerprint verification system based on double random phase encoding
NASA Astrophysics Data System (ADS)
Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi
2006-03-01
We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.
Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, S R; Bihari, B L; Salari, K
As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.
The purpose of this SOP is to define the steps involved in data entry and data verification of physical forms. It applies to the data entry and data verification of all physical forms. The procedure defined herein was developed for use in the Arizona NHEXAS project and the "Bor...
Murias, Juan M; Pogliaghi, Silvia; Paterson, Donald H
2018-01-01
The accuracy of an exhaustive ramp incremental (RI) test to determine maximal oxygen uptake ([Formula: see text]O 2max ) was recently questioned and the utilization of a verification phase proposed as a gold standard. This study compared the oxygen uptake ([Formula: see text]O 2 ) during a RI test to that obtained during a verification phase aimed to confirm attainment of [Formula: see text]O 2max . Sixty-one healthy males [31 older (O) 65 ± 5 yrs; 30 younger (Y) 25 ± 4 yrs] performed a RI test (15-20 W/min for O and 25 W/min for Y). At the end of the RI test, a 5-min recovery period was followed by a verification phase of constant load cycling to fatigue at either 85% ( n = 16) or 105% ( n = 45) of the peak power output obtained from the RI test. The highest [Formula: see text]O 2 after the RI test (39.8 ± 11.5 mL·kg -1 ·min -1 ) and the verification phase (40.1 ± 11.2 mL·kg -1 ·min -1 ) were not different ( p = 0.33) and they were highly correlated ( r = 0.99; p < 0.01). This response was not affected by age or intensity of the verification phase. The Bland-Altman analysis revealed a very small absolute bias (-0.25 mL·kg -1 ·min -1 , not different from 0) and a precision of ±1.56 mL·kg -1 ·min -1 between measures. This study indicated that a verification phase does not highlight an under-estimation of [Formula: see text]O 2max derived from a RI test, in a large and heterogeneous group of healthy younger and older men naïve to laboratory testing procedures. Moreover, only minor within-individual differences were observed between the maximal [Formula: see text]O 2 elicited during the RI and the verification phase. Thus a verification phase does not add any validation of the determination of a [Formula: see text]O 2max . Therefore, the recommendation that a verification phase should become a gold standard procedure, although initially appealing, is not supported by the experimental data.
A Secure Framework for Location Verification in Pervasive Computing
NASA Astrophysics Data System (ADS)
Liu, Dawei; Lee, Moon-Chuen; Wu, Dan
The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.
Comments for A Conference on Verification in the 21st Century
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doyle, James E.
2012-06-12
The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification ismore » information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.« less
What is the Final Verification of Engineering Requirements?
NASA Technical Reports Server (NTRS)
Poole, Eric
2010-01-01
This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.
Ascertainment and verification of diabetes in the EPIC-NL study.
Sluijs, I; van der A, D L; Beulens, J W J; Spijkerman, A M W; Ros, M M; Grobbee, D E; van der Schouw, Y T
2010-08-01
The objectives of this study were to describe in detail the ascertainment and verification of prevalent and incident diabetes in the Dutch contributor to the European Prospective Investigation into Cancer and Nutrition (EPIC-NL cohort) and to examine to what extent ascertained diabetes agreed with general practitioner (GP) and pharmacy records. In total, 40,011 adults, aged 21 to 70 years at baseline, were included. Diabetes was ascertained via self-report, linkage to registers of hospital discharge diagnoses (HDD) and a urinary glucose strip test. Ascertained diabetes cases were verified against GP or pharmacist information using mailed questionnaires. At baseline, 795 (2.0%) diabetes cases were ascertained, and 1494 (3.7%) during a mean follow-up of ten years. The majority was ascertained via self-report only (56.7%), or self-report in combination with HDD (18.0%). After verification of ascertained diabetes cases, 1532 (66.9%) [corrected] were defined as having diabetes , 495 (21.6%) as non-diabetic individuals, and 262 (11.5%) as uncertain. Of the 1538 cases ascertained by self-report, 1350 (positive predictive value: 87.8%) were confirmed by GP or pharmacist. Cases ascertained via self-report in combination with HDD were most often confirmed (334 (positive predictive value: 96.0%)). Two out of three ascertained diabetes cases were confirmed to have been diagnosed with diabetes by their GP or pharmacist. Diabetes cases ascertained via self-report in combination with HDD had the highest confirmation.
NASA Astrophysics Data System (ADS)
Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong
2011-04-01
As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.
International Cooperative for Aerosol Prediction Workshop on Aerosol Forecast Verification
NASA Technical Reports Server (NTRS)
Benedetti, Angela; Reid, Jeffrey S.; Colarco, Peter R.
2011-01-01
The purpose of this workshop was to reinforce the working partnership between centers who are actively involved in global aerosol forecasting, and to discuss issues related to forecast verification. Participants included representatives from operational centers with global aerosol forecasting requirements, a panel of experts on Numerical Weather Prediction and Air Quality forecast verification, data providers, and several observers from the research community. The presentations centered on a review of current NWP and AQ practices with subsequent discussion focused on the challenges in defining appropriate verification measures for the next generation of aerosol forecast systems.
NASA Technical Reports Server (NTRS)
Benedetti, Angela; Reid, Jeffrey S.; Colarco, Peter R.
2011-01-01
The purpose of this workshop was to reinforce the working partnership between centers who are actively involved in global aerosol forecasting, and to discuss issues related to forecast verification. Participants included representatives from operational centers with global aerosol forecasting requirements, a panel of experts on Numerical Weather Prediction and Air Quality forecast verification, data providers, and several observers from the research community. The presentations centered on a review of current NWP and AQ practices with subsequent discussion focused on the challenges in defining appropriate verification measures for the next generation of aerosol forecast systems.
Fluorescence In Situ Hybridization Probe Validation for Clinical Use.
Gu, Jun; Smith, Janice L; Dowling, Patricia K
2017-01-01
In this chapter, we provide a systematic overview of the published guidelines and validation procedures for fluorescence in situ hybridization (FISH) probes for clinical diagnostic use. FISH probes-which are classified as molecular probes or analyte-specific reagents (ASRs)-have been extensively used in vitro for both clinical diagnosis and research. Most commercially available FISH probes in the United States are strictly regulated by the U.S. Food and Drug Administration (FDA), the Centers for Disease Control and Prevention (CDC), the Centers for Medicare & Medicaid Services (CMS) the Clinical Laboratory Improvement Amendments (CLIA), and the College of American Pathologists (CAP). Although home-brewed FISH probes-defined as probes made in-house or acquired from a source that does not supply them to other laboratories-are not regulated by these agencies, they too must undergo the same individual validation process prior to clinical use as their commercial counterparts. Validation of a FISH probe involves initial validation and ongoing verification of the test system. Initial validation includes assessment of a probe's technical specifications, establishment of its standard operational procedure (SOP), determination of its clinical sensitivity and specificity, development of its cutoff, baseline, and normal reference ranges, gathering of analytics, confirmation of its applicability to a specific research or clinical setting, testing of samples with or without the abnormalities that the probe is meant to detect, staff training, and report building. Ongoing verification of the test system involves testing additional normal and abnormal samples using the same method employed during the initial validation of the probe.
The U.S. Environmental Protection Agency has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through high quality, peer reviewed data on technology performance to those involved in the des...
NASA Astrophysics Data System (ADS)
Wei, T. B.; Chen, Y. L.; Lin, H. R.; Huang, S. Y.; Yeh, T. C. J.; Wen, J. C.
2016-12-01
In the groundwater study, it estimated the heterogeneous spatial distribution of hydraulic Properties, there were many scholars use to hydraulic tomography (HT) from field site pumping tests to estimate inverse of heterogeneous spatial distribution of hydraulic Properties, to prove the most of most field site aquifer was heterogeneous hydrogeological parameters spatial distribution field. Many scholars had proposed a method of hydraulic tomography to estimate heterogeneous spatial distribution of hydraulic Properties of aquifer, the Huang et al. [2011] was used the non-redundant verification analysis of pumping wells changed, observation wells fixed on the inverse and the forward, to reflect the feasibility of the heterogeneous spatial distribution of hydraulic Properties of field site aquifer of the non-redundant verification analysis on steady-state model.From post literature, finding only in steady state, non-redundant verification analysis of pumping well changed location and observation wells fixed location for inverse and forward. But the studies had not yet pumping wells fixed or changed location, and observation wells fixed location for redundant verification or observation wells change location for non-redundant verification of the various combinations may to explore of influences of hydraulic tomography method. In this study, it carried out redundant verification method and non-redundant verification method for forward to influences of hydraulic tomography method in transient. And it discuss above mentioned in NYUST campus sites the actual case, to prove the effectiveness of hydraulic tomography methods, and confirmed the feasibility on inverse and forward analysis from analysis results.Keywords: Hydraulic Tomography, Redundant Verification, Heterogeneous, Inverse, Forward
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marleau, Peter; Brubaker, Erik; Deland, Sharon M.
This report summarizes the discussion and conclusions reached during a table top exercise held at Sandia National Laboratories, Albuquerque on September 3, 2014 regarding a recently described approach for nuclear warhead verification based on the cryptographic concept of a zero-knowledge protocol (ZKP) presented in a recent paper authored by Glaser, Barak, and Goldston. A panel of Sandia National Laboratories researchers, whose expertise includes radiation instrumentation design and development, cryptography, and arms control verification implementation, jointly reviewed the paper and identified specific challenges to implementing the approach as well as some opportunities. It was noted that ZKP as used in cryptographymore » is a useful model for the arms control verification problem, but the direct analogy to arms control breaks down quickly. The ZKP methodology for warhead verification fits within the general class of template-based verification techniques, where a reference measurement is used to confirm that a given object is like another object that has already been accepted as a warhead by some other means. This can be a powerful verification approach, but requires independent means to trust the authenticity of the reference warhead - a standard that may be difficult to achieve, which the ZKP authors do not directly address. Despite some technical challenges, the concept of last-minute selection of the pre-loads and equipment could be a valuable component of a verification regime.« less
Martin, Edward J [Virginia Beach, VA
2008-01-15
A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.
A Roadmap for the Implementation of Continued Process Verification.
Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin
2016-01-01
In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.
Self-Verification of Ability through Biased Performance Memory.
ERIC Educational Resources Information Center
Karabenick, Stuart A.; LeBlanc, Daniel
Evidence points to a pervasive tendency for persons to behave to maintain their existing cognitive structures. One strategy by which this self-verification is made more probable involves information processing. Through attention, encoding and retrieval, and the interpretation of events, persons process information so that self-confirmatory…
The development, verification, and comparison study between LC-MS libraries for two manufacturers’ instruments and a verified protocol are discussed. The LC-MS library protocol was verified through an inter-laboratory study that involved Federal, State, and private laboratories. ...
High stakes in INF verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krepon, M.
1987-06-01
The stakes involved in negotiating INF verification arrangements are high. While these proposals deal only with intermediate-range ground-launched cruise and mobile missiles, if properly devised they could help pave the way for comprehensive limits on other cruise missiles and strategic mobile missiles. In contrast, poorly drafted monitoring provisions could compromise national industrial security and generate numerous compliance controversies. Any verification regime will require new openness on both sides, but that means significant risks as well as opportunities. US and Soviet negotiators could spend weeks, months, and even years working out in painstaking detail verification provisions for medium-range missiles. Alternatively, ifmore » the two sides wished to conclude an INF agreement quickly, they could defer most of the difficult verification issues to the strategic arms negotiations.« less
NASA Astrophysics Data System (ADS)
Szeleszczuk, Łukasz; Gubica, Tomasz; Zimniak, Andrzej; Pisklak, Dariusz M.; Dąbrowska, Kinga; Cyrański, Michał K.; Kańska, Marianna
2017-10-01
A convenient method for the indirect crystal structure verification of methyl glycosides was demonstrated. Single-crystal X-ray diffraction structures for methyl glycoside acetates were deacetylated and subsequently subjected to DFT calculations under periodic boundary conditions. Solid-state NMR spectroscopy served as a guide for calculations. A high level of accuracy of the modelled crystal structures of methyl glycosides was confirmed by comparison with published results of neutron diffraction study using RMSD method.
75 FR 54966 - Privacy Act of 1974: Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-09
... agencies for purposes of verification of income for determining eligibility for benefits. 38 U.S.C. 1710(a... income verification process. The VA records involved in the match are ``Enrollment and Eligibility Records--VA'' (147VA16). The SSA records are from the Earnings Recording and Self- Employment Income...
78 FR 21713 - Privacy Act of 1974: Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-11
... income information from other agencies for purposes of verification of income for determining eligibility... data needed for the income verification process. The VA records involved in the match are ``Enrollment and Eligibility Records--VA'' (147VA16). The SSA records are from the Earnings Recording and Self...
ETV VR/VS SUNSET LABORATORY MODEL 4 OC-EC FIELD ANALYZER
This is a verification report statement that describes the verification test which was conducted over a period of approximately 30 days (April 5 to May 7, 2013) and involved the continuous operation of duplicate Model 4 OC-EC analyzers at the Battelle Columbus Operations Special ...
The verification test was conducted oer a period of 30 days (October 1 to October 31, 2008) and involved the continuous operation of duplicate semi-continuous monitoring technologies at the Burdens Creek Air Monitoring Site, an existing ambient-air monitoring station located near...
NASA Astrophysics Data System (ADS)
Molenda, Michał; Ratman-Kłosińska, Izabela
2018-03-01
Many innovative environmental technologies never reach the market because they are new and cannot demonstrate a successful track record of previous applications. This fact is a serious obstacle on their way to the market. Lack of credible data on the performance of a technology causes mistrust of investors in innovations, especially from public sector, who seek effective solutions however without compromising the technical and financial risks associated with their implementation. Environmental technology verification (ETV) offers a credible, robust and transparent process that results in a third party confirmation of the claims made by the providers about the performance of the novel environmental technologies. Verifications of performance are supported by high quality, independent test data. In that way ETV as a tool helps establish vendor credibility and buyer confidence. Several countries across the world have implemented ETV in the form of national or regional programmes. ETV in the European Union was implemented as a voluntary scheme if a form of a pilot programme. The European Commission launched the Environmental Technology Pilot Programme of the European Union (EU ETV) in 2011. The paper describes the European model of ETV set up and put to operation under the Pilot Programme of Environmental Technologies Verification of the European Union. The goal, objectives, technological scope, involved entities are presented. An attempt has been made to summarise the results of the EU ETV scheme performance available for the period of 2012 when the programme has become fully operational until the first half of 2016. The study was aimed at analysing the overall organisation and efficiency of the EU ETV Pilot Programme. The study was based on the analysis of the documents the operation of the EU ETV system. For this purpose, a relevant statistical analysis of the data on the performance of the EU ETV system provided by the European Commission was carried out.
The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?
Schaun, Gustavo Z
2017-12-08
Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 1 2013-10-01 2013-10-01 false What are the MRO's functions in reviewing laboratory confirmed non-negative drug test results? 40.129 Section 40.129 Transportation Office of the Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers and the Verification Proces...
The U.S. EPA has created the Environmental Technology Verification (ETV) program to provide high quality, peer reviewed data on technology performance to those involved in the design, distribution, financing, permitting, purchase, and use of environmental technologies. The Air Po...
Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K
2013-03-04
The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.
A physical zero-knowledge object-comparison system for nuclear warhead verification
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d'Errico, Francesco
2016-01-01
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477
A physical zero-knowledge object-comparison system for nuclear warhead verification.
Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco
2016-09-20
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.
A physical zero-knowledge object-comparison system for nuclear warhead verification
NASA Astrophysics Data System (ADS)
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; D'Errico, Francesco
2016-09-01
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.
A physical zero-knowledge object-comparison system for nuclear warhead verification
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; ...
2016-09-20
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less
A physical zero-knowledge object-comparison system for nuclear warhead verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less
NASA Technical Reports Server (NTRS)
Chiang, T.; Tessarzik, J. M.; Badgley, R. H.
1972-01-01
The primary aim of this investigation was verification of basic methods which are to be used in cataloging elastomer dynamic properties (stiffness and damping) in terms of viscoelastic model constants. These constants may then be used to predict dynamic properties for general elastomer shapes and operating conditions, thereby permitting optimum application of elastomers as energy absorption and/or energy storage devices in the control of vibrations in a broad variety of applications. The efforts reported involved: (1) literature search; (2) the design, fabrication and use of a test rig for obtaining elastomer dynamic test data over a wide range of frequencies, amplitudes, and preloads; and (3) the reduction of the test data, by means of a selected three-element elastomer model and specialized curve fitting techniques, to material properties. Material constants thus obtained have been used to calculate stiffness and damping for comparison with measured test data. These comparisons are excellent for a number of test conditions and only fair to poor for others. The results confirm the validity of the basic approach of the overall program and the mechanics of the cataloging procedure, and at the same time suggest areas in which refinements should be made.
Self-verification and depression among youth psychiatric inpatients.
Joiner, T E; Katz, J; Lew, A S
1997-11-01
According to self-verification theory (e.g., W.B. Swann, 1983), people are motivated to preserve stable self-concepts by seeking self-confirming interpersonal responses, even if the responses are negative. In the current study of 72 youth psychiatric inpatients (36 boys; 36 girls; ages 7-17, M = 13.18; SD = 2.59), the authors provide the 1st test of self-verification theory among a youth sample. Participants completed self-report questionnaires on depression, self-esteem, anxiety, negative and positive affect, and interest in negative feedback from others. The authors made chart diagnoses available, and they collected peer rejection ratings. Consistent with hypotheses, the authors found that interest in negative feedback was associated with depression, was predictive of peer rejection (but only within relatively longer peer relationships), was more highly related to cognitive than emotional aspects of depression, and was specifically associated with depression, rather than being generally associated with emotional distress. The authors discuss implications for self-verification theory and for the phenomenology of youth depression.
The Evolution of the NASA Commercial Crew Program Mission Assurance Process
NASA Technical Reports Server (NTRS)
Canfield, Amy C.
2016-01-01
In 2010, the National Aeronautics and Space Administration (NASA) established the Commercial Crew Program (CCP) in order to provide human access to the International Space Station and low Earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine that the Commercial Provider's transportation system complies with programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted hazard reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100% of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (S&MA) model does not support the nature of the CCP. To that end, NASA S&MA is implementing a Risk Based Assurance process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications.
Effectiveness of electrocardiographic guidance in CVAD tip placement.
Walker, Graham; Chan, Raymond J; Alexandrou, Evan; Webster, Joan; Rickard, Claire
International standard practice for the correct confirmation of the central venous access device is the chest X-ray. The intracavitary electrocardiogram-based insertion method is radiation-free, and allows real-time placement verification, providing immediate treatment and reduced requirement for post-procedural repositioning. Relevant databases were searched for prospective randomised controlled trials (RCTs) or quasi RCTs that compared the effectiveness of electrocardiogram-guided catheter tip positioning with placement using surface-anatomy-guided insertion plus chest X-ray confirmation. The primary outcome was accurate catheter tip placement. Secondary outcomes included complications, patient satisfaction and costs. Five studies involving 729 participants were included. Electrocardiogram-guided insertion was more accurate than surface anatomy guided insertion (odds ratio: 8.3; 95% confidence interval (CI) 1.38; 50.07; p=0.02). There was a lack of reporting on complications, patient satisfaction and costs. The evidence suggests that intracavitary electrocardiogram-based positioning is superior to surface-anatomy-guided positioning of central venous access devices, leading to significantly more successful placements. This technique could potentially remove the requirement for post-procedural chest X-ray, especially during peripherally inserted central catheter (PICC) line insertion.
Code of Federal Regulations, 2011 CFR
2011-10-01
...), except as otherwise provided in this section. (c) In the verification interview, you must explain the laboratory findings to the employee and address technical questions or issues the employee may raise. (d) You... at the time of the verification interview. As the MRO, you have discretion to extend the time...
Code of Federal Regulations, 2010 CFR
2010-10-01
...), except as otherwise provided in this section. (c) In the verification interview, you must explain the laboratory findings to the employee and address technical questions or issues the employee may raise. (d) You... at the time of the verification interview. As the MRO, you have discretion to extend the time...
Closed Loop Requirements and Analysis Management
NASA Technical Reports Server (NTRS)
Lamoreaux, Michael; Verhoef, Brett
2015-01-01
Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.
Simple method to verify OPC data based on exposure condition
NASA Astrophysics Data System (ADS)
Moon, James; Ahn, Young-Bae; Oh, Sey-Young; Nam, Byung-Ho; Yim, Dong Gyu
2006-03-01
In a world where Sub100nm lithography tool is an everyday household item for device makers, shrinkage of the device is at a rate that no one ever have imagined. With the shrinkage of device at such a high rate, demand placed on Optical Proximity Correction (OPC) is like never before. To meet this demand with respect to shrinkage rate of the device, more aggressive OPC tactic is involved. Aggressive OPC tactics is a must for sub 100nm lithography tech but this tactic eventually results in greater room for OPC error and complexity of the OPC data. Until now, Optical Rule Check (ORC) or Design Rule Check (DRC) was used to verify this complex OPC error. But each of these methods has its pros and cons. ORC verification of OPC data is rather accurate "process" wise but inspection of full chip device requires a lot of money (Computer , software,..) and patience (run time). DRC however has no such disadvantage, but accuracy of the verification is a total downfall "process" wise. In this study, we were able to create a new method for OPC data verification that combines the best of both ORC and DRC verification method. We created a method that inspects the biasing of the OPC data with respect to the illumination condition of the process that's involved. This new method for verification was applied to 80nm tech ISOLATION and GATE layer of the 512M DRAM device and showed accuracy equivalent to ORC inspection with run time that of DRC verification.
Current status of verification practices in clinical biochemistry in Spain.
Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè
2013-09-01
Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.
Concept Verification Test - Evaluation of Spacelab/Payload operation concepts
NASA Technical Reports Server (NTRS)
Mcbrayer, R. O.; Watters, H. H.
1977-01-01
The Concept Verification Test (CVT) procedure is used to study Spacelab operational concepts by conducting mission simulations in a General Purpose Laboratory (GPL) which represents a possible design of Spacelab. In conjunction with the laboratory a Mission Development Simulator, a Data Management System Simulator, a Spacelab Simulator, and Shuttle Interface Simulator have been designed. (The Spacelab Simulator is more functionally and physically representative of the Spacelab than the GPL.) Four simulations of Spacelab mission experimentation were performed, two involving several scientific disciplines, one involving life sciences, and the last involving material sciences. The purpose of the CVT project is to support the pre-design and development of payload carriers and payloads, and to coordinate hardware, software, and operational concepts of different developers and users.
Land Ice Verification and Validation Kit
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-07-15
To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less
NASA Astrophysics Data System (ADS)
Zhong, Shenlu; Li, Mengjiao; Tang, Xiajie; He, Weiqing; Wang, Xiaogang
2017-01-01
A novel optical information verification and encryption method is proposed based on inference principle and phase retrieval with sparsity constraints. In this method, a target image is encrypted into two phase-only masks (POMs), which comprise sparse phase data used for verification. Both of the two POMs need to be authenticated before being applied for decrypting. The target image can be optically reconstructed when the two authenticated POMs are Fourier transformed and convolved by the correct decryption key, which is also generated in encryption process. No holographic scheme is involved in the proposed optical verification and encryption system and there is also no problem of information disclosure in the two authenticable POMs. Numerical simulation results demonstrate the validity and good performance of this new proposed method.
Active alignment/contact verification system
Greenbaum, William M.
2000-01-01
A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.
Sierra/Aria 4.48 Verification Manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sierra Thermal Fluid Development Team
Presented in this document is a portion of the tests that exist in the Sierra Thermal/Fluids verification test suite. Each of these tests is run nightly with the Sierra/TF code suite and the results of the test checked under mesh refinement against the correct analytic result. For each of the tests presented in this document the test setup, derivation of the analytic solution, and comparison of the code results to the analytic solution is provided. This document can be used to confirm that a given code capability is verified or referenced as a compilation of example problems.
VeriClick: an efficient tool for table format verification
NASA Astrophysics Data System (ADS)
Nagy, George; Tamhankar, Mangesh
2012-01-01
The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.
Fingerprint changes and verification failure among patients with hand dermatitis.
Lee, Chew Kek; Chang, Choong Chor; Johar, Asmah; Puwira, Othman; Roshidah, Baba
2013-03-01
To determine the prevalence of fingerprint verification failure and to define and quantify the fingerprint changes associated with fingerprint verification failure. Case-control study. Referral public dermatology center. The study included 100 consecutive patients with clinical hand dermatitis involving the palmar distal phalanx of either thumb and 100 age-, sex-, and ethnicity-matched controls. Patients with an altered thumb print due to other causes and palmar hyperhidrosis were excluded. Fingerprint verification(pass/fail) and hand eczema severity index score. Twenty-seven percent of patients failed fingerprint verification compared with 2% of controls. Fingerprint verification failure was associated with a higher hand eczema severity index score (P.001). The main fingerprint abnormalities were fingerprint dystrophy (42.0%) and abnormal white lines (79.5%). The number of abnormal white lines was significantly higher among the patients with hand dermatitis compared with controls(P=.001). Among the patients with hand dermatitis, theodds of failing fingerprint verification with fingerprint dystrophy was 4.01. The presence of broad lines and long lines was associated with a greater odds of fingerprint verification failure (odds ratio [OR], 8.04; 95% CI, 3.56-18.17 and OR, 2.37; 95% CI, 1.31-4.27, respectively),while the presence of thin lines was protective of verification failure (OR, 0.45; 95% CI, 0.23-0.89). Fingerprint verification failure is a significant problem among patients with more severe hand dermatitis. It is mainly due to fingerprint dystrophy and abnormal white lines. Malaysian National Medical Research Register Identifier: NMRR-11-30-8226
Validation and verification of a virtual environment for training naval submarine officers
NASA Astrophysics Data System (ADS)
Zeltzer, David L.; Pioch, Nicholas J.
1996-04-01
A prototype virtual environment (VE) has been developed for training a submarine officer of the desk (OOD) to perform in-harbor navigation on a surfaced submarine. The OOD, stationed on the conning tower of the vessel, is responsible for monitoring the progress of the boat as it negotiates a marked channel, as well as verifying the navigational suggestions of the below- deck piloting team. The VE system allows an OOD trainee to view a particular harbor and associated waterway through a head-mounted display, receive spoken reports from a simulated piloting team, give spoken commands to the helmsman, and receive verbal confirmation of command execution from the helm. The task analysis of in-harbor navigation, and the derivation of application requirements are briefly described. This is followed by a discussion of the implementation of the prototype. This implementation underwent a series of validation and verification assessment activities, including operational validation, data validation, and software verification of individual software modules as well as the integrated system. Validation and verification procedures are discussed with respect to the OOD application in particular, and with respect to VE applications in general.
NASA Astrophysics Data System (ADS)
Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui
2011-05-01
During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.
Data requirements for verification of ram glow chemistry
NASA Technical Reports Server (NTRS)
Swenson, G. R.; Mende, S. B.
1985-01-01
A set of questions is posed regarding the surface chemistry producing the ram glow on the space shuttle. The questions surround verification of the chemical cycle involved in the physical processes leading to the glow. The questions, and a matrix of measurements required for most answers, are presented. The measurements include knowledge of the flux composition to and from a ram surface as well as spectroscopic signatures from the U to visible to IR. A pallet set of experiments proposed to accomplish the measurements is discussed. An interim experiment involving an available infrared instrument to be operated from the shuttle Orbiter cabin is also be discussed.
The Effect of Mystery Shopper Reports on Age Verification for Tobacco Purchases
KREVOR, BRAD S.; PONICKI, WILLIAM R.; GRUBE, JOEL W.; DeJONG, WILLIAM
2011-01-01
Mystery shops (MS) involving attempted tobacco purchases by young buyers have been employed to monitor retail stores’ performance in refusing underage sales. Anecdotal evidence suggests that MS visits with immediate feedback to store personnel can improve age verification. This study investigated the impact of monthly and twice-monthly MS reports on age verification. Forty-five Walgreens stores were each visited 20 times by mystery shoppers. The stores were randomly assigned to one of three conditions. Control group stores received no feedback, whereas two treatment groups received feedback communications every visit (twice monthly) or every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Post-baseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement than control group stores. Verification rates increased significantly during the study period for all three groups, with delayed improvement among control group stores. Communication between managers regarding the MS program may account for the delayed age-verification improvements observed in the control group stores. Encouraging inter-store communication might extend the benefits of MS programs beyond those stores that receive this intervention. PMID:21541874
The effect of mystery shopper reports on age verification for tobacco purchases.
Krevor, Brad S; Ponicki, William R; Grube, Joel W; DeJong, William
2011-09-01
Mystery shops involving attempted tobacco purchases by young buyers have been implemented in order to monitor retail stores' performance in refusing underage sales. Anecdotal evidence suggests that mystery shop visits with immediate feedback to store personnel can improve age verification. This study investigated the effect of monthly and twice-monthly mystery shop reports on age verification. Mystery shoppers visited 45 Walgreens stores 20 times. The stores were randomly assigned to 1 of 3 conditions. Control group stores received no feedback, whereas 2 treatment groups received feedback communications on every visit (twice monthly) or on every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Postbaseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement compared with the control group stores. Verification rates increased significantly during the study period for all 3 groups, with delayed improvement among control group stores. Communication between managers regarding the mystery shop program may account for the delayed age-verification improvements observed in the control group stores. Encouraging interstore communication might extend the benefits of mystery shop programs beyond those stores that receive this intervention. Copyright © Taylor & Francis Group, LLC
Sierra/SolidMechanics 4.48 Verification Tests Manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose
2018-03-01
Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less
Sierra/SolidMechanics 4.48 Verification Tests Manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plews, Julia A.; Crane, Nathan K.; de Frias, Gabriel Jose
Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less
Katz, Jennifer; Joiner, Thomas E
2002-02-01
We contend that close relationships provide adults with optimal opportunities for personal growth when relationship partners provide accurate, honest feedback. Accordingly, it was predicted that young adults would experience the relationship quality with relationship partners who evaluated them in a manner consistent their own self-evaluations. Three empirical tests of this self-verification hypothesis as applied to close dyads were conducted. In Study 1, young adults in dating relationships were most intimate with and somewhat more committed to partners when they perceived that partners evaluated them as they evaluated themselves. Self-verification effects were pronounced for those involved in more serious dating relationships. In Study 2, men reported the greatest esteem for same-sex roommates who evaluated them in a self-verifying manner. Results from Study 2 were replicated and extended to both male and female roommate dyads in Study 3. Further, self-verification effects were most pronounced for young adults with high emotional empathy. Results suggest that self-verification theory is useful for understanding dyadic adjustment across a variety of relational contexts in young adulthood. Implications of self-verification processes for adult personal development are outlined within an identity negotiation framework.
Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G
2014-08-01
In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.
Nord, B.; Buckley-Geer, E.; Lin, H.; ...
2016-08-05
We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either weremore » not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ~ 0.80–3.2 and in i-band surface brightness i SB ~ 23–25 mag arcsec –2 (2'' aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc, which have ranges θ E ~ 5''–9'' and M enc ~ 8 × 10 12 to 6 × 10 13 M ⊙, respectively.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nord, B.; Buckley-Geer, E.; Lin, H.
We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either weremore » not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ~ 0.80–3.2 and in i-band surface brightness i SB ~ 23–25 mag arcsec –2 (2'' aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc, which have ranges θ E ~ 5''–9'' and M enc ~ 8 × 10 12 to 6 × 10 13 M ⊙, respectively.« less
Toward Automatic Verification of Goal-Oriented Flow Simulations
NASA Technical Reports Server (NTRS)
Nemec, Marian; Aftosmis, Michael J.
2014-01-01
We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.
Arms Control Verification: ’Bridge’ Theories and the Politics of Expediency.
1983-04-01
that the compliance verification dilemma, a uniquely American problem, creates a set of opportunities that are, in fact, among the principal reasons for...laws of the class struggle.4 9 While Americans were arguing among themselves about whether detente should involve political "linkage,’ the Chairman...required an equivalent American willingness to persevere indefinitely. But to generate that kind of fervor among the voting populace would have required
Dynamic analysis for shuttle design verification
NASA Technical Reports Server (NTRS)
Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.
1972-01-01
Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.
Horseshoes in a Chaotic System with Only One Stable Equilibrium
NASA Astrophysics Data System (ADS)
Huan, Songmei; Li, Qingdu; Yang, Xiao-Song
To confirm the numerically demonstrated chaotic behavior in a chaotic system with only one stable equilibrium reported by Wang and Chen, we resort to Poincaré map technique and present a rigorous computer-assisted verification of horseshoe chaos by virtue of topological horseshoes theory.
Bayesian Estimation of Combined Accuracy for Tests with Verification Bias
Broemeling, Lyle D.
2011-01-01
This presentation will emphasize the estimation of the combined accuracy of two or more tests when verification bias is present. Verification bias occurs when some of the subjects are not subject to the gold standard. The approach is Bayesian where the estimation of test accuracy is based on the posterior distribution of the relevant parameter. Accuracy of two combined binary tests is estimated employing either “believe the positive” or “believe the negative” rule, then the true and false positive fractions for each rule are computed for two tests. In order to perform the analysis, the missing at random assumption is imposed, and an interesting example is provided by estimating the combined accuracy of CT and MRI to diagnose lung cancer. The Bayesian approach is extended to two ordinal tests when verification bias is present, and the accuracy of the combined tests is based on the ROC area of the risk function. An example involving mammography with two readers with extreme verification bias illustrates the estimation of the combined test accuracy for ordinal tests. PMID:26859487
Verification of chloride adsorption effect of mortar with salt adsorbent
NASA Astrophysics Data System (ADS)
Hoshina, T.; Nakajima, N.; Sudo, H.; Date, S.
2017-11-01
In order to investigate the chloride adsorption effect of mortar mixed with chloride adsorbent, electrophoresis test using mortar specimen and immersion dry repeated test were conducted to evaluate chloride adsorption effect. As a result, it was confirmed that soluble salt content that causes corrosion of rebar in the specimen was reduced by the chloride adsorbent and corrosion inhibiting effect of the rebar was also obtained. It was also confirmed that by increasing dosage of the chloride adsorbent, the chloride adsorbing effect becomes larger as well..
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.
Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in socialmore » media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.« less
INF verification: a guide for the perplexed
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendelsohn, J.
1987-09-01
The administration has dug itself some deep holes on the verification issue. It will have to conclude an arms control treaty without having resolved earlier (but highly questionable) compliance issues on which it has placed great emphasis. It will probably have to abandon its more sweeping (and unnecessary) on-site inspection (OSI) proposals because of adverse security and political implications for the United States and its allies. And, finally, it will probably have to present to the Congress an INF treaty that will provide for a considerably less-stringent (but nonetheless adequate) verification regime that it had originally demanded. It is difficultmore » to dispel the impression that, when the likelihood of concluding an INF treaty seemed remote, the administration indulged its penchant for intrusive and non-negotiable verification measures. As the possibility of, and eagerness for, a treaty increased, and as the Soviet Union shifted its policy from one of the resistance to OSI to one of indicating that on-site verification involved reciprocal obligations, the administration was forced to scale back its OSI rhetoric. This re-evaluation of OSI by the administration does not make the INF treaty any less verifiable; from the outset the Reagan administration was asking for a far more extensive verification package than was necessary, practicable, acceptable, or negotiable.« less
Hehmke, Bernd; Berg, Sabine; Salzsieder, Eckhard
2017-05-01
Continuous standardized verification of the accuracy of blood glucose meter systems for self-monitoring after their introduction into the market is an important clinically tool to assure reliable performance of subsequently released lots of strips. Moreover, such published verification studies permit comparison of different blood glucose monitoring systems and, thus, are increasingly involved in the process of evidence-based purchase decision making.
NASA Technical Reports Server (NTRS)
Spruce, Joseph P.; Hall, Calllie; McPherson, Terry; Spiering, Bruce; Brown, Richard; Estep, Lee; Lunde, Bruce; Guest, DeNeice; Navard, Andy; Pagnutti, Mary;
2006-01-01
This report discusses verification and validation (V&V) assessment of Moderate Resolution Imaging Spectroradiometer (MODIS) ocean data products contributed by the Naval Research Laboratory (NRL) and Applied Coherent Technologies (ACT) Corporation to National Oceanic Atmospheric Administration s (NOAA) Near Real Time (NRT) Harmful Algal Blooms Observing System (HABSOS). HABSOS is a maturing decision support tool (DST) used by NOAA and its partners involved with coastal and public health management.
2006-09-30
High-Pressure Waterjet • CO2 Pellet/Turbine Wheel • Ultrahigh-Pressure Waterjet 5 Process Water Reuse/Recycle • Cross-Flow Microfiltration ...documented on a process or laboratory form. Corrective action will involve taking all necessary steps to restore a measuring system to proper working order...In all cases, a nonconformance will be rectified before sample processing and analysis continues. If corrective action does not restore the
Kinetic-Family-Drawing Styles and Emotionally Disturbed Childhood Behavior
ERIC Educational Resources Information Center
McPhee, John P.; Wegner, Kenneth W.
1976-01-01
Purpose of the study was to achieve empirical verification that Kinetic-Family-Drawing (KFD) styles were reflected by children with moderate/severe emotionally disturbed interpersonal relationships as opposed to the nonstylized drawings of adjusted children. The results confirmed the general existence of KFD style; however, style was not…
Design Authority in the Test Programme Definition: The Alenia Spazio Experience
NASA Astrophysics Data System (ADS)
Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.
2004-08-01
In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).
Verification of a Byzantine-Fault-Tolerant Self-stabilizing Protocol for Clock Synchronization
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2008-01-01
This paper presents the mechanical verification of a simplified model of a rapid Byzantine-fault-tolerant self-stabilizing protocol for distributed clock synchronization systems. This protocol does not rely on any assumptions about the initial state of the system except for the presence of sufficient good nodes, thus making the weakest possible assumptions and producing the strongest results. This protocol tolerates bursts of transient failures, and deterministically converges within a time bound that is a linear function of the self-stabilization period. A simplified model of the protocol is verified using the Symbolic Model Verifier (SMV). The system under study consists of 4 nodes, where at most one of the nodes is assumed to be Byzantine faulty. The model checking effort is focused on verifying correctness of the simplified model of the protocol in the presence of a permanent Byzantine fault as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period. Although model checking results of the simplified model of the protocol confirm the theoretical predictions, these results do not necessarily confirm that the protocol solves the general case of this problem. Modeling challenges of the protocol and the system are addressed. A number of abstractions are utilized in order to reduce the state space.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nabavizadeh, Nima, E-mail: nabaviza@ohsu.edu; Elliott, David A.; Chen, Yiyi
Purpose: To survey image guided radiation therapy (IGRT) practice patterns, as well as IGRT's impact on clinical workflow and planning treatment volumes (PTVs). Methods and Materials: A sample of 5979 treatment site–specific surveys was e-mailed to the membership of the American Society for Radiation Oncology (ASTRO), with questions pertaining to IGRT modality/frequency, PTV expansions, method of image verification, and perceived utility/value of IGRT. On-line image verification was defined as images obtained and reviewed by the physician before treatment. Off-line image verification was defined as images obtained before treatment and then reviewed by the physician before the next treatment. Results: Of 601 evaluablemore » responses, 95% reported IGRT capabilities other than portal imaging. The majority (92%) used volumetric imaging (cone-beam CT [CBCT] or megavoltage CT), with volumetric imaging being the most commonly used modality for all sites except breast. The majority of respondents obtained daily CBCTs for head and neck intensity modulated radiation therapy (IMRT), lung 3-dimensional conformal radiation therapy or IMRT, anus or pelvis IMRT, prostate IMRT, and prostatic fossa IMRT. For all sites, on-line image verification was most frequently performed during the first few fractions only. No association was seen between IGRT frequency or CBCT utilization and clinical treatment volume to PTV expansions. Of the 208 academic radiation oncologists who reported working with residents, only 41% reported trainee involvement in IGRT verification processes. Conclusion: Consensus guidelines, further evidence-based approaches for PTV margin selection, and greater resident involvement are needed for standardized use of IGRT practices.« less
Nabavizadeh, Nima; Elliott, David A; Chen, Yiyi; Kusano, Aaron S; Mitin, Timur; Thomas, Charles R; Holland, John M
2016-03-15
To survey image guided radiation therapy (IGRT) practice patterns, as well as IGRT's impact on clinical workflow and planning treatment volumes (PTVs). A sample of 5979 treatment site-specific surveys was e-mailed to the membership of the American Society for Radiation Oncology (ASTRO), with questions pertaining to IGRT modality/frequency, PTV expansions, method of image verification, and perceived utility/value of IGRT. On-line image verification was defined as images obtained and reviewed by the physician before treatment. Off-line image verification was defined as images obtained before treatment and then reviewed by the physician before the next treatment. Of 601 evaluable responses, 95% reported IGRT capabilities other than portal imaging. The majority (92%) used volumetric imaging (cone-beam CT [CBCT] or megavoltage CT), with volumetric imaging being the most commonly used modality for all sites except breast. The majority of respondents obtained daily CBCTs for head and neck intensity modulated radiation therapy (IMRT), lung 3-dimensional conformal radiation therapy or IMRT, anus or pelvis IMRT, prostate IMRT, and prostatic fossa IMRT. For all sites, on-line image verification was most frequently performed during the first few fractions only. No association was seen between IGRT frequency or CBCT utilization and clinical treatment volume to PTV expansions. Of the 208 academic radiation oncologists who reported working with residents, only 41% reported trainee involvement in IGRT verification processes. Consensus guidelines, further evidence-based approaches for PTV margin selection, and greater resident involvement are needed for standardized use of IGRT practices. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Gravitz, Robert M.; Hale, Joseph
2006-01-01
NASA's Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model's fidelity, credibility, and quality. This information will allow the decision-maker to understand the risks involved in using a model's results in the decision-making process. This presentation will discuss NASA's approach for verification and validation (V&V) of its models or simulations supporting space exploration. This presentation will describe NASA's V&V process and the associated M&S verification and validation (V&V) activities required to support the decision-making process. The M&S V&V Plan and V&V Report templates for ESMD will also be illustrated.
Cognitive Bias in Systems Verification
NASA Technical Reports Server (NTRS)
Larson, Steve
2012-01-01
Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nord, B.; Buckley-Geer, E.; Lin, H.
We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either weremore » not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ∼ 0.80–3.2 and in i -band surface brightness i {sub SB} ∼ 23–25 mag arcsec{sup −2} (2″ aperture). For each of the six systems, we estimate the Einstein radius θ {sub E} and the enclosed mass M {sub enc}, which have ranges θ {sub E} ∼ 5″–9″ and M {sub enc} ∼ 8 × 10{sup 12} to 6 × 10{sup 13} M {sub ⊙}, respectively.« less
NASA Astrophysics Data System (ADS)
Nord, B.; Buckley-Geer, E.; Lin, H.; Diehl, H. T.; Helsby, J.; Kuropatkin, N.; Amara, A.; Collett, T.; Allam, S.; Caminha, G. B.; De Bom, C.; Desai, S.; Dúmet-Montoya, H.; Pereira, M. Elidaiana da S.; Finley, D. A.; Flaugher, B.; Furlanetto, C.; Gaitsch, H.; Gill, M.; Merritt, K. W.; More, A.; Tucker, D.; Saro, A.; Rykoff, E. S.; Rozo, E.; Birrer, S.; Abdalla, F. B.; Agnello, A.; Auger, M.; Brunner, R. J.; Carrasco Kind, M.; Castander, F. J.; Cunha, C. E.; da Costa, L. N.; Foley, R. J.; Gerdes, D. W.; Glazebrook, K.; Gschwend, J.; Hartley, W.; Kessler, R.; Lagattuta, D.; Lewis, G.; Maia, M. A. G.; Makler, M.; Menanteau, F.; Niernberg, A.; Scolnic, D.; Vieira, J. D.; Gramillano, R.; Abbott, T. M. C.; Banerji, M.; Benoit-Lévy, A.; Brooks, D.; Burke, D. L.; Capozzi, D.; Carnero Rosell, A.; Carretero, J.; D'Andrea, C. B.; Dietrich, J. P.; Doel, P.; Evrard, A. E.; Frieman, J.; Gaztanaga, E.; Gruen, D.; Honscheid, K.; James, D. J.; Kuehn, K.; Li, T. S.; Lima, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Miquel, R.; Neilsen, E.; Nichol, R. C.; Ogando, R.; Plazas, A. A.; Romer, A. K.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Walker, A. R.; Wester, W.; Zhang, Y.; DES Collaboration
2016-08-01
We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either were not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ˜ 0.80-3.2 and in I-band surface brightness I SB ˜ 23-25 mag arcsec-2 (2″ aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc, which have ranges θ E ˜ 5″-9″ and M enc ˜ 8 × 1012 to 6 × 1013 M ⊙, respectively. This paper includes data gathered with the 6.5 m Magellan Telescopes located at Las Campanas Observatory, Chile.
Yu, Shihui; Kielt, Matthew; Stegner, Andrew L; Kibiryeva, Nataliya; Bittel, Douglas C; Cooley, Linda D
2009-12-01
The American College of Medical Genetics guidelines for microarray analysis for constitutional cytogenetic abnormalities require abnormal or ambiguous results from microarray-based comparative genomic hybridization (aCGH) analysis be confirmed by an alternative method. We employed quantitative real-time polymerase chain reaction (qPCR) technology using SYBR Green I reagents for confirmation of 93 abnormal aCGH results (50 deletions and 43 duplications) and 54 parental samples. A novel qPCR protocol using DNA sequences coding for X-linked lethal diseases in males for designing reference primers was established. Of the 81 sets of test primers used for confirmation of 93 abnormal copy number variants (CNVs) in 80 patients, 71 sets worked after the initial primer design (88%), 9 sets were redesigned once, and 1 set twice because of poor amplification. Fifty-four parental samples were tested using 33 sets of test primers to follow up 34 CNVs in 30 patients. Nineteen CNVs were confirmed as inherited, 13 were negative in both parents, and 2 were inconclusive due to a negative result in a single parent. The qPCR assessment clarified aCGH results in two cases and corrected a fluorescence in situ hybridization result in one case. Our data illustrate that qPCR methodology using SYBR Green I reagents is accurate, highly sensitive, specific, rapid, and cost-effective for verification of chromosomal imbalances detected by aCGH in the clinical setting.
NASA Astrophysics Data System (ADS)
Chen, C.; Rundle, J. B.; Holliday, J. R.; Nanjo, K.; Turcotte, D. L.; Li, S.; Tiampo, K. F.
2005-12-01
Forecast verification procedures for statistical events with binary outcomes typically rely on the use of contingency tables and Relative Operating Characteristic (ROC) diagrams. Originally developed for the statistical evaluation of tornado forecasts on a county-by-county basis, these methods can be adapted to the evaluation of competing earthquake forecasts. Here we apply these methods retrospectively to two forecasts for the m = 7.3 1999 Chi-Chi, Taiwan, earthquake. These forecasts are based on a method, Pattern Informatics (PI), that locates likely sites for future large earthquakes based on large change in activity of the smallest earthquakes. A competing null hypothesis, Relative Intensity (RI), is based on the idea that future large earthquake locations are correlated with sites having the greatest frequency of small earthquakes. We show that for Taiwan, the PI forecast method is superior to the RI forecast null hypothesis. Inspection of the two maps indicates that their forecast locations are indeed quite different. Our results confirm an earlier result suggesting that the earthquake preparation process for events such as the Chi-Chi earthquake involves anomalous changes in activation or quiescence, and that signatures of these processes can be detected in precursory seismicity data. Furthermore, we find that our methods can accurately forecast the locations of aftershocks from precursory seismicity changes alone, implying that the main shock together with its aftershocks represent a single manifestation of the formation of a high-stress region nucleating prior to the main shock.
2009-10-01
will guarantee a solid base for the future. The content of this publication has been reproduced directly from material supplied by RTO or the...intensity threat involving a local population wanting to break into the camp to steal material and food supplies ; and • A higher intensity threat...combatant evacuation opeations, distribute emergency supplies , and evacuate/ relocate refugees and displaced persons. Specified NLW-relevant tasks are
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-04-28
This health and safety plan sets forth the requirements and procedures to protect the personnel involved in the Lead Source Removal Project at the Former YS-86O Firing Ranges. This project will be conducted in a manner that ensures the protection of the safety and health of workers, the public, and the environment. The purpose of this removal action is to address lead contaminated soil and reduce a potential risk to human health and the environment. This site is an operable unit within the Upper East Fork Poplar Creek watershed. The removal action will contribute to early source actions within themore » watershed. The project will accomplish this through the removal of lead-contaminated soil in the target areas of the two small arms firing ranges. This plan covers the removal actions at the Former YS-86O Firing Ranges. These actions involve the excavation of lead-contaminated soils, the removal of the concrete trench and macadam (asphalt) paths, verification/confirmation sampling, grading and revegetation. The primary hazards include temperature extremes, equipment operation, noise, potential lead exposure, uneven and slippery working surfaces, and insects.« less
2009-01-01
Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075
Integrated Medical Model Verification, Validation, and Credibility
NASA Technical Reports Server (NTRS)
Walton, Marlei; Kerstman, Eric; Foy, Millennia; Shah, Ronak; Saile, Lynn; Boley, Lynn; Butler, Doug; Myers, Jerry
2014-01-01
The Integrated Medical Model (IMM) was designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic (stochastic process) model based on historical data, cohort data, and subject matter expert opinion. A probabilistic approach is taken since exact (deterministic) results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation. METHODS: In 2008, the IMM team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) was published, the IMM team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. RESULTS: IMM VVC updates are compiled recurrently and include 7009 Compliance and Credibility matrices, IMM VV Plan status, and a synopsis of any changes or updates to the IMM during the reporting period. Reporting tools have evolved over the lifetime of the IMM project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the NASA-STD-7009 Guidance Document. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and simulation tool, and completion of service requests from a broad end user consortium including Operations, Science and Technology Planning, and Exploration Planning. CONCLUSIONS: The VVC approach established by the IMM project of combining the IMM VV Plan with 7009 requirements is comprehensive and includes the involvement of end users at every stage in IMM evolution. Methods and techniques used to quantify the VVC status of the IMM have not only received approval from the local NASA community but have also garnered recognition by other federal agencies seeking to develop similar guidelines in the medical modeling community.
Scenarios for exercising technical approaches to verified nuclear reductions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doyle, James
2010-01-01
Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions willmore » take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information for establishing a conceptual approach to a five-year technical program plan for research and development of nuclear arms reductions verification and transparency technologies and procedures.« less
Grelewska-Nowotko, Katarzyna; Żurawska-Zajfert, Magdalena; Żmijewska, Ewelina; Sowa, Sławomir
2018-05-01
In recent years, digital polymerase chain reaction (dPCR), a new molecular biology technique, has been gaining in popularity. Among many other applications, this technique can also be used for the detection and quantification of genetically modified organisms (GMOs) in food and feed. It might replace the currently widely used real-time PCR method (qPCR), by overcoming problems related to the PCR inhibition and the requirement of certified reference materials to be used as a calibrant. In theory, validated qPCR methods can be easily transferred to the dPCR platform. However, optimization of the PCR conditions might be necessary. In this study, we report the transfer of two validated qPCR methods for quantification of maize DAS1507 and NK603 events to the droplet dPCR (ddPCR) platform. After some optimization, both methods have been verified according to the guidance of the European Network of GMO Laboratories (ENGL) on analytical method verification (ENGL working group on "Method Verification." (2011) Verification of Analytical Methods for GMO Testing When Implementing Interlaboratory Validated Methods). Digital PCR methods performed equally or better than the qPCR methods. Optimized ddPCR methods confirm their suitability for GMO determination in food and feed.
The NASA Commercial Crew Program (CCP) Mission Assurance Process
NASA Technical Reports Server (NTRS)
Canfield, Amy
2016-01-01
In 2010, NASA established the Commercial Crew Program in order to provide human access to the International Space Station and low earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine the commercial providers transportation system complies with Programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted Hazard Reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100 percent of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (SMA) model does not support the nature of the Commercial Crew Program. To that end, NASA SMA is implementing a Risk Based Assurance (RBA) process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications. This paper will describe the evolution of the CCP Mission Assurance process from the beginning of the Program to its current incarnation. Topics to be covered include a short history of the CCP; the development of the Programmatic mission assurance requirements; the current safety review process; a description of the RBA process and its products and ending with a description of the Shared Assurance Model.
Improving semi-text-independent method of writer verification using difference vector
NASA Astrophysics Data System (ADS)
Li, Xin; Ding, Xiaoqing
2009-01-01
The semi-text-independent method of writer verification based on the linear framework is a method that can use all characters of two handwritings to discriminate the writers in the condition of knowing the text contents. The handwritings are allowed to just have small numbers of even totally different characters. This fills the vacancy of the classical text-dependent methods and the text-independent methods of writer verification. Moreover, the information, what every character is, is used for the semi-text-independent method in this paper. Two types of standard templates, generated from many writer-unknown handwritten samples and printed samples of each character, are introduced to represent the content information of each character. The difference vectors of the character samples are gotten by subtracting the standard templates from the original feature vectors and used to replace the original vectors in the process of writer verification. By removing a large amount of content information and remaining the style information, the verification accuracy of the semi-text-independent method is improved. On a handwriting database involving 30 writers, when the query handwriting and the reference handwriting are composed of 30 distinct characters respectively, the average equal error rate (EER) of writer verification reaches 9.96%. And when the handwritings contain 50 characters, the average EER falls to 6.34%, which is 23.9% lower than the EER of not using the difference vectors.
Verification testing to confirm VO2max attainment in persons with spinal cord injury.
Astorino, Todd A; Bediamol, Noelle; Cotoia, Sarah; Ines, Kenneth; Koeu, Nicolas; Menard, Natasha; Nyugen, Brianna; Olivo, Cassandra; Phillips, Gabrielle; Tirados, Ardreen; Cruz, Gabriela Velasco
2018-01-22
Maximal oxygen uptake (VO 2 max) is a widely used measure of cardiorespiratory fitness, aerobic function, and overall health risk. Although VO 2 max has been measured for almost 100 yr, no standardized criteria exist to verify VO 2 max attainment. Studies document that incidence of 'true' VO 2 max obtained from incremental exercise (INC) can be confirmed using a subsequent verification test (VER). In this study, we examined efficacy of VER in persons with spinal cord injury (SCI). Repeated measures, within-subjects study. University laboratory in San Diego, CA. Ten individuals (age and injury duration = 33.3 ± 10.5 yr and 6.8 ± 6.2 yr) with SCI and 10 able-bodied (AB) individuals (age = 24.1 ± 7.4 yr). Peak oxygen uptake (VO 2 peak) was determined during INC on an arm ergometer followed by VER at 105 percent of peak power output (% PPO). Gas exchange data, heart rate (HR), and blood lactate concentration (BLa) were measured during exercise. Across all participants, VO 2 peak was highly related between protocols (ICC = 0.98) and the mean difference was equal to 0.08 ± 0.11 L/min. Compared to INC, VO 2 peak from VER was not different in SCI (1.30 ± 0.45 L/min vs. 1.31 ± 0.43 L/min) but higher in AB (1.63 ± 0.40 L/min vs. 1.76 ± 0.40 L/min). Data show similar VO 2 peak between incremental and verification tests in SCI, suggesting that VER confirms VO 2 max attainment. However, in AB participants completing arm ergometry, VER is essential to validate appearance of 'true' VO 2 peak.
Integrated testing and verification system for research flight software design document
NASA Technical Reports Server (NTRS)
Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.
1979-01-01
The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
Adapting an evidence-based model to retain adolescent study participants in longitudinal research.
Davis, Erin; Demby, Hilary; Jenner, Lynne Woodward; Gregory, Alethia; Broussard, Marsha
2016-02-01
Maintaining contact with and collecting outcome data from adolescent study participants can present a significant challenge for researchers conducting longitudinal studies. Establishing an organized and effective protocol for participant follow-up is crucial to reduce attrition and maintain high retention rates. This paper describes our methods in using and adapting the evidence-based Engagement, Verification, Maintenance, and Confirmation (EVMC) model to follow up with adolescents 6 and 12 months after implementation of a health program. It extends previous research by focusing on two key modifications to the model: (1) the central role of cell phones and texting to maintain contact with study participants throughout the EVMC process and, (2) use of responsive two-way communication between staff and participants and flexible administration modes and methods in the confirmation phase to ensure that busy teens not only respond to contacts, but also complete data collection. These strategies have resulted in high overall retention rates (87-91%) with adolescent study participants at each follow-up data collection point without the utilization of other, more involved tracking measures. The methods and findings presented may be valuable for other researchers with limited resources planning for or engaged in collecting follow-up outcome data from adolescents enrolled in longitudinal studies. Copyright © 2015. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talley, Darren G.
2017-04-01
This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code showsmore » good agreement between simulation and actual ACRR operations.« less
Development tests for the 2.5 megawatt Mod-2 wind turbine generator
NASA Technical Reports Server (NTRS)
Andrews, J. S.; Baskin, J. M.
1982-01-01
The 2.5 megawatt MOD-2 wind turbine generator test program is discussed. The development of the 2.5 megawatt MOD-2 wind turbine generator included an extensive program of testing which encompassed verification of analytical procedures, component development, and integrated system verification. The test program was to assure achievement of the thirty year design operational life of the wind turbine system as well as to minimize costly design modifications which would otherwise have been required during on site system testing. Computer codes were modified, fatigue life of structure and dynamic components were verified, mechanical and electrical component and subsystems were functionally checked and modified where necessary to meet system specifications, and measured dynamic responses of coupled systems confirmed analytical predictions.
24 CFR 5.210 - Purpose, applicability, and Federal preemption.
Code of Federal Regulations, 2010 CFR
2010-04-01
... and Urban Development GENERAL HUD PROGRAM REQUIREMENTS; WAIVERS Disclosure and Verification of Social... subpart involves income information from SWICAs, and wages, net earnings from self-employment, payments of...
Numerical Simulation of the Fluid-Structure Interaction of a Surface Effect Ship Bow Seal
NASA Astrophysics Data System (ADS)
Bloxom, Andrew L.
Numerical simulations of fluid-structure interaction (FSI) problems were performed in an effort to verify and validate a commercially available FSI tool. This tool uses an iterative partitioned coupling scheme between CD-adapco's STAR-CCM+ finite volume fluid solver and Simulia's Abaqus finite element structural solver to simulate the FSI response of a system. Preliminary verification and validation work (V&V) was carried out to understand the numerical behavior of the codes individually and together as a FSI tool. Verification and Validation work that was completed included code order verification of the respective fluid and structural solvers with Couette-Poiseuille flow and Euler-Bernoulli beam theory. These results confirmed the 2 nd order accuracy of the spatial discretizations used. Following that, a mixture of solution verifications and model calibrations was performed with the inclusion of the physics models implemented in the solution of the FSI problems. Solution verifications were completed for fluid and structural stand-alone models as well as for the coupled FSI solutions. These results re-confirmed the spatial order of accuracy but for more complex flows and physics models as well as the order of accuracy of the temporal discretizations. In lieu of a good material definition, model calibration is performed to reproduce the experimental results. This work used model calibration for both instances of hyperelastic materials which were presented in the literature as validation cases because these materials were defined as linear elastic. Calibrated, three dimensional models of the bow seal on the University of Michigan bow seal test platform showed the ability to reproduce the experimental results qualitatively through averaging of the forces and seal displacements. These simulations represent the only current 3D results for this case. One significant result of this study is the ability to visualize the flow around the seal and to directly measure the seal resistances at varying cushion pressures, seal immersions, forward speeds, and different seal materials. SES design analysis could greatly benefit from the inclusion of flexible seals in simulations, and this work is a positive step in that direction. In future work, the inclusion of more complex seal geometries and contact will further enhance the capability of this tool.
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2007-01-01
This report presents the mechanical verification of a simplified model of a rapid Byzantine-fault-tolerant self-stabilizing protocol for distributed clock synchronization systems. This protocol does not rely on any assumptions about the initial state of the system. This protocol tolerates bursts of transient failures, and deterministically converges within a time bound that is a linear function of the self-stabilization period. A simplified model of the protocol is verified using the Symbolic Model Verifier (SMV) [SMV]. The system under study consists of 4 nodes, where at most one of the nodes is assumed to be Byzantine faulty. The model checking effort is focused on verifying correctness of the simplified model of the protocol in the presence of a permanent Byzantine fault as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period. Although model checking results of the simplified model of the protocol confirm the theoretical predictions, these results do not necessarily confirm that the protocol solves the general case of this problem. Modeling challenges of the protocol and the system are addressed. A number of abstractions are utilized in order to reduce the state space. Also, additional innovative state space reduction techniques are introduced that can be used in future verification efforts applied to this and other protocols.
7 CFR 1980.398 - Unauthorized assistance and other deficiencies.
Code of Federal Regulations, 2011 CFR
2011-01-01
... proper change orders. (3) Fraud or misrepresentation. A deficiency that involves an action by the Lender... debarment. Examples of this type of deficiency include falsified Verifications of Employment, false...
DOE R&D Accomplishments Database
Lederman, L. M.
1963-01-09
The prediction and verification of the neutrino are reviewed, together with the V A theory for its interactions (particularly the difficulties with the apparent existence of two neutrinos and the high energy cross section). The Brookhaven experiment confirming the existence of two neutrinos and the cross section increase with momentum is then described, and future neutrino experiments are considered. (D.C.W.)
The Second NASA Formal Methods Workshop 1992
NASA Technical Reports Server (NTRS)
Johnson, Sally C. (Compiler); Holloway, C. Michael (Compiler); Butler, Ricky W. (Compiler)
1992-01-01
The primary goal of the workshop was to bring together formal methods researchers and aerospace industry engineers to investigate new opportunities for applying formal methods to aerospace problems. The first part of the workshop was tutorial in nature. The second part of the workshop explored the potential of formal methods to address current aerospace design and verification problems. The third part of the workshop involved on-line demonstrations of state-of-the-art formal verification tools. Also, a detailed survey was filled in by the attendees; the results of the survey are compiled.
[Research progress on enlargement of medicinal resources of Paridis Rhizome].
Cheng, Li; Zhen, Yan; Chen, Min; Huang, Lu-qi
2015-08-01
Currently, as an important raw material of Chinese traditional patent medicines, Paridis Rhizome is in great demand, which led to its price increases. In order to protect the wild resources and satisfy market demand of Paridis rhizome, the researches in various directions were conducted, involved its chemical composition, pharmacological action, clinical application, resource investigation, artificial cultivation, etc. Herein, the chemical studies of genus Paridis Rhizome, aerial parts of Paridis Rhizome gummy and starchy Paridis Rhizome, and the studies of endophyte in Paridis Rhizome were reviewed and analyzed in order to explore the substitutes of Paridis Rhizome, and provide the reference for the enlargement of medicinal resources of Paridis Rhizome. It manifests that the steroidal saponins, the important chemical compositions in Paridis Rhizome were tested in genus Paridis Rhizome, aerial parts of Paridis Rhizome, gummy Paridis Rhizome and the endophyte in Paridis Rhizome. However, the further experimental studies and clinical verification works should be carried out to confirm the final substitute.
Kobayashi, Hiroki; Harada, Hiroko; Nakamura, Masaomi; Futamura, Yushi; Ito, Akihiro; Yoshida, Minoru; Iemura, Shun-Ichiro; Shin-Ya, Kazuo; Doi, Takayuki; Takahashi, Takashi; Natsume, Tohru; Imoto, Masaya; Sakakibara, Yasubumi
2012-04-05
Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system.As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.
Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey
2010-09-01
Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.
Towards the Verification of Human-Robot Teams
NASA Technical Reports Server (NTRS)
Fisher, Michael; Pearce, Edward; Wooldridge, Mike; Sierhuis, Maarten; Visser, Willem; Bordini, Rafael H.
2005-01-01
Human-Agent collaboration is increasingly important. Not only do high-profile activities such as NASA missions to Mars intend to employ such teams, but our everyday activities involving interaction with computational devices falls into this category. In many of these scenarios, we are expected to trust that the agents will do what we expect and that the agents and humans will work together as expected. But how can we be sure? In this paper, we bring together previous work on the verification of multi-agent systems with work on the modelling of human-agent teamwork. Specifically, we target human-robot teamwork. This paper provides an outline of the way we are using formal verification techniques in order to analyse such collaborative activities. A particular application is the analysis of human-robot teams intended for use in future space exploration.
40 CFR 82.40 - Technician training and certification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... address in § 82.38(a) verification that the program meets all of the following standards: (1) Training... training, training through self-study of instructional material, or on-site training involving instructors...
VINCI: the VLT Interferometer commissioning instrument
NASA Astrophysics Data System (ADS)
Kervella, Pierre; Coudé du Foresto, Vincent; Glindemann, Andreas; Hofmann, Reiner
2000-07-01
The Very Large Telescope Interferometer (VLTI) is a complex system, made of a large number of separated elements. To prepare an early successful operation, it will require a period of extensive testing and verification to ensure that the many devices involved work properly together, and can produce meaningful data. This paper describes the concept chosen for the VLTI commissioning instrument, LEONARDO da VINCI, and details its functionalities. It is a fiber based two-way beam combiner, associated with an artificial star and an alignment verification unit. The technical commissioning of the VLTI is foreseen as a stepwise process: fringes will first be obtained with the commissioning instrument in an autonomous mode (no other parts of the VLTI involved); then the VLTI telescopes and optical trains will be tested in autocollimation; finally fringes will be observed on the sky.
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
Neuropathological Assessment as an Endpoint in Clinical Trial Design.
Gentleman, Steve; Liu, Alan King Lun
2018-01-01
Different neurodegenerative conditions can have complex, overlapping clinical presentations that make accurate diagnosis during life very challenging. For this reason, confirmation of the clinical diagnosis still requires postmortem verification. This is particularly relevant for clinical trials of novel therapeutics where it is important to ascertain what disease and/or pathology modifying effects the therapeutics have had. Furthermore, it is important to confirm that patients in the trial actually had the correct clinical diagnosis as this will have a major bearing on the interpretation of trial results. Here we present a simple protocol for pathological assessment of neurodegenerative changes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.
The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without themore » use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.« less
Nejo, Takahide; Oya, Soichi; Tsukasa, Tsuchiya; Yamaguchi, Naomi; Matsui, Toru
2016-12-01
Several bedside approaches used in combination with thoracoabdominal X-ray are widely used to avoid severe complications that have been reported during nasogastric tube management. Although confirmation by X-ray is considered the gold standard, it is not yet perfect. We present 2 cases of rare complications in which the routine verification methods could not detect all the complications related to the nasogastric tube placement. Case 1 was a 17-year-old male who presented with a brain tumor and repeatedly required nasogastric tube placement. Despite normal auscultatory and X-ray findings, the patient's condition deteriorated rapidly after resuming the enteral nutrition (EN). Computed tomography images showed the presence of hepatic portal venous gas (HPVG). Urgent upper gastrointestinal endoscopy showed esophagogastric submucosal tunneling of the tube that required an emergency open total gastrectomy. Case 2 was a 76-year-old man with long-term EN after stroke. While the last auscultatory verification was normal, he suddenly developed extensive HPVG due to gastric mucosal injury following EN, which resulted in progressive intestinal necrosis, general peritonitis, and death. These 2 cases indicated that routine verification methods consisting of auscultation and X-ray may not be completely reliable, and the awareness of the limitations of these methods should be reaffirmed because expeditious examinations and necessary interventions are critical in preventing life-threatening complications.
NASA Astrophysics Data System (ADS)
Villiger, Arturo; Schaer, Stefan; Dach, Rolf; Prange, Lars; Jäggi, Adrian
2017-04-01
It is common to handle code biases in the Global Navigation Satellite System (GNSS) data analysis as conventional differential code biases (DCBs): P1-C1, P1-P2, and P2-C2. Due to the increasing number of signals and systems in conjunction with various tracking modes for the different signals (as defined in RINEX3 format), the number of DCBs would increase drastically and the bookkeeping becomes almost unbearable. The Center for Orbit Determination in Europe (CODE) has thus changed its processing scheme to observable-specific signal biases (OSB). This means that for each observation involved all related satellite and receiver biases are considered. The OSB contributions from various ionosphere analyses (geometry-free linear combination) using different observables and frequencies and from clock analyses (ionosphere-free linear combination) are then combined on normal equation level. By this, one consistent set of OSB values per satellite and receiver can be obtained that contains all information needed for GNSS-related processing. This advanced procedure of code bias handling is now also applied to the IGS (International GNSS Service) MGEX (Multi-GNSS Experiment) procedure at CODE. Results for the biases from the legacy IGS solution as well as the CODE MGEX processing (considering GPS, GLONASS, Galileo, BeiDou, and QZSS) are presented. The consistency with the traditional method is confirmed and the new results are discussed regarding the long-term stability. When processing code data, it is essential to know the true observable types in order to correct for the associated biases. CODE has been verifying the receiver tracking technologies for GPS based on estimated DCB multipliers (for the RINEX 2 case). With the change to OSB, the original verification approach was extended to search for the best fitting observable types based on known OSB values. In essence, a multiplier parameter is estimated for each involved GNSS observable type. This implies that we could recover, for receivers tracking a combination of signals, even the factors of these combinations. The verification of the observable types is crucial to identify the correct observable types of RINEX 2 data (which does not contain the signal modulation in comparison to RINEX 3). The correct information of the used observable types is essential for precise point positioning (PPP) applications and GNSS ambiguity resolution. Multi-GNSS OSBs and verified receiver tracking modes are essential to get best possible multi-GNSS solutions for geodynamic purposes and other applications.
Initial verification and validation of RAZORBACK - A research reactor transient analysis code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talley, Darren G.
2015-09-01
This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actualmore » ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.« less
1966-08-01
AS-202, the second Saturn IB launch vehicle developed by the Marshall Space Flight Center, lifts off from Cape Canaveral, Florida, August 25, 1966. Primary mission objectives included the confirmation of projected launch loads, demonstration of spacecraft component separation, and verification of heat shield adequacy at high reentry rates. In all, nine Saturn IB flights were made, ending with the Apollo-Soyuz Test Project (ASTP) in July 1975.
The F1000Research: Ebola article collection
Piot, Peter
2014-01-01
The explosion of information about Ebola requires rapid publication, transparent verification and unrestricted access. I urge everyone involved in all aspects of the Ebola epidemic to openly and rapidly report their experiences and findings. PMID:25580233
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 1 2013-10-01 2013-10-01 false On what basis does the MRO verify test results involving adulteration or substitution? 40.145 Section 40.145 Transportation Office of the Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers and the Verification Process §...
Predicted and tested performance of durable TPS
NASA Technical Reports Server (NTRS)
Shideler, John L.
1992-01-01
The development of thermal protection systems (TPS) for aerospace vehicles involves combining material selection, concept design, and verification tests to evaluate the effectiveness of the system. The present paper reviews verification tests of two metallic and one carbon-carbon thermal protection system. The test conditions are, in general, representative of Space Shuttle design flight conditions which may be more or less severe than conditions required for future space transportation systems. The results of this study are intended to help establish a preliminary data base from which the designers of future entry vehicles can evaluate the applicability of future concepts to their vehicles.
NASA Technical Reports Server (NTRS)
Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.
2016-01-01
With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.
NASA Astrophysics Data System (ADS)
Graham, Michelle; Gray, David
As wireless networks become increasingly ubiquitous, the demand for a method of locating a device has increased dramatically. Location Based Services are now commonplace but there are few methods of verifying or guaranteeing a location provided by a user without some specialised hardware, especially in larger scale networks. We propose a system for the verification of location claims, using proof gathered from neighbouring devices. In this paper we introduce a protocol to protect this proof gathering process, protecting the privacy of all involved parties and securing it from intruders and malicious claiming devices. We present the protocol in stages, extending the security of this protocol to allow for flexibility within its application. The Secure Location Verification Proof Gathering Protocol (SLVPGP) has been designed to function within the area of Vehicular Networks, although its application could be extended to any device with wireless & cryptographic capabilities.
Space Weather Models and Their Validation and Verification at the CCMC
NASA Technical Reports Server (NTRS)
Hesse, Michael
2010-01-01
The Community Coordinated l\\lodeling Center (CCMC) is a US multi-agency activity with a dual mission. With equal emphasis, CCMC strives to provide science support to the international space research community through the execution of advanced space plasma simulations, and it endeavors to support the space weather needs of the CS and partners. Space weather support involves a broad spectrum, from designing robust forecasting systems and transitioning them to forecasters, to providing space weather updates and forecasts to NASA's robotic mission operators. All of these activities have to rely on validation and verification of models and their products, so users and forecasters have the means to assign confidence levels to the space weather information. In this presentation, we provide an overview of space weather models resident at CCMC, as well as of validation and verification activities undertaken at CCMC or through the use of CCMC services.
Stratway: A Modular Approach to Strategic Conflict Resolution
NASA Technical Reports Server (NTRS)
Hagen, George E.; Butler, Ricky W.; Maddalon, Jeffrey M.
2011-01-01
In this paper we introduce Stratway, a modular approach to finding long-term strategic resolutions to conflicts between aircraft. The modular approach provides both advantages and disadvantages. Our primary concern is to investigate the implications on the verification of safety-critical properties of a strategic resolution algorithm. By partitioning the problem into verifiable modules much stronger verification claims can be established. Since strategic resolution involves searching for solutions over an enormous state space, Stratway, like most similar algorithms, searches these spaces by applying heuristics, which present especially difficult verification challenges. An advantage of a modular approach is that it makes a clear distinction between the resolution function and the trajectory generation function. This allows the resolution computation to be independent of any particular vehicle. The Stratway algorithm was developed in both Java and C++ and is available through a open source license. Additionally there is a visualization application that is helpful when analyzing and quickly creating conflict scenarios.
A new technique for measuring listening and reading literacy in developing countries
NASA Astrophysics Data System (ADS)
Greene, Barbara A.; Royer, James M.; Anzalone, Stephen
1990-03-01
One problem in evaluating educational interventions in developing countries is the absence of tests that adequately reflect the culture and curriculum. The Sentence Verification Technique is a new procedure for measuring reading and listening comprehension that allows for the development of tests based on materials indigenous to a given culture. The validity of using the Sentence Verification Technique to measure reading comprehension in Grenada was evaluated in the present study. The study involved 786 students at standards 3, 4 and 5. The tests for each standard consisted of passages that varied in difficulty. The students identified as high ability students in all three standards performed better than those identified as low ability. All students performed better with easier passages. Additionally, students in higher standards performed bettter than students in lower standards on a given passage. These results supported the claim that the Sentence Verification Technique is a valid measure of reading comprehension in Grenada.
Pfeiffer, P; Bache, S; Isbye, D L; Rudolph, S S; Rovsing, L; Børglum, J
2012-05-01
Ultrasound (US) may have an emerging role as an adjunct in verification of endotracheal intubation. Obtaining optimal US images in obese patients is generally regarded more difficult than for other patients. This study compared the time consumption of bilateral lung US with auscultation and capnography for verifying endotracheal intubation in obese patients. A prospective, paired and investigator-blinded study performed in the operating theatre. Twenty-four adult patients requiring endotracheal intubation for bariatric surgery were included. During post-intubation bag ventilation, bilateral lung US was performed for detection of lungsliding indicating lung ventilation simultaneous with capnography and auscultation of epigastrium and chest. Primary outcome measure was the time difference to confirmed endotracheal intubation between US and auscultation alone. The secondary outcome measure was time difference between US and auscultation combined with capnography. Both methods verified endotracheal tube placement in all patients. No significant difference was found between US compared with auscultation alone. Median time for verification by auscultation alone was 47.5 s [interquartile (IQR) 40-51 s], with a mean difference of -0.3 s in favor of US (95% confidence interval -3.5-2.9 s) P = 0.87. Comparing US with the combination of auscultation and capnography, there was a significant difference between the two methods. Median time for verification by US was 43 s (IQR 40-51 s) vs. 55 s (IQR 46-65 s), P < 0.0001. In obese patients, verification of endotracheal tube placement with US is as fast as auscultation alone and faster than the standard method of auscultation and capnography. © 2012 The Authors. Acta Anaesthesiologica Scandinavica © 2012 The Acta Anaesthesiologica Scandinavica Foundation.
Verification of a Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Roithmayr, Carlos; Toniolo, Matthew D.; Karlgaard, Christopher; Pamadi, Bandu N.
2008-01-01
This paper discusses the verification of the Constraint Force Equation (CFE) methodology and its implementation in the Program to Optimize Simulated Trajectories II (POST2) for multibody separation problems using three specially designed test cases. The first test case involves two rigid bodies connected by a fixed joint; the second case involves two rigid bodies connected with a universal joint; and the third test case is that of Mach 7 separation of the Hyper-X vehicle. For the first two cases, the POST2/CFE solutions compared well with those obtained using industry standard benchmark codes, namely AUTOLEV and ADAMS. For the Hyper-X case, the POST2/CFE solutions were in reasonable agreement with the flight test data. The CFE implementation in POST2 facilitates the analysis and simulation of stage separation as an integral part of POST2 for seamless end-to-end simulations of launch vehicle trajectories.
Influenza forecasting with Google Flu Trends.
Dugas, Andrea Freyer; Jalalpour, Mehdi; Gel, Yulia; Levin, Scott; Torcaso, Fred; Igusa, Takeru; Rothman, Richard E
2013-01-01
We developed a practical influenza forecast model based on real-time, geographically focused, and easy to access data, designed to provide individual medical centers with advanced warning of the expected number of influenza cases, thus allowing for sufficient time to implement interventions. Secondly, we evaluated the effects of incorporating a real-time influenza surveillance system, Google Flu Trends, and meteorological and temporal information on forecast accuracy. Forecast models designed to predict one week in advance were developed from weekly counts of confirmed influenza cases over seven seasons (2004-2011) divided into seven training and out-of-sample verification sets. Forecasting procedures using classical Box-Jenkins, generalized linear models (GLM), and generalized linear autoregressive moving average (GARMA) methods were employed to develop the final model and assess the relative contribution of external variables such as, Google Flu Trends, meteorological data, and temporal information. A GARMA(3,0) forecast model with Negative Binomial distribution integrating Google Flu Trends information provided the most accurate influenza case predictions. The model, on the average, predicts weekly influenza cases during 7 out-of-sample outbreaks within 7 cases for 83% of estimates. Google Flu Trend data was the only source of external information to provide statistically significant forecast improvements over the base model in four of the seven out-of-sample verification sets. Overall, the p-value of adding this external information to the model is 0.0005. The other exogenous variables did not yield a statistically significant improvement in any of the verification sets. Integer-valued autoregression of influenza cases provides a strong base forecast model, which is enhanced by the addition of Google Flu Trends confirming the predictive capabilities of search query based syndromic surveillance. This accessible and flexible forecast model can be used by individual medical centers to provide advanced warning of future influenza cases.
Proton Therapy Dose Characterization and Verification
2016-10-01
than recommended as these patients are on a separate UPENN research study where dose maximum accepted was 6700 cGy. 15... Research Protection Office. 8.0 Data Handling and Record Keeping All patients must have a signed Informed Consent Form and an On - study (confirmation...this award. Phase 1 concentrated on designing and building a Multi-leaf collimator for use in proton therapy. Phase 2 focused on studying the
Kalkhoff, Will; Marcussen, Kristen; Serpe, Richard T
2016-07-01
After many years of research across disciplines, it remains unclear whether people are more motivated to seek appraisals that accurately match self-views (self-verification) or are as favorable as possible (self-enhancement). Within sociology, mixed findings in identity theory have fueled the debate. A problem here is that a commonly employed statistical approach does not take into account the direction of a discrepancy between how we see ourselves and how we think others see us in terms of a given identity, yet doing so is critical for determining which self-motive is at play. We offer a test of three competing models of identity processes, including a new "mixed motivations" model where self-verification and self-enhancement operate simultaneously. We compare the models using the conventional statistical approach versus response surface analysis. The latter method allows us to determine whether identity discrepancies involving over-evaluation are as distressing as those involving under-evaluation. We use nationally representative data and compare results across four different identities and multiple outcomes. The two statistical approaches lead to the same conclusions more often than not and mostly support identity theory and its assumption that people seek self-verification. However, response surface tests reveal patterns that are mistaken as evidence of self-verification by conventional procedures, especially for the spouse identity. We also find that identity discrepancies have different effects on distress and self-conscious emotions (guilt and shame). Our findings have implications not only for research on self and identity across disciplines, but also for many other areas of research that incorporate these concepts and/or use difference scores as explanatory variables. Copyright © 2016 Elsevier Inc. All rights reserved.
Verifying a Computer Algorithm Mathematically.
ERIC Educational Resources Information Center
Olson, Alton T.
1986-01-01
Presents an example of mathematics from an algorithmic point of view, with emphasis on the design and verification of this algorithm. The program involves finding roots for algebraic equations using the half-interval search algorithm. The program listing is included. (JN)
NASA Technical Reports Server (NTRS)
Miner, Paul S.
1993-01-01
A critical function in a fault-tolerant computer architecture is the synchronization of the redundant computing elements. The synchronization algorithm must include safeguards to ensure that failed components do not corrupt the behavior of good clocks. Reasoning about fault-tolerant clock synchronization is difficult because of the possibility of subtle interactions involving failed components. Therefore, mechanical proof systems are used to ensure that the verification of the synchronization system is correct. In 1987, Schneider presented a general proof of correctness for several fault-tolerant clock synchronization algorithms. Subsequently, Shankar verified Schneider's proof by using the mechanical proof system EHDM. This proof ensures that any system satisfying its underlying assumptions will provide Byzantine fault-tolerant clock synchronization. The utility of Shankar's mechanization of Schneider's theory for the verification of clock synchronization systems is explored. Some limitations of Shankar's mechanically verified theory were encountered. With minor modifications to the theory, a mechanically checked proof is provided that removes these limitations. The revised theory also allows for proven recovery from transient faults. Use of the revised theory is illustrated with the verification of an abstract design of a clock synchronization system.
NASA Astrophysics Data System (ADS)
Lukyanov, A. D.; Alekseev, V. V.; Bogomolov, Yu V.; Dunaeva, O. A.; Malakhov, V. V.; Mayorov, A. G.; Rodenko, S. A.
2017-01-01
Analysis of experimental data of primary positrons and antiprotons fluxes obtained by PAMELA spectrometer, recently confirmed by AMS-02 spectrometer, for some reasons is of big interest for scientific community, especially for energies higher than 100 GV, where appearance of signal coming from dark matter particles is possible. In this work we present a method for verification of charge sign for high-energy antiprotons, measured by magnetic tracking system of PAMELA spectrometer, which can be immitated by protons due to scattering or finite instrumental resolution at high energies (so-called “spillover”). We base our approach on developing2 a set of distinctive features represented by differently computed rigidities and training AdaBoost classifier, which shows good classification accuracy on Monte-Carlo simulation data of 98% for rigidity up to 600 GV.
Saeidian, Hamid; Babri, Mehran; Ramezani, Atefeh; Ashrafi, Davood; Sarabadani, Mansour; Naseri, Mohammad Taghi
2013-01-01
The electron ionization (EI) mass spectra of a series of O-alkyl O-2-(N,N-dialkylaminolethyl alkylphosphonites(phosphonates), which are precursors of nerve agents, were studied for Chemical Weapons Convention (CWC) verification. General El fragmentation pathways were constructed and discussed. Proposed fragment structures were confirmed through analyzing fragment ions of deuterated analogs and density functional theory (DFT) calculations. The observed fragment ions are due to different fragmentation pathways such as hydrogen and McLafferty+1 rearrangements, alkene, amine and alkoxy elimination by alpha- or beta-cleavage process. Fragment ions distinctly allow unequivocal identification of the interested compounds including those of isomeric compounds. The presence and abundance of fragment ions were found to depend on the size and structure of the alkyl group attached to nitrogen, phosphorus and oxygen atoms.
Self-verification in clinical depression: the desire for negative evaluation.
Giesler, R B; Josephs, R A; Swann, W B
1996-08-01
Do clinically depressed individuals seek favorable or unfavorable information about the self? Self-verification theory makes the counterintuitive prediction that depressed individuals solicit feedback that confirms their negative self-views. To test this prediction, participants were classified on the basis of a structured clinical interview and self-report measures into high-esteem, low self-esteem, and depressed groups. All participants were offered a choice between receiving favorable or unfavorable feedback; 82% of the depressed participants chose the unfavorable feedback, compared to 64% of the low self-esteem participants and 25% of the high self-esteem participants. Additional evidence indicated that depressed individuals also failed to exploit fully an opportunity to acquire favorable evaluations that were self-verifying. The authors discuss how seeking negative evaluations and failing to seek favorable evaluations may help maintain depression.
NASA Technical Reports Server (NTRS)
Panek, Joseph W.
2001-01-01
The proper operation of the Electronically Scanned Pressure (ESP) System critical to accomplish the following goals: acquisition of highly accurate pressure data for the development of aerospace and commercial aviation systems and continuous confirmation of data quality to avoid costly, unplanned, repeat wind tunnel or turbine testing. Standard automated setup and checkout routines are necessary to accomplish these goals. Data verification and integrity checks occur at three distinct stages, pretest pressure tubing and system checkouts, daily system validation and in-test confirmation of critical system parameters. This paper will give an overview of the existing hardware, software and methods used to validate data integrity.
Control structural interaction testbed: A model for multiple flexible body verification
NASA Technical Reports Server (NTRS)
Chory, M. A.; Cohen, A. L.; Manning, R. A.; Narigon, M. L.; Spector, V. A.
1993-01-01
Conventional end-to-end ground tests for verification of control system performance become increasingly complicated with the development of large, multiple flexible body spacecraft structures. The expense of accurately reproducing the on-orbit dynamic environment and the attendant difficulties in reducing and accounting for ground test effects limits the value of these tests. TRW has developed a building block approach whereby a combination of analysis, simulation, and test has replaced end-to-end performance verification by ground test. Tests are performed at the component, subsystem, and system level on engineering testbeds. These tests are aimed at authenticating models to be used in end-to-end performance verification simulations: component and subassembly engineering tests and analyses establish models and critical parameters, unit level engineering and acceptance tests refine models, and subsystem level tests confirm the models' overall behavior. The Precision Control of Agile Spacecraft (PCAS) project has developed a control structural interaction testbed with a multibody flexible structure to investigate new methods of precision control. This testbed is a model for TRW's approach to verifying control system performance. This approach has several advantages: (1) no allocation for test measurement errors is required, increasing flight hardware design allocations; (2) the approach permits greater latitude in investigating off-nominal conditions and parametric sensitivities; and (3) the simulation approach is cost effective, because the investment is in understanding the root behavior of the flight hardware and not in the ground test equipment and environment.
NASA Formal Methods Workshop, 1990
NASA Technical Reports Server (NTRS)
Butler, Ricky W. (Compiler)
1990-01-01
The workshop brought together researchers involved in the NASA formal methods research effort for detailed technical interchange and provided a mechanism for interaction with representatives from the FAA and the aerospace industry. The workshop also included speakers from industry to debrief the formal methods researchers on the current state of practice in flight critical system design, verification, and certification. The goals were: define and characterize the verification problem for ultra-reliable life critical flight control systems and the current state of practice in industry today; determine the proper role of formal methods in addressing these problems, and assess the state of the art and recent progress toward applying formal methods to this area.
Design of the software development and verification system (SWDVS) for shuttle NASA study task 35
NASA Technical Reports Server (NTRS)
Drane, L. W.; Mccoy, B. J.; Silver, L. W.
1973-01-01
An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.
The Use of Remote Sensing Satellites for Verification in International Law
NASA Astrophysics Data System (ADS)
Hettling, J. K.
The contribution is a very sensitive topic which is currently about to gain significance and importance in the international community. It implies questions of international law as well as the contemplation of new developments and decisions in international politics. The paper will begin with the meaning and current status of verification in international law as well as the legal basis of satellite remote sensing in international treaties and resolutions. For the verification part, this implies giving a definition of verification and naming its fields of application and the different means of verification. For the remote sensing part, it involves the identification of relevant provisions in the Outer Space Treaty and the United Nations General Assembly Principles on Remote Sensing. Furthermore it shall be looked at practical examples: in how far have remote sensing satellites been used to verify international obligations? Are there treaties which would considerably profit from the use of remote sensing satellites? In this respect, there are various examples which can be contemplated, such as the ABM Treaty (even though out of force now), the SALT and START Agreements, the Chemical Weapons Convention and the Conventional Test Ban Treaty. It will be mentioned also that NGOs have started to verify international conventions, e.g. Landmine Monitor is verifying the Mine-Ban Convention. Apart from verifying arms control and disarmament treaties, satellites can also strengthen the negotiation of peace agreements (such as the Dayton Peace Talks) and the prevention of international conflicts from arising. Verification has played an increasingly prominent role in high-profile UN operations. Verification and monitoring can be applied to the whole range of elements that constitute a peace implementation process, ranging from the military aspects through electoral monitoring and human rights monitoring, from negotiating an accord to finally monitoring it. Last but not least the problem of enforcing international obligations needs to be addressed, especially the dependence of international law on the will of political leaders and their respective national interests.
Sivanesam, Kalkena; Shu, Irene; Huggins, Kelly N. L.; Tatarek-Nossol, Marianna; Kapurniotu, Aphrodite; Andersen, Niels H.
2016-01-01
Versions of a previously discovered β-hairpin peptide inhibitor of IAPP aggregation that are stabilized in that conformation, or even forced to remain in the hairpin conformation by a backbone cyclization constraint, display superior activity as inhibitors. The cyclized hairpin, cyclo-WW2, displays inhibitory activity at sub-stoichiometric concentrations relative to this amyloidogenic peptide. The hairpin binding hypothesis stands confirmed. PMID:27317951
Automatic high throughput empty ISO container verification
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2007-04-01
Encouraging results are presented for the automatic analysis of radiographic images of a continuous stream of ISO containers to confirm they are truly empty. A series of image processing algorithms are described that process real-time data acquired during the actual inspection of each container and assigns each to one of the classes "empty", "not empty" or "suspect threat". This research is one step towards achieving fully automated analysis of cargo containers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunch, Kyle J.; Jones, Anthony M.; Ramuhalli, Pradeep
The ratification and ongoing implementation of the New START Treaty have been widely regarded as noteworthy global security achievements for both the Obama Administration and the Putin (formerly Medvedev) regime. But deeper cuts that move beyond the United States and Russia to engage the P-5 and other nuclear weapons possessor states are envisioned under future arms control regimes, and are indeed required for the P-5 in accordance with their Article VI disarmament obligations in the Nuclear Non-Proliferation Treaty. Future verification needs will include monitoring the cessation of production of new fissile material for weapons, monitoring storage of warhead components andmore » fissile materials and verifying dismantlement of warheads, pits, secondary stages, and other materials. A fundamental challenge to implementing a nuclear disarmament regime is the ability to thwart unauthorized material diversion throughout the dismantlement and disposition process through strong chain of custody implementation. Verifying the declared presence, or absence, of nuclear materials and weapons components throughout the dismantlement and disposition lifecycle is a critical aspect of the disarmament process. From both the diplomatic and technical perspectives, verification under these future arms control regimes will require new solutions. Since any acceptable verification technology must protect sensitive design information and attributes to prevent the release of classified or other proliferation-sensitive information, non-nuclear non-sensitive modalities may provide significant new verification tools which do not require the use of additional information barriers. Alternative verification technologies based upon electromagnetic and acoustics could potentially play an important role in fulfilling the challenging requirements of future verification regimes. For example, researchers at the Pacific Northwest National Laboratory (PNNL) have demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to rapidly confirm the presence of specific components on a yes/no basis without revealing classified information. PNNL researchers have also used ultrasonic measurements to obtain images of material microstructures which may be used as templates or unique identifiers of treaty-limited items. Such alternative technologies are suitable for application in various stages of weapons dismantlement and often include the advantage of an inherent information barrier due to the inability to extract classified weapon design information from the collected data. As a result, these types of technologies complement radiation-based verification methods for arms control. This article presents an overview of several alternative verification technologies that are suitable for supporting a future, broader and more intrusive arms control regime that spans the nuclear weapons disarmament lifecycle. The general capabilities and limitations of each verification modality are discussed and example technologies are presented. Potential applications are defined in the context of the nuclear material and weapons lifecycle. Example applications range from authentication (e.g., tracking and signatures within the chain of custody from downloading through weapons storage, unclassified templates and unique identification) to verification of absence and final material disposition.« less
Simulating flow around scaled model of a hypersonic vehicle in wind tunnel
NASA Astrophysics Data System (ADS)
Markova, T. V.; Aksenov, A. A.; Zhluktov, S. V.; Savitsky, D. V.; Gavrilov, A. D.; Son, E. E.; Prokhorov, A. N.
2016-11-01
A prospective hypersonic HEXAFLY aircraft is considered in the given paper. In order to obtain the aerodynamic characteristics of a new construction design of the aircraft, experiments with a scaled model have been carried out in a wind tunnel under different conditions. The runs have been performed at different angles of attack with and without hydrogen combustion in the scaled propulsion engine. However, the measured physical quantities do not provide all the information about the flowfield. Numerical simulation can complete the experimental data as well as to reduce the number of wind tunnel experiments. Besides that, reliable CFD software can be used for calculations of the aerodynamic characteristics for any possible design of the full-scale aircraft under different operation conditions. The reliability of the numerical predictions must be confirmed in verification study of the software. The given work is aimed at numerical investigation of the flowfield around and inside the scaled model of the HEXAFLY-CIAM module under wind tunnel conditions. A cold run (without combustion) was selected for this study. The calculations are performed in the FlowVision CFD software. The flow characteristics are compared against the available experimental data. The carried out verification study confirms the capability of the FlowVision CFD software to calculate the flows discussed.
NASA Astrophysics Data System (ADS)
Osawa, Yuta; Imoto, Shoichi; Kusaka, Sachie; Sato, Fuminobu; Tanoshita, Masahiro; Murata, Isao
2017-09-01
Boron Neutron Capture Therapy (BNCT) is known to be a new promising cancer therapy suppressing influence against normal cells. In Japan, Accelerator Based Neutron Sources (ABNS) are being developed for BNCT. For the spread of ABNS based BNCT, we should characterize the neutron field beforehand. For this purpose, we have been developing a low-energy neutron spectrometer based on 3He position sensitive proportional counter. In this study, a new intense epi-thermal neutron field was developed with a DT neutron source for verification of validity of the spectrometer. After the development, the neutron field characteristics were experimentally evaluated by using activation foils. As a result, we confirmed that an epi-thermal neutron field was successfully developed suppressing fast neutrons substantially. Thereafter, the neutron spectrometer was verified experimentally. In the verification, although a measured detection depth distribution agreed well with the calculated distribution by MCNP, the unfolded spectrum was significantly different from the calculated neutron spectrum due to contribution of the side neutron incidence. Therefore, we designed a new neutron collimator consisting of a polyethylene pre-collimator and boron carbide neutron absorber and confirmed numerically that it could suppress the side incident neutrons and shape the neutron flux to be like a pencil beam.
Gorlin-Goltz syndrome: incidental finding on routine ct scan following car accident.
Kalogeropoulou, Christina; Zampakis, Petros; Kazantzi, Santra; Kraniotis, Pantelis; Mastronikolis, Nicholas S
2009-11-25
Gorlin-Goltz syndrome is a rare hereditary disease. Pathogenesis of the syndrome is attributed to abnormalities in the long arm of chromosome 9 (q22.3-q31) and loss or mutations of human patched gene (PTCH1 gene). Multiple basal cell carcinomas (BCCs), odontogenic keratocysts, skeletal abnormalities, hyperkeratosis of palms and soles, intracranial ectopic calcifications of the falx cerebri and facial dysmorphism are considered the main clinical features. Diagnosis is based upon established major and minor clinical and radiological criteria and ideally confirmed by DNA analysis. Because of the different systems affected, a multidisciplinary approach team of various experts is required for a successful management. We report the case of a 19 year-old female who was involved in a car accident and found to present imaging findings of Gorlin-Goltz syndrome during a routine whole body computed tomography (CT) scan in order to exclude traumatic injuries. Radiologic findings of the syndrome are easily identifiable on CT scans and may prompt to early verification of the disease, which is very important for regular follow-up and better survival rates from the co-existent diseases.
The calculating brain: an fMRI study.
Rickard, T C; Romero, S G; Basso, G; Wharton, C; Flitman, S; Grafman, J
2000-01-01
To explore brain areas involved in basic numerical computation, functional magnetic imaging (fMRI) scanning was performed on college students during performance of three tasks; simple arithmetic, numerical magnitude judgment, and a perceptual-motor control task. For the arithmetic relative to the other tasks, results for all eight subjects revealed bilateral activation in Brodmann's area 44, in dorsolateral prefrontal cortex (areas 9 and 10), in inferior and superior parietal areas, and in lingual and fusiform gyri. Activation was stronger on the left for all subjects, but only at Brodmann's area 44 and the parietal cortices. No activation was observed in the arithmetic task in several other areas previously implicated for arithmetic, including the angular and supramarginal gyri and the basal ganglia. In fact, angular and supramarginal gyri were significantly deactivated by the verification task relative to both the magnitude judgment and control tasks for every subject. Areas activated by the magnitude task relative to the control were more variable, but in five subjects included bilateral inferior parietal cortex. These results confirm some existing hypotheses regarding the neural basis of numerical processes, invite revision of others, and suggest productive lines for future investigation.
Experimental test of quantum nonlocality in three-photon Greenberger-Horne-Zeilinger entanglement
Pan; Bouwmeester; Daniell; Weinfurter; Zeilinger
2000-02-03
Bell's theorem states that certain statistical correlations predicted by quantum physics for measurements on two-particle systems cannot be understood within a realistic picture based on local properties of each individual particle-even if the two particles are separated by large distances. Einstein, Podolsky and Rosen first recognized the fundamental significance of these quantum correlations (termed 'entanglement' by Schrodinger) and the two-particle quantum predictions have found ever-increasing experimental support. A more striking conflict between quantum mechanical and local realistic predictions (for perfect correlations) has been discovered; but experimental verification has been difficult, as it requires entanglement between at least three particles. Here we report experimental confirmation of this conflict, using our recently developed method to observe three-photon entanglement, or 'Greenberger-Horne-Zeilinger' (GHZ) states. The results of three specific experiments, involving measurements of polarization correlations between three photons, lead to predictions for a fourth experiment; quantum physical predictions are mutually contradictory with expectations based on local realism. We find the results of the fourth experiment to be in agreement with the quantum prediction and in striking conflict with local realism.
Maryáš, Josef; Faktor, Jakub; Dvořáková, Monika; Struhárová, Iva; Grell, Peter; Bouchal, Pavel
2014-03-01
Metastases are responsible for most of the cases of death in patients with solid tumors. There is thus an urgent clinical need of better understanding the exact molecular mechanisms and finding novel therapeutics targets and biomarkers of metastatic disease of various tumors. Metastases are formed in a complicated biological process called metastatic cascade. Up to now, proteomics has enabled the identification of number of metastasis-associated proteins and potential biomarkers in cancer tissues, microdissected cells, model systems, and secretomes. Expression profiles and biological role of key proteins were confirmed in verification and functional experiments. This communication reviews these observations and analyses the methodological aspects of the proteomics approaches used. Moreover, it reviews contribution of current proteomics in the field of functional characterization and interactome analysis of proteins involved in various events in metastatic cascade. It is evident that ongoing technical progress will further increase proteome coverage and sample capacity of proteomics technologies, giving complex answers to clinical and functional questions asked. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Radiocarbon as a Reactive Tracer for Tracking Permanent CO 2 Storage in Basaltic Rocks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matter, Juerg; Stute, Martin; Schlosser, Peter
In view of concerns about the long-term integrity and containment of CO 2 storage in geologic reservoirs, many efforts have been made to improve the monitoring, verification and accounting methods for geologically stored CO 2. Our project aimed to demonstrate that carbon-14 ( 14C) could be used as a reactive tracer to monitor geochemical reactions and evaluate the extent of mineral trapping of CO 2 in basaltic rocks. The capacity of a storage reservoir for mineral trapping of CO 2 is largely a function of host rock composition. Mineral carbonation involves combining CO 2 with divalent cations including Ca 2+,more » Mg 2+ and Fe 2+. The most abundant geological sources for these cations are basaltic rocks. Based on initial storage capacity estimates, we know that basalts have the necessary capacity to store million to billion tons of CO 2 via in situ mineral carbonation. However, little is known about CO2-fluid-rock reactions occurring in a basaltic storage reservoir during and post-CO 2 injection. None of the common monitoring and verification techniques have been able to provide a surveying tool for mineral trapping. The most direct method for quantitative monitoring and accounting involves the tagging of the injected CO 2 with 14C because 14C is not present in deep geologic reservoirs prior to injection. Accordingly, we conducted two CO 2 injection tests at the CarbFix pilot injection site in Iceland to study the feasibility of 14C as a reactive tracer for monitoring CO 2-fluid-rock reactions and CO 2 mineralization. Our newly developed monitoring techniques, using 14C as a reactive tracer, have been successfully demonstrated. For the first time, permanent and safe disposal of CO 2 as environmentally benign carbonate minerals in basaltic rocks could be shown. Over 95% of the injected CO 2 at the CarbFix pilot injection site was mineralized to carbonate minerals in less than two years after injection. Our monitoring results confirm that CO 2 mineralization in basaltic rocks is far faster than previously postulated.« less
The U.S. EPA operates the Environmental and Sustainable Technology Evaluation (ESTE) program to facilitate the deployment of innovative technologies through performance verification and information dissemination. This ESTE project involved evaluation of co-firing common woody bio...
How streamlining telecommunications can cut IT expense.
McIntyre, Greg
2016-02-01
Hospitals and health systems can save IT expenses by implementing more efficient processes in accordance with the principles of effective telecommunications expense management. This approach involves three primary steps: Inventory of existing infrastructure. Charge verification. Optimization of rates and design for continual improvement.
Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier
2017-03-14
Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.
In vivo dose verification of IMRT treated head and neck cancer patients.
Engström, Per E; Haraldsson, Pia; Landberg, Torsten; Sand Hansen, Hanne; Aage Engelholm, Svend; Nyström, Håkan
2005-01-01
An independent in vivo dose verification procedure for IMRT treatments of head and neck cancers was developed. Results of 177 intracavitary TLD measurements from 10 patients are presented. The study includes data from 10 patients with cancer of the rhinopharynx or the thyroid treated with dynamic IMRT. Dose verification was performed by insertion of a flexible naso-oesophageal tube containing TLD rods and markers for EPID and simulator image detection. Part of the study focussed on investigating the accuracy of the TPS calculations in the presence of inhomogeneities. Phantom measurements and Monte Carlo simulations were performed for a number of geometries involving lateral electronic disequilibrium and steep density shifts. The in vivo TLD measurements correlated well with the predictions of the treatment planning system with a measured/calculated dose ratio of 1.002+/-0.051 (1 SD, N=177). The measurements were easily performed and well tolerated by the patients. We conclude that in vivo intracavitary dosimetry with TLD is suitable and accurate for dose determination in intensity-modulated beams.
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Blackwell, Ben F.; Edwards, Jack R.
2007-01-01
The development and verification of a one-dimensional material thermal response code with ablation is presented. The implicit time integrator, control volume finite element spatial discretization, and Newton's method for nonlinear iteration on the entire system of residual equations have been implemented and verified for the thermochemical ablation of internally decomposing materials. This study is a continuation of the work presented in "One-Dimensional Ablation with Pyrolysis Gas Flow Using a Full Newton's Method and Finite Control Volume Procedure" (AIAA-2006-2910), which described the derivation, implementation, and verification of the constant density solid energy equation terms and boundary conditions. The present study extends the model to decomposing materials including decomposition kinetics, pyrolysis gas flow through the porous char layer, and a mixture (solid and gas) energy equation. Verification results are presented for the thermochemical ablation of a carbon-phenolic ablator which involves the solution of the entire system of governing equations.
Personal Identification by Keystroke Dynamics in Japanese Free Text Typing
NASA Astrophysics Data System (ADS)
Samura, Toshiharu; Nishimura, Haruhiko
Biometrics is classified into verification and identification. Many researchers on the keystroke dynamics have treated the verification of a fixed short password which is used for the user login. In this research, we pay attention to the identification and investigate several characteristics of the keystroke dynamics in Japanese free text typing. We developed Web-based typing software in order to collect the keystroke data on the Local Area Network and performed experiments on a total of 112 subjects, from which three groups of typing level, the beginner's level and above, the normal level and above and the middle level and above were constructed. Based on the identification methods by the weighted Euclid distance and the neural network for the extracted feature indexes in Japanese texts, we evaluated identification performances for the three groups. As a result, high accuracy of personal identification was confirmed in both methods, in proportion to the typing level of the group.
The U.S. EPA operates the Environmental and Sustainable Technology Evaluation (ESTE) program to facilitate the deployment of innovative technologies through performance verification and information dissemination. This ESTE project involved evaluation of co-firing common woody bio...
Safety Verification of the Small Aircraft Transportation System Concept of Operations
NASA Technical Reports Server (NTRS)
Carreno, Victor; Munoz, Cesar
2005-01-01
A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.
Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model
NASA Astrophysics Data System (ADS)
Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal
How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.
An Optimized Online Verification Imaging Procedure for External Beam Partial Breast Irradiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willis, David J., E-mail: David.Willis@petermac.or; Royal Melbourne Institute of Technology University, Melbourne, Victoria; Kron, Tomas
2011-07-01
The purpose of this study was to evaluate the capabilities of a kilovoltage (kV) on-board imager (OBI)-equipped linear accelerator in the setting of on-line verification imaging for external-beam partial breast irradiation. Available imaging techniques were optimized and assessed for image quality using a modified anthropomorphic phantom. Imaging dose was also assessed. Imaging techniques were assessed for physical clearance between patient and treatment machine using a volunteer. Nonorthogonal kV image pairs were identified as optimal in terms of image quality, clearance, and dose. After institutional review board approval, this approach was used for 17 patients receiving accelerated partial breast irradiation. Imagingmore » was performed before every fraction verification with online correction of setup deviations >5 mm (total image sessions = 170). Treatment staff rated risk of collision and visibility of tumor bed surgical clips where present. Image session duration and detected setup deviations were recorded. For all cases, both image projections (n = 34) had low collision risk. Surgical clips were rated as well as visualized in all cases where they were present (n = 5). The average imaging session time was 6 min, 16 sec, and a reduction in duration was observed as staff became familiar with the technique. Setup deviations of up to 1.3 cm were detected before treatment and subsequently confirmed offline. Nonorthogonal kV image pairs allowed effective and efficient online verification for partial breast irradiation. It has yet to be tested in a multicenter study to determine whether it is dependent on skilled treatment staff.« less
NASA Astrophysics Data System (ADS)
Raikovskiy, N. A.; Tretyakov, A. V.; Abramov, S. A.; Nazmeev, F. G.; Pavlichev, S. V.
2017-08-01
The paper presents a numerical study method of the cooling medium flowing in the water jacket of self-lubricating sliding bearing based on ANSYS CFX. The results of numerical calculations have satisfactory convergence with the empirical data obtained on the testbed. Verification data confirm the possibility of applying this numerical technique for the analysis of coolant flowings in the self-lubricating bearing containing the water jacket.
Mechanical Properties for Advanced Engine Materials
1992-04-01
electric potential differences, fractography , and model verification. 2.1.2 Creep This investigation [Khobaib] studied the creep behavior of SCS-6/Ti-24AI-1...dependent behavior [Bushnell; Hunsaker et al.] into an in-house code, MAGNA [Brockman], was more cost effective than obtaining, learning , and modifying a new...the applicability of Eqn. (13) to disks with deep notches. These results correlated well with Eqn. (13) as shown in Fig. 4.4.3.-1 and confirmed its
Sivanesam, Kalkena; Shu, Irene; Huggins, Kelly N L; Tatarek-Nossol, Marianna; Kapurniotu, Aphrodite; Andersen, Niels H
2016-08-01
Versions of a previously discovered β-hairpin peptide inhibitor of IAPP aggregation that are stabilized in that conformation, or even forced to remain in the hairpin conformation by a backbone cyclization constraint, display superior activity as inhibitors. The cyclized hairpin, cyclo-WW2, displays inhibitory activity at substoichiometric concentrations relative to this amyloidogenic peptide. The hairpin-binding hypothesis stands confirmed. © 2016 Federation of European Biochemical Societies.
NASA Technical Reports Server (NTRS)
Bruce, Kevin R.
1986-01-01
A Mach/CAS control system using an elevator was designed and developed for use on the NASA TCV B737 aircraft to support research in profile descent procedures and approach energy management. The system was designed using linear analysis techniques primarily. The results were confirmed and the system validated at additional flight conditions using a nonlinear 737 aircraft simulation. All design requirements were satisfied.
2010-08-01
effortless flow. Varies speech flow for stylistic effect, e.g., to emphasize a point. Uses appropriate discourse markers and connectors spontaneously. L3...were equally represented in Cognitive Aspects of Cross-Linguistic Communica- tion (15%), Pilot Controller Interactions (15%), and Verification...Confirmation of Messages (15%). Cognitive Aspects of Cross-linguistic Communication The speed of communication and understanding is probably a comfortable
2012-01-30
CFRP LAMINATES FOR MARINE USE Sa. CONTRACT NUMBER 5b. GRANT NUMBER N00014-06-1-1139 Sc. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Miyano, Yasushi...prediction of CFRP laminates proposed and confirmed experimentally in the previous ONR project of Grant # N000140110949 was verified theoretically and refined...DURABILITY OF CFRP LAMINATES FOR MARINE USE Principal Investigator Yasushi Miyano Co-principal Investigator Isao Kimpara Materials System
NASA Technical Reports Server (NTRS)
Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.
2016-01-01
With the recent development of multi-dimensional thermal protection system (TPS) material response codes, the capability to account for surface-to-surface radiation exchange in complex geometries is critical. This paper presents recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute geometric view factors for radiation problems involving multiple surfaces. Verification of the code's radiation capabilities and results of a code-to-code comparison are presented. Finally, a demonstration case of a two-dimensional ablating cavity with enclosure radiation accounting for a changing geometry is shown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee
2016-09-01
This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.
Verification of intravenous catheter placement by auscultation--a simple, noninvasive technique.
Lehavi, Amit; Rudich, Utay; Schechtman, Moshe; Katz, Yeshayahu Shai
2014-01-01
Verification of proper placement of an intravenous catheter may not always be simple. We evaluated the auscultation technique for this purpose. Twenty healthy volunteers were randomized for 18G catheter inserted intravenously either in the right (12) or left arm (8), and subcutaneously in the opposite arm. A standard stethoscope was placed over an area approximately 3 cm proximal to the tip of the catheter in the presumed direction of the vein to grade on a 0-6 scale the murmur heard by rapidly injecting 2 mL of NaCl 0.9% solution. The auscultation was evaluated by a blinded staff anesthesiologist. All 20 intravenous injection were evaluated as flow murmurs, and were graded an average 5.65 (±0.98), whereas all 20 subcutaneous injections were evaluated as either crackles or no sound, and were graded an average 2.00 (±1.38), without negative results. Sensitivity was calculated as 95%. Specificity and Kappa could not be calculated due to an empty false-positive group. Being simple, handy and noninvasive, we recommend to use the auscultation technique for verification of the proper placement of an intravenous catheter when uncertain of its position. Data obtained in our limited sample of healthy subjects need to be confirmed in the clinical setting.
Quantifying residues from postharvest fumigation of almonds and walnuts with propylene oxide
USDA-ARS?s Scientific Manuscript database
A novel analytical approach, involving solvent extraction with methyl tert-butyl ether (MTBE) followed by gas chromatography (GC), was developed to quantify residues that result from the postharvest fumigation of almonds and walnuts with propylene oxide (PPO). Verification and quantification of PPO,...
Barlow-Stewart, Kristine; Taylor, Sandra D; Treloar, Susan A; Stranger, Mark; Otlowski, Margaret
2009-03-01
To undertake a systematic process of verification of consumer accounts of alleged genetic discrimination. Verification of incidents reported in life insurance and other contexts that met the criteria of genetic discrimination, and the impact of fear of such treatment, was determined, with consent, through interview, document analysis and where appropriate, direct contact with the third party involved. The process comprised obtaining evidence that the alleged incident was accurately reported and determining whether the decision or action seemed to be justifiable and/or ethical. Reported incidents of genetic discrimination were verified in life insurance access, underwriting and coercion (9), applications for worker's compensation (1) and early release from prison (1) and in two cases of fear of discrimination impacting on access to genetic testing. Relevant conditions were inherited cancer susceptibility (8), Huntington disease (3), hereditary hemochromatosis (1), and polycystic kidney disease (1). In two cases, the reversal of an adverse underwriting decision to standard rate after intervention with insurers by genetics health professionals was verified. The mismatch between consumer and third party accounts in three life insurance incidents involved miscommunication or lack of information provision by financial advisers. These first cases of verified genetic discrimination make it essential for policies and guidelines to be developed and implemented to ensure appropriate use of genetic test results in insurance underwriting, to promote education and training in the financial industry, and to provide support for consumers and health professionals undertaking challenges of adverse decisions.
Universal Verification Methodology Based Register Test Automation Flow.
Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu
2016-05-01
In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.
Requirements Development for the NASA Advanced Engineering Environment (AEE)
NASA Technical Reports Server (NTRS)
Rogers, Eric; Hale, Joseph P.; Zook, Keith; Gowda, Sanjay; Salas, Andrea O.
2003-01-01
The requirements development process for the Advanced Engineering Environment (AEE) is presented. This environment has been developed to allow NASA to perform independent analysis and design of space transportation architectures and technologies. Given the highly collaborative and distributed nature of AEE, a variety of organizations are involved in the development, operations and management of the system. Furthermore, there are additional organizations involved representing external customers and stakeholders. Thorough coordination and effective communication is essential to translate desired expectations of the system into requirements. Functional, verifiable requirements for this (and indeed any) system are necessary to fulfill several roles. Requirements serve as a contractual tool, configuration management tool, and as an engineering tool, sometimes simultaneously. The role of requirements as an engineering tool is particularly important because a stable set of requirements for a system provides a common framework of system scope and characterization among team members. Furthermore, the requirements provide the basis for checking completion of system elements and form the basis for system verification. Requirements are at the core of systems engineering. The AEE Project has undertaken a thorough process to translate the desires and expectations of external customers and stakeholders into functional system-level requirements that are captured with sufficient rigor to allow development planning, resource allocation and system-level design, development, implementation and verification. These requirements are maintained in an integrated, relational database that provides traceability to governing Program requirements and also to verification methods and subsystem-level requirements.
Sridhar, L; Karthikraj, R; Lakshmi, V V S; Raju, N Prasada; Prabhakar, S
2014-08-01
Rapid detection and identification of chemical warfare agents and related precursors/degradation products in various environmental matrices is of paramount importance for verification of standards set by the chemical weapons convention (CWC). Nitrogen mustards, N,N-dialkylaminoethyl-2-chlorides, N,N-dialkylaminoethanols, N-alkyldiethanolamines, and triethanolamine, which are listed CWC scheduled chemicals, are prone to undergo N-oxidation in environmental matrices or during decontamination process. Thus, screening of the oxidized products of these compounds is also an important task in the verification process because the presence of these products reveals alleged use of nitrogen mustards or precursors of VX compounds. The N-oxides of aminoethanols and aminoethylchlorides easily produce [M + H](+) ions under electrospray ionization conditions, and their collision-induced dissociation spectra include a specific neutral loss of 48 u (OH + CH2OH) and 66 u (OH + CH2Cl), respectively. Based on this specific fragmentation, a rapid screening method was developed for screening of the N-oxides by applying neutral loss scan technique. The method was validated and the applicability of the method was demonstrated by analyzing positive and negative samples. The method was useful in the detection of N-oxides of aminoethanols and aminoethylchlorides in environmental matrices at trace levels (LOD, up to 500 ppb), even in the presence of complex masking agents, without the use of time-consuming sample preparation methods and chromatographic steps. This method is advantageous for the off-site verification program and also for participation in official proficiency tests conducted by the Organization for the Prohibition of Chemical Weapons (OPCW), the Netherlands. The structure of N-oxides can be confirmed by the MS/MS experiments on the detected peaks. A liquid chromatography-mass spectrometry (LC-MS) method was developed for the separation of isomeric N-oxides of aminoethanols and aminoethylchlorides using a C18 Hilic column. Critical isomeric compounds can be confirmed by LC-MS/MS experiments, after detecting the N-oxides from the neutral loss scanning method.
Yamamoto, Takashi; Noma, Yukio; Sakai, Shin-Ichi
2016-07-02
A series of verification tests were carried out in order to confirm that polychlorinated naphthalenes (PCNs) contained in synthetic rubber products (Neoprene FB products) and aerosol adhesives, which were accidentally imported into Japan, could be thermally destroyed using an industrial waste incinerator. In the verification tests, Neoprene FB products containing PCNs at a concentration of 2800 mg/kg were added to industrial wastes at a ratio of 600 mg Neoprene FB product/kg-waste, and then incinerated at an average temperature of 985 °C. Total PCN concentrations were 14 ng/m 3 N in stack gas, 5.7 ng/g in bottom ash, 0.98 ng/g in boiler dust, and 1.2 ng/g in fly ash. Destruction efficiency (DE) and destruction removal efficiency (DRE) of congener No. 38/40, which is considered an input marker congener, were 99.9974 and 99.9995 %, respectively. The following dioxin concentrations were found: 0.11 ng-TEQ/m 3 N for the stack gas, 0.096 ng-TEQ/g for the bottom ash, 0.010 ng-TEQ/g for the boiler dust, and 0.072 ng-TEQ/g for the fly ash. Since the PCN levels in the PCN destruction test were even at slightly lower concentrations than in the baseline test without PCN addition, the detected PCNs are to a large degree unintentionally produced PCNs and does not mainly stem from input material. Also, the dioxin levels did not change. From these results, we confirmed that PCNs contained in Neoprene FB products and aerosol adhesives could be destroyed to a high degree by high-temperature incineration. Therefore, all recalled Neoprene FB products and aerosol adhesives containing PCNs were successfully treated under the same conditions as the verification tests.
Chutz, Noah; Skutsch, Margaret
2016-01-01
There have been many calls for community participation in MRV (measuring, reporting, verification) for REDD+. This paper examines whether community involvement in MRV is a requirement, why it appears desirable to REDD+ agencies and external actors, and under what conditions communities might be interested in participating. It asks What’s in it for communities? What might communities gain from such an involvement? What could they lose? It embraces a broader approach which we call community MMM which involves mapping, measuring and monitoring of forest and other natural resources for issues which are of interest to the community itself. We focus on cases in México because the country has an unusually high proportion of forests under community communal ownership. In particular, we refer to a recent REDD+ initiative—CONAFOR-LAIF, in which local communities select and approve local people to participate in community-based monitoring activities. From these local initiatives we identify the specific and the general drivers for communities to be involved in mapping, measuring and monitoring of their own territories and their natural resources. We present evidence that communities are more interested in this wider approach than in a narrow focus on carbon monitoring. Finally we review what the challenges to reconciling MMM with MRV requirements are likely to be. PMID:27300439
Using Replication Projects in Teaching Research Methods
ERIC Educational Resources Information Center
Standing, Lionel G.; Grenier, Manuel; Lane, Erica A.; Roberts, Meigan S.; Sykes, Sarah J.
2014-01-01
It is suggested that replication projects may be valuable in teaching research methods, and also address the current need in psychology for more independent verification of published studies. Their use in an undergraduate methods course is described, involving student teams who performed direct replications of four well-known experiments, yielding…
Publication Of Oceanographic Data on CD-ROM
NASA Technical Reports Server (NTRS)
Hilland, Jeffrey E.; Smith, Elizabeth A.; Martin, Michael D.
1992-01-01
Large collections of oceanographic data and other large collections of data published on CD-ROM's in formats facilitating access and analysis. Involves four major steps: preprocessing, premastering, mastering, and verification. Large capacity, small size, commercial availability, long-life, and standard format of CD-ROM's offer advantages over computer-compatible magnetic tape.
30 CFR 227.600 - What automated verification functions may a State perform?
Code of Federal Regulations, 2010 CFR
2010-07-01
... involves systematic monitoring of production and royalty reports to identify and resolve reporting or... reported by royalty reporters to sales and transfer volumes reported by production reporters. If you request delegation of automated comparison of sales and production volumes, you must perform at least the...
Abstract for 1999 Rational Software User Conference
NASA Technical Reports Server (NTRS)
Dunphy, Julia; Rouquette, Nicolas; Feather, Martin; Tung, Yu-Wen
1999-01-01
We develop spacecraft fault-protection software at NASA/JPL. Challenges exemplified by our task: 1) high-quality systems - need for extensive validation & verification; 2) multi-disciplinary context - involves experts from diverse areas; 3) embedded systems - must adapt to external practices, notations, etc.; and 4) development pressures - NASA's mandate of "better, faster, cheaper".
A Comparison of the Effects of Two Instructional Sequences Involving Science Laboratory Activities.
ERIC Educational Resources Information Center
Ivins, Jerry Edward
This study attempted to determine if students learn science concepts better when laboratories are used to verify concepts already intorduced through lectures and textbooks (verification laboratories or whether achievement and retention are improved when laboratories are used to introduce new concepts (directed discovery learning laboratories). The…
Photovoltaic system criteria documents. Volume 5: Safety criteria for photovoltaic applications
NASA Technical Reports Server (NTRS)
Koenig, John C.; Billitti, Joseph W.; Tallon, John M.
1979-01-01
Methodology is described for determining potential safety hazards involved in the construction and operation of photovoltaic power systems and provides guidelines for the implementation of safety considerations in the specification, design and operation of photovoltaic systems. Safety verification procedures for use in solar photovoltaic systems are established.
A significant challenge in environmental studies is to determine the onset and extent of MTBE bioremediation at an affected site, which may involve indirect approaches such as microcosm verification of microbial activities at a given site. Stable isotopic fractionation is cha...
Moderation in the Certificates of General Education for Adults. Guidelines for Providers.
ERIC Educational Resources Information Center
Council of Adult Education, Melbourne (Australia).
This document provides guidelines for the process of moderation and verification of assessments for educators involved in adult education. As used in the education establishment in Australia, "moderation" is the process of ensuring the standardization of assessment. Through the moderation process, assessment procedures conducted in a…
18 CFR 3b.222 - Identification requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... disclose a social security number. (h) No verification of identity will be required of individuals seeking... in the record will be used to determine identity. (c) If the system manager determines that the data... individual involved, a signed notarized statement asserting identity or some other reasonable means to verify...
2011-12-01
therefore a more general approach uses the pseudo-inverse shown in Equation (12) to obtain the commanded gimbal rate. 1 /T T b N CMG...gimbal motor. Approaching the problem from this perspective increases the complexity significantly and the relationship between motor current and...included in this document confirms the equations that Schaub and Junkins developed. The approaches used in the two derivations are sufficiently
Quantum money with nearly optimal error tolerance
NASA Astrophysics Data System (ADS)
Amiri, Ryan; Arrazola, Juan Miguel
2017-06-01
We present a family of quantum money schemes with classical verification which display a number of benefits over previous proposals. Our schemes are based on hidden matching quantum retrieval games and they tolerate noise up to 23 % , which we conjecture reaches 25 % asymptotically as the dimension of the underlying hidden matching states is increased. Furthermore, we prove that 25 % is the maximum tolerable noise for a wide class of quantum money schemes with classical verification, meaning our schemes are almost optimally noise tolerant. We use methods in semidefinite programming to prove security in a substantially different manner to previous proposals, leading to two main advantages: first, coin verification involves only a constant number of states (with respect to coin size), thereby allowing for smaller coins; second, the reusability of coins within our scheme grows linearly with the size of the coin, which is known to be optimal. Last, we suggest methods by which the coins in our protocol could be implemented using weak coherent states and verified using existing experimental techniques, even in the presence of detector inefficiencies.
A zero-knowledge protocol for nuclear warhead verification
NASA Astrophysics Data System (ADS)
Glaser, Alexander; Barak, Boaz; Goldston, Robert J.
2014-06-01
The verification of nuclear warheads for arms control involves a paradox: international inspectors will have to gain high confidence in the authenticity of submitted items while learning nothing about them. Proposed inspection systems featuring `information barriers', designed to hide measurements stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of a fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, which is designed in such a way that sensitive information is never measured and so does not need to be hidden. We interrogate submitted items with energetic neutrons, making, in effect, differential measurements of both neutron transmission and emission. Calculations for scenarios in which material is diverted from a test object show that a high degree of discrimination can be achieved while revealing zero information. Our ideas for a physical zero-knowledge system could have applications beyond the context of nuclear disarmament. The proposed technique suggests a way to perform comparisons or computations on personal or confidential data without measuring the data in the first place.
Innovative Research Program: Supershields for Gamma-Ray Astronomy
NASA Technical Reports Server (NTRS)
Hailey, Charles J.
2000-01-01
The supershield project evaluated the importance of novel shield configurations for suppressing neutron induced background in new classes of gamma-ray detectors such as CZT. The basic concept was to use a two-part shield. The outer shield material heavily moderates the incoming neutron spectrum. This moderated neutron beam is then more easily absorbed by the inner material, which is an efficient neutron absorber. This approach is, in principle, more efficient than that in previous attempts to make neutron shields. These previous attempts involved biatomic, monlithic shields (eg. LiH) in which the shield consisted of a single material but with two types of atoms - one for moderating and one for absorbing. The problem with this type of monolithic shield is that moderating neutrons, without the efficient absorption of them, leads to the leakage into the detector of neutrons with a low energy component (approx. 10-100 KeV). These energy neutrons are particularly problematic for many types of detectors. The project was roughly divided into phases. In the first phase we attempted to carefully define the neutron source function incident on any space instrument. This is essential since the design of any shield depends on the shape of the incident neutron spectrum. We found that approximations commonly used in gamma-ray astronomy for photon background is inadequate. In addition, we found that secondary neutrons produced in any passive shield, and dominated by inelastic neutron scattering, are far more important than background due to neutron activation. The second phase of our work involved design of supershield geometries (one and three dimensional) in order to compare different shield configurations and materials for their effectiveness as neutron shields. Moreover we wanted to compare these supershields with previous neutron shields to confirm the performance differences between the supershield (two material) and monolithic (one material) designs and to understand the physics origins of these differences more clearly. The third phase of the supershield program involved the benchmarking of the supershield designs through direct experimental verification. This required fabricating various supershields and exposing them to beams of neutrons to directly characterize their performance. With explicit verification that our modeling procedures can be used with confidence, we are now in a position to design shields for realistic space geometries. Using the supershield modeling capacity developed as part of this program we are attempting to evaluate their utility for a specific proposed mission--the Energetic X-ray Imaging Survey Telescope (EXIST). It is anticipated that this experiment, which is limited by internal background at high energies, might benefit from a neutron shield.
Feasibility of biochemical verification in a web-based smoking cessation study.
Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L
2017-10-01
Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.
Eligibility Guidance/Free and Reduced Price Policy Handbook.
ERIC Educational Resources Information Center
New York State Education Dept., Albany. Bureau of School Food Management and Nutrition.
This handbook is designed to serve as a resource guide to New York school district officials who are involved in the application approval, hearing, and verification processes for the National School Lunch and School Breakfast Programs, and for those exercising the free milk option of the Special Milk Program. Detailed information and clarification…
Task Listings Resulting from the Vocational Competency Measures Project. Memorandum Report.
ERIC Educational Resources Information Center
American Institutes for Research in the Behavioral Sciences, Palo Alto, CA.
This memorandum report consists of 14 task listings resulting from the Vocational Competency Measures Project. (The Vocational Competency Measures Project was a test development project that involved the writing and verification of task listings for 14 vocational occupational areas through over 225 interviews conducted in 27 states.) Provided in…
Software architecture standard for simulation virtual machine, version 2.0
NASA Technical Reports Server (NTRS)
Sturtevant, Robert; Wessale, William
1994-01-01
The Simulation Virtual Machine (SBM) is an Ada architecture which eases the effort involved in the real-time software maintenance and sustaining engineering. The Software Architecture Standard defines the infrastructure which all the simulation models are built from. SVM was developed for and used in the Space Station Verification and Training Facility.
Studies in support of an SNM cutoff agreement: The PUREX exercise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stanbro, W.D.; Libby, R.; Segal, J.
1995-07-01
On September 23, 1993, President Clinton, in a speech before the United Nations General Assembly, called for an international agreement banning the production of plutonium and highly enriched uranium for nuclear explosive purposes. A major element of any verification regime for such an agreement would probably involve inspections of reprocessing plants in Nuclear Nonproliferation Treaty weapons states. Many of these are large facilities built in the 1950s with no thought that they would be subject to international inspection. To learn about some of the problems that might be involved in the inspection of such large, old facilities, the Department ofmore » Energy, Office of Arms Control and Nonproliferation, sponsored a mock inspection exercise at the PUREX plant on the Hanford Site. This exercise examined a series of alternatives for inspections of the PUREX as a model for this type of facility at other locations. A series of conclusions were developed that can be used to guide the development of verification regimes for a cutoff agreement at reprocessing facilities.« less
Additional confirmation of the validity of laboratory simulation of cloud radiances
NASA Technical Reports Server (NTRS)
Davis, J. M.; Cox, S. K.
1986-01-01
The results of a laboratory experiment are presented that provide additional verification of the methodology adopted for simulation of the radiances reflected from fields of optically thick clouds using the Cloud Field Optical Simulator (CFOS) at Colorado State University. The comparison of these data with their theoretically derived counterparts indicates that the crucial mechanism of cloud-to-cloud radiance field interaction is accurately simulated in the CFOS experiments and adds confidence to the manner in which the optical depth is scaled.
BASIC BIOCHEMICAL AND CLINICAL ASPECTS OF NONINVASIVE TESTS HELIC.
Dmitrienko, M A; Dmitrienko, V S; Kornienko, E A; Parolova, N I; Colomina, E O; Aronov, E B
Biochemical process that lay in the core of non-invasive detection of Helico ho cter pylod with the help of HELIC Ammonia breath test, manufactured by AMA Co Ltd., St.Petersburg, is shown. Patents from various countries, describing ammonia as H.pyiori diagnostic marker, are reviewed. Approaches for evaluation of efficacy of the test-system are analyzed, validation and verification data is provided. High diagnostic characteristics are confirmed by the results of comparative studies on patients of different age groups, reaching 97% sensitivity and 96% specificity.
NASA Astrophysics Data System (ADS)
Miyahara, M.; Furuta, M.; Takekawa, T.; Oda, S.; Koshikawa, T.; Akiba, T.; Mori, T.; Mimura, T.; Sawada, C.; Yamaguchi, T.; Nishioka, S.; Tada, M.
2009-07-01
An irradiation detection method using the difference of the radiation sensitivity of the heat-treated microorganisms was developed as one of the microbiological detection methods of the irradiated foods. This detection method is based on the difference of the viable cell count before and after heat treatment (70 °C and 10 min). The verification by collaborative blind trial of this method was done by nine inspecting agencies in Japan. The samples used for this trial were five kinds of spices consisting of non-irradiated, 5 kGy irradiated, and 7 kGy irradiated black pepper, allspice, oregano, sage, and paprika, respectively. As a result of this collaboration, a high percentage (80%) of the correct answers was obtained for irradiated black pepper and allspice. However, the method was less successful for irradiated oregano, sage, and paprika. It might be possible to use this detection method for preliminary screening of the irradiated foods but further work is necessary to confirm these findings.
Ogle, Stephen; Davis, Kenneth J.; Lauvaux, Thomas; ...
2015-03-10
Verifying national greenhouse gas (GHG) emissions inventories is a critical step to ensure that reported emissions data to the United Nations Framework Convention on Climate Change (UNFCCC) are accurate and representative of a country’s contribution to GHG concentrations in the atmosphere. Verification could include a variety of evidence, but arguably the most convincing verification would be confirmation of a change in GHG concentrations in the atmosphere that is consistent with reported emissions to the UNFCCC. We report here on a case study evaluating this option based on a prototype atmospheric CO2 measurement network deployed in the Mid-Continent Region of themore » conterminous United States. We found that the atmospheric CO2 measurement data did verify the accuracy of the emissions inventory within the confidence limits of the emissions estimates, suggesting that this technology could be further developed and deployed more widely in the future for verifying reported emissions.« less
NASA Astrophysics Data System (ADS)
Sakamoto, Yasuaki; Kashiwagi, Takayuki; Hasegawa, Hitoshi; Sasakawa, Takashi; Fujii, Nobuo
This paper describes the design considerations and experimental verification of an LIM rail brake armature. In order to generate power and maximize the braking force density despite the limited area between the armature and the rail and the limited space available for installation, we studied a design method that is suitable for designing an LIM rail brake armature; we considered adoption of a ring winding structure. To examine the validity of the proposed design method, we developed a prototype ring winding armature for the rail brakes and examined its electromagnetic characteristics in a dynamic test system with roller rigs. By repeating various tests, we confirmed that unnecessary magnetic field components, which were expected to be present under high speed running condition or when a ring winding armature was used, were not present. Further, the necessary magnetic field component and braking force attained the desired values. These studies have helped us to develop a basic design method that is suitable for designing the LIM rail brake armatures.
NASA Technical Reports Server (NTRS)
Pujar, Vijay V.; Cawley, James D.; Levine, S. (Technical Monitor)
2000-01-01
Earlier results from computer simulation studies suggest a correlation between the spatial distribution of stacking errors in the Beta-SiC structure and features observed in X-ray diffraction patterns of the material. Reported here are experimental results obtained from two types of nominally Beta-SiC specimens, which yield distinct XRD data. These samples were analyzed using high resolution transmission electron microscopy (HRTEM) and the stacking error distribution was directly determined. The HRTEM results compare well to those deduced by matching the XRD data with simulated spectra, confirming the hypothesis that the XRD data is indicative not only of the presence and density of stacking errors, but also that it can yield information regarding their distribution. In addition, the stacking error population in both specimens is related to their synthesis conditions and it appears that it is similar to the relation developed by others to explain the formation of the corresponding polytypes.
Experimental Validation of L1 Adaptive Control: Rohrs' Counterexample in Flight
NASA Technical Reports Server (NTRS)
Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Issac; Kitsios, Ioannis; Cao, Chengyu; Gregory, Irene M.; Valavani, Lena
2010-01-01
The paper presents new results on the verification and in-flight validation of an L1 adaptive flight control system, and proposes a general methodology for verification and validation of adaptive flight control algorithms. The proposed framework is based on Rohrs counterexample, a benchmark problem presented in the early 80s to show the limitations of adaptive controllers developed at that time. In this paper, the framework is used to evaluate the performance and robustness characteristics of an L1 adaptive control augmentation loop implemented onboard a small unmanned aerial vehicle. Hardware-in-the-loop simulations and flight test results confirm the ability of the L1 adaptive controller to maintain stability and predictable performance of the closed loop adaptive system in the presence of general (artificially injected) unmodeled dynamics. The results demonstrate the advantages of L1 adaptive control as a verifiable robust adaptive control architecture with the potential of reducing flight control design costs and facilitating the transition of adaptive control into advanced flight control systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, David A.
The U.S. Department of Energy (DOE) Oak Ridge Office of Environmental Management selected Oak Ridge Associated Universities (ORAU), through the Oak Ridge Institute for Science and Education (ORISE) contract, to perform independent verification (IV) at Zone 2 of the East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. ORAU has concluded IV surveys, per the project-specific plan (PSP) (ORAU 2013a) covering exposure units (EUs) Z2-24, -31, -32, and -36. The objective of this effort was to verify the target EUs comply with requirements in the Zone 2 Record of Decision (ROD) (DOE 2005), as implemented by using the dynamic verificationmore » strategy presented in the dynamic work plan (DWP) (BJC 2007); and confirm commitments in the DWP were adequately implemented, as verified via IV surveys and soil sampling.« less
[Does action semantic knowledge influence mental simulation in sentence comprehension?].
Mochizuki, Masaya; Naito, Katsuo
2012-04-01
This research investigated whether action semantic knowledge influences mental simulation during sentence comprehension. In Experiment 1, we confirmed that the words of face-related objects include the perceptual knowledge about the actions that bring the object to the face. In Experiment 2, we used an acceptability judgment task and a word-picture verification task to compare the perceptual information that is activated by the comprehension of sentences describing an action using face-related objects near the face (near-sentence) or far from the face (far-sentence). Results showed that participants took a longer time to judge the acceptability of the far-sentence than the near-sentence. Verification times were significantly faster when the actions in the pictures matched the action described in the sentences than when they were mismatched. These findings suggest that action semantic knowledge influences sentence processing, and that perceptual information corresponding to the content of the sentence is activated regardless of the action semantic knowledge at the end of the sentence processing.
Applying fault tree analysis to the prevention of wrong-site surgery.
Abecassis, Zachary A; McElroy, Lisa M; Patel, Ronak M; Khorzad, Rebeca; Carroll, Charles; Mehrotra, Sanjay
2015-01-01
Wrong-site surgery (WSS) is a rare event that occurs to hundreds of patients each year. Despite national implementation of the Universal Protocol over the past decade, development of effective interventions remains a challenge. We performed a systematic review of the literature reporting root causes of WSS and used the results to perform a fault tree analysis to assess the reliability of the system in preventing WSS and identifying high-priority targets for interventions aimed at reducing WSS. Process components where a single error could result in WSS were labeled with OR gates; process aspects reinforced by verification were labeled with AND gates. The overall redundancy of the system was evaluated based on prevalence of AND gates and OR gates. In total, 37 studies described risk factors for WSS. The fault tree contains 35 faults, most of which fall into five main categories. Despite the Universal Protocol mandating patient verification, surgical site signing, and a brief time-out, a large proportion of the process relies on human transcription and verification. Fault tree analysis provides a standardized perspective of errors or faults within the system of surgical scheduling and site confirmation. It can be adapted by institutions or specialties to lead to more targeted interventions to increase redundancy and reliability within the preoperative process. Copyright © 2015 Elsevier Inc. All rights reserved.
Gorlin-Goltz syndrome: incidental finding on routine ct scan following car accident
2009-01-01
Introduction Gorlin-Goltz syndrome is a rare hereditary disease. Pathogenesis of the syndrome is attributed to abnormalities in the long arm of chromosome 9 (q22.3-q31) and loss or mutations of human patched gene (PTCH1 gene). Multiple basal cell carcinomas (BCCs), odontogenic keratocysts, skeletal abnormalities, hyperkeratosis of palms and soles, intracranial ectopic calcifications of the falx cerebri and facial dysmorphism are considered the main clinical features. Diagnosis is based upon established major and minor clinical and radiological criteria and ideally confirmed by DNA analysis. Because of the different systems affected, a multidisciplinary approach team of various experts is required for a successful management. Case presentation We report the case of a 19 year-old female who was involved in a car accident and found to present imaging findings of Gorlin-Goltz syndrome during a routine whole body computed tomography (CT) scan in order to exclude traumatic injuries. Conclusion Radiologic findings of the syndrome are easily identifiable on CT scans and may prompt to early verification of the disease, which is very important for regular follow-up and better survival rates from the co-existent diseases. PMID:20062724
Perspective: Surface freezing in water: A nexus of experiments and simulations
NASA Astrophysics Data System (ADS)
Haji-Akbari, Amir; Debenedetti, Pablo G.
2017-08-01
Surface freezing is a phenomenon in which crystallization is enhanced at a vapor-liquid interface. In some systems, such as n-alkanes, this enhancement is dramatic and results in the formation of a crystalline layer at the free interface even at temperatures slightly above the equilibrium bulk freezing temperature. There are, however, systems in which the enhancement is purely kinetic and only involves faster nucleation at or near the interface. The first, thermodynamic, type of surface freezing is easier to confirm in experiments, requiring only the verification of the existence of crystalline order at the interface. The second, kinetic, type of surface freezing is far more difficult to prove experimentally. One material that is suspected of undergoing the second type of surface freezing is liquid water. Despite strong indications that the freezing of liquid water is kinetically enhanced at vapor-liquid interfaces, the findings are far from conclusive, and the topic remains controversial. In this perspective, we present a simple thermodynamic framework to understand conceptually and distinguish these two types of surface freezing. We then briefly survey fifteen years of experimental and computational work aimed at elucidating the surface freezing conundrum in water.
[Infrastructure and contents of clinical data management plan].
Shen, Tong; Xu, Lie-dong; Fu, Hai-jun; Liu, Yan; He, Jia; Chen, Ping-yan; Song, Yu-fei
2015-11-01
Establishment of quality management system (QMS) plays a critical role in the clinical data management (CDM). The objectives of CDM are to ensure the quality and integrity of the trial data. Thus, every stage or element that may impact the quality outcomes of clinical studies should be in the controlled manner, which is referred to the full life cycle of CDM associated with the data collection, handling and statistical analysis of trial data. Based on the QMS, this paper provides consensus on how to develop a compliant clinical data management plan (CDMP). According to the essential requirements of the CDM, the CDMP should encompass each process of data collection, data capture and cleaning, medical coding, data verification and reconciliation, database monitoring and management, external data transmission and integration, data documentation and data quality assurance and so on. Creating and following up data management plan in each designed data management steps, dynamically record systems used, actions taken, parties involved will build and confirm regulated data management processes, standard operational procedures and effective quality metrics in all data management activities. CDMP is one of most important data management documents that is the solid foundation for clinical data quality.
Verification and Validation of Digitally Upgraded Control Rooms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald; Lau, Nathan
2015-09-01
As nuclear power plants undertake main control room modernization, a challenge is the lack of a clearly defined human factors process to follow. Verification and validation (V&V) as applied in the nuclear power community has tended to involve efforts such as integrated system validation, which comes at the tail end of the design stage. To fill in guidance gaps and create a step-by-step process for control room modernization, we have developed the Guideline for Operational Nuclear Usability and Knowledge Elicitation (GONUKE). This approach builds on best practices in the software industry, which prescribe an iterative user-centered approach featuring multiple cyclesmore » of design and evaluation. Nuclear regulatory guidance for control room design emphasizes summative evaluation—which occurs after the design is complete. In the GONUKE approach, evaluation is also performed at the formative stage of design—early in the design cycle using mockups and prototypes for evaluation. The evaluation may involve expert review (e.g., software heuristic evaluation at the formative stage and design verification against human factors standards like NUREG-0700 at the summative stage). The evaluation may also involve user testing (e.g., usability testing at the formative stage and integrated system validation at the summative stage). An additional, often overlooked component of evaluation is knowledge elicitation, which captures operator insights into the system. In this report we outline these evaluation types across design phases that support the overall modernization process. The objective is to provide industry-suitable guidance for steps to be taken in support of the design and evaluation of a new human-machine interface (HMI) in the control room. We suggest the value of early-stage V&V and highlight how this early-stage V&V can help improve the design process for control room modernization. We argue that there is a need to overcome two shortcomings of V&V in current practice—the propensity for late-stage V&V and the use of increasingly complex psychological assessment measures for V&V.« less
Central mechanisms for force and motion--towards computational synthesis of human movement.
Hemami, Hooshang; Dariush, Behzad
2012-12-01
Anatomical, physiological and experimental research on the human body can be supplemented by computational synthesis of the human body for all movement: routine daily activities, sports, dancing, and artistic and exploratory involvements. The synthesis requires thorough knowledge about all subsystems of the human body and their interactions, and allows for integration of known knowledge in working modules. It also affords confirmation and/or verification of scientific hypotheses about workings of the central nervous system (CNS). A simple step in this direction is explored here for controlling the forces of constraint. It requires co-activation of agonist-antagonist musculature. The desired trajectories of motion and the force of contact have to be provided by the CNS. The spinal control involves projection onto a muscular subset that induces the force of contact. The projection of force in the sensory motor cortex is implemented via a well-defined neural population unit, and is executed in the spinal cord by a standard integral controller requiring input from tendon organs. The sensory motor cortex structure is extended to the case for directing motion via two neural population units with vision input and spindle efferents. Digital computer simulations show the feasibility of the system. The formulation is modular and can be extended to multi-link limbs, robot and humanoid systems with many pairs of actuators or muscles. It can be expanded to include reticular activating structures and learning. Copyright © 2012 Elsevier Ltd. All rights reserved.
Experimental verification of multipartite entanglement in quantum networks
McCutcheon, W.; Pappa, A.; Bell, B. A.; McMillan, A.; Chailloux, A.; Lawson, T.; Mafu, M.; Markham, D.; Diamanti, E.; Kerenidis, I.; Rarity, J. G.; Tame, M. S.
2016-01-01
Multipartite entangled states are a fundamental resource for a wide range of quantum information processing tasks. In particular, in quantum networks, it is essential for the parties involved to be able to verify if entanglement is present before they carry out a given distributed task. Here we design and experimentally demonstrate a protocol that allows any party in a network to check if a source is distributing a genuinely multipartite entangled state, even in the presence of untrusted parties. The protocol remains secure against dishonest behaviour of the source and other parties, including the use of system imperfections to their advantage. We demonstrate the verification protocol in a three- and four-party setting using polarization-entangled photons, highlighting its potential for realistic photonic quantum communication and networking applications. PMID:27827361
Prediction Interval Development for Wind-Tunnel Balance Check-Loading
NASA Technical Reports Server (NTRS)
Landman, Drew; Toro, Kenneth G.; Commo, Sean A.; Lynn, Keith C.
2014-01-01
Results from the Facility Analysis Verification and Operational Reliability project revealed a critical gap in capability in ground-based aeronautics research applications. Without a standardized process for check-loading the wind-tunnel balance or the model system, the quality of the aerodynamic force data collected varied significantly between facilities. A prediction interval is required in order to confirm a check-loading. The prediction interval provides an expected upper and lower bound on balance load prediction at a given confidence level. A method has been developed which accounts for sources of variability due to calibration and check-load application. The prediction interval method of calculation and a case study demonstrating its use is provided. Validation of the methods is demonstrated for the case study based on the probability of capture of confirmation points.
Engineering support activities for the Apollo 17 Surface Electrical Properties Experiment.
NASA Technical Reports Server (NTRS)
Cubley, H. D.
1972-01-01
Description of the engineering support activities which were required to ensure fulfillment of objectives specified for the Apollo 17 SEP (Surface Electrical Properties) Experiment. Attention is given to procedural steps involving verification of hardware acceptability to the astronauts, computer simulation of the experiment hardware, field trials, receiver antenna pattern measurements, and the qualification test program.
Code of Federal Regulations, 2011 CFR
2011-07-01
... certificate or a Safety Management Certificate; (3) Periodic audits including— (i) An annual verification... safety management audit and when is it required to be completed? 96.320 Section 96.320 Navigation and... SAFE OPERATION OF VESSELS AND SAFETY MANAGEMENT SYSTEMS How Will Safety Management Systems Be...
A proposed standard method for polarimetric calibration and calibration verification
NASA Astrophysics Data System (ADS)
Persons, Christopher M.; Jones, Michael W.; Farlow, Craig A.; Morell, L. Denise; Gulley, Michael G.; Spradley, Kevin D.
2007-09-01
Accurate calibration of polarimetric sensors is critical to reducing and analyzing phenomenology data, producing uniform polarimetric imagery for deployable sensors, and ensuring predictable performance of polarimetric algorithms. It is desirable to develop a standard calibration method, including verification reporting, in order to increase credibility with customers and foster communication and understanding within the polarimetric community. This paper seeks to facilitate discussions within the community on arriving at such standards. Both the calibration and verification methods presented here are performed easily with common polarimetric equipment, and are applicable to visible and infrared systems with either partial Stokes or full Stokes sensitivity. The calibration procedure has been used on infrared and visible polarimetric imagers over a six year period, and resulting imagery has been presented previously at conferences and workshops. The proposed calibration method involves the familiar calculation of the polarimetric data reduction matrix by measuring the polarimeter's response to a set of input Stokes vectors. With this method, however, linear combinations of Stokes vectors are used to generate highly accurate input states. This allows the direct measurement of all system effects, in contrast with fitting modeled calibration parameters to measured data. This direct measurement of the data reduction matrix allows higher order effects that are difficult to model to be discovered and corrected for in calibration. This paper begins with a detailed tutorial on the proposed calibration and verification reporting methods. Example results are then presented for a LWIR rotating half-wave retarder polarimeter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cherpak, Amanda
Purpose: The Octavius 1000{sup SRS} detector was commissioned in December 2014 and is used routinely for verification of all SRS and SBRT plans. Results of verifications were analyzed to assess trends and limitations of the device and planning methods. Methods: Plans were delivered using a True Beam STx and results were evaluated using gamma analysis (95%, 3%/3mm) and absolute dose difference (5%). Verification results were analyzed based on several plan parameters including tumour volume, degree of modulation and prescribed dose. Results: During a 12 month period, a total of 124 patient plans were verified using the Octavius detector. Thirteen plansmore » failed the gamma criteria, while 7 plans failed based on the absolute dose difference. When binned according to degree of modulation, a significant correlation was found between MU/cGy and both mean dose difference (r=0.78, p<0.05) and gamma (r=−0.60, p<0.05). When data was binned according to tumour volume, the standard deviation of average gamma dropped from 2.2% – 3.7% for the volumes less than 30 cm{sup 3} to below 1% for volumes greater than 30 cm{sup 3}. Conclusions: The majority of plans and verification failures involved tumour volumes smaller than 30 cm{sup 3}. This was expected due to the nature of disease treated with SBRT and SRS techniques and did not increase rate of failure. Correlations found with MU/cGy indicate that as modulation increased, results deteriorated but not beyond the previously set thresholds.« less
A simple method for verifying the deployment of the TOMS-EP solar arrays
NASA Technical Reports Server (NTRS)
Koppersmith, James R.; Ketchum, Eleanor
1995-01-01
The Total Ozone Mapping Spectrometer-Earth Probe (TOMS-EP) mission relies upon a successful deployment of the spacecraft's solar arrays. Several methods of verification are being employed to ascertain the solar array deployment status, with each requiring differing amounts of data. This paper describes a robust attitude-independent verification method that utilizes telemetry from the coarse Sun sensors (CSS's) and the three-axis magnetometers (TAM's) to determine the solar array deployment status - and it can do so with only a few, not necessarily contiguous, points of data. The method developed assumes that the solar arrays are deployed. Telemetry data from the CSS and TAM are converted to the Sun and magnetic field vectors in spacecraft body coordinates, and the angle between them is calculated. Deployment is indicated if this angle is within a certain error tolerance of the angle between the reference Sun and magnetic field vectors. Although several other methods can indicate a non-deployed state, with this method there is a 70% confidence level in confirming deployment as well as a nearly 100% certainty in confirming a non-deployed state. In addition, the spacecraft attitude (which is not known during the first orbit after launch) is not needed for this algorithm because the angle between the Sun and magnetic field vectors is independent of the spacecraft attitude. This technique can be applied to any spacecraft with a TAM and with CSS's mounted on the solar array(s).
Nakamura, Sayaka; Sato, Hiroaki; Tanaka, Reiko; Yaguchi, Takashi
2016-01-01
We have previously proposed a rapid identification method for bacterial strains based on the profiles of their ribosomal subunit proteins (RSPs), observed using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). This method can perform phylogenetic characterization based on the mass of housekeeping RSP biomarkers, ideally calculated from amino acid sequence information registered in public protein databases. With the aim of extending its field of application to medical mycology, this study investigates the actual state of information of RSPs of eukaryotic fungi registered in public protein databases through the characterization of ribosomal protein fractions extracted from genome-sequenced Aspergillus fumigatus strains Af293 and A1163 as a model. In this process, we have found that the public protein databases harbor problems. The RSP names are in confusion, so we have provisionally unified them using the yeast naming system. The most serious problem is that many incorrect sequences are registered in the public protein databases. Surprisingly, more than half of the sequences are incorrect, due chiefly to mis-annotation of exon/intron structures. These errors could be corrected by a combination of in silico inspection by sequence homology analysis and MALDI-TOF MS measurements. We were also able to confirm conserved post-translational modifications in eleven RSPs. After these verifications, the masses of 31 expressed RSPs under 20,000 Da could be accurately confirmed. These RSPs have a potential to be useful biomarkers for identifying clinical isolates of A. fumigatus .
Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics
NASA Technical Reports Server (NTRS)
Vary, A.
1980-01-01
Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.
NASA Technical Reports Server (NTRS)
Oishi, Meeko; Tomlin, Claire; Degani, Asaf
2003-01-01
Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moran, B.; Stern, W.; Colley, J.
International Atomic Energy Agency (IAEA) safeguards involves verification activities at a wide range of facilities in a variety of operational phases (e.g., under construction, start-up, operating, shutdown, closed-down, and decommissioned). Safeguards optimization for each different facility type and operational phase is essential for the effectiveness of safeguards implementation. The IAEA’s current guidance regarding safeguards for the different facility types in the various lifecycle phases is provided in its Design Information Examination (DIE) and Verification (DIV) procedure. 1 Greater efficiency in safeguarding facilities that are shut down or closed down, including those being decommissioned, could allow the IAEA to use amore » greater portion of its effort to conduct other verification activities. Consequently, the National Nuclear Security Administration’s Office of International Nuclear Safeguards sponsored this study to evaluate whether there is an opportunity to optimize safeguards approaches for facilities that are shutdown or closed-down. The purpose of this paper is to examine existing safeguards approaches for shutdown and closed-down facilities, including facilities being decommissioned, and to seek to identify whether they may be optimized.« less
Supersonic gas-liquid cleaning system
NASA Technical Reports Server (NTRS)
Caimi, Raoul E. B.; Thaxton, Eric A.
1994-01-01
A system to perform cleaning and cleanliness verification is being developed to replace solvent flush methods using CFC 113 for fluid system components. The system is designed for two purposes: internal and external cleaning and verification. External cleaning is performed with the nozzle mounted at the end of a wand similar to a conventional pressure washer. Internal cleaning is performed with a variety of fixtures designed for specific applications. Internal cleaning includes tubes, pipes, flex hoses, and active fluid components such as valves and regulators. The system uses gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the object to be cleaned. Compressed air or any inert gas may be used to provide the conveying medium for the liquid. The converging-diverging nozzles accelerate the gas-liquid mixture to supersonic velocities. The liquid being accelerated may be any solvent including water. This system may be used commercially to replace CFC and other solvent cleaning methods widely used to remove dust, dirt, flux, and lubricants. In addition, cleanliness verification can be performed without the solvents which are typically involved. This paper will present the technical details of the system, the results achieved during testing at KSC, and future applications for this system.
Monitoring/Verification using DMS: TATP Example
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephan Weeks, Kevin Kyle, Manuel Manard
Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations-management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. Fast GC is the leading field analytical method for gas phase separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less
Monitoring/Verification Using DMS: TATP Example
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin Kyle; Stephan Weeks
Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operationsmanagement systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less
Supersonic gas-liquid cleaning system
NASA Astrophysics Data System (ADS)
Caimi, Raoul E. B.; Thaxton, Eric A.
1994-02-01
A system to perform cleaning and cleanliness verification is being developed to replace solvent flush methods using CFC 113 for fluid system components. The system is designed for two purposes: internal and external cleaning and verification. External cleaning is performed with the nozzle mounted at the end of a wand similar to a conventional pressure washer. Internal cleaning is performed with a variety of fixtures designed for specific applications. Internal cleaning includes tubes, pipes, flex hoses, and active fluid components such as valves and regulators. The system uses gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the object to be cleaned. Compressed air or any inert gas may be used to provide the conveying medium for the liquid. The converging-diverging nozzles accelerate the gas-liquid mixture to supersonic velocities. The liquid being accelerated may be any solvent including water. This system may be used commercially to replace CFC and other solvent cleaning methods widely used to remove dust, dirt, flux, and lubricants. In addition, cleanliness verification can be performed without the solvents which are typically involved. This paper will present the technical details of the system, the results achieved during testing at KSC, and future applications for this system.
30 CFR 250.913 - When must I resubmit Platform Verification Program plans?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sacuta, Norm; Young, Aleana; Worth, Kyle
2015-12-22
The IEAGHG Weyburn-Midale CO₂ Monitoring and Storage Project (WMP) began in 2000 with the first four years of research that confirmed the suitability of the containment complex of the Weyburn oil field in southeastern Saskatchewan as a storage location for CO₂ injected as part of enhanced oil recovery (EOR) operations. The first half of this report covers research conducted from 2010 to 2012, under the funding of the United States Department of Energy (contract DEFE0002697), the Government of Canada, and various other governmental and industry sponsors. The work includes more in-depth analysis of various components of a measurement, monitoring andmore » verification (MMV) program through investigation of data on site characterization and geological integrity, wellbore integrity, storage monitoring (geophysical and geochemical), and performance/risk assessment. These results then led to the development of a Best Practices Manual (BPM) providing oilfield and project operators with guidance on CO₂ storage and CO₂-EOR. In 2013, the USDOE and Government of Saskatchewan exercised an optional phase of the same project to further develop and deploy applied research tools, technologies, and methodologies to the data and research at Weyburn with the aim of assisting regulators and operators in transitioning CO₂-EOR operations into permanent storage. This work, detailed in the second half of this report, involves seven targeted research projects – evaluating the minimum dataset for confirming secure storage; additional overburden monitoring; passive seismic monitoring; history-matched modelling; developing proper wellbore design; casing corrosion evaluation; and assessment of post CO₂-injected core samples. The results from the final and optional phases of the Weyburn-Midale Project confirm the suitability of CO₂-EOR fields for the injection of CO₂, and further, highlight the necessary MMV and follow-up monitoring required for these operations to be considered permanent storage.« less
2013-01-01
Background In emergency settings, verification of endotracheal tube (ETT) location is important for critically ill patients. Ignorance of oesophageal intubation can be disastrous. Many methods are used for verification of the endotracheal tube location; none are ideal. Quantitative waveform capnography is considered the standard of care for this purpose but is not always available and is expensive. Therefore, this feasibility study is conducted to compare a cheaper alternative, bedside upper airway ultrasonography to waveform capnography, for verification of endotracheal tube location after intubation. Methods This was a prospective, single-centre, observational study, conducted at the HRPB, Ipoh. It included patients who were intubated in the emergency department from 28 March 2012 to 17 August 2012. A waiver of consent had been obtained from the Medical Research Ethics Committee. Bedside upper airway ultrasonography was performed after intubation and compared to waveform capnography. Specificity, sensitivity, positive and negative predictive value and likelihood ratio are calculated. Results A sample of 107 patients were analysed, and 6 (5.6%) had oesophageal intubations. The overall accuracy of bedside upper airway ultrasonography was 98.1% (95% confidence interval (CI) 93.0% to 100.0%). The kappa value (Κ) was 0.85, indicating a very good agreement between the bedside upper airway ultrasonography and waveform capnography. Thus, bedside upper airway ultrasonography is in concordance with waveform capnography. The sensitivity, specificity, positive predictive value and negative predictive value of bedside upper airway ultrasonography were 98.0% (95% CI 93.0% to 99.8%), 100% (95% CI 54.1% to 100.0%), 100% (95% CI 96.3% to 100.0%) and 75.0% (95% CI 34.9% to 96.8%). The likelihood ratio of a positive test is infinite and the likelihood ratio of a negative test is 0.0198 (95% CI 0.005 to 0.0781). The mean confirmation time by ultrasound is 16.4 s. No adverse effects were recorded. Conclusions Our study shows that ultrasonography can replace waveform capnography in confirming ETT placement in centres without capnography. This can reduce incidence of unrecognised oesophageal intubation and prevent morbidity and mortality. Trial registration National Medical Research Register NMRR11100810230. PMID:23826756
Generic Verification Protocol for Verification of Online Turbidimeters
This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...
Nakajima, Masashi; Hiraoka, Takahiro; Hirohara, Yoko; Oshika, Tetsuro; Mihashi, Toshifumi
2015-01-01
Several researchers studied the longitudinal chromatic aberration (LCA) of the human eye and observed that it does not change due to age. We measured the LCA of 45 subjects’ normal right eyes at three distinct wavelengths (561, 690, and 840 nm) using a Hartmann–Shack wavefront aberrometer (HSWA) while consecutively switching between three light sources for wavefront sensing. We confirmed that the LCA of the human eye does not change due to age between 22 and 57 years. PMID:26203391
Empirical testing of an analytical model predicting electrical isolation of photovoltaic models
NASA Astrophysics Data System (ADS)
Garcia, A., III; Minning, C. P.; Cuddihy, E. F.
A major design requirement for photovoltaic modules is that the encapsulation system be capable of withstanding large DC potentials without electrical breakdown. Presented is a simple analytical model which can be used to estimate material thickness to meet this requirement for a candidate encapsulation system or to predict the breakdown voltage of an existing module design. A series of electrical tests to verify the model are described in detail. The results of these verification tests confirmed the utility of the analytical model for preliminary design of photovoltaic modules.
NASA Technical Reports Server (NTRS)
Axelrad, P.; Cox, A. E.; Crumpton, K. S.
1997-01-01
An algorithm is presented which uses observations of Global Positioning System (GPS) signals reflected from the ocean surface and acquired by a GPS receiver onboard an altimetric satellite to compute the ionospheric delay present in the altimeter measurement. This eliminates the requirement for a dual frequency altimeter for many Earth observing missions. A ground-based experiment is described which confirms the presence of these ocean-bounced signals and demonstrates the potential for altimeter ionospheric correction at the centimeter level.
Micro Computer Tomography for medical device and pharmaceutical packaging analysis.
Hindelang, Florine; Zurbach, Raphael; Roggo, Yves
2015-04-10
Biomedical device and medicine product manufacturing are long processes facing global competition. As technology evolves with time, the level of quality, safety and reliability increases simultaneously. Micro Computer Tomography (Micro CT) is a tool allowing a deep investigation of products: it can contribute to quality improvement. This article presents the numerous applications of Micro CT for medical device and pharmaceutical packaging analysis. The samples investigated confirmed CT suitability for verification of integrity, measurements and defect detections in a non-destructive manner. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
1982-01-01
Tests to verify the as-designed performance of all circuits within the thematic mapper electronics module unit are described. Specifically, the tests involved the evaluation of the scan line corrector driver, shutter drivers function, cal lamp controller function, post amplifier function, command decoder verification unit, and the temperature and actuator controllers function.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, R.A.
1980-12-01
This comparison study involves a preliminary verification of finite element calculations. The methodology of the comparison study consists of solving four example problems with both the SPECTROM finite element program and the MARC-CDC general purpose finite element program. The results show close agreement for all example problems.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Triangular Transactions Involving Commodities Covered by a U.S. Import Certificate § 748.10(e). 0694-0012... Delivery Verification Certificate §§ 748.13 and 762.2(b). 0694-0017 International Import Certificate § 748... Assurance Requirement of License Exception TSR (Technology and Software Under Restriction) §§ 740.3(d) and...
Fractionated Proton Radiotherapy for Benign Cavernous Sinus Meningiomas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slater, Jerry D., E-mail: jdslater@dominion.llumc.edu; Loredo, Lilia N.; Chung, Arthur
2012-08-01
Purpose: To evaluate the efficacy of fractionated proton radiotherapy for a population of patients with benign cavernous sinus meningiomas. Methods and Materials: Between 1991 and 2002, 72 patients were treated at Loma Linda University Medical Center with proton therapy for cavernous sinus meningiomas. Fifty-one patients had biopsy or subtotal resection; 47 had World Health Organization grade 1 pathology. Twenty-one patients had no histologic verification. Twenty-two patients received primary proton therapy; 30 had 1 previous surgery; 20 had more than 1 surgery. The mean gross tumor volume was 27.6 cm{sup 3}; mean clinical target volume was 52.9 cm{sup 3}. Median totalmore » doses for patients with and without histologic verification were 59 and 57 Gy, respectively. Mean and median follow-up periods were 74 months. Results: The overall 5-year actuarial control rate was 96%; the control rate was 99% in patients with grade 1 or absent histologic findings and 50% for those with atypical histology. All 21 patients who did not have histologic verification and 46 of 47 patients with histologic confirmation of grade 1 tumor demonstrated disease control at 5 years. Control rates for patients without previous surgery, 1 surgery, and 2 or more surgeries were 95%, 96%, and 95%, respectively. Conclusions: Fractionated proton radiotherapy for grade 1 cavernous sinus meningiomas achieves excellent control rates with minimal toxicities, regardless of surgical intervention or use of histologic diagnosis. Disease control for large lesions can be achieved by primary fractionated proton therapy.« less
NASA Astrophysics Data System (ADS)
Yim, S.-W.; Yu, S.-D.; Kim, H.-R.; Kim, M.-J.; Park, C.-R.; Yang, S.-E.; Kim, W.-S.; Hyun, O.-B.; Sim, J.; Park, K.-B.; Oh, I.-S.
2010-11-01
We have constructed and completed the preparation for a long-term operation test of a superconducting fault current limiter (SFCL) in a Korea Electric Power Corporation (KEPCO) test grid. The SFCL with rating of 22.9 kV/630 A, 3-phases, has been connected to the 22.9 kV test grid equipped with reclosers and other protection devices in Gochang Power Testing Center of KEPCO. The main goals of the test are the verification of SFCL performance and protection coordination studies. A line-commutation type SFCL was fabricated and installed for this project, and the superconducting components were cooled by a cryo-cooler to 77 K in the sub-cooled liquid nitrogen pressurized by 3 bar of helium gas. The verification test includes un-manned - long-term operation with and without loads and fault tests. Since the test site is 170 km away from the laboratory, we will adopt the un-manned operation with real-time remote monitoring and controlling using high speed internet. For the fault tests, we will apply fault currents up to around 8 kArms to the SFCL using an artificial fault generator. The fault tests may allow us not only to confirm the current limiting capability of the SFCL, but also to adjust the SFCL - recloser coordination such as resetting over-current relay parameters. This paper describes the construction of the testing facilities and discusses the plans for the verification tests.
Upgrades at the NASA Langley Research Center National Transonic Facility
NASA Technical Reports Server (NTRS)
Paryz, Roman W.
2012-01-01
Several projects have been completed or are nearing completion at the NASA Langley Research Center (LaRC) National Transonic Facility (NTF). The addition of a Model Flow-Control/Propulsion Simulation test capability to the NTF provides a unique, transonic, high-Reynolds number test capability that is well suited for research in propulsion airframe integration studies, circulation control high-lift concepts, powered lift, and cruise separation flow control. A 1992 vintage Facility Automation System (FAS) that performs the control functions for tunnel pressure, temperature, Mach number, model position, safety interlock and supervisory controls was replaced using current, commercially available components. This FAS upgrade also involved a design study for the replacement of the facility Mach measurement system and the development of a software-based simulation model of NTF processes and control systems. The FAS upgrades were validated by a post upgrade verification wind tunnel test. The data acquisition system (DAS) upgrade project involves the design, purchase, build, integration, installation and verification of a new DAS by replacing several early 1990's vintage computer systems with state of the art hardware/software. This paper provides an update on the progress made in these efforts. See reference 1.
Model verification of mixed dynamic systems. [POGO problem in liquid propellant rockets
NASA Technical Reports Server (NTRS)
Chrostowski, J. D.; Evensen, D. A.; Hasselman, T. K.
1978-01-01
A parameter-estimation method is described for verifying the mathematical model of mixed (combined interactive components from various engineering fields) dynamic systems against pertinent experimental data. The model verification problem is divided into two separate parts: defining a proper model and evaluating the parameters of that model. The main idea is to use differences between measured and predicted behavior (response) to adjust automatically the key parameters of a model so as to minimize response differences. To achieve the goal of modeling flexibility, the method combines the convenience of automated matrix generation with the generality of direct matrix input. The equations of motion are treated in first-order form, allowing for nonsymmetric matrices, modeling of general networks, and complex-mode analysis. The effectiveness of the method is demonstrated for an example problem involving a complex hydraulic-mechanical system.
Quality dependent fusion of intramodal and multimodal biometric experts
NASA Astrophysics Data System (ADS)
Kittler, J.; Poh, N.; Fatukasi, O.; Messer, K.; Kryszczuk, K.; Richiardi, J.; Drygajlo, A.
2007-04-01
We address the problem of score level fusion of intramodal and multimodal experts in the context of biometric identity verification. We investigate the merits of confidence based weighting of component experts. In contrast to the conventional approach where confidence values are derived from scores, we use instead raw measures of biometric data quality to control the influence of each expert on the final fused score. We show that quality based fusion gives better performance than quality free fusion. The use of quality weighted scores as features in the definition of the fusion functions leads to further improvements. We demonstrate that the achievable performance gain is also affected by the choice of fusion architecture. The evaluation of the proposed methodology involves 6 face and one speech verification experts. It is carried out on the XM2VTS data base.
Collapse of Experimental Colloidal Aging using Record Dynamics
NASA Astrophysics Data System (ADS)
Robe, Dominic; Boettcher, Stefan; Sibani, Paolo; Yunker, Peter
The theoretical framework of record dynamics (RD) posits that aging behavior in jammed systems is controlled by short, rare events involving activation of only a few degrees of freedom. RD predicts dynamics in an aging system to progress with the logarithm of t /tw . This prediction has been verified through new analysis of experimental data on an aging 2D colloidal system. MSD and persistence curves spanning three orders of magnitude in waiting time are collapsed. These predictions have also been found consistent with a number of experiments and simulations, but verification of the specific assumptions that RD makes about the underlying statistics of these rare events has been elusive. Here the observation of individual particles allows for the first time the direct verification of the assumptions about event rates and sizes. This work is suppoted by NSF Grant DMR-1207431.
Cassini's Test Methodology for Flight Software Verification and Operations
NASA Technical Reports Server (NTRS)
Wang, Eric; Brown, Jay
2007-01-01
The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).
Hypersonic CFD applications for the National Aero-Space Plane
NASA Technical Reports Server (NTRS)
Richardson, Pamela F.; Mcclinton, Charles R.; Bittner, Robert D.; Dilley, A. Douglas; Edwards, Kelvin W.
1989-01-01
Design and analysis of the NASP depends heavily upon developing the critical technology areas that cover the entire engineering design of the vehicle. These areas include materials, structures, propulsion systems, propellants, integration of airframe and propulsion systems, controls, subsystems, and aerodynamics areas. Currently, verification of many of the classical engineering tools relies heavily on computational fluid dynamics. Advances are being made in the development of CFD codes to accomplish nose-to-tail analyses for hypersonic aircraft. Additional details involving the partial development, analysis, verification, and application of the CFL3D code and the SPARK combustor code are discussed. A nonequilibrium version of CFL3D that is presently being developed and tested is also described. Examples are given of portion calculations for research hypersonic aircraft geometries and comparisons with experiment data show good agreement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.
Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed formore » discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.« less
Verification Games: Crowd-Sourced Formal Verification
2016-03-01
VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced
Verification and extension of the MBL technique for photo resist pattern shape measurement
NASA Astrophysics Data System (ADS)
Isawa, Miki; Tanaka, Maki; Kazumi, Hideyuki; Shishido, Chie; Hamamatsu, Akira; Hasegawa, Norio; De Bisschop, Peter; Laidler, David; Leray, Philippe; Cheng, Shaunee
2011-03-01
In order to achieve pattern shape measurement with CD-SEM, the Model Based Library (MBL) technique is in the process of development. In this study, several libraries which consisted by double trapezoid model placed in optimum layout, were used to measure the various layout patterns. In order to verify the accuracy of the MBL photoresist pattern shape measurement, CDAFM measurements were carried out as a reference metrology. Both results were compared to each other, and we confirmed that there is a linear correlation between them. After that, to expand the application field of the MBL technique, it was applied to end-of-line (EOL) shape measurement to show the capability. Finally, we confirmed the possibility that the MBL could be applied to more local area shape measurement like hot-spot analysis.
Unmaking the bomb: Verifying limits on the stockpiles of nuclear weapons
NASA Astrophysics Data System (ADS)
Glaser, Alexander
2017-11-01
Verifying limits on the stockpiles of nuclear weapons may require the ability for international in-spectors to account for individual warheads, even when non-deployed, and to confirm the authenticity of nuclear warheads prior to dismantlement. These are fundamentally new challenges for nuclear verification, and they have been known for some time; unfortunately, due to a lack of sense of urgency, research in this area has not made substantial progress over the past 20 years. This chapter explores the central outstanding issues and offers a number of possible paths forward. In the case of confirming numerical limits, these in-clude innovative tagging techniques and approaches solely based on declarations using modern crypto-graphic escrow schemes; with regard to warhead confirmation, there has recently been increasing interest in developing fundamentally new measurement approaches where, in one form or another, sensitive infor-mation is not acquired in the first place. Overall, new international R&D efforts could more usefully focus on non-intrusive technologies and approaches, which may show more promise for early demonstration and adoption. In the meantime, while warhead dismantlements remain unverified, nuclear weapon states ought to begin to document warhead assembly, refurbishment, and dismantlement activities and movements of warheads and warhead components through the weapons complex in ways that international inspectors will find credible at a later time. Again, such a process could be enabled by modern cryptographic techniques such as blockchaining. Finally, and perhaps most importantly, it is important to recognize that the main reason for the complexity of technologies and approaches needed for nuclear disarmament verification is the requirement to protect information that nuclear weapon states consider sensitive. Ultimately, if information security concerns cannot be resolved to the satisfaction of all stakeholders, an alternative would be to "reveal the secret" and to make available select warhead design information.
Monte Carlo verification of radiotherapy treatments with CloudMC.
Miras, Hector; Jiménez, Rubén; Perales, Álvaro; Terrón, José Antonio; Bertolet, Alejandro; Ortiz, Antonio; Macías, José
2018-06-27
A new implementation has been made on CloudMC, a cloud-based platform presented in a previous work, in order to provide services for radiotherapy treatment verification by means of Monte Carlo in a fast, easy and economical way. A description of the architecture of the application and the new developments implemented is presented together with the results of the tests carried out to validate its performance. CloudMC has been developed over Microsoft Azure cloud. It is based on a map/reduce implementation for Monte Carlo calculations distribution over a dynamic cluster of virtual machines in order to reduce calculation time. CloudMC has been updated with new methods to read and process the information related to radiotherapy treatment verification: CT image set, treatment plan, structures and dose distribution files in DICOM format. Some tests have been designed in order to determine, for the different tasks, the most suitable type of virtual machines from those available in Azure. Finally, the performance of Monte Carlo verification in CloudMC is studied through three real cases that involve different treatment techniques, linac models and Monte Carlo codes. Considering computational and economic factors, D1_v2 and G1 virtual machines were selected as the default type for the Worker Roles and the Reducer Role respectively. Calculation times up to 33 min and costs of 16 € were achieved for the verification cases presented when a statistical uncertainty below 2% (2σ) was required. The costs were reduced to 3-6 € when uncertainty requirements are relaxed to 4%. Advantages like high computational power, scalability, easy access and pay-per-usage model, make Monte Carlo cloud-based solutions, like the one presented in this work, an important step forward to solve the long-lived problem of truly introducing the Monte Carlo algorithms in the daily routine of the radiotherapy planning process.
Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas
2008-01-01
The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of the detector itself. In addition to PET alone, PET/CT imaging provides accurate information on the position of the imaged object and may assess possible anatomical changes during fractionated radiotherapy in clinical applications. PMID:17388158
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toltz, A; Seuntjens, J; Hoesl, M
Purpose: With the aim of reducing acute esophageal radiation toxicity in pediatric patients receiving craniospinal irradiation (CSI), we investigated the implementation of an in-vivo, adaptive proton therapy range verification methodology. Simulation experiments and in-phantom measurements were conducted to validate the range verification technique for this clinical application. Methods: A silicon diode array system has been developed and experimentally tested in phantom for passively scattered proton beam range verification for a prostate treatment case by correlating properties of the detector signal to the water equivalent path length (WEPL). We propose to extend the methodology to verify range distal to the vertebralmore » body for pediatric CSI cases by placing this small volume dosimeter in the esophagus of the anesthetized patient immediately prior to treatment. A set of calibration measurements was performed to establish a time signal to WEPL fit for a “scout” beam in a solid water phantom. Measurements are compared against Monte Carlo simulation in GEANT4 using the Tool for Particle Simulation (TOPAS). Results: Measurements with the diode array in a spread out Bragg peak of 14 cm modulation width and 15 cm range (177 MeV passively scattered beam) in solid water were successfully validated against proton fluence rate simulations in TOPAS. The resulting calibration curve allows for a sensitivity analysis of detector system response with dose rate in simulation and with individual diode position through simulation on patient CT data. Conclusion: Feasibility has been shown for the application of this range verification methodology to pediatric CSI. An in-vivo measurement to determine the WEPL to the inner surface of the esophagus will allow for personalized adjustment of the treatment plan to ensure sparing of the esophagus while confirming target coverage. A Toltz acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290)« less
Science verification of operational aerosol and cloud products for TROPOMI on Sentinel-5 precursor
NASA Astrophysics Data System (ADS)
Lelli, Luca; Gimeno-Garcia, Sebastian; Sanders, Abram; Sneep, Maarten; Rozanov, Vladimir V.; Kokhanvosky, Alexander A.; Loyola, Diego; Burrows, John P.
2016-04-01
With the approaching launch of the Sentinel-5 precursor (S-5P) satellite, scheduled by mid 2016, one preparatory task of the L2 working group (composed by the Institute of Environmental Physics IUP Bremen, the Royal Netherlands Meteorological Institute KNMI De Bilt, and the German Aerospace Center DLR Oberpfaffenhofen) has been the assessment of biases among aerosol and cloud products, that are going to be inferred by the respective algorithms from measurements of the platform's payload TROPOspheric Monitoring Instrument (TROPOMI). The instrument will measure terrestrial radiance with varying moderate spectral resolutions from the ultraviolet throughout the shortwave infrared. Specifically, all the operational and verification algorithms involved in this comparison exploit the sensitivity of molecular oxygen absorption (the A-band, 755-775 nm, with a resolution of 0.54 nm) to changes in optical and geometrical parameters of tropospheric scattering layers. Therefore, aerosol layer height (ALH) and thickness (AOT), cloud top height (CTH), thickness (COT) and albedo (CA) are the targeted properties. First, the verification of these properties has been accomplished upon synchronisation of the respective forward radiative transfer models for a variety of atmospheric scenarios. Then, biases against independent techniques have been evaluated with real measurements of selected GOME-2 orbits. Global seasonal bias assessment has been carried out for CTH, CA and COT, whereas the verification of ALH and AOT is based on the analysis of the ash plume emitted by the icelandic volcanic eruption Eyjafjallajökull in May 2010 and selected dust scenes off the Saharan west coast sensed by SCIAMACHY in year 2009.
Development of an inpatient operational pharmacy productivity model.
Naseman, Ryan W; Lopez, Ben R; Forrey, Ryan A; Weber, Robert J; Kipp, Kris M
2015-02-01
An innovative model for measuring the operational productivity of medication order management in inpatient settings is described. Order verification within a computerized prescriber order-entry system was chosen as the pharmacy workload driver. To account for inherent variability in the tasks involved in processing different types of orders, pharmaceutical products were grouped by class, and each class was assigned a time standard, or "medication complexity weight" reflecting the intensity of pharmacist and technician activities (verification of drug indication, verification of appropriate dosing, adverse-event prevention and monitoring, medication preparation, product checking, product delivery, returns processing, nurse/provider education, and problem-order resolution). The resulting "weighted verifications" (WV) model allows productivity monitoring by job function (pharmacist versus technician) to guide hiring and staffing decisions. A 9-month historical sample of verified medication orders was analyzed using the WV model, and the calculations were compared with values derived from two established models—one based on the Case Mix Index (CMI) and the other based on the proprietary Pharmacy Intensity Score (PIS). Evaluation of Pearson correlation coefficients indicated that values calculated using the WV model were highly correlated with those derived from the CMI-and PIS-based models (r = 0.845 and 0.886, respectively). Relative to the comparator models, the WV model offered the advantage of less period-to-period variability. The WV model yielded productivity data that correlated closely with values calculated using two validated workload management models. The model may be used as an alternative measure of pharmacy operational productivity. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Coronary Artery Diagnosis Aided by Neural Network
NASA Astrophysics Data System (ADS)
Stefko, Kamil
2007-01-01
Coronary artery disease is due to atheromatous narrowing and subsequent occlusion of the coronary vessel. Application of optimised feed forward multi-layer back propagation neural network (MLBP) for detection of narrowing in coronary artery vessels is presented in this paper. The research was performed using 580 data records from traditional ECG exercise test confirmed by coronary arteriography results. Each record of training database included description of the state of a patient providing input data for the neural network. Level and slope of ST segment of a 12 lead ECG signal recorded at rest and after effort (48 floating point values) was the main component of input data for neural network was. Coronary arteriography results (verified the existence or absence of more than 50% stenosis of the particular coronary vessels) were used as a correct neural network training output pattern. More than 96% of cases were correctly recognised by especially optimised and a thoroughly verified neural network. Leave one out method was used for neural network verification so 580 data records could be used for training as well as for verification of neural network.
NASA Astrophysics Data System (ADS)
Jovanović, J.; Petronijević, R. B.; Lukić, M.; Karan, D.; Parunović, N.; Branković-Lazić, I.
2017-09-01
During the previous development of a chemometric method for estimating the amount of added colorant in meat products, it was noticed that the natural colorant most commonly added to boiled sausages, E 120, has different CIE-LAB behavior compared to artificial colors that are used for the same purpose. This has opened the possibility of transforming the developed method into a method for identifying the addition of natural or synthetic colorants in boiled sausages based on the measurement of the color of the cross-section. After recalibration of the CIE-LAB method using linear discriminant analysis, verification was performed on 76 boiled sausages, of either frankfurters or Parisian sausage types. The accuracy and reliability of the classification was confirmed by comparison with the standard HPLC method. Results showed that the LDA + CIE-LAB method can be applied with high accuracy, 93.42 %, to estimate food color type in boiled sausages. Natural orange colors can give false positive results. Pigments from spice mixtures had no significant effect on CIE-LAB results.
Joiner, T E
1999-09-01
It is suggested that self-verification theory may provide insight as to why bulimic symptoms often persist for years, sometimes even despite intervention. In an effort to meet basic needs for self-confirmation, bulimic women may invite the very responses they fear (e.g., negative feedback about appearance), and thus propagate their symptoms. It was thus predicted that interest in negative feedback would be correlated with body dissatisfaction and bulimic symptoms, and that interest in negative feedback would serve as a risk factor for development of later symptoms, via the mediating effects of increased body dissatisfaction. Seventy-nine undergraduate women completed self-report assessments of interest in negative feedback, bulimic symptoms, and body dissatisfaction. Results supported the prediction that, despite serious concerns about body appearance, bulimic women were interested in the very feedback that would aggravate these concerns. Moreover, interest in negative feedback appeared to serve as a risk factor for development of later symptoms, via the mediating effects of increased body dissatisfaction. The clinical implications of these findings are discussed.
Collusion-aware privacy-preserving range query in tiered wireless sensor networks.
Zhang, Xiaoying; Dong, Lei; Peng, Hui; Chen, Hong; Zhao, Suyun; Li, Cuiping
2014-12-11
Wireless sensor networks (WSNs) are indispensable building blocks for the Internet of Things (IoT). With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromisedmaster nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals.
Collusion-Aware Privacy-Preserving Range Query in Tiered Wireless Sensor Networks†
Zhang, Xiaoying; Dong, Lei; Peng, Hui; Chen, Hong; Zhao, Suyun; Li, Cuiping
2014-01-01
Wireless sensor networks (WSNs) are indispensable building blocks for the Internet of Things (IoT). With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromised master nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals. PMID:25615731
Impact of Finger Type in Fingerprint Authentication
NASA Astrophysics Data System (ADS)
Gafurov, Davrondzhon; Bours, Patrick; Yang, Bian; Busch, Christoph
Nowadays fingerprint verification system is the most widespread and accepted biometric technology that explores various features of the human fingers for this purpose. In general, every normal person has 10 fingers with different size. Although it is claimed that recognition performance with little fingers can be less accurate compared to other finger types, to our best knowledge, this has not been investigated yet. This paper presents our study on the topic of influence of the finger type into fingerprint recognition performance. For analysis we employ two fingerprint verification software packages (one public and one commercial). We conduct test on GUC100 multi sensor fingerprint database which contains fingerprint images of all 10 fingers from 100 subjects. Our analysis indeed confirms that performance with small fingers is less accurate than performance with the others fingers of the hand. It also appears that best performance is being obtained with thumb or index fingers. For example, performance deterioration from the best finger (i.e. index or thumb) to the worst fingers (i.e. small ones) can be in the range of 184%-1352%.
Role to Be Played by Independent Geotechnical Supervision in the Foundation for Bridge Construction
NASA Astrophysics Data System (ADS)
Sobala, Dariusz; Rybak, Jarosław
2017-10-01
Some remarks concerning the necessity of employing an independent and over all ethical geotechnical survey were presented in the paper. Starting from the design phase, through the whole construction process, the importance of geotechnical engineer is stated in legal acts. Numerous testing technologies serve for the calibration of geotechnical technologies and allow for confirming the quality and capacity of piles. Special emphasis was payed to the involvement of scientifical and research institutions which can not only serve services but also can postprocess and methodize collected data. Such databases enable for new codes, methods and recommendations. Selection of deep foundations for bridge-type structures is most often dependent on complex geotechnical conditions, concentrated loads and constraints for pier displacements. Besides the last ones, prior to more common introduction of the design-construct system, could be a convenient justification for design engineer, who imposed deep foundation because he did not want or was not able to estimate the effect of pier settlement on civil engineering structure. The paper provides some notes about the need to engage a geotechnical supervising service of high competency and ethical quality during engineering and construction stages of foundations for bridge-type structures where legal requirements are of special consideration. Successive stages of projects are reviewed and research methods used for current calibration of geotechnical technologies and verification of geotechnical work quality are analysed. Special attention is given to potential involvement of independent R&D institutions which, apart from rendering specific services, also collect and systemize the research results thus enabling, in the long term, to revise engineering standards, instructions and guidelines.
Nakamura, Sayaka; Sato, Hiroaki; Tanaka, Reiko; Yaguchi, Takashi
2016-01-01
We have previously proposed a rapid identification method for bacterial strains based on the profiles of their ribosomal subunit proteins (RSPs), observed using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). This method can perform phylogenetic characterization based on the mass of housekeeping RSP biomarkers, ideally calculated from amino acid sequence information registered in public protein databases. With the aim of extending its field of application to medical mycology, this study investigates the actual state of information of RSPs of eukaryotic fungi registered in public protein databases through the characterization of ribosomal protein fractions extracted from genome-sequenced Aspergillus fumigatus strains Af293 and A1163 as a model. In this process, we have found that the public protein databases harbor problems. The RSP names are in confusion, so we have provisionally unified them using the yeast naming system. The most serious problem is that many incorrect sequences are registered in the public protein databases. Surprisingly, more than half of the sequences are incorrect, due chiefly to mis-annotation of exon/intron structures. These errors could be corrected by a combination of in silico inspection by sequence homology analysis and MALDI-TOF MS measurements. We were also able to confirm conserved post-translational modifications in eleven RSPs. After these verifications, the masses of 31 expressed RSPs under 20,000 Da could be accurately confirmed. These RSPs have a potential to be useful biomarkers for identifying clinical isolates of A. fumigatus. PMID:27843740
Study of solution procedures for nonlinear structural equations
NASA Technical Reports Server (NTRS)
Young, C. T., II; Jones, R. F., Jr.
1980-01-01
A method for the redution of the cost of solution of large nonlinear structural equations was developed. Verification was made using the MARC-STRUC structure finite element program with test cases involving single and multiple degrees of freedom for static geometric nonlinearities. The method developed was designed to exist within the envelope of accuracy and convergence characteristic of the particular finite element methodology used.
Rule-Based Runtime Verification
NASA Technical Reports Server (NTRS)
Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik
2003-01-01
We present a rule-based framework for defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time logics, interval logics, forms of quantified temporal logics, and so on. Our logic, EAGLE, is implemented as a Java library and involves novel techniques for rule definition, manipulation and execution. Monitoring is done on a state-by-state basis, without storing the execution trace.
Kapiriri, Lydia
2017-06-19
While there have been efforts to develop frameworks to guide healthcare priority setting; there has been limited focus on evaluation frameworks. Moreover, while the few frameworks identify quality indicators for successful priority setting, they do not provide the users with strategies to verify these indicators. Kapiriri and Martin (Health Care Anal 18:129-147, 2010) developed a framework for evaluating priority setting in low and middle income countries. This framework provides BOTH parameters for successful priority setting and proposes means of their verification. Before its use in real life contexts, this paper presents results from a validation process of the framework. The framework validation involved 53 policy makers and priority setting researchers at the global, national and sub-national levels (in Uganda). They were requested to indicate the relative importance of the proposed parameters as well as the feasibility of obtaining the related information. We also pilot tested the proposed means of verification. Almost all the respondents evaluated all the parameters, including the contextual factors, as 'very important'. However, some respondents at the global level thought 'presence of incentives to comply', 'reduced disagreements', 'increased public understanding,' 'improved institutional accountability' and 'meeting the ministry of health objectives', which could be a reflection of their levels of decision making. All the proposed means of verification were assessed as feasible with the exception of meeting observations which would require an insider. These findings results were consistent with those obtained from the pilot testing. These findings are relevant to policy makers and researchers involved in priority setting in low and middle income countries. To the best of our knowledge, this is one of the few initiatives that has involved potential users of a framework (at the global and in a Low Income Country) in its validation. The favorable validation of all the parameters at the national and sub-national levels implies that the framework has potential usefulness at those levels, as is. The parameters that were disputed at the global level necessitate further discussion when using the framework at that level. The next step is to use the validated framework in evaluating actual priority setting at the different levels.
King, Shelby M.; Higgins, J. William; Nino, Celina R.; Smith, Timothy R.; Paffenroth, Elizabeth H.; Fairbairn, Casey E.; Docuyanan, Abigail; Shah, Vishal D.; Chen, Alice E.; Presnell, Sharon C.; Nguyen, Deborah G.
2017-01-01
Due to its exposure to high concentrations of xenobiotics, the kidney proximal tubule is a primary site of nephrotoxicity and resulting attrition in the drug development pipeline. Current pre-clinical methods using 2D cell cultures and animal models are unable to fully recapitulate clinical drug responses due to limited in vitro functional lifespan, or species-specific differences. Using Organovo's proprietary 3D bioprinting platform, we have developed a fully cellular human in vitro model of the proximal tubule interstitial interface comprising renal fibroblasts, endothelial cells, and primary human renal proximal tubule epithelial cells to enable more accurate prediction of tissue-level clinical outcomes. Histological characterization demonstrated formation of extensive microvascular networks supported by endogenous extracellular matrix deposition. The epithelial cells of the 3D proximal tubule tissues demonstrated tight junction formation and expression of renal uptake and efflux transporters; the polarized localization and function of P-gp and SGLT2 were confirmed. Treatment of 3D proximal tubule tissues with the nephrotoxin cisplatin induced loss of tissue viability and epithelial cells in a dose-dependent fashion, and cimetidine rescued these effects, confirming the role of the OCT2 transporter in cisplatin-induced nephrotoxicity. The tissues also demonstrated a fibrotic response to TGFβ as assessed by an increase in gene expression associated with human fibrosis and histological verification of excess extracellular matrix deposition. Together, these results suggest that the bioprinted 3D proximal tubule model can serve as a test bed for the mechanistic assessment of human nephrotoxicity and the development of pathogenic states involving epithelial-interstitial interactions, making them an important adjunct to animal studies. PMID:28337147
Termopoli, Veronica; Famiglini, Giorgio; Palma, Pierangela; Magrini, Laura; Cappiello, Achille
2015-03-01
Sudden infant death syndrome (SIDS) and sudden intrauterine unexpected death syndrome (SIUDS) are an unresolved teaser in the social-medical and health setting of modern medicine and are the result of multifactorial interactions. Recently, prenatal exposure to environmental contaminants has been associated with negative pregnancy outcomes, and verification of their presence in fetal and newborn tissues is of crucial importance. A gas chromatography-tandem mass spectrometry (MS/MS) method, using a triple quadrupole analyzer, is proposed to assess the presence of 20 organochlorine pesticides, two organophosphate pesticides, one carbamate (boscalid), and a phenol (bisphenol A) in human brain tissues. Samples were collected during autopsies of infants and fetuses that died suddenly without any evident cause. The method involves a liquid-solid extraction using n-hexane as the extraction solvent. The extracts were purified with Florisil cartridges prior to the final determination. Recovery experiments using lamb brain spiked at three different concentrations in the range of 1-50 ng g(-1) were performed, with recoveries ranging from 79 to 106%. Intraday and interday repeatability were evaluated, and relative standard deviations lower than 10% and 18%, respectively, were obtained. The selectivity and sensitivity achieved in multiple reaction monitoring mode allowed us to achieve quantification and confirmation in a real matrix at levels as low as 0.2-0.6 ng g(-1). Two MS/MS transitions were acquired for each analyte, using the Q/q ratio as the confirmatory parameter. This method was applied to the analysis of 14 cerebral cortex samples (ten SIUDS and four SIDS cases), and confirmed the presence of several selected compounds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, M; Jung, J; Yoon, D
Purpose: Respiratory gated radiation therapy (RGRT) gives accurate results when a patient’s breathing is stable and regular. Thus, the patient should be fully aware during respiratory pattern training before undergoing the RGRT treatment. In order to bypass the process of respiratory pattern training, we propose a target location prediction system for RGRT that uses only natural respiratory volume, and confirm its application. Methods: In order to verify the proposed target location prediction system, an in-house phantom set was used. This set involves a chest phantom including target, external markers, and motion generator. Natural respiratory volume signals were generated using themore » random function in MATLAB code. In the chest phantom, the target takes a linear motion based on the respiratory signal. After a four-dimensional computed tomography (4DCT) scan of the in-house phantom, the motion trajectory was derived as a linear equation. The accuracy of the linear equation was compared with that of the motion algorithm used by the operating motion generator. In addition, we attempted target location prediction using random respiratory volume values. Results: The correspondence rate of the linear equation derived from the 4DCT images with the motion algorithm of the motion generator was 99.41%. In addition, the average error rate of target location prediction was 1.23% for 26 cases. Conclusion: We confirmed the applicability of our proposed target location prediction system for RGRT using natural respiratory volume. If additional clinical studies can be conducted, a more accurate prediction system can be realized without requiring respiratory pattern training.« less
Chen, C; Xiang, J Y; Hu, W; Xie, Y B; Wang, T J; Cui, J W; Xu, Y; Liu, Z; Xiang, H; Xie, Q
2015-11-01
To screen and identify safe micro-organisms used during Douchi fermentation, and verify the feasibility of producing high-quality Douchi using these identified micro-organisms. PCR-denaturing gradient gel electrophoresis (DGGE) and automatic amino-acid analyser were used to investigate the microbial diversity and free amino acids (FAAs) content of 10 commercial Douchi samples. The correlations between microbial communities and FAAs were analysed by statistical analysis. Ten strains with significant positive correlation were identified. Then an experiment on Douchi fermentation by identified strains was carried out, and the nutritional composition in Douchi was analysed. Results showed that FAAs and relative content of isoflavone aglycones in verification Douchi samples were generally higher than those in commercial Douchi samples. Our study indicated that fungi, yeasts, Bacillus and lactic acid bacteria were the key players in Douchi fermentation, and with identified probiotic micro-organisms participating in fermentation, a higher quality Douchi product was produced. This is the first report to analyse and confirm the key micro-organisms during Douchi fermentation by statistical analysis. This work proves fermentation micro-organisms to be the key influencing factor of Douchi quality, and demonstrates the feasibility of fermenting Douchi using identified starter micro-organisms. © 2015 The Society for Applied Microbiology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harper, Kyle; Truong, Thanh-Tam; Magwood, Leroy
In the process of decontaminating and decommissioning (D&D) older nuclear facilities, special precautions must be taken with removable or airborne contamination. One possible strategy utilizes foams and fixatives to affix these loose contaminants. Many foams and fixatives are already commercially available, either generically or sold specifically for D&D. However, due to a lack of revelant testing in a radioactive environment, additional verification is needed to confirm that these products not only affix contamination to their surfaces, but also will function in a D&D environment. Several significant safety factors, including flammability and worker safety, can be analyzed through the process ofmore » headspace analysis, a technique that analyzes the off gas formed before or during the curing process of the foam/fixative, usually using gas chromatography-mass spectrometry (GC-MS). This process focuses on the volatile components of a chemical, which move freely between the solid/liquid form within the sample and the gaseous form in the area above the sample (the headspace). Between possibly hot conditions in a D&D situation and heat created in a foaming reaction, the volatility of many chemicals can change, and thus different gasses can be released at different times throughout the reaction. This project focused on analysis of volatile chemicals involved in the process of using foams and fixatives to identify any potential hazardous or flammable compounds.« less
The SeaHorn Verification Framework
NASA Technical Reports Server (NTRS)
Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.
2015-01-01
In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.
Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems
NASA Technical Reports Server (NTRS)
Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris
2010-01-01
Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.
NASA Astrophysics Data System (ADS)
Miller, Jacob; Sanders, Stephen; Miyake, Akimasa
2017-12-01
While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.
Technical review of SRT-CMA-930058 revalidation studies of Mark 16 experiments: J70
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, R.L.
1993-10-25
This study is a reperformance of a set of MGBS-TGAL criticality safety code validation calculations previously reported by Clark. The reperformance was needed because the records of the previous calculations could not be located in current APG files and records. As noted by the author, preliminary attempts to reproduce the Clark results by direct modeling in MGBS and TGAL were unsuccessful. Consultation with Clark indicated that the MGBS-TGAL (EXPT) option within the KOKO system should be used to set up the MGBS and TGAL input data records. The results of the study indicate that the technique used by Clark hasmore » been established and that the technique is now documented for future use. File records of the calculations have also been established in APG files. The review was performed per QAP 11--14 of 1Q34. Since the reviewer was involved in developing the procedural technique used for this study, this review can not be considered a fully independent review, but should be considered a verification that the document contains adequate information to allow a new user to perform similar calculations, a verification of the procedure by performing several calculations independently with identical results to the reported results, and a verification of the readability of the report.« less
Spacecraft servicing demonstration plan
NASA Technical Reports Server (NTRS)
Bergonz, F. H.; Bulboaca, M. A.; Derocher, W. L., Jr.
1984-01-01
A preliminary spacecraft servicing demonstration plan is prepared which leads to a fully verified operational on-orbit servicing system based on the module exchange, refueling, and resupply technologies. The resulting system can be applied at the space station, in low Earth orbit with an orbital maneuvering vehicle (OMV), or be carried with an OMV to geosynchronous orbit by an orbital transfer vehicle. The three phase plan includes ground demonstrations, cargo bay demonstrations, and free flight verifications. The plan emphasizes the exchange of multimission modular spacecraft (MMS) modules which involves space repairable satellites. Three servicer mechanism configurations are the engineering test unit, a protoflight quality unit, and two fully operational units that have been qualified and documented for use in free flight verification activity. The plan balances costs and risks by overlapping study phases, utilizing existing equipment for ground demonstrations, maximizing use of existing MMS equipment, and rental of a spacecraft bus.
Verification and Implementation of Operations Safety Controls for Flight Missions
NASA Technical Reports Server (NTRS)
Smalls, James R.; Jones, Cheryl L.; Carrier, Alicia S.
2010-01-01
There are several engineering disciplines, such as reliability, supportability, quality assurance, human factors, risk management, safety, etc. Safety is an extremely important engineering specialty within NASA, and the consequence involving a loss of crew is considered a catastrophic event. Safety is not difficult to achieve when properly integrated at the beginning of each space systems project/start of mission planning. The key is to ensure proper handling of safety verification throughout each flight/mission phase. Today, Safety and Mission Assurance (S&MA) operations engineers continue to conduct these flight product reviews across all open flight products. As such, these reviews help ensure that each mission is accomplished with safety requirements along with controls heavily embedded in applicable flight products. Most importantly, the S&MA operations engineers are required to look for important design and operations controls so that safety is strictly adhered to as well as reflected in the final flight product.
On Biometrics With Eye Movements.
Zhang, Youming; Juhola, Martti
2017-09-01
Eye movements are a relatively novel data source for biometric identification. When video cameras applied to eye tracking become smaller and more efficient, this data source could offer interesting opportunities for the development of eye movement biometrics. In this paper, we study primarily biometric identification as seen as a classification task of multiple classes, and secondarily biometric verification considered as binary classification. Our research is based on the saccadic eye movement signal measurements from 109 young subjects. In order to test the data measured, we use a procedure of biometric identification according to the one-versus-one (subject) principle. In a development from our previous research, which also involved biometric verification based on saccadic eye movements, we now apply another eye movement tracker device with a higher sampling frequency of 250 Hz. The results obtained are good, with correct identification rates at 80-90% at their best.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
NASA Astrophysics Data System (ADS)
Sarkar, Biplab; Ray, Jyotirmoy; Ganesh, Tharmarnadar; Manikandan, Arjunan; Munshi, Anusheel; Rathinamuthu, Sasikumar; Kaur, Harpreet; Anbazhagan, Satheeshkumar; Giri, Upendra K.; Roy, Soumya; Jassal, Kanan; Kalyan Mohanti, Bidhu
2018-04-01
The aim of this article is to derive and verify a mathematical formulation for the reduction of the six-dimensional (6D) positional inaccuracies of patients (lateral, longitudinal, vertical, pitch, roll and yaw) to three-dimensional (3D) linear shifts. The formulation was mathematically and experimentally tested and verified for 169 stereotactic radiotherapy patients. The mathematical verification involves the comparison of any (one) of the calculated rotational coordinates with the corresponding value from the 6D shifts obtained by cone beam computed tomography (CBCT). The experimental verification involves three sets of measurements using an ArcCHECK phantom, when (i) the phantom was not moved (neutral position: 0MES), (ii) the position of the phantom shifted by 6D shifts obtained from CBCT (6DMES) from neutral position and (iii) the phantom shifted from its neutral position by 3D shifts reduced from 6D shifts (3DMES). Dose volume histogram and statistical comparisons were made between ≤ft< TPSCAL{\\text -}0MES \\right> and ≤ft< 3DMES{\\text -6DMES} \\right> . The mathematical verification was performed by a comparison of the calculated and measured yaw (γ°) rotation values, which gave a straight line, Y = 1X with a goodness of fit as R 2 = 0.9982. The verification, based on measurements, gave a planning target volume receiving 100% of the dose (V100%) as 99.1 ± 1.9%, 96.3 ± 1.8%, 74.3 ± 1.9% and 72.6 ± 2.8% for the calculated treatment planning system values TPSCAL, 0MES, 3DMES and 6DMES, respectively. The statistical significance (p-values: paired sample t-test) of V100% were found to be 0.03 for the paired sample ≤ft< 3DMES{\\text -6DMES} \\right> and 0.01 for ≤ft< 0MES{\\text -TPSCAL} \\right> . In this paper, a mathematical method to reduce 6D shifts to 3D shifts is presented. The mathematical method is verified by using well-matched values between the measured and calculated γ°. Measurements done on the ArcCHECK phantom also proved that the proposed methodology is correct. The post-correction of the table position condition introduces a minimal spatial dose delivery error in the frameless stereotactic system, using a 6D motion enabled robotic couch. This formulation enables the reduction of 6D positional inaccuracies to 3D linear shifts, and hence allows the treatment of patients with frameless stereotactic radiosurgery by using only a 3D linear motion enabled couch.
Sarkar, Biplab; Ray, Jyotirmoy; Ganesh, Tharmarnadar; Manikandan, Arjunan; Munshi, Anusheel; Rathinamuthu, Sasikumar; Kaur, Harpreet; Anbazhagan, Satheeshkumar; Giri, Upendra K; Roy, Soumya; Jassal, Kanan; Mohanti, Bidhu Kalyan
2018-03-22
The aim of this article is to derive and verify a mathematical formulation for the reduction of the six-dimensional (6D) positional inaccuracies of patients (lateral, longitudinal, vertical, pitch, roll and yaw) to three-dimensional (3D) linear shifts. The formulation was mathematically and experimentally tested and verified for 169 stereotactic radiotherapy patients. The mathematical verification involves the comparison of any (one) of the calculated rotational coordinates with the corresponding value from the 6D shifts obtained by cone beam computed tomography (CBCT). The experimental verification involves three sets of measurements using an ArcCHECK phantom, when (i) the phantom was not moved (neutral position: 0MES), (ii) the position of the phantom shifted by 6D shifts obtained from CBCT (6DMES) from neutral position and (iii) the phantom shifted from its neutral position by 3D shifts reduced from 6D shifts (3DMES). Dose volume histogram and statistical comparisons were made between [Formula: see text] and [Formula: see text]. The mathematical verification was performed by a comparison of the calculated and measured yaw (γ°) rotation values, which gave a straight line, Y = 1X with a goodness of fit as R 2 = 0.9982. The verification, based on measurements, gave a planning target volume receiving 100% of the dose (V100%) as 99.1 ± 1.9%, 96.3 ± 1.8%, 74.3 ± 1.9% and 72.6 ± 2.8% for the calculated treatment planning system values TPSCAL, 0MES, 3DMES and 6DMES, respectively. The statistical significance (p-values: paired sample t-test) of V100% were found to be 0.03 for the paired sample [Formula: see text] and 0.01 for [Formula: see text]. In this paper, a mathematical method to reduce 6D shifts to 3D shifts is presented. The mathematical method is verified by using well-matched values between the measured and calculated γ°. Measurements done on the ArcCHECK phantom also proved that the proposed methodology is correct. The post-correction of the table position condition introduces a minimal spatial dose delivery error in the frameless stereotactic system, using a 6D motion enabled robotic couch. This formulation enables the reduction of 6D positional inaccuracies to 3D linear shifts, and hence allows the treatment of patients with frameless stereotactic radiosurgery by using only a 3D linear motion enabled couch.
GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER
The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...
Reicher, Joshua; Reicher, Danielle; Reicher, Murray
2007-06-01
Improper positioning of the endotracheal tube during intubation poses a serious health risk to patients. In one prospective study of 219 critically ill patients, 14% required endotracheal tube repositioning after intubation [Brunel et al. Chest 1989; 96: 1043-1045] While a variety of techniques are used to confirm proper tube placement, a chest X-ray is usually employed for definitive verification. Radio frequency identification (RFID) technology, in which an RFID reader emits and receives a signal from an RFID tag, may be useful in evaluating endotracheal tube position. RFID technology has already been approved for use in humans as a safe and effective tool in a variety of applications. The use of handheld RFID detectors and RFID tag-labeled endotracheal tubes could allow for easy and accurate bedside monitoring of endotracheal tube position, once initial proper placement is confirmed.
Ebola hemorrhagic fever associated with novel virus strain, Uganda, 2007-2008.
Wamala, Joseph F; Lukwago, Luswa; Malimbo, Mugagga; Nguku, Patrick; Yoti, Zabulon; Musenero, Monica; Amone, Jackson; Mbabazi, William; Nanyunja, Miriam; Zaramba, Sam; Opio, Alex; Lutwama, Julius J; Talisuna, Ambrose O; Okware, Sam I
2010-07-01
During August 2007-February 2008, the novel Bundibugyo ebolavirus species was identified during an outbreak of Ebola viral hemorrhagic fever in Bundibugyo district, western Uganda. To characterize the outbreak as a requisite for determining response, we instituted a case-series investigation. We identified 192 suspected cases, of which 42 (22%) were laboratory positive for the novel species; 74 (38%) were probable, and 77 (40%) were negative. Laboratory confirmation lagged behind outbreak verification by 3 months. Bundibugyo ebolavirus was less fatal (case-fatality rate 34%) than Ebola viruses that had caused previous outbreaks in the region, and most transmission was associated with handling of dead persons without appropriate protection (adjusted odds ratio 3.83, 95% confidence interval 1.78-8.23). Our study highlights the need for maintaining a high index of suspicion for viral hemorrhagic fevers among healthcare workers, building local capacity for laboratory confirmation of viral hemorrhagic fevers, and institutionalizing standard precautions.
Experimental verification of a radiofrequency power model for Wi-Fi technology.
Fang, Minyu; Malone, David
2010-04-01
When assessing the power emitted from a Wi-Fi network, it has been observed that these networks operate at a relatively low duty cycle. In this paper, we extend a recently introduced model of emitted power in Wi-Fi networks to cover conditions where devices do not always have packets to transmit. We present experimental results to validate the original model and its extension by developing approximate, but practical, testbed measurement techniques. The accuracy of the models is confirmed, with small relative errors: less than 5-10%. Moreover, we confirm that the greatest power is emitted when the network is saturated with traffic. Using this, we give a simple technique to quickly estimate power output based on traffic levels and give examples showing how this might be used in practice to predict current or future power output from a Wi-Fi network.
Wang, Yingbing; Ebuoma, Lilian; Saksena, Mansi; Liu, Bob; Specht, Michelle; Rafferty, Elizabeth
2014-08-01
Use of mobile digital specimen radiography systems expedites intraoperative verification of excised breast specimens. The purpose of this study was to evaluate the performance of a such a system for verifying targets. A retrospective review included 100 consecutive pairs of breast specimen radiographs. Specimens were imaged in the operating room with a mobile digital specimen radiography system and then with a conventional digital mammography system in the radiology department. Two expert reviewers independently scored each image for image quality on a 3-point scale and confidence in target visualization on a 5-point scale. A target was considered confidently verified only if both reviewers declared the target to be confidently detected. The 100 specimens contained a total of 174 targets, including 85 clips (49%), 53 calcifications (30%), 35 masses (20%), and one architectural distortion (1%). Although a significantly higher percentage of mobile digital specimen radiographs were considered poor quality by at least one reviewer (25%) compared with conventional digital mammograms (1%), 169 targets (97%), were confidently verified with mobile specimen radiography; 172 targets (98%) were verified with conventional digital mammography. Three faint masses were not confidently verified with mobile specimen radiography, and conventional digital mammography was needed for confirmation. One faint mass and one architectural distortion were not confidently verified with either method. Mobile digital specimen radiography allows high diagnostic confidence for verification of target excision in breast specimens across target types, despite lower image quality. Substituting this modality for conventional digital mammography can eliminate delays associated with specimen transport, potentially decreasing surgical duration and increasing operating room throughput.
PANIC: A General-purpose Panoramic Near-infrared Camera for the Calar Alto Observatory
NASA Astrophysics Data System (ADS)
Cárdenas Vázquez, M.-C.; Dorner, B.; Huber, A.; Sánchez-Blanco, E.; Alter, M.; Rodríguez Gómez, J. F.; Bizenberger, P.; Naranjo, V.; Ibáñez Mengual, J.-M.; Panduro, J.; García Segura, A. J.; Mall, U.; Fernández, M.; Laun, W.; Ferro Rodríguez, I. M.; Helmling, J.; Terrón, V.; Meisenheimer, K.; Fried, J. W.; Mathar, R. J.; Baumeister, H.; Rohloff, R.-R.; Storz, C.; Verdes-Montenegro, L.; Bouy, H.; Ubierna, M.; Fopp, P.; Funke, B.
2018-02-01
PANIC7 is the new PAnoramic Near-Infrared Camera for Calar Alto and is a project jointly developed by the MPIA in Heidelberg, Germany, and the IAA in Granada, Spain, for the German-Spanish Astronomical Center at Calar Alto Observatory (CAHA; Almería, Spain). This new instrument works with the 2.2 m and 3.5 m CAHA telescopes covering a field of view of 30 × 30 arcmin and 15 × 15 arcmin, respectively, with a sampling of 4096 × 4096 pixels. It is designed for the spectral bands from Z to K S , and can also be equipped with narrowband filters. The instrument was delivered to the observatory in 2014 October and was commissioned at both telescopes between 2014 November and 2015 June. Science verification at the 2.2 m telescope was carried out during the second semester of 2015 and the instrument is now at full operation. We describe the design, assembly, integration, and verification process, the final laboratory tests and the PANIC instrument performance. We also present first-light data obtained during the commissioning and preliminary results of the scientific verification. The final optical model and the theoretical performance of the camera were updated according to the as-built data. The laboratory tests were made with a star simulator. Finally, the commissioning phase was done at both telescopes to validate the camera real performance on sky. The final laboratory test confirmed the expected camera performances, complying with the scientific requirements. The commissioning phase on sky has been accomplished.
18 CFR 281.213 - Data Verification Committee.
Code of Federal Regulations, 2011 CFR
2011-04-01
... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...
18 CFR 281.213 - Data Verification Committee.
Code of Federal Regulations, 2010 CFR
2010-04-01
... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...
The politics of verification and the control of nuclear tests, 1945-1980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallagher, N.W.
1990-01-01
This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less
Square wave voltammetry at the dropping mercury electrode: Experimental
Turner, J.A.; Christie, J.H.; Vukovic, M.; Osteryoung, R.A.
1977-01-01
Experimental verification of earlier theoretical work for square wave voltammetry at the dropping mercury electrode is given. Experiments using ferric oxalate and cadmium(II) in HCl confirm excellent agreement with theory. Experimental peak heights and peak widths are found to be within 2% of calculated results. An example of trace analysis using square wave voltammetry at the DME is presented. The technique is shown to have the same order of sensitivity as differential pulse polarography but is much faster to perform. A detection limit for cadmium in 0.1 M HCl for the system used here was 7 ?? 10-8 M.
Experimental verification of the rainbow trapping effect in adiabatic plasmonic gratings
Gan, Qiaoqiang; Gao, Yongkang; Wagner, Kyle; Vezenov, Dmitri; Ding, Yujie J.; Bartoli, Filbert J.
2011-01-01
We report the experimental observation of a trapped rainbow in adiabatically graded metallic gratings, designed to validate theoretical predictions for this unique plasmonic structure. One-dimensional graded nanogratings were fabricated and their surface dispersion properties tailored by varying the grating groove depth, whose dimensions were confirmed by atomic force microscopy. Tunable plasmonic bandgaps were observed experimentally, and direct optical measurements on graded grating structures show that light of different wavelengths in the 500–700-nm region is “trapped” at different positions along the grating, consistent with computer simulations, thus verifying the “rainbow” trapping effect. PMID:21402936
The little ice age as recorded in the stratigraphy of the tropical quelccaya ice cap.
Thompson, L G; Mosley-Thompson, E; Dansgaard, W; Grootes, P M
1986-10-17
The analyses of two ice cores from a southern tropical ice cap provide a record of climatic conditions over 1000 years for a region where other proxy records are nearly absent. Annual variations in visible dust layers, oxygen isotopes, microparticle concentrations, conductivity, and identification of the historical (A.D. 1600) Huaynaputina ash permit accurate dating and time-scale verification. The fact that the Little Ice Age (about A.D. 1500 to 1900) stands out as a significant climatic event in the oxygen isotope and electrical conductivity records confirms the worldwide character of this event.
Highly reliable oxide VCSELs for datacom applications
NASA Astrophysics Data System (ADS)
Aeby, Ian; Collins, Doug; Gibson, Brian; Helms, Christopher J.; Hou, Hong Q.; Lou, Wenlin; Bossert, David J.; Wang, Charlie X.
2003-06-01
In this paper we describe the processes and procedures that have been developed to ensure high reliability for Emcore"s 850 nm oxide confined GaAs VCSELs. Evidence from on-going accelerated life testing and other reliability studies that confirm that this process yields reliable products will be discussed. We will present data and analysis techniques used to determine the activation energy and acceleration factors for the dominant wear-out failure mechanisms for our devices as well as our estimated MTTF of greater than 2 million use hours. We conclude with a summary of internal verification and field return rate validation data.
Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers
NASA Technical Reports Server (NTRS)
Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.
1983-01-01
A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.
Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)
NASA Technical Reports Server (NTRS)
Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.
2015-01-01
The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of the IMM Project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the HRP NASA-STD-7009 Guidance Document working group and the NASA-HDBK-7009 [2]. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and simulation tool, and completion of service requests from a broad end user consortium including operations, science and technology planning, and exploration planning. IMM v4.0 is slated for operational release in the FY015 and current VVC assessments illustrate the expected VVC status prior to the completion of customer lead external review efforts. CONCLUSIONS: The VVC approach established by the IMM Project of incorporating Project-specific recommended practices and guidelines for implementing the 7009 requirements is comprehensive and includes the involvement of end users at every stage in IMM evolution. Methods and techniques used to quantify the VVC status of the IMM Project represented a critical communication tool in providing clear and concise suitability assessments to IMM customers. These processes have not only received approval from the local NASA community but have also garnered recognition by other federal agencies seeking to develop similar guidelines in the medical modeling community.
NASA Astrophysics Data System (ADS)
Deer, Maria Soledad
The auditory experience of using a hearing aid or a cochlear implant simultaneously with a cell phone is driven by a number of factors. These factors are: radiofrequency and baseband interference, speech intelligibility, sound quality, handset design, volume control and signal strength. The purpose of this study was to develop a tool to be used by hearing aid and cochlear implant users in retail stores as they try cell phones before buying them. This tool is meant to be an efficient, practical and systematic consumer selection tool that will capture and document information on all the domains that play a role in the auditory experience of using a cell phone with a hearing aid or cochlear implant. The development of this consumer tool involved three steps as follows: preparation, verification and measurement of success according to a predefined criterion. First, the consumer tool, consisting of a comparison chart and speech material, was prepared. Second, the consumer tool was evaluated by groups of subjects in a two-step verification process. Phase I was conducted in a controlled setting and it was followed by Phase II which took place in real world (field) conditions. In order to perform a systematic evaluation of the consumer tool two questionnaires were developed: one questionnaire for each phase. Both questionnaires involved five quantitative variables scored with the use of ratings scales. These ratings were averaged yielding an Overall Consumer Performance Score. A qualitative performance category corresponding to the Mean Opinion Score (MOS) was allocated to each final score within a scale ranging from 1 to 5 (where 5 = excellent and 1 = bad). Finally, the consumer tool development was determined to be successful if at least 80% of the participants in verification Phase II rated the comparison chart as excellent or good according to the qualitative MOS score. The results for verification Phase II (field conditions) indicated that the Overall Consumer Performance score for 92% of the subjects (11/12) was 3.7 and above corresponding to Good and Excellent MOS qualitative categories. It was concluded that this is a practical and efficient tool for hearing aid/cochlear implant users as they approach a cell phone selection process.
40 CFR 1065.920 - PEMS calibrations and verifications.
Code of Federal Regulations, 2014 CFR
2014-07-01
....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...
This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...
Aerospace Nickel-cadmium Cell Verification
NASA Technical Reports Server (NTRS)
Manzo, Michelle A.; Strawn, D. Michael; Hall, Stephen W.
2001-01-01
During the early years of satellites, NASA successfully flew "NASA-Standard" nickel-cadmium (Ni-Cd) cells manufactured by GE/Gates/SAFF on a variety of spacecraft. In 1992 a NASA Battery Review Board determined that the strategy of a NASA Standard Cell and Battery Specification and the accompanying NASA control of a standard manufacturing control document (MCD) for Ni-Cd cells and batteries was unwarranted. As a result of that determination, standards were abandoned and the use of cells other than the NASA Standard was required. In order to gain insight into the performance and characteristics of the various aerospace Ni-Cd products available, tasks were initiated within the NASA Aerospace Flight Battery Systems Program that involved the procurement and testing of representative aerospace Ni-Cd cell designs. A standard set of test conditions was established in order to provide similar information about the products from various vendors. The objective of this testing was to provide independent verification of representative commercial flight cells available in the marketplace today. This paper will provide a summary of the verification tests run on cells from various manufacturers: Sanyo 35 Ampere-hour (Ali) standard and 35 Ali advanced Ni-Cd cells, SAFr 50 Ah Ni-Cd cells and Eagle-Picher 21 Ali Magnum and 21 Ali Super Ni-CdTM cells from Eagle-Picher were put through a full evaluation. A limited number of 18 and 55 Ali cells from Acme Electric were also tested to provide an initial evaluation of the Acme aerospace cell designs. Additionally, 35 Ali aerospace design Ni-MH cells from Sanyo were evaluated under the standard conditions established for this program. Ile test program is essentially complete. The cell design parameters, the verification test plan and the details of the test result will be discussed.
Formal development of a clock synchronization circuit
NASA Technical Reports Server (NTRS)
Miner, Paul S.
1995-01-01
This talk presents the latest stage in formal development of a fault-tolerant clock synchronization circuit. The development spans from a high level specification of the required properties to a circuit realizing the core function of the system. An abstract description of an algorithm has been verified to satisfy the high-level properties using the mechanical verification system EHDM. This abstract description is recast as a behavioral specification input to the Digital Design Derivation system (DDD) developed at Indiana University. DDD provides a formal design algebra for developing correct digital hardware. Using DDD as the principle design environment, a core circuit implementing the clock synchronization algorithm was developed. The design process consisted of standard DDD transformations augmented with an ad hoc refinement justified using the Prototype Verification System (PVS) from SRI International. Subsequent to the above development, Wilfredo Torres-Pomales discovered an area-efficient realization of the same function. Establishing correctness of this optimization requires reasoning in arithmetic, so a general verification is outside the domain of both DDD transformations and model-checking techniques. DDD represents digital hardware by systems of mutually recursive stream equations. A collection of PVS theories was developed to aid in reasoning about DDD-style streams. These theories include a combinator for defining streams that satisfy stream equations, and a means for proving stream equivalence by exhibiting a stream bisimulation. DDD was used to isolate the sub-system involved in Torres-Pomales' optimization. The equivalence between the original design and the optimized verified was verified in PVS by exhibiting a suitable bisimulation. The verification depended upon type constraints on the input streams and made extensive use of the PVS type system. The dependent types in PVS provided a useful mechanism for defining an appropriate bisimulation.
An evaluation of superminicomputers for thermal analysis
NASA Technical Reports Server (NTRS)
Storaasli, O. O.; Vidal, J. B.; Jones, G. K.
1982-01-01
The use of superminicomputers for solving a series of increasingly complex thermal analysis problems is investigated. The approach involved (1) installation and verification of the SPAR thermal analyzer software on superminicomputers at Langley Research Center and Goddard Space Flight Center, (2) solution of six increasingly complex thermal problems on this equipment, and (3) comparison of solution (accuracy, CPU time, turnaround time, and cost) with solutions on large mainframe computers.
Development and Experimental Verification of Surface Effects in a Fluidic Model
2006-01-01
FROM A HE PLASMA INSIDE A POLYSTYRENE MICROCHANNEL. 43 FIGURE 30: THE EMISSION SPECTRA FROM A MIXED HEXAFLUOROETHYLENE/HE PLASMA INSIDE THE...MICROCHANNEL 47 FIGURE 35: THE ADSORPTION OF GLUCOSE OXIDASE TO DIFFERENT POLYMER SURFACES WAS SHOWN TO HAVE A SIGNIFICANT EFFECT ON ELECTROOSMOTIC FLOW...approach involves neglecting non-ideal (convective-diffusive) effects 5 by assuming well- mixed protein in contact with an idealized surface. Coupled
Commissary Services: AFSC 612XX and Civilian Equivalent
1992-02-01
Customer Service, Wee-Serve Operations, Storeworker, Com- missary Operation Management , and System Verification. Because of this wide dispersion across the...PERSONNEL 86% 0 0 0 STOREWORKER 0 7% 5% 0 QUALITY ASSURANCE EVALUATORS 0 * * 0 TRAINING MANAGEMENT 0 * 2% COMMISSARY OPERATION MANAGEMENT 0 7% 46% 68% WEE...skill level members work in the Commissary Operation Management job, while smaller percentages work in the Senior Management job. Continued involvement of
Formal Validation of Aerospace Software
NASA Astrophysics Data System (ADS)
Lesens, David; Moy, Yannick; Kanig, Johannes
2013-08-01
Any single error in critical software can have catastrophic consequences. Even though failures are usually not advertised, some software bugs have become famous, such as the error in the MIM-104 Patriot. For space systems, experience shows that software errors are a serious concern: more than half of all satellite failures from 2000 to 2003 involved software. To address this concern, this paper addresses the use of formal verification of software developed in Ada.
NASA Technical Reports Server (NTRS)
Szuch, J. R.; Seldner, K.; Cwynar, D. S.
1977-01-01
A real time, hybrid computer simulation of a turbofan engine is described. Controls research programs involving that engine are supported by the simulation. The real time simulation is shown to match the steady state and transient performance of the engine over a wide range of flight conditions and power settings. The simulation equations, FORTRAN listing, and analog patching diagrams are included.
Diez, P; Hoskin, P J; Aird, E G A
2007-10-01
This questionnaire forms the basis of the quality assurance (QA) programme for the UK randomized Phase III study of the Stanford V regimen versus ABVD for treatment of advanced Hodgkin's disease to assess differences between participating centres in treatment planning and delivery of involved-field radiotherapy for Hodgkin's lymphoma The questionnaire, which was circulated amongst 42 participating centres, consisted of seven sections: target volume definition and dose prescription; critical structures; patient positioning and irradiation techniques; planning; dose calculation; verification; and future developments The results are based on 25 responses. One-third plan using CT alone, one-third use solely the simulator and the rest individualize, depending on disease site. Eleven centres determine a dose distribution for each patient. Technique depends on disease site and whether CT or simulator planning is employed. Most departments apply isocentric techniques and use immobilization and customized shielding. In vivo dosimetry is performed in 7 centres and treatment verification occurs in 24 hospitals. In conclusion, the planning and delivery of treatment for lymphoma patients varies across the country. Conventional planning is still widespread but most centres are moving to CT-based planning and virtual simulation with extended use of immobilization, customized shielding and compensation.
Meneghetti, Chiara; Labate, Enia; Pazzaglia, Francesca; Hamilton, Colin; Gyselinck, Valérie
2017-05-01
This study examines the involvement of spatial and visual working memory (WM) in the construction of flexible spatial models derived from survey and route descriptions. Sixty young adults listened to environment descriptions, 30 from a survey perspective and the other 30 from a route perspective, while they performed spatial (spatial tapping [ST]) and visual (dynamic visual noise [DVN]) secondary tasks - believed to overload the spatial and visual working memory (WM) components, respectively - or no secondary task (control, C). Their mental representations of the environment were tested by free recall and a verification test with both route and survey statements. Results showed that, for both recall tasks, accuracy was worse in the ST than in the C or DVN conditions. In the verification test, the effect of both ST and DVN was a decreasing accuracy for sentences testing spatial relations from the opposite perspective to the one learnt than if the perspective was the same; only ST had a stronger interference effect than the C condition for sentences from the opposite perspective from the one learnt. Overall, these findings indicate that both visual and spatial WM, and especially the latter, are involved in the construction of perspective-flexible spatial models. © 2016 The British Psychological Society.
30 CFR 250.913 - When must I resubmit Platform Verification Program plans?
Code of Federal Regulations, 2011 CFR
2011-07-01
... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...
30 CFR 250.909 - What is the Platform Verification Program?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...
Requirement Assurance: A Verification Process
NASA Technical Reports Server (NTRS)
Alexander, Michael G.
2011-01-01
Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.
30 CFR 250.909 - What is the Platform Verification Program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...
Property-driven functional verification technique for high-speed vision system-on-chip processor
NASA Astrophysics Data System (ADS)
Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian
2017-04-01
The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.
Hydrologic data-verification management program plan
Alexander, C.W.
1982-01-01
Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)
NASA Astrophysics Data System (ADS)
Noufal, Manthala Padannayil; Abdullah, Kallikuzhiyil Kochunny; Niyas, Puzhakkal; Subha, Pallimanhayil Abdul Raheem
2017-12-01
Aim: This study evaluates the impacts of using different evaluation criteria on gamma pass rates in two commercially available QA methods employed for the verification of VMAT plans using different hypothetical planning target volumes (PTVs) and anatomical regions. Introduction: Volumetric modulated arc therapy (VMAT) is a widely accepted technique to deliver highly conformal treatment in a very efficient manner. As their level of complexity is high in comparison to intensity-modulated radiotherapy (IMRT), the implementation of stringent quality assurance (QA) before treatment delivery is of paramount importance. Material and Methods: Two sets of VMAT plans were generated using Eclipse planning systems, one with five different complex hypothetical three-dimensional PTVs and one including three anatomical regions. The verification of these plans was performed using a MatriXX ionization chamber array embedded inside a MultiCube phantom and a Varian EPID dosimetric system attached to a Clinac iX. The plans were evaluated based on the 3%/3 mm, 2%/2 mm, and 1%/1 mm global gamma criteria and with three low-dose threshold values (0%, 10%, and 20%). Results: The gamma pass rates were above 95% in all VMAT plans, when the 3%/3mm gamma criterion was used and no threshold was applied. In both systems, the pass rates decreased as the criteria become stricter. Higher pass rates were observed when no threshold was applied and they tended to decrease for 10% and 20% thresholds. Conclusion: The results confirm the suitability of the equipments used and the validity of the plans. The study also confirmed that the threshold settings greatly affect the gamma pass rates, especially for lower gamma criteria.
SU-F-T-469: A Clinically Observed Discrepancy Between Image-Based and Log- Based MLC Position
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neal, B; Ahmed, M; Siebers, J
2016-06-15
Purpose: To present a clinical case which challenges the base assumption of log-file based QA, by showing that the actual position of a MLC leaf can suddenly deviate from its programmed and logged position by >1 mm as observed with real-time imaging. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used in cine mode to capture portal images during treatment. Visual monitoring identified an anomalous MLC leaf pair gap not otherwise detected by the automatic position verification. The position of the erred leaf was measured on EPID images and log files were analyzed for themore » treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days. Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3±0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusion: It has been clinically observed that log-file derived leaf positions can differ from their actual positions by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trusting log file records. Intra-treatment EPID imaging provides a method to capture departures from MLC planned positions. Work was supported in part by Varian Medical Systems.« less
Lodge, Keri-Michèle; Milnes, David; Gilbody, Simon M
2011-03-01
Background Identifying patients with learning disabilities within primary care is central to initiatives for improving the health of this population. UK general practitioners (GPs) receive additional income for maintaining registers of patients with learning disabilities as part of the Quality and Outcomes Framework (QOF), and may opt to provide Directed Enhanced Services (DES), which requires practices to maintain registers of patients with moderate or severe learning disabilities and offer them annual health checks.Objectives This paper describes the development of a register of patients with moderate or severe learning disabilities at one UK general practice.Methods A Read code search of one UK general practice's electronic medical records was conducted in order to identify patients with learning disabilities. Confirmation of diagnoses was sought by scrutinising records and GP verification. Cross-referencing with the practice QOF register of patients with learning disabilities of any severity, and the local authority's list of clients with learning disabilities, was performed.Results Of 15 001 patients, 229 (1.5%) were identified by the Read code search as possibly having learning disabilities. Scrutiny of records and GP verification confirmed 64 had learning disabilities and 24 did not, but the presence or absence of learning disability remained unclear in 141 cases. Cross-referencing with the QOF register (n=81) and local authority list (n=49) revealed little overlap.Conclusion Identifying learning disability and assessing its severity are tasks GPs may be unfamiliar with, and relying on Read code searches may result in under-detection. Further research is needed to define optimum strategies for identifying, cross-referencing and validating practice-based registers of patients with learning disabilities.
2011-01-01
Background Identifying patients with learning disabilities within primary care is central to initiatives for improving the health of this population. UK general practitioners (GPs) receive additional income for maintaining registers of patients with learning disabilities as part of the Quality and Outcomes Framework (QOF), and may opt to provide Directed Enhanced Services (DES), which requires practices to maintain registers of patients with moderate or severe learning disabilities and offer them annual health checks. Objectives This paper describes the development of a register of patients with moderate or severe learning disabilities at one UK general practice. Methods A Read code search of one UK general practice's electronic medical records was conducted in order to identify patients with learning disabilities. Confirmation of diagnoses was sought by scrutinising records and GP verification. Cross-referencing with the practice QOF register of patients with learning disabilities of any severity, and the local authority's list of clients with learning disabilities, was performed. Results Of 15 001 patients, 229 (1.5%) were identified by the Read code search as possibly having learning disabilities. Scrutiny of records and GP verification confirmed 64 had learning disabilities and 24 did not, but the presence or absence of learning disability remained unclear in 141 cases. Cross-referencing with the QOF register (n=81) and local authority list (n=49) revealed little overlap. Conclusion Identifying learning disability and assessing its severity are tasks GPs may be unfamiliar with, and relying on Read code searches may result in under-detection. Further research is needed to define optimum strategies for identifying, cross-referencing and validating practice-based registers of patients with learning disabilities. PMID:22479290
SU-E-J-34: Setup Accuracy in Spine SBRT Using CBCT 6D Image Guidance in Comparison with 6D ExacTrac
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Z; Yip, S; Lewis, J
2015-06-15
Purpose Volumetric information of the spine captured on CBCT can potentially improve the accuracy in spine SBRT setup that has been commonly performed through 2D radiographs. This work evaluates the setup accuracy in spine SBRT using 6D CBCT image guidance that recently became available on Varian systems. Methods ExacTrac radiographs have been commonly used for Spine SBRT setup. The setup process involves first positioning patients with lasers followed by localization imaging, registration, and repositioning. Verification images are then taken providing the residual errors (ExacTracRE) before beam on. CBCT verification is also acquired in our institute. The availability of both ExacTracmore » and CBCT verifications allows a comparison study. 41 verification CBCT of 16 patients were retrospectively registered with the planning CT enabling 6D corrections, giving CBCT residual errors (CBCTRE) which were compared with ExacTracRE. Results The RMS discrepancies between CBCTRE and ExacTracRE are 1.70mm, 1.66mm, 1.56mm in vertical, longitudinal and lateral directions and 0.27°, 0.49°, 0.35° in yaw, roll and pitch respectively. The corresponding mean discrepancies (and standard deviation) are 0.62mm (1.60mm), 0.00mm (1.68mm), −0.80mm (1.36mm) and 0.05° (0.58°), 0.11° (0.48°), −0.16° (0.32°). Of the 41 CBCT, 17 had high-Z surgical implants. No significant difference in ExacTrac-to-CBCT discrepancy was observed between patients with and without the implants. Conclusion Multiple factors can contribute to the discrepancies between CBCT and ExacTrac: 1) the imaging iso-centers of the two systems, while calibrated to coincide, can be different; 2) the ROI used for registration can be different especially if ribs were included in ExacTrac images; 3) small patient motion can occur between the two verification image acquisitions; 4) the algorithms can be different between CBCT (volumetric) and ExacTrac (radiographic) registrations.« less
Contributions to the study of the mechanisms of photodynamic cross-linking of proteins
NASA Astrophysics Data System (ADS)
Shen, Hui-Rong
The illumination of proteins in solution and in cells in the presence of photosensitizers may lead to the inter- and/or intramolecular crosslinking of the proteins (photosensitized or photodynamic crosslinking). This phenomenon appears to be involved in the photohemolysis of red cells, cataract development, skin photoaging, photodynamic therapy for cancers, laser welding of tissues, biomaterial modification, and other biological situations. Although the processes involved in the photocrosslinking of proteins have been extensively studied, the mechanisms involved are still largely unknown. The main objectives of the studies reported in this dissertation were to investigate the detailed mechanisms involved in the photocrosslinking of proteins and to determine the chemical nature of the crosslinks formed. The first part of this study was devoted to the verification of the roles of His, Lys and Tyr in the photodynamic crosslinking of proteins. The crosslinking reaction was modeled using tailor-made water-soluble synthetic N-(2-hydroxypropyl)methacrylamide (HPMA) copolymers containing epsilon-aminocaproic acid side chains terminating in His, Lys or tyrosinamide residues photosensitized by rose bengal (RB) and flavin mononucleotide (FMN). RB typically produces singlet oxygen, whereas FMN produces both singlet oxygen and radicals. His-His and His-Lys crosslinks were formed with RB as the sensitizer. RB-sensitization did not crosslink Tyr residues, whereas FMN coupled two Tyr residues via a radical pathway. Protection of the His and/or Lys residues in ribonuclease A (RNase A) significantly inhibited the extent of intermolecular crosslinking, and confirmed the key roles played by His and Lys in crosslinking reactions. The second part of this study involved the elucidation of the detailed reaction mechanisms and the chemical structures of His-His and Tyr-Tyr crosslinks. N-benzoyl-histidine (Bz-His) and N-acetyl-tyrosine (Ac-Tyr) were used to model the photosensitized crosslinking of proteins involving His and Tyr residues. Photocrosslinking of Bz-His was performed in phosphate buffer at pH 7.4 with immobilized RB beads as sensitizer. The main dimerized product was isolated and characterized. Its chemical structure was established by MS and NMR methods. Ac-Tyr was photocrosslinked with FMN as the sensitizer at pH 6.0; oxygen was necessary. Three main crosslinked dimers were obtained. Their chemical structures were determined by MS and NMR data.
STELLAR: fast and exact local alignments
2011-01-01
Background Large-scale comparison of genomic sequences requires reliable tools for the search of local alignments. Practical local aligners are in general fast, but heuristic, and hence sometimes miss significant matches. Results We present here the local pairwise aligner STELLAR that has full sensitivity for ε-alignments, i.e. guarantees to report all local alignments of a given minimal length and maximal error rate. The aligner is composed of two steps, filtering and verification. We apply the SWIFT algorithm for lossless filtering, and have developed a new verification strategy that we prove to be exact. Our results on simulated and real genomic data confirm and quantify the conjecture that heuristic tools like BLAST or BLAT miss a large percentage of significant local alignments. Conclusions STELLAR is very practical and fast on very long sequences which makes it a suitable new tool for finding local alignments between genomic sequences under the edit distance model. Binaries are freely available for Linux, Windows, and Mac OS X at http://www.seqan.de/projects/stellar. The source code is freely distributed with the SeqAn C++ library version 1.3 and later at http://www.seqan.de. PMID:22151882
NASA Astrophysics Data System (ADS)
Czirjak, Daniel
2017-04-01
Remote sensing platforms have consistently demonstrated the ability to detect, and in some cases identify, specific targets of interest, and photovoltaic solar panels are shown to have a unique spectral signature that is consistent across multiple manufacturers and construction methods. Solar panels are proven to be detectable in hyperspectral imagery using common statistical target detection methods such as the adaptive cosine estimator, and false alarms can be mitigated through the use of a spectral verification process that eliminates pixels that do not have the key spectral features of photovoltaic solar panel reflectance spectrum. The normalized solar panel index is described and is a key component in the false-alarm mitigation process. After spectral verification, these solar panel arrays are confirmed on openly available literal imagery and can be measured using numerous open-source algorithms and tools. The measurements allow for the assessment of overall solar power generation capacity using an equation that accounts for solar insolation, the area of solar panels, and the efficiency of the solar panels conversion of solar energy to power. Using a known location with readily available information, the methods outlined in this paper estimate the power generation capabilities within 6% of the rated power.
Formal Verification of a Power Controller Using the Real-Time Model Checker UPPAAL
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Larsen, Kim Guldstrand; Skou, Arne
1999-01-01
A real-time system for power-down control in audio/video components is modeled and verified using the real-time model checker UPPAAL. The system is supposed to reside in an audio/video component and control (read from and write to) links to neighbor audio/video components such as TV, VCR and remote-control. In particular, the system is responsible for the powering up and down of the component in between the arrival of data, and in order to do so in a safe way without loss of data, it is essential that no link interrupts are lost. Hence, a component system is a multitasking system with hard real-time requirements, and we present techniques for modeling time consumption in such a multitasked, prioritized system. The work has been carried out in a collaboration between Aalborg University and the audio/video company B&O. By modeling the system, 3 design errors were identified and corrected, and the following verification confirmed the validity of the design but also revealed the necessity for an upper limit of the interrupt frequency. The resulting design has been implemented and it is going to be incorporated as part of a new product line.
NASA Astrophysics Data System (ADS)
Smoczek, Jaroslaw
2015-10-01
The paper deals with the problem of reducing the residual vibration and limiting the transient oscillations of a flexible and underactuated system with respect to the variation of operating conditions. The comparative study of generalized predictive control (GPC) and fuzzy scheduling scheme developed based on the P1-TS fuzzy theory, local pole placement method and interval analysis of closed-loop system polynomial coefficients is addressed to the problem of flexible crane control. The two alternatives of a GPC-based method are proposed that enable to realize this technique either with or without a sensor of payload deflection. The first control technique is based on the recursive least squares (RLS) method applied to on-line estimate the parameters of a linear parameter varying (LPV) model of a crane dynamic system. The second GPC-based approach is based on a payload deflection feedback estimated using a pendulum model with the parameters interpolated using the P1-TS fuzzy system. Feasibility and applicability of the developed methods were confirmed through experimental verification performed on a laboratory scaled overhead crane.
NASA Astrophysics Data System (ADS)
Hönicke, Philipp; Kolbe, Michael; Müller, Matthias; Mantler, Michael; Krämer, Markus; Beckhoff, Burkhard
2014-10-01
An experimental method for the verification of the individually different energy dependencies of L1-, L2-, and L3- subshell photoionization cross sections is described. The results obtained for Pd and Mo are well in line with theory regarding both energy dependency and absolute values, and confirm the theoretically calculated cross sections by Scofield from the early 1970 s and, partially, more recent data by Trzhaskovskaya, Nefedov, and Yarzhemsky. The data also demonstrate the questionability of quantitative x-ray spectroscopical results based on the widely used fixed jump ratio approximated cross sections with energy independent ratios. The experiments are carried out by employing the radiometrically calibrated instrumentation of the Physikalisch-Technische Bundesanstalt at the electron storage ring BESSY II in Berlin; the obtained fluorescent intensities are thereby calibrated at an absolute level in reference to the International System of Units. Experimentally determined fixed fluorescence line ratios for each subshell are used for a reliable deconvolution of overlapping fluorescence lines. The relevant fundamental parameters of Mo and Pd are also determined experimentally in order to calculate the subshell photoionization cross sections independently of any database.
NASA Technical Reports Server (NTRS)
Windley, P. J.
1991-01-01
In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.
An automatic agricultural zone classification procedure for crop inventory satellite images
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Kux, H. J.; Velasco, F. R. D.; Deoliveira, M. O. B.
1982-01-01
A classification procedure for assessing crop areal proportion in multispectral scanner image is discussed. The procedure is into four parts: labeling; classification; proportion estimation; and evaluation. The procedure also has the following characteristics: multitemporal classification; the need for a minimum field information; and verification capability between automatic classification and analyst labeling. The processing steps and the main algorithms involved are discussed. An outlook on the future of this technology is also presented.
Testing of the line element of special relativity with rotating systems
NASA Technical Reports Server (NTRS)
Vargas, Jose G.; Torr, Douglas G.
1989-01-01
Experiments with rotating systems are examined from the point of view of a test theory of the Lorentz transformations (LTs), permitting, in principle, the verification of the simultaneity relation. The significance of the experiments involved in the testing of the LTs can be determined using Robertson's test theory (RTT). A revised RTT is discussed, and attention is given to the Ehrenfest paradox in connection with the testing of the LTs.
Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
NASA Astrophysics Data System (ADS)
Houtz, Derek Anderson
Microwave radiometers allow remote sensing of earth and atmospheric temperatures from space, anytime, anywhere, through clouds, and in the dark. Data from microwave radiometers are high-impact operational inputs to weather forecasts, and are used to provide a vast array of climate data products including land and sea surface temperatures, soil moisture, ocean salinity, cloud precipitation and moisture height profiles, and even wind speed and direction, to name a few. Space-borne microwave radiometers have a major weakness when it comes to long-term climate trends due to their lack of traceability. Because there is no standard, or absolute reference, for microwave brightness temperature, nationally or internationally, individual instruments must each rely on their own internal calibration source to set an absolute reference to the fundamental unit of Kelvin. This causes each subsequent instrument to have a calibration offset and there is no 'true' reference. The work introduced in this thesis addresses this vacancy by proposing and introducing a NIST microwave brightness temperature source that may act as the primary reference. The NIST standard will allow pre-launch calibration of radiometers across a broad range of remote sensing pertinent frequencies between 18 GHz and 220 GHz. The blackbody will be capable of reaching temperatures ranging between liquid nitrogen boiling at approximately 77 K and warm-target temperature of 350 K. The brightness temperature of the source has associated standard uncertainty ranging as a function of frequency between 0.084 K and 0.111 K. The standard can be transferred to the calibration source in the instrument, providing traceability of all subsequent measurements back to the primary standard. The development of the NIST standard source involved predicting and measuring its brightness temperature, and minimizing the associated uncertainty of this quantity. Uniform and constant physical temperature along with well characterized and maximized emissivity are fundamental to a well characterized blackbody. The chosen geometry is a microwave absorber coated copper cone. Electromagnetic and thermal simulations are introduced to optimize the design. Experimental verifications of the simulated quantities confirm the predicted performance of the blackbody.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dartevelle, Sebastian
2007-10-01
Large-scale volcanic eruptions are hazardous events that cannot be described by detailed and accurate in situ measurement: hence, little to no real-time data exists to rigorously validate current computer models of these events. In addition, such phenomenology involves highly complex, nonlinear, and unsteady physical behaviors upon many spatial and time scales. As a result, volcanic explosive phenomenology is poorly understood in terms of its physics, and inadequately constrained in terms of initial, boundary, and inflow conditions. Nevertheless, code verification and validation become even more critical because more and more volcanologists use numerical data for assessment and mitigation of volcanic hazards.more » In this report, we evaluate the process of model and code development in the context of geophysical multiphase flows. We describe: (1) the conception of a theoretical, multiphase, Navier-Stokes model, (2) its implementation into a numerical code, (3) the verification of the code, and (4) the validation of such a model within the context of turbulent and underexpanded jet physics. Within the validation framework, we suggest focusing on the key physics that control the volcanic clouds—namely, momentum-driven supersonic jet and buoyancy-driven turbulent plume. For instance, we propose to compare numerical results against a set of simple and well-constrained analog experiments, which uniquely and unambiguously represent each of the key-phenomenology. Key« less
Robotic control and inspection verification
NASA Technical Reports Server (NTRS)
Davis, Virgil Leon
1991-01-01
Three areas of possible commercialization involving robots at the Kennedy Space Center (KSC) are discussed: a six degree-of-freedom target tracking system for remote umbilical operations; an intelligent torque sensing end effector for operating hand valves in hazardous locations; and an automatic radiator inspection device, a 13 by 65 foot robotic mechanism involving completely redundant motors, drives, and controls. Aspects concerning the first two innovations can be integrated to enable robots or teleoperators to perform tasks involving orientation and panal actuation operations that can be done with existing technology rather than waiting for telerobots to incorporate artificial intelligence (AI) to perform 'smart' autonomous operations. The third robot involves the application of complete control hardware redundancy to enable performance of work over and near expensive Space Shuttle hardware. The consumer marketplace may wish to explore commercialization of similiar component redundancy techniques for applications when a robot would not normally be used because of reliability concerns.
Self-verification motives at the collective level of self-definition.
Chen, Serena; Chen, Karen Y; Shaw, Lindsay
2004-01-01
Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.
Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems
NASA Technical Reports Server (NTRS)
Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)
2003-01-01
Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.
Hatherly, K E; Smylie, J C; Rodger, A; Dally, M J; Davis, S R; Millar, J L
2001-01-01
At the William Buckland Radiotherapy Center (WBRC), field-only electronic portal image (EPI) hard copies are used for radiation treatment field verification for whole brain, breast, chest, spine, and large pelvic fields, as determined by a previous study. A subsequent research project, addressing the quality of double exposed EPI hard copies for sites where field only EPI was not considered adequate to determine field placement, has been undertaken. The double exposed EPI hard copies were compared to conventional double exposed port films for small pelvic, partial brain, and head and neck fields and for a miscellaneous group. All double exposed EPIs were captured during routine clinical procedures using liquid ion chamber cassettes. EPI hard copies were generated using a Visiplex multi-format camera. In sites where port film remained the preferred verification format, the port films were generated as per department protocol. In addition EPIs were collected specifically for this project. Four radiation oncologists performed the evaluation of EPI and port film images independently with a questionnaire completed at each stage of the evaluation process to assess the following: Adequacy of information in the image to assess field placement. Adequacy of information for determining field placement correction. Clinician's preferred choice of imaging for field placement assessment The results indicate that double exposed EPI hard copies generally do containsufficient information to permit evaluation of field placement and can replace conventionaldouble exposed port films in a significant number of sites. These include the following:pelvis fields < 12 X 12 cm, partial brain fields, and a miscellaneous group. However forradical head and neck fields, the preferred verification image format remained port film dueto the image hard copy size and improved contrast for this media. Thus in this departmenthard copy EPI is the preferred modality of field verification for all sites except radical headand neck treatments. This should result in an increase in efficiency of workloadmanagement and patient care.
Verification Testing: Meet User Needs Figure of Merit
NASA Technical Reports Server (NTRS)
Kelly, Bryan W.; Welch, Bryan W.
2017-01-01
Verification is the process through which Modeling and Simulation(M&S) software goes to ensure that it has been rigorously tested and debugged for its intended use. Validation confirms that said software accurately models and represents the real world system. Credibility gives an assessment of the development and testing effort that the software has gone through as well as how accurate and reliable test results are. Together, these three components form Verification, Validation, and Credibility(VV&C), the process by which all NASA modeling software is to be tested to ensure that it is ready for implementation. NASA created this process following the CAIB (Columbia Accident Investigation Board) report seeking to understand the reasons the Columbia space shuttle failed during reentry. The reports conclusion was that the accident was fully avoidable, however, among other issues, the necessary data to make an informed decision was not there and the result was complete loss of the shuttle and crew. In an effort to mitigate this problem, NASA put out their Standard for Models and Simulations, currently in version NASA-STD-7009A, in which they detailed their recommendations, requirements and rationale for the different components of VV&C. They did this with the intention that it would allow for people receiving MS software to clearly understand and have data from the past development effort. This in turn would allow the people who had not worked with the MS software before to move forward with greater confidence and efficiency in their work. This particular project looks to perform Verification on several MATLAB (Registered Trademark)(The MathWorks, Inc.) scripts that will be later implemented in a website interface. It seeks to take note and define the limits of operation, the units and significance, and the expected datatype and format of the inputs and outputs of each of the scripts. This is intended to prevent the code from attempting to make incorrect or impossible calculations. Additionally, this project will look at the coding generally and note inconsistencies, redundancies, and other aspects that may become problematic or slow down the codes run time. Certain scripts lacking in documentation also will be commented and cataloged.
Image Hashes as Templates for Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.
2012-07-17
Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images,more » and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring that the information contained in the hashed image data (available out-of-IB) cannot be used to extract sensitive information about the imaged object is of primary concern. Thus the techniques are characterized by high unpredictability to guarantee security. We will present an assessment of the performance of our techniques with respect to security, sensitivity and robustness on the basis of a methodical and mathematically precise framework.« less
The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...
National Centers for Environmental Prediction
Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS
National Centers for Environmental Prediction
Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration /EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION / DIAGNOSTICS Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS
A study to define and verify a model of interactive-constructive elementary school science teaching
NASA Astrophysics Data System (ADS)
Henriques, Laura
This study took place within a four year systemic reform effort collaboratively undertaken by the Science Education Center at the University of Iowa and a local school district. Key features of the inservice project included the use of children's literature as a springboard into inquiry based science investigations, activities to increase parents' involvement in children's science learning and extensive inservice opportunities for elementary teachers to increase content knowledge and content-pedagogical knowledge. The overarching goal of this elementary science teacher enhancement project was to move teachers towards an interactive-constructivist model of teaching and learning. This study had three components. The first was the definition of the prototype teacher indicated by the project's goals and supported by science education research. The second involved the generation of a model to show relationships between teacher-generated products, demographics and their subsequent teaching behaviors. The third involved the verification of the hypothesized model using data collected on 15 original participants. Demographic information, survey responses, interview and written responses to scenarios were among the data collected as source variables. These were scored using a rubric designed to measure constructivist practices in science teaching. Videotapes of science teaching and revised science curricula were collected as downstream variables and scored using an the ESTEEM observational rubric and a rubric developed for the project. Results indicate that newer teachers were more likely to implement features of the project. Those teachers who were philosophically aligned with project goals before project involvement were also more likely to implement features of the project. Other associations between reported beliefs, planning and classroom implementations were not confirmed by these data. Data show that teachers reported higher levels of implementation than their classroom teaching indicated. Qualitative analysis indicated teachers who were more likely to implement the goals of this project were flexible, spontaneous, and able to give students more choices and responsibilities for their own learning. These teachers routinely used children's ideas and interests to guide instruction. Recommendations for future inservice are included. Discussion centers around elements of time, teacher input, teacher reflection, teachers as leaders and leaders' modeling of advocated practices.
Roodbeen, Ruud T J; Schelleman-Offermans, Karen; Lemmens, Paul H H M
2016-06-01
Age limits are effective in reducing alcohol- and tobacco-related harm, however, their effectiveness depends on the extent to which they are complied with. This study aimed to investigate the effectiveness of different age verification systems (AVSs) implemented by 400 Dutch supermarkets on requesting a valid age verification (ID) and on sellers' compliance. A mixed method design was used. Compliance was measured by 800 alcohol and tobacco purchase attempts by 17-year-old mystery shoppers. To analyze the effectiveness of AVSs, logistic regression analyses were performed. Insight into facilitating and hindering factors in the purchase process was obtained by 13 interviews with supermarket managers. Only a tendency toward a positive effect of the presence of the keying-on-date-of-birth AVS or ID swiper/checker was found on ID request for both alcohol and tobacco purchase attempts. The use of the keying-on-date-of-birth AVS or ID swiper/checker significantly increased the odds for compliance after an ID was requested, for both alcohol and tobacco purchase attempts. Managers indicated that ID requests and compliance could be facilitated by providing cashiers with sufficient managerial support, technical support, and regular training about the purchase process and use of the AVS. The usage of AVSs calculating and confirming whether the customer reached the legal purchase age for cashiers significantly increases the odds for cashiers to comply with age limits of alcohol and tobacco. Future research should gain insight into how usage of effective AVSs can be improved and explore the feasibility of implementation and effectiveness in other outlets. Copyright © 2016 The Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Contextual Variation, Familiarity, Academic Literacy, and Rural Adolescents' Idiom Knowledge.
Qualls, Constance Dean; O'Brien, Rose M; Blood, Gordon W; Hammer, Carol Scheffner
2003-01-01
The paucity of data on idiom development in adolescents, particularly rural adolescents, limits the ability of speech-language pathologists and educators to test and teach idioms appropriately in this population. This study was designed to delineate the interrelationships between context, familiarity, and academic literacy relative to rural adolescents' idiom knowledge. Ninety-five rural eighth graders (M age=13.4 years) were quasi-randomly assigned to complete the Idiom Comprehension Test (Qualls & Harris, 1999) in one of three contexts: idioms in a short story (n=25), idioms in isolation (n=32), and idioms in a verification task (n=38). For all conditions, the identical 24 idioms-8 each of high, moderate, and low familiarity (Nippold & Rudzinski, 1993)-were presented. For a subset (N=54) of the students, reading and language arts scores from the California Achievement Tests (5th ed., 1993), a standardized achievement test, were correlated with performance on the idiom test. Performance in the story condition and on high-familiarity idioms showed the greatest accuracy. For the isolation and verification conditions, context interacted with familiarity. Associations existed between idiom performance and reading ability and idiom performance and language literacy, but only for the story and verification conditions. High-proficiency readers showed the greatest idiom accuracy. The results support the notion that context facilitates idiom comprehension for rural adolescents, and that idiom testing should consider not only context, but idiom familiarity as well. Thus, local norms should be established. Findings also confirm that good readers are better at comprehending idioms, likely resulting from enriched vocabulary obtained through reading. These normative data indicate what might be expected when testing idiom knowledge in adolescents with language impairments.
42 CFR 457.380 - Eligibility verification.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Eligibility verification. 457.380 Section 457.380... Requirements: Eligibility, Screening, Applications, and Enrollment § 457.380 Eligibility verification. (a) The... State may establish reasonable eligibility verification mechanisms to promote enrollment of eligible...
The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...
Turbulence Modeling Verification and Validation
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.
2014-01-01
Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important steps in the process. Verification insures that the CFD code is solving the equations as intended (no errors in the implementation). This is typically done either through the method of manufactured solutions (MMS) or through careful step-by-step comparisons with other verified codes. After the verification step is concluded, validation is performed to document the ability of the turbulence model to represent different types of flow physics. Validation can involve a large number of test case comparisons with experiments, theory, or DNS. Organized workshops have proved to be valuable resources for the turbulence modeling community in its pursuit of turbulence modeling verification and validation. Workshop contributors using different CFD codes run the same cases, often according to strict guidelines, and compare results. Through these comparisons, it is often possible to (1) identify codes that have likely implementation errors, and (2) gain insight into the capabilities and shortcomings of different turbulence models to predict the flow physics associated with particular types of flows. These are valuable lessons because they help bring consistency to CFD codes by encouraging the correction of faulty programming and facilitating the adoption of better models. They also sometimes point to specific areas needed for improvement in the models. In this paper, several recent workshops are summarized primarily from the point of view of turbulence modeling verification and validation. Furthermore, the NASA Langley Turbulence Modeling Resource website is described. The purpose of this site is to provide a central location where RANS turbulence models are documented, and test cases, grids, and data are provided. The goal of this paper is to provide an abbreviated survey of turbulence modeling verification and validation efforts, summarize some of the outcomes, and give some ideas for future endeavors in this area.
Optimization of Magnet Arrangement in Double-Layer Interior Permanent-Magnet Motors
NASA Astrophysics Data System (ADS)
Yamazaki, Katsumi; Kitayuguchi, Kazuya
The arrangement of permanent magnets in double-layer interior permanent-magnet motors is optimized for variable-speed applications. First, the arrangement of magnets is decided by automatic optimization. Next, the superiority of the optimized motor is discussed by the d- and q-axis equivalent circuits that consider the magnetic saturation of the rotor core. Finally, experimental verification is carried out by using a prototype motor. It is confirmed that the maximum torque of the optimized motor under both low speed and high speed conditions are higher than those of conventional motors because of relatively large q-axis inductance and small d-axis inductance.
Experimental verification and simulation of negative index of refraction using Snell's law.
Parazzoli, C G; Greegor, R B; Li, K; Koltenbah, B E C; Tanielian, M
2003-03-14
We report the results of a Snell's law experiment on a negative index of refraction material in free space from 12.6 to 13.2 GHz. Numerical simulations using Maxwell's equations solvers show good agreement with the experimental results, confirming the existence of negative index of refraction materials. The index of refraction is a function of frequency. At 12.6 GHz we measure and compute the real part of the index of refraction to be -1.05. The measurements and simulations of the electromagnetic field profiles were performed at distances of 14lambda and 28lambda from the sample; the fields were also computed at 100lambda.
Sequence verification as quality-control step for production of cDNA microarrays.
Taylor, E; Cogdell, D; Coombes, K; Hu, L; Ramdas, L; Tabor, A; Hamilton, S; Zhang, W
2001-07-01
To generate cDNA arrays in our core laboratory, we amplified about 2300 PCR products from a human, sequence-verified cDNA clone library. As a quality-control step, we sequenced the PCR products immediately before printing. The sequence information was used to search the GenBank database to confirm the identities. Although these clones were previously sequence verified by the company, we found that only 79% of the clones matched the original database after handling. Our experience strongly indicates the necessity to sequence verify the clones at the final stage before printing on microarray slides and to modify the gene list accordingly.
[Verification of Learning Effects by Team-based Learning].
Ono, Shin-Ichi; Ito, Yoshihisa; Ishige, Kumiko; Inokuchi, Norio; Kosuge, Yasuhiro; Asami, Satoru; Izumisawa, Megumi; Kobayashi, Hiroko; Hayashi, Hiroyuki; Suzuki, Takashi; Kishikawa, Yukinaga; Hata, Harumi; Kose, Eiji; Tabata, Kei-Ichi
2017-11-01
It has been recommended that active learning methods, such as team-based learning (TBL) and problem-based learning (PBL), be introduced into university classes by the Central Council for Education. As such, for the past 3 years, we have implemented TBL in a medical therapeutics course for 4-year students. Based upon our experience, TBL is characterized as follows: TBL needs fewer teachers than PBL to conduct a TBL module. TBL enables both students and teachers to recognize and confirm the learning results from preparation and reviewing. TBL grows students' responsibility for themselves and their teams, and likely facilitates learning activities through peer assessment.
NASA Astrophysics Data System (ADS)
Yamaji, Minoru; Oshima, Juro; Hidaka, Motohiko
2009-06-01
Evidence for the coupled electron/proton transfer mechanism of the phenolic H-atom transfer between triplet π,π ∗ 3,3'-carbonylbis(7-diethylaminocoumarin) and phenol derivatives is obtained by using laser photolysis techniques. It was confirmed that the quenching rate constants of triplet CBC by phenols having positive Hammett constants do not follow the Rehm-Weller equation for electron transfer while those by phenols with negative Hammett constants do it. From the viewpoint of thermodynamic parameters for electron transfer, the crucial factors for phenolic H-atom transfer to π,π ∗ triplet are discussed.
Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification
NASA Technical Reports Server (NTRS)
Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.
2016-01-01
Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.
Verification of Numerical Programs: From Real Numbers to Floating Point Numbers
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn E.; Munoz, Cesar; Kirchner, Florent; Correnson, Loiec
2013-01-01
Numerical algorithms lie at the heart of many safety-critical aerospace systems. The complexity and hybrid nature of these systems often requires the use of interactive theorem provers to verify that these algorithms are logically correct. Usually, proofs involving numerical computations are conducted in the infinitely precise realm of the field of real numbers. However, numerical computations in these algorithms are often implemented using floating point numbers. The use of a finite representation of real numbers introduces uncertainties as to whether the properties veri ed in the theoretical setting hold in practice. This short paper describes work in progress aimed at addressing these concerns. Given a formally proven algorithm, written in the Program Verification System (PVS), the Frama-C suite of tools is used to identify sufficient conditions and verify that under such conditions the rounding errors arising in a C implementation of the algorithm do not affect its correctness. The technique is illustrated using an algorithm for detecting loss of separation among aircraft.
Applications of a Fast Neutron Detector System to Verification of Special Nuclear Materials
NASA Astrophysics Data System (ADS)
Mayo, Douglas R.; Byrd, Roger C.; Ensslin, Norbert; Krick, Merlyn S.; Mercer, David J.; Miller, Michael C.; Prettyman, Thomas H.; Russo, Phyllis A.
1998-04-01
An array of boron-loaded plastic optically coupled to bismuth germanate scintillators has been developed to detect neutrons for measurement of special nuclear materials. The phoswiched detection system has the advantage of a high neutron detection efficiency and short die-away time. This is achieved by mixing the moderator (plastic) and the detector (^10B) at the molecular level. Simulations indicate that the neutron capture probabilities equal or exceed those of the current thermal neutron multiplicity techniques which have the moderator (polyethylene) and detectors (^3He gas proportional tubes) macroscopically separate. Experiments have been performed to characterize the response of these detectors and validate computer simulations. The fast neutron detection system may be applied to the quantitative assay of plutonium in high (α,n) backgrounds, with emphasis on safeguards and enviromental scenarios. Additional applications of the insturment, in a non-quantative mode, has been tested for possible verification activities involving dismantlement of nuclear weapons. A description of the detector system, simulations and preliminary data will be presented.
Safeguards by Design Challenge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alwin, Jennifer Louise
The International Atomic Energy Agency (IAEA) defines Safeguards as a system of inspection and verification of the peaceful uses of nuclear materials as part of the Nuclear Nonproliferation Treaty. IAEA oversees safeguards worldwide. Safeguards by Design (SBD) involves incorporation of safeguards technologies, techniques, and instrumentation during the design phase of a facility, rather that after the fact. Design challenge goals are the following: Design a system of safeguards technologies, techniques, and instrumentation for inspection and verification of the peaceful uses of nuclear materials. Cost should be minimized to work with the IAEA’s limited budget. Dose to workers should always bemore » as low are reasonably achievable (ALARA). Time is of the essence in operating facilities and flow of material should not be interrupted significantly. Proprietary process information in facilities may need to be protected, thus the amount of information obtained by inspectors should be the minimum required to achieve the measurement goal. Then three different design challenges are detailed: Plutonium Waste Item Measurement System, Marine-based Modular Reactor, and Floating Nuclear Power Plant (FNPP).« less
Cognitive neuroscience in forensic science: understanding and utilizing the human element
Dror, Itiel E.
2015-01-01
The human element plays a critical role in forensic science. It is not limited only to issues relating to forensic decision-making, such as bias, but also relates to most aspects of forensic work (some of which even take place before a crime is ever committed or long after the verification of the forensic conclusion). In this paper, I explicate many aspects of forensic work that involve the human element and therefore show the relevance (and potential contribution) of cognitive neuroscience to forensic science. The 10 aspects covered in this paper are proactive forensic science, selection during recruitment, training, crime scene investigation, forensic decision-making, verification and conflict resolution, reporting, the role of the forensic examiner, presentation in court and judicial decisions. As the forensic community is taking on the challenges introduced by the realization that the human element is critical for forensic work, new opportunities emerge that allow for considerable improvement and enhancement of the forensic science endeavour. PMID:26101281
Cognitive neuroscience in forensic science: understanding and utilizing the human element.
Dror, Itiel E
2015-08-05
The human element plays a critical role in forensic science. It is not limited only to issues relating to forensic decision-making, such as bias, but also relates to most aspects of forensic work (some of which even take place before a crime is ever committed or long after the verification of the forensic conclusion). In this paper, I explicate many aspects of forensic work that involve the human element and therefore show the relevance (and potential contribution) of cognitive neuroscience to forensic science. The 10 aspects covered in this paper are proactive forensic science, selection during recruitment, training, crime scene investigation, forensic decision-making, verification and conflict resolution, reporting, the role of the forensic examiner, presentation in court and judicial decisions. As the forensic community is taking on the challenges introduced by the realization that the human element is critical for forensic work, new opportunities emerge that allow for considerable improvement and enhancement of the forensic science endeavour. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Learning Assumptions for Compositional Verification
NASA Technical Reports Server (NTRS)
Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)
2002-01-01
Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.
EPA has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The Air Pollution Control Technology Verification Center, a cente...
40 CFR 1066.240 - Torque transducer verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270 and... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066...
NASA Astrophysics Data System (ADS)
Kumar, Gokula; Norhafizah, I.; Shazril, I.; Nursyatina, AR; Aziz, MZ Abdul; Zin, Hafiz M.; Zakir, MK; Norjayadi; Norliza, AS; Ismail, A.; Khairun, N.
2017-05-01
This case report describes a complex radical 3D-Conformal Radiotherapy treatment planning, dosimetric issues and outcome of definitive treatment of un-resectable carcinoma of the vulvar in a 42-year old lady. The patient presented with large fungating mass of the vulva which was biopsy confirmed as Keratinizing Squamous Cell Carcinoma. Further staging investigation revealed locally advanced disease (T4), with bilateral inguinal lymph nodes involvement. There is no systemic metastasis or intra-pelvic nodes. The patient was seen by Gynae-Oncology team and the disease was deemed un-resectable without significant morbidity. She was treated to a total dose of 64.8Gy in 36 fractions over 7 weeks with concurrent weekly Cisplatinum in 2 phases. 3D-Conformal radiotherapy technique using the modified segmental boost technique (MSBT, large PA and small AP photon fields with inguinal electron matching) was used. TLD chips were used for in-vivo dose verification in phase 1 and 2 of the treatment. At completion of planned radiotherapy, patient had a complete clinical response, grade 2-3 skin toxicity, grade 2 rectal toxicity, and grade 2 dysuria Vulval Squamous Cell Carcinomas are very radiosensitive tumours and the skills of the treating Radiation Oncologist, Dosimetrists, Physicist, Radiation Therapist and also nurses is of foremost importance is ensuring good clinical outcomes.
Hatten, James R.; Parsley, Michael; Barton, Gary; Batt, Thomas; Fosness, Ryan L.
2018-01-01
A study was conducted to identify habitat characteristics associated with age 0+ White Sturgeon (Acipenser transmontanus Richardson, 1863) recruitment in three reaches of the Columbia River Basin: Skamania reach (consistent recruitment), John Day reach (intermittent/inconsistent recruitment), and Kootenai reach (no recruitment). Our modeling approach involved numerous steps. First, we collected information about substrate, embeddedness, and hydrodynamics in each reach. Second, we developed a set of spatially explicit predictor variables. Third, we built two habitat (probability) models with Skamania reach training data where White Sturgeon recruitment was consistent. Fourth, we created spawning maps of each reach by populating the habitat models with in-reach physical metrics (substrate, embeddedness, and hydrodynamics). Fifth, we examined model accuracy by overlaying spawning locations in Skamania and Kootenai reaches with habitat predictions obtained from probability models. Sixth, we simulated how predicted habitat changed in each reach after manipulating physical conditions to more closely match Skamania reach. Model verification confirmed White Sturgeon generally spawned in locations with higher model probabilities in Skamania and Kootenai reaches, indicating the utility of extrapolating the models. Model simulations revealed significant gains in White Sturgeon habitat in all reaches when spring flow increased, gravel/cobble composition increased, or embeddedness decreased. The habitat models appear well suited to assist managers when identifying reach-specific factors limiting White Sturgeon recruitment in the Columbia River Basin or throughout its range.
Learning the Rules of the Game
NASA Astrophysics Data System (ADS)
Smith, Donald A.
2018-03-01
Games have often been used in the classroom to teach physics ideas and concepts, but there has been less published on games that can be used to teach scientific thinking. D. Maloney and M. Masters describe an activity in which students attempt to infer rules to a game from a history of moves, but the students don't actually play the game. Giving the list of moves allows the instructor to emphasize the important fact that nature usually gives us incomplete data sets, but it does make the activity less immersive. E. Kimmel suggested letting students attempt to figure out the rules to Reversi by playing it, but this game only has two players, which makes it difficult to apply in a classroom setting. Kimmel himself admits the choice of Reversi is somewhat arbitrary. There are games, however, that are designed to make the process of figuring out the rules an integral aspect of play. These games involve more people and require only a deck or two of cards. I present here an activity constructed around the card game Mao, which can be used to help students recognize aspects of scientific thinking. The game is particularly good at illustrating the importance of falsification tests (questions designed to elicit a negative answer) over verification tests (examples that confirm what is already suspected) for illuminating the underlying rules.
Park, Seong Ho; Han, Kyunghwa
2018-03-01
The use of artificial intelligence in medicine is currently an issue of great interest, especially with regard to the diagnostic or predictive analysis of medical images. Adoption of an artificial intelligence tool in clinical practice requires careful confirmation of its clinical utility. Herein, the authors explain key methodology points involved in a clinical evaluation of artificial intelligence technology for use in medicine, especially high-dimensional or overparameterized diagnostic or predictive models in which artificial deep neural networks are used, mainly from the standpoints of clinical epidemiology and biostatistics. First, statistical methods for assessing the discrimination and calibration performances of a diagnostic or predictive model are summarized. Next, the effects of disease manifestation spectrum and disease prevalence on the performance results are explained, followed by a discussion of the difference between evaluating the performance with use of internal and external datasets, the importance of using an adequate external dataset obtained from a well-defined clinical cohort to avoid overestimating the clinical performance as a result of overfitting in high-dimensional or overparameterized classification model and spectrum bias, and the essentials for achieving a more robust clinical evaluation. Finally, the authors review the role of clinical trials and observational outcome studies for ultimate clinical verification of diagnostic or predictive artificial intelligence tools through patient outcomes, beyond performance metrics, and how to design such studies. © RSNA, 2018.
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2010 CFR
2010-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2012 CFR
2012-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2011 CFR
2011-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2013 CFR
2013-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2014 CFR
2014-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
Joint ETV/NOWATECH verification protocol for the Sorbisense GSW40 passive sampler
Environmental technology verification (ETV) is an independent (third party) assessment of the performance of a technology or a product for a specified application, under defined conditions and quality assurance. This verification is a joint verification with the US EPA ETV schem...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
Multi-canister overpack project -- verification and validation, MCNP 4A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldmann, L.H.
This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less
What makes a hospital manager competent at the middle and senior levels?
Liang, Zhanming; Leggat, Sandra G; Howard, Peter F; Koh, Lee
2013-11-01
The purpose of this paper is to confirm the core competencies required for middle to senior level managers in Victorian public hospitals in both metropolitan and regional/rural areas. This exploratory mixed-methods study used a three-step approach which included position description content analysis, focus group discussions and online competency verification and identification survey. The study validated a number of key tasks required for senior and middle level hospital managers (levels II, III and IV) and identified and confirmed the essential competencies for completing these key tasks effectively. As a result, six core competencies have been confirmed as common to the II, III and IV management levels in both the Melbourne metropolitan and regional/rural areas. Six core competencies are required for middle to senior level managers in public hospitals which provide guidance to the further development of the competency-based educational approach for training the current management workforce and preparing future health service managers. With the detailed descriptions of the six core competencies, healthcare organisations and training institutions will be able to assess the competency gaps and managerial training needs of current health service managers and develop training programs accordingly.
2007-01-15
it can detect specifically proscribed content changes to critical files (e.g., illegal shells inserted into /etc/ passwd ). Fourth, it can detect the...UNIX password management involves a pair of inter-related files (/etc/ passwd and /etc/shadow). The corresponding access patterns seen at the storage...content integrity verification is utilized. As a concrete example, consider a UNIX system password file (/etc/ passwd ), which consists of a set of well
Assessment of the first radiances received from the VSSR Atmospheric Sounder (VAS) instrument
NASA Technical Reports Server (NTRS)
Chesters, D.; Uccellini, L. W.; Montgomery, H.; Mostek, A.; Robinson, W.
1981-01-01
The first orderly, calibrated radiances from the VAS-D instrument on the GOES-4 satellite are examined for: image quality, radiometric precision, radiation transfer verification at clear air radiosonde sites, regression retrieval accuracy, and mesoscale analysis features. Postlaunch problems involving calibration and data processing irregularities of scientific or operational significance are included. The radiances provide good visual and relative radiometric data for empirically conditioned retrievals of mesoscale temperature and moisture fields in clear air.
NASA Technical Reports Server (NTRS)
Landano, M. R.; Easter, R. W.
1984-01-01
Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.
Verification of S&D Solutions for Network Communications and Devices
NASA Astrophysics Data System (ADS)
Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen
This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.
Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk
NASA Technical Reports Server (NTRS)
Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.
2014-01-01
The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.
78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-13
... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...
78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-28
.... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process AGENCY: Pipeline and... announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity Verification Process shares similar characteristics with fitness for service processes. At this workshop, the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
P.C. Weaver
2008-06-12
Conduct verification surveys of available grids at the DWI 1630 in Knoxville, Tennessee. A representative with the Independent Environmental Assessment and Verification (IEAV) team from ORISE conducted a verification survey of a partial area within Grid E9.
40 CFR 1065.395 - Inertial PM balance verifications.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...
40 CFR 1065.395 - Inertial PM balance verifications.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...
22 CFR 123.14 - Import certificate/delivery verification procedure.
Code of Federal Regulations, 2010 CFR
2010-04-01
... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Import certificate/delivery verification...
22 CFR 123.14 - Import certificate/delivery verification procedure.
Code of Federal Regulations, 2011 CFR
2011-04-01
... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Import certificate/delivery verification...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
24 CFR 5.512 - Verification of eligible immigration status.
Code of Federal Regulations, 2010 CFR
2010-04-01
... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...
Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation
NASA Technical Reports Server (NTRS)
Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna
2000-01-01
This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.
Towards the formal verification of the requirements and design of a processor interface unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.
7 CFR 272.8 - State income and eligibility verification system.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 4 2010-01-01 2010-01-01 false State income and eligibility verification system. 272... PARTICIPATING STATE AGENCIES § 272.8 State income and eligibility verification system. (a) General. (1) State agencies may maintain and use an income and eligibility verification system (IEVS), as specified in this...
24 CFR 985.3 - Indicators, HUD verification methods and ratings.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicators, HUD verification..., HUD verification methods and ratings. This section states the performance indicators that are used to assess PHA Section 8 management. HUD will use the verification method identified for each indicator in...
78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-12
.... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension... public workshop on ``Integrity Verification Process'' which took place on August 7, 2013. The notice also sought comments on the proposed ``Integrity Verification Process.'' In response to the comments received...
10 CFR 9.54 - Verification of identity of individuals making requests.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-12
...-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing... correct certain portions of the Protocol Gas Verification Program and Minimum Competency Requirements for... final rule that amends the Agency's Protocol Gas Verification Program (PGVP) and the minimum competency...
30 CFR 227.601 - What are a State's responsibilities if it performs automated verification?
Code of Federal Regulations, 2010 CFR
2010-07-01
... performs automated verification? 227.601 Section 227.601 Mineral Resources MINERALS MANAGEMENT SERVICE... Perform Delegated Functions § 227.601 What are a State's responsibilities if it performs automated verification? To perform automated verification of production reports or royalty reports, you must: (a) Verify...
10 CFR 9.54 - Verification of identity of individuals making requests.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...
10 CFR 9.54 - Verification of identity of individuals making requests.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...
10 CFR 9.54 - Verification of identity of individuals making requests.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...
Verification test report on a solar heating and hot water system
NASA Technical Reports Server (NTRS)
1978-01-01
Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.
46 CFR 61.40-3 - Design verification testing.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...
Kalinová, Jana P; Tříska, Jan; Vrchotová, Naděžda; Moos, Martin
2014-08-15
The presence of caprolactam, a precursor of Nylon-6, among those synthetic polymers which are widely-spread throughout the environment, could be the reason for its being found in plants. The aim of this work was to confirm the previously described presence of caprolactam in dry and sprouted achenes, as well as in achene exudates of common buckwheat (Fagopyrum esculentum Moench). When the lyophilized sprouted and dry buckwheat achenes, along with exudates from growth experiments, with caprolactam-free medium were analysed by HPLC, no caprolactam was found. After addition of caprolactam into the growth medium, we confirmed the uptake of caprolactam in the lyophilized sprouted buckwheat achenes. The uptake of caprolactam is also a function of light conditions during the growth experiments. Caprolactam also inhibits the content of phenolic compounds; especially rutin, vitexin, isovitexin, orientin, and homoorientin in buckwheat plants. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Frandsen, Benjamin A.; Brunelli, Michela; Page, Katharine; Uemura, Yasutomo J.; Staunton, Julie B.; Billinge, Simon J. L.
2016-05-01
We present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ˜1 nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominated by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. The Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.
Verification and Validation Studies for the LAVA CFD Solver
NASA Technical Reports Server (NTRS)
Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.
2013-01-01
The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.
Kuo, Chin-Chi; Balakrishnan, Poojitha; Hsein, Yenh-Chen; Wu, Vin-Cent; Chueh, Shih-Chieh Jeff; Chen, Yung-Ming; Wu, Kwan-Dun; Wang, Ming-Jiuh
2015-09-01
The diagnosis of primary aldosteronism (PA) among the older-aged population has posed a crucial challenge. Among patients over 50 years old, this trial assessed comparability of the performance of two PA diagnostic tests: losartan and captopril suppression tests. A post-hoc subgroup analysis from a prospective cohort was conducted by the TAIPAI (Taiwan Primary Aldosteronism Investigation) group between July 2003 and July 2006. Of the 160 patients in the cohort, 60 patients over 50 years old received captopril and losartan tests to confirm PA. Among the 60 patients over 50 years old, 31 patients had PA confirmed by standardized protocol. The area under the receiver-operating characteristic (ROC) curve for post-captopril aldosterone was significantly less than that for post-losartan plasma aldosterone concentration (PAC) (0.87 vs 0.94, p=0.02). Using the aldosterone-renin ratio (ARR)>35 with PAC>10 ng/dl, the specificity was 82.76% vs 93.1% and the sensitivity was 77.42% vs 87.10% for the captopril and losartan tests, respectively. The equivalence between the two tests were confirmed by the exact McNemar's test (p=1.0). The losartan test showed comparable accuracy to confirm PA. Verification of this "elderly-friendly" confirmatory test will be the first step to prepare a specific diagnostic model of PA for the older-aged population. © The Author(s) 2014.
Kuo, Chin-Chi; Balakrishnan, Poojitha; Hsein, Yenh-Chen; Wu, Vin-Cent; Chueh, Shih-Chieh Jeff; Chen, Yung-Ming; Wu, Kwan-Dun; Wang, Ming-Jiuh
2013-01-01
Objective The diagnosis of primary aldosteronism (PA) among the older-aged population has posed a crucial challenge. Among patients over 50 years old, this trial assessed comparability of the performance of two PA diagnostic tests: losartan and captoril suppression tests. Methods A post-hoc subgroup analysis from a prospective cohort was conducted by TAIPAI (Taiwan Primary Aldosteornism Investigation) group between July 2003 and July 2006. Of the 160 patients in the cohort, 60 patients over 50 years received captopril and losartan tests to confirm PA. Results Among the 60 patients over 50 years old, 31 patients had PA confirmed by standardized protocol. The area under the receiver-operating characteristic (ROC) curve of the post-captopril aldosterone was significantly less than that of the post-losartan plasma aldosterone concentration (0.87 vs. 0.94, p = 0.02). Using ARR>35 with PAC>10 ng/dL, the specificity was 82.76% vs. 93.1% and the sensitivity was 77.42% vs. 87.10% for the captopril and losartan tests, respectively. The equivalence between the two tests were confirmed by exact McNemar test (p= 1.0). Conclusion The losartan test showed comparable accuracy to confirm PA. Verification of this “elderly-friendly” confirmatory test will be the first step to prepare the specific diagnostic model of PA for older-aged population. PMID:25031295
Code of Federal Regulations, 2010 CFR
2010-07-01
... confidentiality of, Statements of Account, Verification Auditor's Reports, and other verification information... GENERAL PROVISIONS § 201.29 Access to, and confidentiality of, Statements of Account, Verification Auditor... Account, including the Primary Auditor's Reports, filed under 17 U.S.C. 1003(c) and access to a Verifying...
Formal Multilevel Hierarchical Verification of Synchronous MOS VLSI Circuits.
1987-06-01
166 12.4 Capacitance Coupling............................. 166 12.5 Multiple Abstraction Fuctions ....................... 168...depend on whether it is performing flat verification or hierarchical verification. The primary operations of Silica Pithecus when performing flat...signals never arise. The primary operation of Silica Pithecus when performing hierarchical verification is processing constraints to show they hold
Code of Federal Regulations, 2011 CFR
2011-10-01
...; verification of identity of individuals making requests; accompanying persons; and procedures for... Procedures and Requirements § 802.7 Requests: How, where, and when presented; verification of identity of... which the record is contained. (d) Verification of identity of requester. (1) For written requests, the...
Code of Federal Regulations, 2013 CFR
2013-10-01
...; verification of identity of individuals making requests; accompanying persons; and procedures for... Procedures and Requirements § 802.7 Requests: How, where, and when presented; verification of identity of... which the record is contained. (d) Verification of identity of requester. (1) For written requests, the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-02
... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: 3206-0215, Verification of Full-Time School...) 3206-0215, Verification of Full-Time School Attendance. As required by the Paperwork Reduction Act of... or faxed to (202) 395-6974. SUPPLEMENTARY INFORMATION: RI 25-49, Verification of Full-Time School...
25 CFR 61.8 - Verification forms.
Code of Federal Regulations, 2010 CFR
2010-04-01
... using the last address of record. The verification form will be used to ascertain the previous enrollee... death. Name and/or address changes will only be made if the verification form is signed by an adult... 25 Indians 1 2010-04-01 2010-04-01 false Verification forms. 61.8 Section 61.8 Indians BUREAU OF...
NASA Astrophysics Data System (ADS)
Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard
2006-05-01
A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.
Expert system verification and validation study
NASA Technical Reports Server (NTRS)
French, Scott W.; Hamilton, David
1992-01-01
Five workshops on verification and validation (V&V) of expert systems (ES) where taught during this recent period of performance. Two key activities, previously performed under this contract, supported these recent workshops (1) Survey of state-of-the-practice of V&V of ES and (2) Development of workshop material and first class. The first activity involved performing an extensive survey of ES developers in order to answer several questions regarding the state-of-the-practice in V&V of ES. These questions related to the amount and type of V&V done and the successfulness of this V&V. The next key activity involved developing an intensive hands-on workshop in V&V of ES. This activity involved surveying a large number of V&V techniques, conventional as well as ES specific ones. In addition to explaining the techniques, we showed how each technique could be applied on a sample problem. References were included in the workshop material, and cross referenced to techniques, so that students would know where to go to find additional information about each technique. In addition to teaching specific techniques, we included an extensive amount of material on V&V concepts and how to develop a V&V plan for an ES project. We felt this material was necessary so that developers would be prepared to develop an orderly and structured approach to V&V. That is, they would have a process that supported the use of the specific techniques. Finally, to provide hands-on experience, we developed a set of case study exercises. These exercises were to provide an opportunity for the students to apply all the material (concepts, techniques, and planning material) to a realistic problem.
Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B
2009-12-01
Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.
A study of applications scribe frame data verifications using design rule check
NASA Astrophysics Data System (ADS)
Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki
2013-06-01
In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luke, S J
2011-12-20
This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of themore » US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.« less
Formal Analysis of the Remote Agent Before and After Flight
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Lowry, Mike; Park, SeungJoon; Pecheur, Charles; Penix, John; Visser, Willem; White, Jon L.
2000-01-01
This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.
A web-based system for supporting global land cover data production
NASA Astrophysics Data System (ADS)
Han, Gang; Chen, Jun; He, Chaoying; Li, Songnian; Wu, Hao; Liao, Anping; Peng, Shu
2015-05-01
Global land cover (GLC) data production and verification process is very complicated, time consuming and labor intensive, requiring huge amount of imagery data and ancillary data and involving many people, often from different geographic locations. The efficient integration of various kinds of ancillary data and effective collaborative classification in large area land cover mapping requires advanced supporting tools. This paper presents the design and development of a web-based system for supporting 30-m resolution GLC data production by combining geo-spatial web-service and Computer Support Collaborative Work (CSCW) technology. Based on the analysis of the functional and non-functional requirements from GLC mapping, a three tiers system model is proposed with four major parts, i.e., multisource data resources, data and function services, interactive mapping and production management. The prototyping and implementation of the system have been realised by a combination of Open Source Software (OSS) and commercially available off-the-shelf system. This web-based system not only facilitates the integration of heterogeneous data and services required by GLC data production, but also provides online access, visualization and analysis of the images, ancillary data and interim 30 m global land-cover maps. The system further supports online collaborative quality check and verification workflows. It has been successfully applied to China's 30-m resolution GLC mapping project, and has improved significantly the efficiency of GLC data production and verification. The concepts developed through this study should also benefit other GLC or regional land-cover data production efforts.
NASA Astrophysics Data System (ADS)
Grijpink, Jan
2004-06-01
Along at least twelve dimensions biometric systems might vary. We need to exploit this variety to manoeuvre biometrics into place to be able to realise its social potential. Subsequently, two perspectives on biometrics are proposed revealing that biometrics will probably be ineffective in combating identity fraud, organised crime and terrorism: (1) the value chain perspective explains the first barrier: our strong preference for large scale biometric systems for general compulsory use. These biometric systems cause successful infringements to spread unnoticed. A biometric system will only function adequately if biometrics is indispensable for solving the dominant chain problem. Multi-chain use of biometrics takes it beyond the boundaries of good manageability. (2) the identity fraud perspective exposes the second barrier: our traditional approach to identity verification. We focus on identity documents, neglecting the person and the situation involved. Moreover, western legal cultures have made identity verification procedures known, transparent, uniform and predictable. Thus, we have developed a blind spot to identity fraud. Biometrics provides good potential to better checking persons, but will probably be used to enhance identity documents. Biometrics will only pay off if it confronts the identity fraudster with less predictable verification processes and more risks of his identity fraud being spotted. Standardised large scale applications of biometrics for general compulsory use without countervailing measures will probably produce the reverse. This contribution tentatively presents a few headlines for an overall biometrics strategy that could better resist identity fraud.
Energetic arousal and language: predictions from the computational theory of quantifiers processing.
Zajenkowski, Marcin
2013-10-01
The author examines the relationship between energetic arousal (EA) and the processing of sentences containing natural-language quantifiers. Previous studies and theories have shown that energy may differentially affect various cognitive functions. Recent investigations devoted to quantifiers strongly support the theory that various types of quantifiers involve different cognitive functions in the sentence-picture verification task. In the present study, 201 students were presented with a sentence-picture verification task consisting of simple propositions containing a quantifier that referred to the color of a car on display. Color pictures of cars accompanied the propositions. In addition, the level of participants' EA was measured before and after the verification task. It was found that EA and performance on proportional quantifiers (e.g., "More than half of the cars are red") are in an inverted U-shaped relationship. This result may be explained by the fact that proportional sentences engage working memory to a high degree, and previous models of EA-cognition associations have been based on the assumption that tasks that require parallel attentional and memory processes are best performed when energy is moderate. The research described in the present article has several applications, as it shows the optimal human conditions for verbal comprehension. For instance, it may be important in workplace design to control the level of arousal experienced by office staff when work is mostly related to the processing of complex texts. Energy level may be influenced by many factors, such as noise, time of day, or thermal conditions.
Arms Control Verification Is No Longer a Stumbling Block to True Disarmament
1992-05-01
that they had had sirce 1945.17 The Atomoic Energy Cormission failed to reach an agreeriment due to the deadlock between the two plans. The Soviet...the Soviets exploding their first nuclear device on September 2", 149 ’ 13 The talks or general and complete disarmament had failed. The failure to...and by the general state of international arid do,mestic toolitics at the time. 7 Adequate verificat ion involves the acceptance of a. degree of
Get Your Requirements Straight: Storyboarding Revisited
NASA Astrophysics Data System (ADS)
Haesen, Mieke; Luyten, Kris; Coninx, Karin
Current user-centred software engineering (UCSE) approaches provide many techniques to combine know-how available in multidisciplinary teams. Although the involvement of various disciplines is beneficial for the user experience of the future application, the transition from a user needs analysis to a structured interaction analysis and UI design is not always straightforward. We propose storyboards, enriched by metadata, to specify functional and non-functional requirements. Accompanying tool support should facilitate the creation and use of storyboards. We used a meta-storyboard for the verification of storyboarding approaches.
Toward a therapy for mitochondrial disease
Viscomi, Carlo
2016-01-01
Mitochondrial disorders are a group of genetic diseases affecting the energy-converting process of oxidative phosphorylation. The extreme variability of symptoms, organ involvement, and clinical course represent a challenge to the development of effective therapeutic interventions. However, new possibilities have recently been emerging from studies in model organisms and awaiting verification in humans. I will discuss here the most promising experimental approaches and the challenges we face to translate them into the clinics. The current clinical trials will also be briefly reviewed. PMID:27911730
NASA Technical Reports Server (NTRS)
Buchanan, H.; Nixon, D.; Joyce, R.
1974-01-01
A simulation of the Skylab attitude and pointing control system (APCS) is outlined and discussed. Implementation is via a large hybrid computer and includes those factors affecting system momentum management, propellant consumption, and overall vehicle performance. The important features of the flight system are discussed; the mathematical models necessary for this treatment are outlined; and the decisions involved in implementation are discussed. A brief summary of the goals and capabilities of this tool is also included.
NASA Technical Reports Server (NTRS)
Putkovich, K.
1981-01-01
Initial test results indicated that the Global Positioning System/Time Transfer Unit (GPS/TTU) performed well within the + or - 100 nanosecond range required by the original system specification. Subsequent testing involved the verification of GPS time at the master control site via portable clocks and the acquisition and tracking of as many passes of the space vehicles currently in operation as possible. A description and discussion of the testing, system modifications, test results obtained, and an evaluation of both GPS and the GPS/TTU are presented.
Fatigue and fracture: Overview
NASA Technical Reports Server (NTRS)
Halford, G. R.
1984-01-01
A brief overview of the status of the fatigue and fracture programs is given. The programs involve the development of appropriate analytic material behavior models for cyclic stress-strain-temperature-time/cyclic crack initiation, and cyclic crack propagation. The underlying thrust of these programs is the development and verification of workable engineering methods for the calculation, in advance of service, of the local cyclic stress-strain response at the critical life governing location in hot section compounds, and the resultant crack initiation and crack growth lifetimes.
Modelling crystal growth: Convection in an asymmetrically heated ampoule
NASA Technical Reports Server (NTRS)
Alexander, J. Iwan D.; Rosenberger, Franz; Pulicani, J. P.; Krukowski, S.; Ouazzani, Jalil
1990-01-01
The objective was to develop and implement a numerical method capable of solving the nonlinear partial differential equations governing heat, mass, and momentum transfer in a 3-D cylindrical geometry in order to examine the character of convection in an asymmetrically heated cylindrical ampoule. The details of the numerical method, including verification tests involving comparison with results obtained from other methods, are presented. The results of the study of 3-D convection in an asymmetrically heated cylinder are described.
External tank aerothermal design criteria verification
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Saladino, Anthony J.
1991-01-01
If a Space Shuttle Main Engine (SSME) fails during the initial 160 seconds of the Shuttle flight, a return-to-launch-site maneuver will be implemented. The period of concern for this task is the pitch-around maneuver when the vehicle is flying backward. The intent of this report is to identify and define the flowfield at the most critical locations from an environment perspective. The solution procedure used to predict the plume heating rates involves both computational analysis and engineering modeling.
The Air Pollution Control Technology Verification Center has selected general ventilation air cleaners as a technology area. The Generic Verification Protocol for Biological and Aerosol Testing of General Ventilation Air Cleaners is on the Environmental Technology Verification we...
49 CFR 40.135 - What does the MRO tell the employee at the beginning of the verification interview?
Code of Federal Regulations, 2010 CFR
2010-10-01
... beginning of the verification interview? 40.135 Section 40.135 Transportation Office of the Secretary of... verification interview? (a) As the MRO, you must tell the employee that the laboratory has determined that the... finding of adulteration or substitution. (b) You must explain the verification interview process to the...
40 CFR 1065.550 - Gas analyzer range verification and drift verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations... concentration subcomponents (e.g., THC and CH4 for NMHC) separately. For example, for NMHC measurements, perform drift verification on NMHC; do not verify THC and CH4 separately. (2) Drift verification requires two...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-22
.... SUPPLEMENTARY INFORMATION: RI 38-107, Verification of Who is Getting Payments, is designed for use by the... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Verification of Who Is Getting Payments, RI... currently approved information collection request (ICR) 3206-0197, Verification of Who is Getting Payments...
Automated verification of flight software. User's manual
NASA Technical Reports Server (NTRS)
Saib, S. H.
1982-01-01
(Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.
Research on key technology of the verification system of steel rule based on vision measurement
NASA Astrophysics Data System (ADS)
Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun
2018-01-01
The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.
Space transportation system payload interface verification
NASA Technical Reports Server (NTRS)
Everline, R. T.
1977-01-01
The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).
Bot, Maarten; van den Munckhof, Pepijn; Bakay, Roy; Stebbins, Glenn; Verhagen Metman, Leo
2017-01-01
Objective To determine the accuracy of intraoperative computed tomography (iCT) in localizing deep brain stimulation (DBS) electrodes by comparing this modality with postoperative magnetic resonance imaging (MRI). Background Optimal lead placement is a critical factor for the outcome of DBS procedures and preferably confirmed during surgery. iCT offers 3-dimensional verification of both microelectrode and lead location during DBS surgery. However, accurate electrode representation on iCT has not been extensively studied. Methods DBS surgery was performed using the Leksell stereotactic G frame. Stereotactic coordinates of 52 DBS leads were determined on both iCT and postoperative MRI and compared with intended final target coordinates. The resulting absolute differences in X (medial-lateral), Y (anterior-posterior), and Z (dorsal-ventral) coordinates (ΔX, ΔY, and ΔZ) for both modalities were then used to calculate the euclidean distance. Results Euclidean distances were 2.7 ± 1.1 and 2.5 ± 1.2 mm for MRI and iCT, respectively (p = 0.2). Conclusion Postoperative MRI and iCT show equivalent DBS lead representation. Intraoperative localization of both microelectrode and DBS lead in stereotactic space enables direct adjustments. Verification of lead placement with postoperative MRI, considered to be the gold standard, is unnecessary. PMID:28601874
Bot, Maarten; van den Munckhof, Pepijn; Bakay, Roy; Stebbins, Glenn; Verhagen Metman, Leo
2017-01-01
To determine the accuracy of intraoperative computed tomography (iCT) in localizing deep brain stimulation (DBS) electrodes by comparing this modality with postoperative magnetic resonance imaging (MRI). Optimal lead placement is a critical factor for the outcome of DBS procedures and preferably confirmed during surgery. iCT offers 3-dimensional verification of both microelectrode and lead location during DBS surgery. However, accurate electrode representation on iCT has not been extensively studied. DBS surgery was performed using the Leksell stereotactic G frame. Stereotactic coordinates of 52 DBS leads were determined on both iCT and postoperative MRI and compared with intended final target coordinates. The resulting absolute differences in X (medial-lateral), Y (anterior-posterior), and Z (dorsal-ventral) coordinates (ΔX, ΔY, and ΔZ) for both modalities were then used to calculate the euclidean distance. Euclidean distances were 2.7 ± 1.1 and 2.5 ± 1.2 mm for MRI and iCT, respectively (p = 0.2). Postoperative MRI and iCT show equivalent DBS lead representation. Intraoperative localization of both microelectrode and DBS lead in stereotactic space enables direct adjustments. Verification of lead placement with postoperative MRI, considered to be the gold standard, is unnecessary. © 2017 The Author(s) Published by S. Karger AG, Basel.
Hydrostatic paradox: experimental verification of pressure equilibrium
NASA Astrophysics Data System (ADS)
Kodejška, Č.; Ganci, S.; Říha, J.; Sedláčková, H.
2017-11-01
This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical analysis of the problem, which is based, firstly, on the equation for isothermal process and, secondly, on the equality of pressures inside and outside the cylinder. From the measured values the confirmation of the theoretical quadratic dependence of the air pressure inside the cylinder on the level of the liquid in the cylinder is obtained, the maximum change in the volume of air within the cylinder occurs for the height of the water column L of one half of the total height of the vessel H. The measurements were made for different diameters of the cylinder and with plates made of different materials located at the bottom of the cylinder to prevent liquid from flowing out of the cylinder. The measured values were subjected to statistical analysis, which demonstrated the validity of the zero hypothesis, i.e. that the measured values are not statistically significantly different from the theoretically calculated ones at the statistical significance level α = 0.05.
Experimental Evaluation of Verification and Validation Tools on Martian Rover Software
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich
2003-01-01
We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.
Performance verification testing of the UltraStrip Systems, Inc., Mobile Emergency Filtration System (MEFS) was conducted under EPA's Environmental Technology Verification (ETV) Program at the EPA Test and Evaluation (T&E) Facility in Cincinnati, Ohio, during November, 2003, thr...
Model Based Verification of Cyber Range Event Environments
2015-12-10
Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error
The Danish Environmental Technology Verification program (DANETV) Water Test Centre operated by DHI, is supported by the Danish Ministry for Science, Technology and Innovation. DANETV, the United States Environmental Protection Agency Environmental Technology Verification Progra...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-03-01
The certification and verification of the Northrup Model NSC-01-0732 Fresnel lens tracking solar collector are presented. A certification statement is included with signatures and a separate report on the structural analysis of the collector system. System verification against the Interim Performance Criteria are indicated by matrices with verification discussion, analysis, and enclosed test results.
Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana
2011-01-01
The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties. PMID:21713128
Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana
2011-01-01
The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.
Generation and confirmation of a (100 x 100)-dimensional entangled quantum system.
Krenn, Mario; Huber, Marcus; Fickler, Robert; Lapkiewicz, Radek; Ramelow, Sven; Zeilinger, Anton
2014-04-29
Entangled quantum systems have properties that have fundamentally overthrown the classical worldview. Increasing the complexity of entangled states by expanding their dimensionality allows the implementation of novel fundamental tests of nature, and moreover also enables genuinely new protocols for quantum information processing. Here we present the creation of a (100 × 100)-dimensional entangled quantum system, using spatial modes of photons. For its verification we develop a novel nonlinear criterion which infers entanglement dimensionality of a global state by using only information about its subspace correlations. This allows very practical experimental implementation as well as highly efficient extraction of entanglement dimensionality information. Applications in quantum cryptography and other protocols are very promising.
Generation and confirmation of a (100 × 100)-dimensional entangled quantum system
Krenn, Mario; Huber, Marcus; Fickler, Robert; Lapkiewicz, Radek; Ramelow, Sven; Zeilinger, Anton
2014-01-01
Entangled quantum systems have properties that have fundamentally overthrown the classical worldview. Increasing the complexity of entangled states by expanding their dimensionality allows the implementation of novel fundamental tests of nature, and moreover also enables genuinely new protocols for quantum information processing. Here we present the creation of a (100 × 100)-dimensional entangled quantum system, using spatial modes of photons. For its verification we develop a novel nonlinear criterion which infers entanglement dimensionality of a global state by using only information about its subspace correlations. This allows very practical experimental implementation as well as highly efficient extraction of entanglement dimensionality information. Applications in quantum cryptography and other protocols are very promising. PMID:24706902
USB environment measurements based on full-scale static engine ground tests
NASA Technical Reports Server (NTRS)
Sussman, M. B.; Harkonen, D. L.; Reed, J. B.
1976-01-01
Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle, and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data, and to establish a basis for future flight test comparisons.
Cross-view gait recognition using joint Bayesian
NASA Astrophysics Data System (ADS)
Li, Chao; Sun, Shouqian; Chen, Xiaoyu; Min, Xin
2017-07-01
Human gait, as a soft biometric, helps to recognize people by walking. To further improve the recognition performance under cross-view condition, we propose Joint Bayesian to model the view variance. We evaluated our prosed method with the largest population (OULP) dataset which makes our result reliable in a statically way. As a result, we confirmed our proposed method significantly outperformed state-of-the-art approaches for both identification and verification tasks. Finally, sensitivity analysis on the number of training subjects was conducted, we find Joint Bayesian could achieve competitive results even with a small subset of training subjects (100 subjects). For further comparison, experimental results, learning models, and test codes are available.
Evaluation of HCFC AK 225 Alternatives for Precision Cleaning and Verification
NASA Technical Reports Server (NTRS)
Melton, D. M.
1998-01-01
Maintaining qualified cleaning and verification processes are essential in an production environment. Environmental regulations have and are continuing to impact cleaning and verification processing in component and large structures, both at the Michoud Assembly Facility and component suppliers. The goal of the effort was to assure that the cleaning and verification proceeds unimpeded and that qualified, environmentally compliant material and process replacements are implemented and perform to specifications. The approach consisted of (1) selection of a Supersonic Gas-Liquid Cleaning System; (2) selection and evaluation of three cleaning and verification solvents as candidate alternatives to HCFC 225 (Vertrel 423 (HCFC), Vertrel MCA (HFC/1,2-Dichloroethylene), and HFE 7100DE (HFE/1,2 Dichloroethylene)); and evaluation of an analytical instrumental post cleaning verification technique. This document is presented in viewgraph format.
Alecu, S; Dadarlat, V; Stanciu, E; Ionescu-Tirgoviste, C; Konerth, A M
1997-01-01
Diabetes represents a heterogeneous group of disturbances, which can have a different aetiology, but have in common glucidic, lipidic and proteinic metabolic disturbances. Insulin-dependent diabetes appears in genetically susceptible persons, as an autoimmune disease activated by environment factors. Epidemiological studies performed in different countries, notice the increasing of diabetes cases in the last decades. Therefore the informatic system EtioDiab (from Etiopathological diabetes) has been developed. The purpose of this system is to assist the medical research regarding the environment factors involved in the etiopathogenesis of insulin-dependent diabetes. The system offers the possibility of calculation of many statistic indicators, of graphic representation of the recorded data, of verification of the statistical hypotheses.
NASA Technical Reports Server (NTRS)
Mckay, C. W.; Bown, R. L.
1985-01-01
The space station data management system involves networks of computing resources that must work cooperatively and reliably over an indefinite life span. This program requires a long schedule of modular growth and an even longer period of maintenance and operation. The development and operation of space station computing resources will involve a spectrum of systems and software life cycle activities distributed across a variety of hosts, an integration, verification, and validation host with test bed, and distributed targets. The requirement for the early establishment and use of an apporopriate Computer Systems and Software Engineering Support Environment is identified. This environment will support the Research and Development Productivity challenges presented by the space station computing system.
Pinealitis accompanying equine recurrent uveitis.
Kalsow, C M; Dwyer, A E; Smith, A W; Nifong, T P
1993-01-01
There is no direct verification of pineal gland involvement in human uveitis. Specimens of pineal tissue are not available during active uveitis in human patients. Naturally occurring uveitis in horses gives us an opportunity to examine tissues during active ocular inflammation. We examined the pineal gland of a horse that was killed because it had become blind during an episode of uveitis. The clinical history and histopathology of the eyes were consistent with post-leptospiral equine recurrent uveitis. The pineal gland of this horse had significant inflammatory infiltration consisting mainly of lymphocytes with some eosinophils. This observation of pinealitis accompanying equine uveitis supports the animal models of experimental autoimmune uveoretinitis with associated pinealitis and suggests that the pineal gland may be involved in some human uveitides. Images PMID:8435400
Compromises produced by the dialectic between self-verification and self-enhancement.
Morling, B; Epstein, S
1997-12-01
Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.
NASA Technical Reports Server (NTRS)
1978-01-01
The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.
Hypothesis testing in students: Sequences, stages, and instructional strategies
NASA Astrophysics Data System (ADS)
Moshman, David; Thompson, Pat A.
Six sequences in the development of hypothesis-testing conceptions are proposed, involving (a) interpretation of the hypothesis; (b) the distinction between using theories and testing theories; (c) the consideration of multiple possibilities; (d) the relation of theory and data; (e) the nature of verification and falsification; and (f) the relation of truth and falsity. An alternative account is then provided involving three global stages: concrete operations, formal operations, and a postformal metaconstructivestage. Relative advantages and difficulties of the stage and sequence conceptualizations are discussed. Finally, three families of teaching strategy are distinguished, which emphasize, respectively: (a) social transmission of knowledge; (b) carefully sequenced empirical experience by the student; and (c) self-regulated cognitive activity of the student. It is argued on the basis of Piaget's theory that the last of these plays a crucial role in the construction of such logical reasoning strategies as those involved in testing hypotheses.
NASA Technical Reports Server (NTRS)
1989-01-01
The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.
Quantum money with classical verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gavinsky, Dmitry
We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.
Quantum money with classical verification
NASA Astrophysics Data System (ADS)
Gavinsky, Dmitry
2014-12-01
We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William
The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.
Towards composition of verified hardware devices
NASA Technical Reports Server (NTRS)
Schubert, E. Thomas; Levitt, K.; Cohen, G. C.
1991-01-01
Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.
Control and Non-Payload Communications (CNPC) Prototype Radio Verification Test Report
NASA Technical Reports Server (NTRS)
Bishop, William D.; Frantz, Brian D.; Thadhani, Suresh K.; Young, Daniel P.
2017-01-01
This report provides an overview and results from the verification of the specifications that defines the operational capabilities of the airborne and ground, L Band and C Band, Command and Non-Payload Communications radio link system. An overview of system verification is provided along with an overview of the operation of the radio. Measurement results are presented for verification of the radios operation.
Verification of operational solar flare forecast: Case of Regional Warning Center Japan
NASA Astrophysics Data System (ADS)
Kubo, Yûki; Den, Mitsue; Ishii, Mamoru
2017-08-01
In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.
How well should probabilistic seismic hazard maps work?
NASA Astrophysics Data System (ADS)
Vanneste, K.; Stein, S.; Camelbeeck, T.; Vleminckx, B.
2016-12-01
Recent large earthquakes that gave rise to shaking much stronger than shown in earthquake hazard maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSHA model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating the shaking history of an area with assumed distribution of earthquakes, frequency-magnitude relation, temporal occurrence model, and ground-motion prediction equation. We compare the "observed" shaking at many sites over time to that predicted by a hazard map generated for the same set of parameters. PSHA predicts that the fraction of sites at which shaking will exceed that mapped is p = 1 - exp(t/T), where t is the duration of observations and T is the map's return period. This implies that shaking in large earthquakes is typically greater than shown on hazard maps, as has occurred in a number of cases. A large number of simulated earthquake histories yield distributions of shaking consistent with this forecast, with a scatter about this value that decreases as t/T increases. The median results are somewhat lower than predicted for small values of t/T and approach the predicted value for larger values of t/T. Hence, the algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. Validation is more complicated because a real observed earthquake history can yield a fractional exceedance significantly higher or lower than that predicted while still being consistent with the hazard map in question. As a result, given that in the real world we have only a single sample, it is hard to assess whether a misfit between a map and observations arises by chance or reflects a biased map.
Ishii, Tadashi; Nakayama, Masaharu; Abe, Michiaki; Takayama, Shin; Kamei, Takashi; Abe, Yoshiko; Yamadera, Jun; Amito, Koichiro; Morino, Kazuma
2016-10-01
Introduction There were 5,385 deceased and 710 missing in the Ishinomaki medical zone following the Great East Japan Earthquake that occurred in Japan on March 11, 2011. The Ishinomaki Zone Joint Relief Team (IZJRT) was formed to unify the relief teams of all organizations joining in support of the Ishinomaki area. The IZJRT expanded relief activity as they continued to manually collect and analyze assessments of essential information for maintaining health in all 328 shelters using a paper-type survey. However, the IZJRT spent an enormous amount of time and effort entering and analyzing these data because the work was vastly complex. Therefore, an assessment system must be developed that can tabulate shelter assessment data correctly and efficiently. The objective of this report was to describe the development and verification of a system to rapidly assess evacuation centers in preparation for the next major disaster. Report Based on experiences with the complex work during the disaster, software called the "Rapid Assessment System of Evacuation Center Condition featuring Gonryo and Miyagi" (RASECC-GM) was developed to enter, tabulate, and manage the shelter assessment data. Further, a verification test was conducted during a large-scale Self-Defense Force (SDF) training exercise to confirm its feasibility, usability, and accuracy. The RASECC-GM comprises three screens: (1) the "Data Entry screen," allowing for quick entry on tablet devices of 19 assessment items, including shelter administrator, living and sanitary conditions, and a tally of the injured and sick; (2) the "Relief Team/Shelter Management screen," for registering information on relief teams and shelters; and (3) the "Data Tabulation screen," which allows tabulation of the data entered for each shelter, as well as viewing and sorting from a disaster headquarters' computer. During the verification test, data of mock shelters entered online were tabulated quickly and accurately on a mock disaster headquarters' computer. Likewise, data entered offline also were tabulated quickly on the mock disaster headquarters' computer when the tablet device was moved into an online environment. The RASECC-GM, a system for rapidly assessing the condition of evacuation centers, was developed. Tests verify that users of the system would be able to easily, quickly, and accurately assess vast quantities of data from multiple shelters in a major disaster and immediately manage the inputted data at the disaster headquarters. Ishii T , Nakayama M , Abe M , Takayama S , Kamei T , Abe Y , Yamadera J , Amito K , Morino K . Development and verification of a mobile shelter assessment system "Rapid Assessment System of Evacuation Center Condition featuring Gonryo and Miyagi (RASECC-GM)" for major disasters. Prehosp Disaster Med. 2016;31(5):539-546.
40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification. (c...
Formal verification of an oral messages algorithm for interactive consistency
NASA Technical Reports Server (NTRS)
Rushby, John
1992-01-01
The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.
A methodology for the rigorous verification of plasma simulation codes
NASA Astrophysics Data System (ADS)
Riva, Fabio
2016-10-01
The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.
A Quantitative Approach to the Formal Verification of Real-Time Systems.
1996-09-01
Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic
Kleene Algebra and Bytecode Verification
2016-04-27
computing the star (Kleene closure) of a matrix of transfer functions. In this paper we show how this general framework applies to the problem of Java ...bytecode verification. We show how to specify transfer functions arising in Java bytecode verification in such a way that the Kleene algebra operations...potentially improve the performance over the standard worklist algorithm when a small cutset can be found. Key words: Java , bytecode, verification, static
Security Verification of Secure MANET Routing Protocols
2012-03-22
SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Matthew F. Steele, Captain, USAF AFIT/GCS/ ENG /12-03 DEPARTMENT OF THE AIR FORCE AIR...States AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Presented to the Faculty Department of Electrical and Computer...DISTRIBUTION UNLIMITED AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS Matthew F. Steele, B.S.E.E. Captain, USAF
Zhu, Ling-Ling; Lv, Na; Zhou, Quan
2016-12-01
We read, with great interest, the study by Baldwin and Rodriguez (2016), which described the role of the verification nurse and details the verification process in identifying errors related to chemotherapy orders. We strongly agree with their findings that a verification nurse, collaborating closely with the prescribing physician, pharmacist, and treating nurse, can better identify errors and maintain safety during chemotherapy administration.
Schlaberg, Robert; Mitchell, Michael J; Taggart, Edward W; She, Rosemary C
2012-01-01
US Food and Drug Administration (FDA)-approved diagnostic tests based on molecular genetic technologies are becoming available for an increasing number of microbial pathogens. Advances in technology and lower costs have moved molecular diagnostic tests formerly performed for research purposes only into much wider use in clinical microbiology laboratories. To provide an example of laboratory studies performed to verify the performance of an FDA-approved assay for the detection of Clostridium difficile cytotoxin B compared with the manufacturer's performance standards. We describe the process and protocols used by a laboratory for verification of an FDA-approved assay, assess data from the verification studies, and implement the assay after verification. Performance data from the verification studies conducted by the laboratory were consistent with the manufacturer's performance standards and the assay was implemented into the laboratory's test menu. Verification studies are required for FDA-approved diagnostic assays prior to use in patient care. Laboratories should develop a standardized approach to verification studies that can be adapted and applied to different types of assays. We describe the verification of an FDA-approved real-time polymerase chain reaction assay for the detection of a toxin gene in a bacterial pathogen.
Requirements, Verification, and Compliance (RVC) Database Tool
NASA Technical Reports Server (NTRS)
Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale
2001-01-01
This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".
NASA Astrophysics Data System (ADS)
Ferreira, Paulo; Kristoufek, Ladislav
2017-11-01
We analyse the covered interest parity (CIP) using two novel regression frameworks based on cross-correlation analysis (detrended cross-correlation analysis and detrending moving-average cross-correlation analysis), which allow for studying the relationships at different scales and work well under non-stationarity and heavy tails. CIP is a measure of capital mobility commonly used to analyse financial integration, which remains an interesting feature of study in the context of the European Union. The importance of this features is related to the fact that the adoption of a common currency is associated with some benefits for countries, but also involves some risks such as the loss of economic instruments to face possible asymmetric shocks. While studying the Eurozone members could explain some problems in the common currency, studying the non-Euro countries is important to analyse if they are fit to take the possible benefits. Our results point to the CIP verification mainly in the Central European countries while in the remaining countries, the verification of the parity is only residual.
Faster Double-Size Bipartite Multiplication out of Montgomery Multipliers
NASA Astrophysics Data System (ADS)
Yoshino, Masayuki; Okeya, Katsuyuki; Vuillaume, Camille
This paper proposes novel algorithms for computing double-size modular multiplications with few modulus-dependent precomputations. Low-end devices such as smartcards are usually equipped with hardware Montgomery multipliers. However, due to progresses of mathematical attacks, security institutions such as NIST have steadily demanded longer bit-lengths for public-key cryptography, making the multipliers quickly obsolete. In an attempt to extend the lifespan of such multipliers, double-size techniques compute modular multiplications with twice the bit-length of the multipliers. Techniques are known for extending the bit-length of classical Euclidean multipliers, of Montgomery multipliers and the combination thereof, namely bipartite multipliers. However, unlike classical and bipartite multiplications, Montgomery multiplications involve modulus-dependent precomputations, which amount to a large part of an RSA encryption or signature verification. The proposed double-size technique simulates double-size multiplications based on single-size Montgomery multipliers, and yet precomputations are essentially free: in an 2048-bit RSA encryption or signature verification with public exponent e=216+1, the proposal with a 1024-bit Montgomery multiplier is at least 1.5 times faster than previous double-size Montgomery multiplications.
Fast regional readout CMOS Image Sensor for dynamic MLC tracking
NASA Astrophysics Data System (ADS)
Zin, H.; Harris, E.; Osmond, J.; Evans, P.
2014-03-01
Advanced radiotherapy techniques such as volumetric modulated arc therapy (VMAT) require verification of the complex beam delivery including tracking of multileaf collimators (MLC) and monitoring the dose rate. This work explores the feasibility of a prototype Complementary metal-oxide semiconductor Image Sensor (CIS) for tracking these complex treatments by utilising fast, region of interest (ROI) read out functionality. An automatic edge tracking algorithm was used to locate the MLC leaves edges moving at various speeds (from a moving triangle field shape) and imaged with various sensor frame rates. The CIS demonstrates successful edge detection of the dynamic MLC motion within accuracy of 1.0 mm. This demonstrates the feasibility of the sensor to verify treatment delivery involving dynamic MLC up to ~400 frames per second (equivalent to the linac pulse rate), which is superior to any current techniques such as using electronic portal imaging devices (EPID). CIS provides the basis to an essential real-time verification tool, useful in accessing accurate delivery of complex high energy radiation to the tumour and ultimately to achieve better cure rates for cancer patients.