Sample records for unique verification context

  1. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    PubMed

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  2. Verification of road databases using multiple road models

    NASA Astrophysics Data System (ADS)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  3. From model conception to verification and validation, a global approach to multiphase Navier-Stoke models with an emphasis on volcanic explosive phenomenology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dartevelle, Sebastian

    2007-10-01

    Large-scale volcanic eruptions are hazardous events that cannot be described by detailed and accurate in situ measurement: hence, little to no real-time data exists to rigorously validate current computer models of these events. In addition, such phenomenology involves highly complex, nonlinear, and unsteady physical behaviors upon many spatial and time scales. As a result, volcanic explosive phenomenology is poorly understood in terms of its physics, and inadequately constrained in terms of initial, boundary, and inflow conditions. Nevertheless, code verification and validation become even more critical because more and more volcanologists use numerical data for assessment and mitigation of volcanic hazards.more » In this report, we evaluate the process of model and code development in the context of geophysical multiphase flows. We describe: (1) the conception of a theoretical, multiphase, Navier-Stokes model, (2) its implementation into a numerical code, (3) the verification of the code, and (4) the validation of such a model within the context of turbulent and underexpanded jet physics. Within the validation framework, we suggest focusing on the key physics that control the volcanic clouds—namely, momentum-driven supersonic jet and buoyancy-driven turbulent plume. For instance, we propose to compare numerical results against a set of simple and well-constrained analog experiments, which uniquely and unambiguously represent each of the key-phenomenology. Key« less

  4. Verification of S&D Solutions for Network Communications and Devices

    NASA Astrophysics Data System (ADS)

    Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen

    This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.

  5. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...

  6. Supporting Technology for Chain of Custody of Nuclear Weapons and Materials throughout the Dismantlement and Disposition Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bunch, Kyle J.; Jones, Anthony M.; Ramuhalli, Pradeep

    The ratification and ongoing implementation of the New START Treaty have been widely regarded as noteworthy global security achievements for both the Obama Administration and the Putin (formerly Medvedev) regime. But deeper cuts that move beyond the United States and Russia to engage the P-5 and other nuclear weapons possessor states are envisioned under future arms control regimes, and are indeed required for the P-5 in accordance with their Article VI disarmament obligations in the Nuclear Non-Proliferation Treaty. Future verification needs will include monitoring the cessation of production of new fissile material for weapons, monitoring storage of warhead components andmore » fissile materials and verifying dismantlement of warheads, pits, secondary stages, and other materials. A fundamental challenge to implementing a nuclear disarmament regime is the ability to thwart unauthorized material diversion throughout the dismantlement and disposition process through strong chain of custody implementation. Verifying the declared presence, or absence, of nuclear materials and weapons components throughout the dismantlement and disposition lifecycle is a critical aspect of the disarmament process. From both the diplomatic and technical perspectives, verification under these future arms control regimes will require new solutions. Since any acceptable verification technology must protect sensitive design information and attributes to prevent the release of classified or other proliferation-sensitive information, non-nuclear non-sensitive modalities may provide significant new verification tools which do not require the use of additional information barriers. Alternative verification technologies based upon electromagnetic and acoustics could potentially play an important role in fulfilling the challenging requirements of future verification regimes. For example, researchers at the Pacific Northwest National Laboratory (PNNL) have demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to rapidly confirm the presence of specific components on a yes/no basis without revealing classified information. PNNL researchers have also used ultrasonic measurements to obtain images of material microstructures which may be used as templates or unique identifiers of treaty-limited items. Such alternative technologies are suitable for application in various stages of weapons dismantlement and often include the advantage of an inherent information barrier due to the inability to extract classified weapon design information from the collected data. As a result, these types of technologies complement radiation-based verification methods for arms control. This article presents an overview of several alternative verification technologies that are suitable for supporting a future, broader and more intrusive arms control regime that spans the nuclear weapons disarmament lifecycle. The general capabilities and limitations of each verification modality are discussed and example technologies are presented. Potential applications are defined in the context of the nuclear material and weapons lifecycle. Example applications range from authentication (e.g., tracking and signatures within the chain of custody from downloading through weapons storage, unclassified templates and unique identification) to verification of absence and final material disposition.« less

  7. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  8. Can legality verification enhace local rights to forest resources? Piloting the policy learning protocol in the Peruvian forest context

    Treesearch

    B. Cashore; I. Visseren-Hamakers; P. Caro Torres; W. de Jong; A. Denvir; D. Humphreys; Kathleen McGinley; G. Auld; S. Lupberger; C. McDermott; S. Sax; D. Yin

    2016-01-01

    This report, “Can Legality Verification Enhance Local Rights to Forest Resources? Piloting the policy learning protocol in the Peruvian forest context,” reports on the testing of the application of the 11-step Policy Learning Protocol in Peru in 2015-16. The Protocol (Cashore et al. 2014) enables actors to draw from international policy initiatives in order to improve...

  9. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  10. Characterization of palmprints by wavelet signatures via directional context modeling.

    PubMed

    Zhang, Lei; Zhang, David

    2004-06-01

    The palmprint is one of the most reliable physiological characteristics that can be used to distinguish between individuals. Current palmprint-based systems are more user friendly, more cost effective, and require fewer data signatures than traditional fingerprint-based identification systems. The principal lines and wrinkles captured in a low-resolution palmprint image provide more than enough information to uniquely identify an individual. This paper presents a palmprint identification scheme that characterizes a palmprint using a set of statistical signatures. The palmprint is first transformed into the wavelet domain, and the directional context of each wavelet subband is defined and computed in order to collect the predominant coefficients of its principal lines and wrinkles. A set of statistical signatures, which includes gravity center, density, spatial dispersivity and energy, is then defined to characterize the palmprint with the selected directional context values. A classification and identification scheme based on these signatures is subsequently developed. This scheme exploits the features of principal lines and prominent wrinkles sufficiently and achieves satisfactory results. Compared with the line-segments-matching or interesting-points-matching based palmprint verification schemes, the proposed scheme uses a much smaller amount of data signatures. It also provides a convenient classification strategy and more accurate identification.

  11. Formal verification and testing: An integrated approach to validating Ada programs

    NASA Technical Reports Server (NTRS)

    Cohen, Norman H.

    1986-01-01

    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  12. Verification Processes in Recognition Memory: The Role of Natural Language Mediators

    ERIC Educational Resources Information Center

    Marshall, Philip H.; Smith, Randolph A. S.

    1977-01-01

    The existence of verification processes in recognition memory was confirmed in the context of Adams' (Adams & Bray, 1970) closed-loop theory. Subjects' recognition was tested following a learning session. The expectation was that data would reveal consistent internal relationships supporting the position that natural language mediation plays…

  13. Monitoring and verification R&D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existingmore » energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.« less

  14. SSME lifetime prediction and verification, integrating environments, structures, materials: The challenge

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.; Salter, L. D.; Young, G. M., III; Munafo, P. M.

    1985-01-01

    The planned missions for the space shuttle dictated a unique and technology-extending rocket engine. The high specific impulse requirements in conjunction with a 55-mission lifetime, plus volume and weight constraints, produced unique structural design, manufacturing, and verification requirements. Operations from Earth to orbit produce severe dynamic environments, which couple with the extreme pressure and thermal environments associated with the high performance, creating large low cycle loads and high alternating stresses above endurance limit which result in high sensitivity to alternating stresses. Combining all of these effects resulted in the requirements for exotic materials, which are more susceptible to manufacturing problems, and the use of an all-welded structure. The challenge of integrating environments, dynamics, structures, and materials into a verified SSME structure is discussed. The verification program and developmental flight results are included. The first six shuttle flights had engine performance as predicted with no failures. The engine system has met the basic design challenges.

  15. Biometric template revocation

    NASA Astrophysics Data System (ADS)

    Arndt, Craig M.

    2004-08-01

    Biometric are a powerful technology for identifying humans both locally and at a distance. In order to perform identification or verification biometric systems capture an image of some biometric of a user or subject. The image is then converted mathematical to representation of the person call a template. Since we know that every human in the world is different each human will have different biometric images (different fingerprints, or faces, etc.). This is what makes biometrics useful for identification. However unlike a credit card number or a password to can be given to a person and later revoked if it is compromised and biometric is with the person for life. The problem then is to develop biometric templates witch can be easily revoked and reissued which are also unique to the user and can be easily used for identification and verification. In this paper we develop and present a method to generate a set of templates which are fully unique to the individual and also revocable. By using bases set compression algorithms in an n-dimensional orthogonal space we can represent a give biometric image in an infinite number of equally valued and unique ways. The verification and biometric matching system would be presented with a given template and revocation code. The code will then representing where in the sequence of n-dimensional vectors to start the recognition.

  16. Verification Image of The Veins on The Back Palm with Modified Local Line Binary Pattern (MLLBP) and Histogram

    NASA Astrophysics Data System (ADS)

    Prijono, Agus; Darmawan Hangkawidjaja, Aan; Ratnadewi; Saleh Ahmar, Ansari

    2018-01-01

    The verification to person who is used today as a fingerprint, signature, personal identification number (PIN) in the bank system, identity cards, attendance, easily copied and forged. This causes the system not secure and is vulnerable to unauthorized persons to access the system. In this research will be implemented verification system using the image of the blood vessels in the back of the palms as recognition more difficult to imitate because it is located inside the human body so it is safer to use. The blood vessels located at the back of the human hand is unique, even humans twins have a different image of the blood vessels. Besides the image of the blood vessels do not depend on a person’s age, so it can be used for long term, except in the case of an accident, or disease. Because of the unique vein pattern recognition can be used in a person. In this paper, we used a modification method to perform the introduction of a person based on the image of the blood vessel that is using Modified Local Line Binary Pattern (MLLBP). The process of matching blood vessel image feature extraction using Hamming Distance. Test case of verification is done by calculating the percentage of acceptance of the same person. Rejection error occurs if a person was not matched by the system with the data itself. The 10 person with 15 image compared to 5 image vein for each person is resulted 80,67% successful Another test case of the verification is done by verified two image from different person that is forgery, and the verification will be true if the system can rejection the image forgery. The ten different person is not verified and the result is obtained 94%.

  17. Identification and verification of hybridoma-derived monoclonal antibody variable region sequences using recombinant DNA technology and mass spectrometry

    USDA-ARS?s Scientific Manuscript database

    Antibody engineering requires the identification of antigen binding domains or variable regions (VR) unique to each antibody. It is the VR that define the unique antigen binding properties and proper sequence identification is essential for functional evaluation and performance of recombinant antibo...

  18. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    NASA Astrophysics Data System (ADS)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.

  19. Arms Control Verification: ’Bridge’ Theories and the Politics of Expediency.

    DTIC Science & Technology

    1983-04-01

    that the compliance verification dilemma, a uniquely American problem, creates a set of opportunities that are, in fact, among the principal reasons for...laws of the class struggle.4 9 While Americans were arguing among themselves about whether detente should involve political "linkage,’ the Chairman...required an equivalent American willingness to persevere indefinitely. But to generate that kind of fervor among the voting populace would have required

  20. Digital Pharmacovigilance and Disease Surveillance: Combining Traditional and Big-Data Systems for Better Public Health

    PubMed Central

    Salathé, Marcel

    2016-01-01

    The digital revolution has contributed to very large data sets (ie, big data) relevant for public health. The two major data sources are electronic health records from traditional health systems and patient-generated data. As the two data sources have complementary strengths—high veracity in the data from traditional sources and high velocity and variety in patient-generated data—they can be combined to build more-robust public health systems. However, they also have unique challenges. Patient-generated data in particular are often completely unstructured and highly context dependent, posing essentially a machine-learning challenge. Some recent examples from infectious disease surveillance and adverse drug event monitoring demonstrate that the technical challenges can be solved. Despite these advances, the problem of verification remains, and unless traditional and digital epidemiologic approaches are combined, these data sources will be constrained by their intrinsic limits. PMID:28830106

  1. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  2. National Center for Nuclear Security - NCNS

    ScienceCinema

    None

    2018-01-16

    As the United States embarks on a new era of nuclear arms control, the tools for treaty verification must be accurate and reliable, and must work at stand-off distances. The National Center for Nuclear Security, or NCNS, at the Nevada National Security Site, is poised to become the proving ground for these technologies. The center is a unique test bed for non-proliferation and arms control treaty verification technologies. The NNSS is an ideal location for these kinds of activities because of its multiple environments; its cadre of experienced nuclear personnel, and the artifacts of atmospheric and underground nuclear weapons explosions. The NCNS will provide future treaty negotiators with solid data on verification and inspection regimes and a realistic environment in which future treaty verification specialists can be trained. Work on warhead monitoring at the NCNS will also support future arms reduction treaties.

  3. Options and Risk for Qualification of Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)

    2002-01-01

    Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.

  4. Simulation validation and management

    NASA Astrophysics Data System (ADS)

    Illgen, John D.

    1995-06-01

    Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.

  5. A Tool for Intersecting Context-Free Grammars and Its Applications

    NASA Technical Reports Server (NTRS)

    Gange, Graeme; Navas, Jorge A.; Schachte, Peter; Sondergaard, Harald; Stuckey, Peter J.

    2015-01-01

    This paper describes a tool for intersecting context-free grammars. Since this problem is undecidable the tool follows a refinement-based approach and implements a novel refinement which is complete for regularly separable grammars. We show its effectiveness for safety verification of recursive multi-threaded programs.

  6. A uniqueness-and-anonymity-preserving remote user authentication scheme for connected health care.

    PubMed

    Chang, Ya-Fen; Yu, Shih-Hui; Shiao, Ding-Rui

    2013-04-01

    Connected health care provides new opportunities for improving financial and clinical performance. Many connected health care applications such as telecare medicine information system, personally controlled health records system, and patient monitoring have been proposed. Correct and quality care is the goal of connected heath care, and user authentication can ensure the legality of patients. After reviewing authentication schemes for connected health care applications, we find that many of them cannot protect patient privacy such that others can trace users/patients by the transmitted data. And the verification tokens used by these authentication schemes to authenticate users or servers are only password, smart card and RFID tag. Actually, these verification tokens are not unique and easy to copy. On the other hand, biometric characteristics, such as iris, face, voiceprint, fingerprint and so on, are unique, easy to be verified, and hard to be copied. In this paper, a biometrics-based user authentication scheme will be proposed to ensure uniqueness and anonymity at the same time. With the proposed scheme, only the legal user/patient himself/herself can access the remote server, and no one can trace him/her according to transmitted data.

  7. Structural Design Requirements and Factors of Safety for Spaceflight Hardware: For Human Spaceflight. Revision A

    NASA Technical Reports Server (NTRS)

    Bernstein, Karen S.; Kujala, Rod; Fogt, Vince; Romine, Paul

    2011-01-01

    This document establishes the structural requirements for human-rated spaceflight hardware including launch vehicles, spacecraft and payloads. These requirements are applicable to Government Furnished Equipment activities as well as all related contractor, subcontractor and commercial efforts. These requirements are not imposed on systems other than human-rated spacecraft, such as ground test articles, but may be tailored for use in specific cases where it is prudent to do so such as for personnel safety or when assets are at risk. The requirements in this document are focused on design rather than verification. Implementation of the requirements is expected to be described in a Structural Verification Plan (SVP), which should describe the verification of each structural item for the applicable requirements. The SVP may also document unique verifications that meet or exceed these requirements with NASA Technical Authority approval.

  8. Engineering of the LISA Pathfinder mission—making the experiment a practical reality

    NASA Astrophysics Data System (ADS)

    Warren, Carl; Dunbar, Neil; Backler, Mike

    2009-05-01

    LISA Pathfinder represents a unique challenge in the development of scientific spacecraft—not only is the LISA Test Package (LTP) payload a complex integrated development, placing stringent requirements on its developers and the spacecraft, but the payload also acts as the core sensor and actuator for the spacecraft, making the tasks of control design, software development and system verification unusually difficult. The micro-propulsion system which provides the remaining actuation also presents substantial development and verification challenges. As the mission approaches the system critical design review, flight hardware is completing verification and the process of verification using software and hardware simulators and test benches is underway. Preparation for operations has started, but critical milestones for LTP and field effect electric propulsion (FEEP) lie ahead. This paper summarizes the status of the present development and outlines the key challenges that must be overcome on the way to launch.

  9. Hardware proofs using EHDM and the RSRE verification methodology

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Sjogren, Jon A.

    1988-01-01

    Examined is a methodology for hardware verification developed by Royal Signals and Radar Establishment (RSRE) in the context of the SRI International's Enhanced Hierarchical Design Methodology (EHDM) specification/verification system. The methodology utilizes a four-level specification hierarchy with the following levels: functional level, finite automata model, block model, and circuit level. The properties of a level are proved as theorems in the level below it. This methodology is applied to a 6-bit counter problem and is critically examined. The specifications are written in EHDM's specification language, Extended Special, and the proofs are improving both the RSRE methodology and the EHDM system.

  10. Thermal System Verification and Model Validation for NASA's Cryogenic Passively Cooled James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Cleveland, Paul E.; Parrish, Keith A.

    2005-01-01

    A thorough and unique thermal verification and model validation plan has been developed for NASA s James Webb Space Telescope. The JWST observatory consists of a large deployed aperture optical telescope passively cooled to below 50 Kelvin along with a suite of several instruments passively and actively cooled to below 37 Kelvin and 7 Kelvin, respectively. Passive cooling to these extremely low temperatures is made feasible by the use of a large deployed high efficiency sunshield and an orbit location at the L2 Lagrange point. Another enabling feature is the scale or size of the observatory that allows for large radiator sizes that are compatible with the expected power dissipation of the instruments and large format Mercury Cadmium Telluride (HgCdTe) detector arrays. This passive cooling concept is simple, reliable, and mission enabling when compared to the alternatives of mechanical coolers and stored cryogens. However, these same large scale observatory features, which make passive cooling viable, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone to most space missions thermal verification plan. JWST is simply too large in its deployed configuration to be properly thermal balance tested in the facilities that currently exist. This reality, when combined with a mission thermal concept with little to no flight heritage, has necessitated the need for a unique and alternative approach to thermal system verification and model validation. This paper describes the thermal verification and model validation plan that has been developed for JWST. The plan relies on judicious use of cryogenic and thermal design margin, a completely independent thermal modeling cross check utilizing different analysis teams and software packages, and finally, a comprehensive set of thermal tests that occur at different levels of JWST assembly. After a brief description of the JWST mission and thermal architecture, a detailed description of the three aspects of the thermal verification and model validation plan is presented.

  11. Contextual Variation, Familiarity, Academic Literacy, and Rural Adolescents' Idiom Knowledge.

    PubMed

    Qualls, Constance Dean; O'Brien, Rose M; Blood, Gordon W; Hammer, Carol Scheffner

    2003-01-01

    The paucity of data on idiom development in adolescents, particularly rural adolescents, limits the ability of speech-language pathologists and educators to test and teach idioms appropriately in this population. This study was designed to delineate the interrelationships between context, familiarity, and academic literacy relative to rural adolescents' idiom knowledge. Ninety-five rural eighth graders (M age=13.4 years) were quasi-randomly assigned to complete the Idiom Comprehension Test (Qualls & Harris, 1999) in one of three contexts: idioms in a short story (n=25), idioms in isolation (n=32), and idioms in a verification task (n=38). For all conditions, the identical 24 idioms-8 each of high, moderate, and low familiarity (Nippold & Rudzinski, 1993)-were presented. For a subset (N=54) of the students, reading and language arts scores from the California Achievement Tests (5th ed., 1993), a standardized achievement test, were correlated with performance on the idiom test. Performance in the story condition and on high-familiarity idioms showed the greatest accuracy. For the isolation and verification conditions, context interacted with familiarity. Associations existed between idiom performance and reading ability and idiom performance and language literacy, but only for the story and verification conditions. High-proficiency readers showed the greatest idiom accuracy. The results support the notion that context facilitates idiom comprehension for rural adolescents, and that idiom testing should consider not only context, but idiom familiarity as well. Thus, local norms should be established. Findings also confirm that good readers are better at comprehending idioms, likely resulting from enriched vocabulary obtained through reading. These normative data indicate what might be expected when testing idiom knowledge in adolescents with language impairments.

  12. Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program

    NASA Technical Reports Server (NTRS)

    Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby

    2017-01-01

    Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.

  13. Secure Image Hash Comparison for Warhead Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruillard, Paul J.; Jarman, Kenneth D.; Robinson, Sean M.

    2014-06-06

    The effort to inspect and verify warheads in the context of possible future arms control treaties is rife with security and implementation issues. In this paper we review prior work on perceptual image hashing for template-based warhead verification. Furthermore, we formalize the notion of perceptual hashes and demonstrate that large classes of such functions are likely not cryptographically secure. We close with a brief discussion of fully homomorphic encryption as an alternative technique.

  14. Verification and Optimal Control of Context-Sensitive Probabilistic Boolean Networks Using Model Checking and Polynomial Optimization

    PubMed Central

    Hiraishi, Kunihiko

    2014-01-01

    One of the significant topics in systems biology is to develop control theory of gene regulatory networks (GRNs). In typical control of GRNs, expression of some genes is inhibited (activated) by manipulating external stimuli and expression of other genes. It is expected to apply control theory of GRNs to gene therapy technologies in the future. In this paper, a control method using a Boolean network (BN) is studied. A BN is widely used as a model of GRNs, and gene expression is expressed by a binary value (ON or OFF). In particular, a context-sensitive probabilistic Boolean network (CS-PBN), which is one of the extended models of BNs, is used. For CS-PBNs, the verification problem and the optimal control problem are considered. For the verification problem, a solution method using the probabilistic model checker PRISM is proposed. For the optimal control problem, a solution method using polynomial optimization is proposed. Finally, a numerical example on the WNT5A network, which is related to melanoma, is presented. The proposed methods provide us useful tools in control theory of GRNs. PMID:24587766

  15. Kepler Planet Detection Metrics: Window and One-Sigma Depth Functions for Data Release 25

    NASA Technical Reports Server (NTRS)

    Burke, Christopher J.; Catanzarite, Joseph

    2017-01-01

    This document describes the window and one-sigma depth functions relevant to the Transiting Planet Search (TPS) algorithm in the Kepler pipeline (Jenkins 2002; Jenkins et al. 2017). The window function specifies the fraction of unique orbital ephemeris epochs over which three transits are observable as a function of orbital period. In this context, the epoch and orbital period, together, comprise the ephemeris of an orbiting companion, and ephemerides with the same period are considered equivalent if their epochs differ by an integer multiple of the period. The one-sigma depth function specifies the depth of a signal (in ppm) for a given light curve that results in a one-sigma detection of a transit signature as a function of orbital period when averaged over all unique orbital ephemerides. These planet detection metrics quantify the ability of TPS to detect a transiting planet signature on a star-by-star basis. They are uniquely applicable to a specific Kepler data release, since they are dependent on the details of the light curves searched and the functionality of the TPS algorithm used to perform the search. This document describes the window and one-sigma depth functions relevant to Kepler Data Release 25 (DR25), where the data were processed (Thompson et al. 2016) and searched (Twicken et al. 2016) with the SOC 9.3 pipeline. In Section 4, we describe significant differences from those reported in Kepler Data Release 24 (Burke Seader 2016) and document our verification method.

  16. EURATOM safeguards efforts in the development of spent fuel verification methods by non-destructive assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matloch, L.; Vaccaro, S.; Couland, M.

    The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction ofmore » encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)« less

  17. Comprehending how visual context influences incremental sentence processing: insights from ERPs and picture-sentence verification

    PubMed Central

    Knoeferle, Pia; Urbach, Thomas P.; Kutas, Marta

    2010-01-01

    To re-establish picture-sentence verification – discredited possibly for its over-reliance on post-sentence response time (RT) measures - as a task for situated comprehension, we collected event-related brain potentials (ERPs) as participants read a subject-verb-object sentence, and RTs indicating whether or not the verb matched a previously depicted action. For mismatches (vs matches), speeded RTs were longer, verb N400s over centro-parietal scalp larger, and ERPs to the object noun more negative. RTs (congruence effect) correlated inversely with the centro-parietal verb N400s, and positively with the object ERP congruence effects. Verb N400s, object ERPs, and verbal working memory scores predicted more variance in RT effects (50%) than N400s alone. Thus, (1) verification processing is not all post-sentence; (2) simple priming cannot account for these results; and (3) verification tasks can inform studies of situated comprehension. PMID:20701712

  18. Being known, intimate, and valued: global self-verification and dyadic adjustment in couples and roommates.

    PubMed

    Katz, Jennifer; Joiner, Thomas E

    2002-02-01

    We contend that close relationships provide adults with optimal opportunities for personal growth when relationship partners provide accurate, honest feedback. Accordingly, it was predicted that young adults would experience the relationship quality with relationship partners who evaluated them in a manner consistent their own self-evaluations. Three empirical tests of this self-verification hypothesis as applied to close dyads were conducted. In Study 1, young adults in dating relationships were most intimate with and somewhat more committed to partners when they perceived that partners evaluated them as they evaluated themselves. Self-verification effects were pronounced for those involved in more serious dating relationships. In Study 2, men reported the greatest esteem for same-sex roommates who evaluated them in a self-verifying manner. Results from Study 2 were replicated and extended to both male and female roommate dyads in Study 3. Further, self-verification effects were most pronounced for young adults with high emotional empathy. Results suggest that self-verification theory is useful for understanding dyadic adjustment across a variety of relational contexts in young adulthood. Implications of self-verification processes for adult personal development are outlined within an identity negotiation framework.

  19. Numerical Modeling of Ablation Heat Transfer

    NASA Technical Reports Server (NTRS)

    Ewing, Mark E.; Laker, Travis S.; Walker, David T.

    2013-01-01

    A unique numerical method has been developed for solving one-dimensional ablation heat transfer problems. This paper provides a comprehensive description of the method, along with detailed derivations of the governing equations. This methodology supports solutions for traditional ablation modeling including such effects as heat transfer, material decomposition, pyrolysis gas permeation and heat exchange, and thermochemical surface erosion. The numerical scheme utilizes a control-volume approach with a variable grid to account for surface movement. This method directly supports implementation of nontraditional models such as material swelling and mechanical erosion, extending capabilities for modeling complex ablation phenomena. Verifications of the numerical implementation are provided using analytical solutions, code comparisons, and the method of manufactured solutions. These verifications are used to demonstrate solution accuracy and proper error convergence rates. A simple demonstration of a mechanical erosion (spallation) model is also provided to illustrate the unique capabilities of the method.

  20. Development and verification of an agent-based model of opinion leadership.

    PubMed

    Anderson, Christine A; Titler, Marita G

    2014-09-27

    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The development and testing of agent-based models is an iterative process. The opinion leader model presented here provides a basic structure for continued model development, ongoing verification, and the establishment of validation procedures, including empirical data collection.

  1. Contribution of the infrasound technology to characterize large scale atmospheric disturbances and impact on infrasound monitoring

    NASA Astrophysics Data System (ADS)

    Blanc, Elisabeth; Le Pichon, Alexis; Ceranna, Lars; Pilger, Christoph; Charlton Perez, Andrew; Smets, Pieter

    2016-04-01

    The International Monitoring System (IMS) developed for the verification of the Comprehensive nuclear-Test-Ban Treaty (CTBT) provides a unique global description of atmospheric disturbances generating infrasound such as extreme events (e.g. meteors, volcanoes, earthquakes, and severe weather) or human activity (e.g. explosions and supersonic airplanes). The analysis of the detected signals, recorded at global scales and over near 15 years at some stations, demonstrates that large-scale atmospheric disturbances strongly affect infrasound propagation. Their time scales vary from several tens of minutes to hours and days. Their effects are in average well resolved by the current model predictions; however, accurate spatial and temporal description is lacking in both weather and climate models. This study reviews recent results using the infrasound technology to characterize these large scale disturbances, including (i) wind fluctuations induced by gravity waves generating infrasound partial reflections and modifications of the infrasound waveguide, (ii) convection from thunderstorms and mountain waves generating gravity waves, (iii) stratospheric warming events which yield wind inversions in the stratosphere, (iv)planetary waves which control the global atmospheric circulation. Improved knowledge of these disturbances and assimilation in future models is an important objective of the ARISE (Atmospheric dynamics Research InfraStructure in Europe) project. This is essential in the context of the future verification of the CTBT as enhanced atmospheric models are necessary to assess the IMS network performance in higher resolution, reduce source location errors, and improve characterization methods.

  2. Digital Pharmacovigilance and Disease Surveillance: Combining Traditional and Big-Data Systems for Better Public Health.

    PubMed

    Salathé, Marcel

    2016-12-01

    The digital revolution has contributed to very large data sets (ie, big data) relevant for public health. The two major data sources are electronic health records from traditional health systems and patient-generated data. As the two data sources have complementary strengths-high veracity in the data from traditional sources and high velocity and variety in patient-generated data-they can be combined to build more-robust public health systems. However, they also have unique challenges. Patient-generated data in particular are often completely unstructured and highly context dependent, posing essentially a machine-learning challenge. Some recent examples from infectious disease surveillance and adverse drug event monitoring demonstrate that the technical challenges can be solved. Despite these advances, the problem of verification remains, and unless traditional and digital epidemiologic approaches are combined, these data sources will be constrained by their intrinsic limits. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America.

  3. Early Detection of Steel Rebar Corrosion by Acoustic Emission Monitoring

    DOT National Transportation Integrated Search

    1995-01-01

    Acoustic emission monitoring was performed in a unique way on concrete specimens containing reinforcing steel and the acoustic emission events correlated with the presence of rebar corrosion. Verification of rebar corrosion was done by galvanic curre...

  4. DISCOVER-AQ: a unique acoustic propagation verification and validation data set

    DOT National Transportation Integrated Search

    2015-08-09

    In 2013, the National Aeronautics and Space Administration conducted a month-long flight test for the Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality research effort in Houston...

  5. Comparing the Effectiveness of Verification and Inquiry Laboratories in Supporting Undergraduate Science Students in Constructing Arguments around Socioscientific Issues

    ERIC Educational Resources Information Center

    Grooms, Jonathon; Sampson, Victor; Golden, Barry

    2014-01-01

    This quasi-experimental study uses a pre-/post-intervention approach to investigate the quality of undergraduate students' arguments in the context of socioscientific issues (SSI) based on experiencing a semester of traditional "cookbook" instruction (N?=?79) or a semester of argument-based instruction (N?=?73) in the context of an…

  6. Verification and classification bias interactions in diagnostic test accuracy studies for fine-needle aspiration biopsy.

    PubMed

    Schmidt, Robert L; Walker, Brandon S; Cohen, Michael B

    2015-03-01

    Reliable estimates of accuracy are important for any diagnostic test. Diagnostic accuracy studies are subject to unique sources of bias. Verification bias and classification bias are 2 sources of bias that commonly occur in diagnostic accuracy studies. Statistical methods are available to estimate the impact of these sources of bias when they occur alone. The impact of interactions when these types of bias occur together has not been investigated. We developed mathematical relationships to show the combined effect of verification bias and classification bias. A wide range of case scenarios were generated to assess the impact of bias components and interactions on total bias. Interactions between verification bias and classification bias caused overestimation of sensitivity and underestimation of specificity. Interactions had more effect on sensitivity than specificity. Sensitivity was overestimated by at least 7% in approximately 6% of the tested scenarios. Specificity was underestimated by at least 7% in less than 0.1% of the scenarios. Interactions between verification bias and classification bias create distortions in accuracy estimates that are greater than would be predicted from each source of bias acting independently. © 2014 American Cancer Society.

  7. Growing skull hemangioma: first and unique description in a patient with Klippel-Trénaunay-Weber syndrome.

    PubMed

    van der Loo, Lars E; Beckervordersandforth, Jan; Colon, Albert J; Schijns, Olaf E M G

    2017-02-01

    We present the first and unique case of a rapid-growing skull hemangioma in a patient with Klippel-Trénaunay-Weber syndrome. This case report provides evidence that not all rapid-growing, osteolytic skull lesions need to have a malignant character but certainly need a histopathological verification. This material offers insight into the list of rare pathological diagnoses in an infrequent syndrome.

  8. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  9. The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?

    PubMed

    Schaun, Gustavo Z

    2017-12-08

    Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.

  10. The Effect of Job Performance Aids on Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fosshage, Erik

    Job performance aids (JPAs) have been studied for many decades in a variety of disciplines and for many different types of tasks, yet this is the first known research experiment using JPAs in a quality assurance (QA) context. The objective of this thesis was to assess whether a JPA has an effect on the performance of a QA observer performing the concurrent dual verification technique for a basic assembly task. The JPA used in this study was a simple checklist, and the design borrows heavily from prior research on task analysis and other human factors principles. The assembly task andmore » QA construct of concurrent dual verification are consistent with those of a high consequence manufacturing environment. Results showed that the JPA had only a limited effect on QA performance in the context of this experiment. However, there were three important and unexpected findings that may draw interest from a variety of practitioners. First, a novel testing methodology sensitive enough to measure the effects of a JPA on performance was created. Second, the discovery that there are different probabilities of detection for different types of error in a QA context may be the most far-reaching results. Third, these results highlight the limitations of concurrent dual verification as a control against defects. It is hoped that both the methodology and results of this study are an effective baseline from which to launch future research activities.« less

  11. Forest Carbon Monitoring and Reporting for REDD+: What Future for Africa?

    PubMed

    Gizachew, Belachew; Duguma, Lalisa A

    2016-11-01

    A climate change mitigation mechanism for emissions reduction from reduced deforestation and forest degradation, plus forest conservation, sustainable management of forest, and enhancement of carbon stocks (REDD+), has received an international political support in the climate change negotiations. The mechanism will require, among others, an unprecedented technical capacity for monitoring, reporting and verification of carbon emissions from the forest sector. A functional monitoring, reporting and verification requires inventories of forest area, carbon stock and changes, both for the construction of forest reference emissions level and compiling the report on the actual emissions, which are essentially lacking in developing countries, particularly in Africa. The purpose of this essay is to contribute to a better understanding of the state and prospects of forest monitoring and reporting in the context of REDD+ in Africa. We argue that monitoring and reporting capacities in Africa fall short of the stringent requirements of the methodological guidance for monitoring, reporting and verification for REDD+, and this may weaken the prospects for successfully implementing REDD+ in the continent. We presented the challenges and prospects in the national forest inventory, remote sensing and reporting infrastructures. A North-South, South-South collaboration as well as governments own investments in monitoring, reporting and verification system could help Africa leapfrog in monitoring and reporting. These could be delivered through negotiations for the transfer of technology, technical capacities, and experiences that exist among developed countries that traditionally compile forest carbon reports in the context of the Kyoto protocol.

  12. 47 CFR 2.954 - Identification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Identification. 2.954 Section 2.954 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL... subject only to verification shall be uniquely identified by the person responsible for marketing or...

  13. 47 CFR 2.954 - Identification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Identification. 2.954 Section 2.954 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL... subject only to verification shall be uniquely identified by the person responsible for marketing or...

  14. Development of automated optical verification technologies for control systems

    NASA Astrophysics Data System (ADS)

    Volegov, Peter L.; Podgornov, Vladimir A.

    1999-08-01

    The report considers optical techniques for automated verification of object's identity designed for control system of nuclear objects. There are presented results of experimental researches and results of development of pattern recognition techniques carried out under the ISTC project number 772 with the purpose of identification of unique feature of surface structure of a controlled object and effects of its random treatment. Possibilities of industrial introduction of the developed technologies in frames of USA and Russia laboratories' lab-to-lab cooperation, including development of up-to-date systems for nuclear material control and accounting are examined.

  15. Why do verification and validation?

    DOE PAGES

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  16. Microscopy as a statistical, Rényi-Ulam, half-lie game: a new heuristic search strategy to accelerate imaging.

    PubMed

    Drumm, Daniel W; Greentree, Andrew D

    2017-11-07

    Finding a fluorescent target in a biological environment is a common and pressing microscopy problem. This task is formally analogous to the canonical search problem. In ideal (noise-free, truthful) search problems, the well-known binary search is optimal. The case of half-lies, where one of two responses to a search query may be deceptive, introduces a richer, Rényi-Ulam problem and is particularly relevant to practical microscopy. We analyse microscopy in the contexts of Rényi-Ulam games and half-lies, developing a new family of heuristics. We show the cost of insisting on verification by positive result in search algorithms; for the zero-half-lie case bisectioning with verification incurs a 50% penalty in the average number of queries required. The optimal partitioning of search spaces directly following verification in the presence of random half-lies is determined. Trisectioning with verification is shown to be the most efficient heuristic of the family in a majority of cases.

  17. Comparison and quantitative verification of mapping algorithms for whole genome bisulfite sequencing

    USDA-ARS?s Scientific Manuscript database

    Coupling bisulfite conversion with next-generation sequencing (Bisulfite-seq) enables genome-wide measurement of DNA methylation, but poses unique challenges for mapping. However, despite a proliferation of Bisulfite-seq mapping tools, no systematic comparison of their genomic coverage and quantitat...

  18. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay

    2005-01-01

    The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.

  19. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  20. Automated System Calibration and Verification of the Position Measurements for the Los Alamos Isotope Production Facility and the Switchyard Kicker Facilities

    NASA Astrophysics Data System (ADS)

    Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.

    2004-11-01

    The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.

  1. Model-Driven Test Generation of Distributed Systems

    NASA Technical Reports Server (NTRS)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  2. Verification and Validation Methodology of Real-Time Adaptive Neural Networks for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Loparo, Kenneth; Mackall, Dale; Schumann, Johann; Soares, Fola

    2004-01-01

    Recent research has shown that adaptive neural based control systems are very effective in restoring stability and control of an aircraft in the presence of damage or failures. The application of an adaptive neural network with a flight critical control system requires a thorough and proven process to ensure safe and proper flight operation. Unique testing tools have been developed as part of a process to perform verification and validation (V&V) of real time adaptive neural networks used in recent adaptive flight control system, to evaluate the performance of the on line trained neural networks. The tools will help in certification from FAA and will help in the successful deployment of neural network based adaptive controllers in safety-critical applications. The process to perform verification and validation is evaluated against a typical neural adaptive controller and the results are discussed.

  3. Verification and Validation (V&V) Methodologies for Multiphase Turbulent and Explosive Flows. V&V Case Studies of Computer Simulations from Los Alamos National Laboratory GMFIX codes

    NASA Astrophysics Data System (ADS)

    Dartevelle, S.

    2006-12-01

    Large-scale volcanic eruptions are inherently hazardous events, hence cannot be described by detailed and accurate in situ measurements; hence, volcanic explosive phenomenology is inadequately constrained in terms of initial and inflow conditions. Consequently, little to no real-time data exist to Verify and Validate computer codes developed to model these geophysical events as a whole. However, code Verification and Validation remains a necessary step, particularly when volcanologists use numerical data for mitigation of volcanic hazards as more often performed nowadays. The Verification and Validation (V&V) process formally assesses the level of 'credibility' of numerical results produced within a range of specific applications. The first step, Verification, is 'the process of determining that a model implementation accurately represents the conceptual description of the model', which requires either exact analytical solutions or highly accurate simplified experimental data. The second step, Validation, is 'the process of determining the degree to which a model is an accurate representation of the real world', which requires complex experimental data of the 'real world' physics. The Verification step is rather simple to formally achieve, while, in the 'real world' explosive volcanism context, the second step, Validation, is about impossible. Hence, instead of validating computer code against the whole large-scale unconstrained volcanic phenomenology, we rather suggest to focus on the key physics which control these volcanic clouds, viz., momentum-driven supersonic jets and multiphase turbulence. We propose to compare numerical results against a set of simple but well-constrained analog experiments, which uniquely and unambiguously represent these two key-phenomenology separately. Herewith, we use GMFIX (Geophysical Multiphase Flow with Interphase eXchange, v1.62), a set of multiphase- CFD FORTRAN codes, which have been recently redeveloped to meet the strict Quality Assurance, verification, and validation requirements from the Office of Civilian Radioactive Waste Management of the US Dept of Energy. GMFIX solves Navier-Stokes and energy partial differential equations for each phase with appropriate turbulence and interfacial coupling between phases. For momentum-driven single- to multi-phase underexpanded jets, the position of the first Mach disk is known empirically as a function of both the pressure ratio, K, and the particle mass fraction, Phi at the nozzle. Namely, the higher K, the further downstream the Mach disk and the higher Phi, the further upstream the first Mach disk. We show that GMFIX captures these two essential features. In addition, GMFIX displays all the properties found in these jets, such as expansion fans, incident and reflected shocks, and subsequent downstream mach discs, which make this code ideal for further investigations of equivalent volcanological phenomena. One of the other most challenging aspects of volcanic phenomenology is the multiphase nature of turbulence. We also validated GMFIX in comparing the velocity profiles and turbulence quantities against well constrained analog experiments. The velocity profiles agree with the analog ones as well as these of production of turbulent quantities. Overall, the Verification and the Validation experiments although inherently challenging suggest GMFIX captures the most essential dynamical properties of multiphase and supersonic flows and jets.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinh, Nam; Athe, Paridhi; Jones, Christopher

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. Thismore » approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.« less

  5. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of themore » US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.« less

  6. Design Authority in the Test Programme Definition: The Alenia Spazio Experience

    NASA Astrophysics Data System (ADS)

    Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.

    2004-08-01

    In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).

  7. Self-verification as a mediator of mothers' self-fulfilling effects on adolescents' educational attainment.

    PubMed

    Scherr, Kyle C; Madon, Stephanie; Guyll, Max; Willard, Jennifer; Spoth, Richard

    2011-05-01

    This research examined whether self-verification acts as a general mediational process of self-fulfilling prophecies. The authors tested this hypothesis by examining whether self-verification processes mediated self-fulfilling prophecy effects within a different context and with a different belief and a different outcome than has been used in prior research. Results of longitudinal data obtained from mothers and their adolescents (N=332) indicated that mothers' beliefs about their adolescents' educational outcomes had a significant indirect effect on adolescents' academic attainment through adolescents' educational aspirations. This effect, observed over a 6-year span, provided evidence that mothers' self-fulfilling effects occurred, in part, because mothers' false beliefs influenced their adolescents' own educational aspirations, which adolescents then self-verified through their educational attainment. The theoretical and applied implications of these findings are discussed.

  8. Self-Verification as a Mediator of Mothers’ Self-Fulfilling Effects on Adolescents’ Educational Attainment

    PubMed Central

    Scherr, Kyle C.; Madon, Stephanie; Guyll, Max; Willard, Jennifer; Spoth, Richard

    2013-01-01

    This research examined whether self-verification acts as a general mediational process of self-fulfilling prophecies. The authors tested this hypothesis by examining whether self-verification processes mediated self-fulfilling prophecy effects within a different context and with a different belief and a different outcome than has been used in prior research. Results of longitudinal data obtained from mothers and their adolescents (N = 332) indicated that mothers’ beliefs about their adolescents’ educational outcomes had a significant indirect effect on adolescents’ academic attainment through adolescents’ educational aspirations. This effect, observed over a six year span, provided evidence that mothers’ self-fulfilling effects occurred, in part, because mothers’ false beliefs influenced their adolescents’ own educational aspirations which adolescents then self-verified through their educational attainment. The theoretical and applied implications of these findings are discussed. PMID:21357755

  9. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  10. Face recognition in the thermal infrared domain

    NASA Astrophysics Data System (ADS)

    Kowalski, M.; Grudzień, A.; Palka, N.; Szustakowski, M.

    2017-10-01

    Biometrics refers to unique human characteristics. Each unique characteristic may be used to label and describe individuals and for automatic recognition of a person based on physiological or behavioural properties. One of the most natural and the most popular biometric trait is a face. The most common research methods on face recognition are based on visible light. State-of-the-art face recognition systems operating in the visible light spectrum achieve very high level of recognition accuracy under controlled environmental conditions. Thermal infrared imagery seems to be a promising alternative or complement to visible range imaging due to its relatively high resistance to illumination changes. A thermal infrared image of the human face presents its unique heat-signature and can be used for recognition. The characteristics of thermal images maintain advantages over visible light images, and can be used to improve algorithms of human face recognition in several aspects. Mid-wavelength or far-wavelength infrared also referred to as thermal infrared seems to be promising alternatives. We present the study on 1:1 recognition in thermal infrared domain. The two approaches we are considering are stand-off face verification of non-moving person as well as stop-less face verification on-the-move. The paper presents methodology of our studies and challenges for face recognition systems in the thermal infrared domain.

  11. Methods and Procedures in PIRLS 2016

    ERIC Educational Resources Information Center

    Martin, Michael O., Ed.; Mullis, Ina V. S., Ed.; Hooper, Martin, Ed.

    2017-01-01

    "Methods and Procedures in PIRLS 2016" documents the development of the Progress in International Reading Literacy Study (PIRLS) assessments and questionnaires and describes the methods used in sampling, translation verification, data collection, database construction, and the construction of the achievement and context questionnaire…

  12. Determinants of Standard Errors of MLEs in Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Cheng, Ying; Zhang, Wei

    2010-01-01

    This paper studies changes of standard errors (SE) of the normal-distribution-based maximum likelihood estimates (MLE) for confirmatory factor models as model parameters vary. Using logical analysis, simplified formulas and numerical verification, monotonic relationships between SEs and factor loadings as well as unique variances are found.…

  13. Within-culture variations of uniqueness: towards an integrative approach based on social status, gender, life contexts, and interpersonal comparison.

    PubMed

    Causse, Elsa; Félonneau, Marie-Line

    2014-01-01

    Research on uniqueness is widely focused on cross-cultural comparisons and tends to postulate a certain form of within-culture homogeneity. Taking the opposite course of this classic posture, we aimed at testing an integrative approach enabling the study of within-culture variations of uniqueness. This approach considered different sources of variation: social status, gender, life contexts, and interpersonal comparison. Four hundred seventy-nine participants completed a measure based on descriptions of "self" and "other." Results showed important variations of uniqueness. An interaction between social status and life contexts revealed the expression of uniqueness in the low-status group. This study highlights the complexity of uniqueness that appears to be related to both cultural ideology and social hierarchy.

  14. Unique Relations of Age and Delinquency with Cognitive Control

    ERIC Educational Resources Information Center

    Iselin, Anne-Marie R.; DeCoster, Jamie

    2012-01-01

    Context processing has significant empirical support as an explanation of age- and psychopathology-related deficiencies in cognitive control. We examined whether context processing generalizes to younger individuals who are in trouble with the law. We tested whether age and delinquency might have unique relations to context processing skills in…

  15. A Lightweight RFID Mutual Authentication Protocol Based on Physical Unclonable Function.

    PubMed

    Xu, He; Ding, Jie; Li, Peng; Zhu, Feng; Wang, Ruchuan

    2018-03-02

    With the fast development of the Internet of Things, Radio Frequency Identification (RFID) has been widely applied into many areas. Nevertheless, security problems of the RFID technology are also gradually exposed, when it provides life convenience. In particular, the appearance of a large number of fake and counterfeit goods has caused massive loss for both producers and customers, for which the clone tag is a serious security threat. If attackers acquire the complete information of a tag, they can then obtain the unique identifier of the tag by some technological means. In general, because there is no extra identifier of a tag, it is difficult to distinguish an original tag and its clone one. Once the legal tag data is obtained, attackers can be able to clone this tag. Therefore, this paper shows an efficient RFID mutual verification protocol. This protocol is based on the Physical Unclonable Function (PUF) and the lightweight cryptography to achieve efficient verification of a single tag. The protocol includes three process: tag recognition, mutual verification and update. The tag recognition is that the reader recognizes the tag; mutual verification is that the reader and tag mutually verify the authenticity of each other; update is supposed to maintain the latest secret key for the following verification. Analysis results show that this protocol has a good balance between performance and security.

  16. A Lightweight RFID Mutual Authentication Protocol Based on Physical Unclonable Function

    PubMed Central

    Ding, Jie; Zhu, Feng; Wang, Ruchuan

    2018-01-01

    With the fast development of the Internet of Things, Radio Frequency Identification (RFID) has been widely applied into many areas. Nevertheless, security problems of the RFID technology are also gradually exposed, when it provides life convenience. In particular, the appearance of a large number of fake and counterfeit goods has caused massive loss for both producers and customers, for which the clone tag is a serious security threat. If attackers acquire the complete information of a tag, they can then obtain the unique identifier of the tag by some technological means. In general, because there is no extra identifier of a tag, it is difficult to distinguish an original tag and its clone one. Once the legal tag data is obtained, attackers can be able to clone this tag. Therefore, this paper shows an efficient RFID mutual verification protocol. This protocol is based on the Physical Unclonable Function (PUF) and the lightweight cryptography to achieve efficient verification of a single tag. The protocol includes three process: tag recognition, mutual verification and update. The tag recognition is that the reader recognizes the tag; mutual verification is that the reader and tag mutually verify the authenticity of each other; update is supposed to maintain the latest secret key for the following verification. Analysis results show that this protocol has a good balance between performance and security. PMID:29498684

  17. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  18. Biomarker Discovery and Verification of Esophageal Squamous Cell Carcinoma Using Integration of SWATH/MRM.

    PubMed

    Hou, Guixue; Lou, Xiaomin; Sun, Yulin; Xu, Shaohang; Zi, Jin; Wang, Quanhui; Zhou, Baojin; Han, Bo; Wu, Lin; Zhao, Xiaohang; Lin, Liang; Liu, Siqi

    2015-09-04

    We propose an efficient integration of SWATH with MRM for biomarker discovery and verification when the corresponding ion library is well established. We strictly controlled the false positive rate associated with SWATH MS signals and carefully selected the target peptides coupled with SWATH and MRM. We collected 10 samples of esophageal squamous cell carcinoma (ESCC) tissues paired with tumors and adjacent regions and quantified 1758 unique proteins with FDR 1% at protein level using SWATH, in which 467 proteins were abundance-dependent with ESCC. After carefully evaluating the SWATH MS signals of the up-regulated proteins, we selected 120 proteins for MRM verification. MRM analysis of the pooled and individual esophageal tissues resulted in 116 proteins that exhibited similar abundance response modes to ESCC that were acquired with SWATH. Because the ESCC-related proteins consisted of a high percentile of secreted proteins, we conducted the MRM assay on patient sera that were collected from pre- and postoperation. Of the 116 target proteins, 42 were identified in the ESCC sera, including 11 with lowered abundances postoperation. Coupling SWATH and MRM is thus feasible and efficient for the discovery and verification of cancer-related protein biomarkers.

  19. Consistent model driven architecture

    NASA Astrophysics Data System (ADS)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  20. 30 CFR 250.904 - What is the Platform Approval Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... these requirements will satisfy MMS criteria for approval of fixed platforms of a proven design that... approval for a floating platform; a platform of unique design; or a platform being installed in deepwater (> 400 ft.) or a frontier area, you must also meet the requirements of the Platform Verification Program...

  1. 30 CFR 250.904 - What is the Platform Approval Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... criteria for approval of fixed platforms of a proven design that will be placed in the shallow water areas... of unique design; or a platform being installed in deepwater (> 400 ft.) or a frontier area, you must also meet the requirements of the Platform Verification Program. The requirements of the Platform...

  2. Numerical verification of two-component dental implant in the context of fatigue life for various load cases.

    PubMed

    Szajek, Krzysztof; Wierszycki, Marcin

    2016-01-01

    Dental implant designing is a complex process which considers many limitations both biological and mechanical in nature. In earlier studies, a complete procedure for improvement of two-component dental implant was proposed. However, the optimization tasks carried out required assumption on representative load case, which raised doubts on optimality for the other load cases. This paper deals with verification of the optimal design in context of fatigue life and its main goal is to answer the question if the assumed load scenario (solely horizontal occlusal load) leads to the design which is also "safe" for oblique occlussal loads regardless the angle from an implant axis. The verification is carried out with series of finite element analyses for wide spectrum of physiologically justified loads. The design of experiment methodology with full factorial technique is utilized. All computations are done in Abaqus suite. The maximal Mises stress and normalized effective stress amplitude for various load cases are discussed and compared with the assumed "safe" limit (equivalent of fatigue life for 5e6 cycles). The obtained results proof that coronial-appical load component should be taken into consideration in the two component dental implant when fatigue life is optimized. However, its influence in the analyzed case is small and does not change the fact that the fatigue life improvement is observed for all components within whole range of analyzed loads.

  3. NASA's Approach to Software Assurance

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  4. The space shuttle launch vehicle aerodynamic verification challenges

    NASA Technical Reports Server (NTRS)

    Wallace, R. O.; Austin, L. D.; Hondros, J. G.; Surber, T. E.; Gaines, L. M.; Hamilton, J. T.

    1985-01-01

    The Space Shuttle aerodynamics and performance communities were challenged to verify the Space Shuttle vehicle (SSV) aerodynamics and system performance by flight measurements. Historically, launch vehicle flight test programs which faced these same challenges were unmanned instrumented flights of simple aerodynamically shaped vehicles. However, the manned SSV flight test program made these challenges more complex because of the unique aerodynamic configuration powered by the first man-rated solid rocket boosters (SRB). The analyses of flight data did not verify the aerodynamics or performance preflight predictions of the first flight of the Space Transportation System (STS-1). However, these analyses have defined the SSV aerodynamics and verified system performance. The aerodynamics community also was challenged to understand the discrepancy between the wind tunnel and flight defined aerodynamics. The preflight analysis challenges, the aerodynamic extraction challenges, and the postflight analyses challenges which led to the SSV system performance verification and which will lead to the verification of the operational ascent aerodynamics data base are presented.

  5. Using crypts as iris minutiae

    NASA Astrophysics Data System (ADS)

    Shen, Feng; Flynn, Patrick J.

    2013-05-01

    Iris recognition is one of the most reliable biometric technologies for identity recognition and verification, but it has not been used in a forensic context because the representation and matching of iris features are not straightforward for traditional iris recognition techniques. In this paper we concentrate on the iris crypt as a visible feature used to represent the characteristics of irises in a similar way to fingerprint minutiae. The matching of crypts is based on their appearances and locations. The number of matching crypt pairs found between two irises can be used for identity verification and the convenience of manual inspection makes iris crypts a potential candidate for forensic applications.

  6. Rats Remember Items in Context Using Episodic Memory.

    PubMed

    Panoz-Brown, Danielle; Corbin, Hannah E; Dalecki, Stefan J; Gentry, Meredith; Brotheridge, Sydney; Sluka, Christina M; Wu, Jie-En; Crystal, Jonathon D

    2016-10-24

    Vivid episodic memories in people have been characterized as the replay of unique events in sequential order [1-3]. Animal models of episodic memory have successfully documented episodic memory of a single event (e.g., [4-8]). However, a fundamental feature of episodic memory in people is that it involves multiple events, and notably, episodic memory impairments in human diseases are not limited to a single event. Critically, it is not known whether animals remember many unique events using episodic memory. Here, we show that rats remember many unique events and the contexts in which the events occurred using episodic memory. We used an olfactory memory assessment in which new (but not old) odors were rewarded using 32 items. Rats were presented with 16 odors in one context and the same odors in a second context. To attain high accuracy, the rats needed to remember item in context because each odor was rewarded as a new item in each context. The demands on item-in-context memory were varied by assessing memory with 2, 3, 5, or 15 unpredictable transitions between contexts, and item-in-context memory survived a 45 min retention interval challenge. When the memory of item in context was put in conflict with non-episodic familiarity cues, rats relied on item in context using episodic memory. Our findings suggest that rats remember multiple unique events and the contexts in which these events occurred using episodic memory and support the view that rats may be used to model fundamental aspects of human cognition. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Limitations in learning: How treatment verifications fail and what to do about it?

    PubMed

    Richardson, Susan; Thomadsen, Bruce

    The purposes of this study were: to provide dialog on why classic incident learning systems have been insufficient for patient safety improvements, discuss failures in treatment verification, and to provide context to the reasons and lessons that can be learned from these failures. Historically, incident learning in brachytherapy is performed via database mining which might include reading of event reports and incidents followed by incorporating verification procedures to prevent similar incidents. A description of both classic event reporting databases and current incident learning and reporting systems is given. Real examples of treatment failures based on firsthand knowledge are presented to evaluate the effectiveness of verification. These failures will be described and analyzed by outlining potential pitfalls and problems based on firsthand knowledge. Databases and incident learning systems can be limited in value and fail to provide enough detail for physicists seeking process improvement. Four examples of treatment verification failures experienced firsthand by experienced brachytherapy physicists are described. These include both underverification and oververification of various treatment processes. Database mining is an insufficient method to affect substantial improvements in the practice of brachytherapy. New incident learning systems are still immature and being tested. Instead, a new method of shared learning and implementation of changes must be created. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  8. A Perfect Platform: Combining Contingency Management with Medications for Drug Abuse

    PubMed Central

    Carroll, Kathleen M.; Rounsaville, Bruce J.

    2008-01-01

    Contingency management (CM) procedures, which provide concrete reinforcers or rewards contingent on verification of discrete targeted behaviors, such as drug-free urines, have been demonstrated to be effective in a number of clinical trials. However, to date there have been only a few that have capitalized on the unique strengths and capabilities of CM as an ideal platform to improve response to or address weaknesses of many pharmacotherapies used in the treatment of drug abuse. In this review, we describe the multiple potential uses of CM as a platform for pharmacotherapy, including reducing illicit drug use in the context of agonist therapies; fostering medication compliance with antagonists, aversive agents and HIV medications; fostering a period of abstinence prior to initiation of agents used to treat comorbid psychiatric conditions or in the context of vaccines to foster adequate periods of abstinence while titer levels are building; and to enhance the effectiveness of anticraving agents through additive or synergistic effects. Although its multiple strengths render it an almost perfect platform, CM does have some weaknesses that have limited its use to date, including cost, the short-term nature of its effects, and need for training. Future treatment development of CM as a medication platform needs to counter these issues by focusing on CM applications with large potential benefit, developing simple or automated methods for CM delivery and placing greater emphasis on the process of transitioning away from formal CM treatment. PMID:17613963

  9. 32 CFR Attachment 3 to Part 855 - Landing Permit Application Instructions

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....13. Block 9c, Title. Self-explanatory. A3.1.14. Block 9d, Telephone Number. Self-explanatory. A3.1.15.... Block 3. Self-explanatory. (Users will not necessarily be denied landing rights if pilots are not... requested, it may be approved if warranted by unique circumstances. (The verification specified for each...

  10. ON AN ALLEGED TRUTH/FALSITY ASYMMETRY IN CONTEXT SHIFTING EXPERIMENTS

    PubMed Central

    Hansen, Nat

    2012-01-01

    Keith DeRose has argued that context shifting experiments should be designed in a specific way in order to accommodate what he calls a ‘truth/falsity asymmetry’. I explain and critique DeRose's reasons for proposing this modification to contextualist methodology, drawing on recent experimental studies of DeRose's bank cases as well as experimental findings about the verification of affirmative and negative statements. While DeRose's arguments for his particular modification to contextualist methodology fail, the lesson of his proposal is that there is good reason to pay close attention to several subtle aspects of the design of context shifting experiments. PMID:25821248

  11. Abstract for 1999 Rational Software User Conference

    NASA Technical Reports Server (NTRS)

    Dunphy, Julia; Rouquette, Nicolas; Feather, Martin; Tung, Yu-Wen

    1999-01-01

    We develop spacecraft fault-protection software at NASA/JPL. Challenges exemplified by our task: 1) high-quality systems - need for extensive validation & verification; 2) multi-disciplinary context - involves experts from diverse areas; 3) embedded systems - must adapt to external practices, notations, etc.; and 4) development pressures - NASA's mandate of "better, faster, cheaper".

  12. Verification of the CFD simulation system SAUNA for complex aircraft configurations

    NASA Astrophysics Data System (ADS)

    Shaw, Jonathon A.; Peace, Andrew J.; May, Nicholas E.; Pocock, Mark F.

    1994-04-01

    This paper is concerned with the verification for complex aircraft configurations of an advanced CFD simulation system known by the acronym SAUNA. A brief description of the complete system is given, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the addressed configuration. The majority of the paper focuses on the application of SAUNA to a variety of configurations from the military aircraft, civil aircraft and missile areas. Mesh generation issues are discussed for each geometry and experimental data are used to assess the accuracy of the inviscid (Euler) model used. It is shown that flexibility and accuracy are combined in an efficient manner, thus demonstrating the value of SAUNA in aerodynamic design.

  13. Post-OPC verification using a full-chip pattern-based simulation verification method

    NASA Astrophysics Data System (ADS)

    Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary

    2005-11-01

    In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow down to unique patterns/cells.

  14. Loads and Structural Dynamics Requirements for Spaceflight Hardware

    NASA Technical Reports Server (NTRS)

    Schultz, Kenneth P.

    2011-01-01

    The purpose of this document is to establish requirements relating to the loads and structural dynamics technical discipline for NASA and commercial spaceflight launch vehicle and spacecraft hardware. Requirements are defined for the development of structural design loads and recommendations regarding methodologies and practices for the conduct of load analyses are provided. As such, this document represents an implementation of NASA STD-5002. Requirements are also defined for structural mathematical model development and verification to ensure sufficient accuracy of predicted responses. Finally, requirements for model/data delivery and exchange are specified to facilitate interactions between Launch Vehicle Providers (LVPs), Spacecraft Providers (SCPs), and the NASA Technical Authority (TA) providing insight/oversight and serving in the Independent Verification and Validation role. In addition to the analysis-related requirements described above, a set of requirements are established concerning coupling phenomena or other interaction between structural dynamics and aerodynamic environments or control or propulsion system elements. Such requirements may reasonably be considered structure or control system design criteria, since good engineering practice dictates consideration of and/or elimination of the identified conditions in the development of those subsystems. The requirements are included here, however, to ensure that such considerations are captured in the design space for launch vehicles (LV), spacecraft (SC) and the Launch Abort Vehicle (LAV). The requirements in this document are focused on analyses to be performed to develop data needed to support structural verification. As described in JSC 65828, Structural Design Requirements and Factors of Safety for Spaceflight Hardware, implementation of the structural verification requirements is expected to be described in a Structural Verification Plan (SVP), which should describe the verification of each structural item for the applicable requirements. The requirement for and expected contents of the SVP are defined in JSC 65828. The SVP may also document unique verifications that meet or exceed these requirements with Technical Authority approval.

  15. Unique identification code for medical fundus images using blood vessel pattern for tele-ophthalmology applications.

    PubMed

    Singh, Anushikha; Dutta, Malay Kishore; Sharma, Dilip Kumar

    2016-10-01

    Identification of fundus images during transmission and storage in database for tele-ophthalmology applications is an important issue in modern era. The proposed work presents a novel accurate method for generation of unique identification code for identification of fundus images for tele-ophthalmology applications and storage in databases. Unlike existing methods of steganography and watermarking, this method does not tamper the medical image as nothing is embedded in this approach and there is no loss of medical information. Strategic combination of unique blood vessel pattern and patient ID is considered for generation of unique identification code for the digital fundus images. Segmented blood vessel pattern near the optic disc is strategically combined with patient ID for generation of a unique identification code for the image. The proposed method of medical image identification is tested on the publically available DRIVE and MESSIDOR database of fundus image and results are encouraging. Experimental results indicate the uniqueness of identification code and lossless recovery of patient identity from unique identification code for integrity verification of fundus images. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Systems Approach to Arms Control Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, K; Neimeyer, I; Listner, C

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between twomore » model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.« less

  17. Striving to be known by significant others: automatic activation of self-verification goals in relationship contexts.

    PubMed

    Kraus, Michael W; Chen, Serena

    2009-07-01

    Extending research on the automatic activation of goals associated with significant others, the authors hypothesized that self-verification goals typically pursued with significant others are automatically elicited when a significant-other representation is activated. Supporting this hypothesis, the activation of a significant-other representation through priming (Experiments 1 and 3) or through a transference encounter (Experiment 2) led participants to seek feedback that verifies their preexisting self-views. Specifically, significant-other primed participants desired self-verifying feedback, in general (Experiment 1), from an upcoming interaction partner (Experiment 2), and relative to acquaintance-primed participants and favorable feedback (Experiment 3). Finally, self-verification goals were activated, especially for relational self-views deemed high in importance to participants' self-concepts (Experiment 2) and held with high certainty (Experiment 3). Implications for research on self-evaluative goals, the relational self, and the automatic goal activation literature are discussed, as are consequences for close relationships. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  18. Markov Chains For Testing Redundant Software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1990-01-01

    Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.

  19. Contributions of acculturation, enculturation, discrimination, and personality traits to social anxiety among Chinese immigrants: A context-specific assessment.

    PubMed

    Fang, Ke; Friedlander, Myrna; Pieterse, Alex L

    2016-01-01

    Based on the diathesis-stress model of anxiety, this study examined the contributions of cultural processes, perceived racial discrimination, and personality traits to social anxiety among Chinese immigrants. Further guided by the theory of intergroup anxiety, this study also adopted a context-specific approach to distinguish between participants' experience of social anxiety when interacting with European Americans versus with other Chinese in the United States. This quantitative and ex post facto study used a convenience sample of 140 first-generation Chinese immigrants. Participants were recruited through e-mails from different university and community groups across the United States. The sample includes 55 men and 82 women (3 did not specify) with an average age of 36 years old. Results showed that more social anxiety was reported in the European American context than in the Chinese ethnic context. The full models accounted for almost half the variance in anxiety in each context. Although personality accounted for the most variance, the cultural variables and discrimination contributed 14% of the unique variance in the European American context. Notably, low acculturation, high neuroticism, and low extraversion were unique contributors to social anxiety with European Americans, whereas in the Chinese ethnic context only low extraversion was a unique contributor; more discrimination was uniquely significant in both contexts. The findings suggest a need to contextualize the research and clinical assessment of social anxiety, and have implications for culturally sensitive counseling with immigrants. (c) 2016 APA, all rights reserved).

  20. A Platform Architecture for Sensor Data Processing and Verification in Buildings

    ERIC Educational Resources Information Center

    Ortiz, Jorge Jose

    2013-01-01

    This thesis examines the state of the art of building information systems and evaluates their architecture in the context of emerging technologies and applications for deep analysis of the built environment. We observe that modern building information systems are difficult to extend, do not provide general services for application development, do…

  1. Automatic Verification of Serializers.

    DTIC Science & Technology

    1980-03-01

    31 2.5 Using semaphores to implement sei ;alizers ......................... 32 2.6 A comparison of...of concurrency control, while Hewitt has concentrated on more primitive control of concurrency in a context where programs communicate by passing...translation oflserializers into clusters and semaphores is given as a possible implementation strategy. Chapter 3 presents a simple semantic model that supl

  2. Families of Functions and Functions of Proof

    ERIC Educational Resources Information Center

    Landman, Greisy Winicki

    2002-01-01

    This article describes an activity for secondary school students that may constitute an appropriate opportunity to discuss with them the idea of proof, particularly in an algebraic context. During the activity the students may experience and understand some of the roles played by proof in mathematics in addition to verification of truth:…

  3. Using Dynamic Geometry to Expand Mathematics Teachers' Understanding of Proof

    ERIC Educational Resources Information Center

    de Villiers, Michael

    2004-01-01

    This paper gives a broad descriptive account of some activities that the author has designed using Sketchpad to develop teachers' understanding of other functions of proof than just the traditional function of 'verification'. These other functions of proof illustrated here are those of explanation, discovery and systematization (in the context of…

  4. Direct observation of how the heavy-fermion state develops in CeCoIn5

    NASA Astrophysics Data System (ADS)

    Chen, Q. Y.; Xu, D. F.; Niu, X. H.; Jiang, J.; Peng, R.; Xu, H. C.; Wen, C. H. P.; Ding, Z. F.; Huang, K.; Shu, L.; Zhang, Y. J.; Lee, H.; Strocov, V. N.; Shi, M.; Bisti, F.; Schmitt, T.; Huang, Y. B.; Dudin, P.; Lai, X. C.; Kirchner, S.; Yuan, H. Q.; Feng, D. L.

    2017-07-01

    Heavy-fermion systems share some of the strange metal phenomenology seen in other unconventional superconductors, providing a unique opportunity to set strange metals in a broader context. Central to understanding heavy-fermion systems is the interplay of localization and itinerancy. These materials acquire high electronic masses and a concomitant Fermi volume increase as the f electrons delocalize at low temperatures. However, despite the wide-spread acceptance of this view, a direct microscopic verification has been lacking. Here we report high-resolution angle-resolved photoemission measurements on CeCoIn5, a prototypical heavy-fermion compound, which spectroscopically resolve the development of band hybridization and the Fermi surface expansion over a wide temperature region. Unexpectedly, the localized-to-itinerant transition occurs at surprisingly high temperatures, yet f electrons are still largely localized even at the lowest temperature. These findings point to an unanticipated role played by crystal-field excitations in the strange metal behavior of CeCoIn5. Our results offer a comprehensive experimental picture of the heavy-fermion formation, setting the stage for understanding the emergent properties, including unconventional superconductivity, in this and related materials.

  5. Comprehension of idioms in adolescents with language-based learning disabilities compared to their typically developing peers.

    PubMed

    Qualls, Constance Dean; Lantz, Jennifer M; Pietrzyk, Rose M; Blood, Gordon W; Hammer, Carol Scheffner

    2004-01-01

    Adolescents with language-based learning disabilities (LBLD) often interpret idioms literally. When idioms are provided in an enriched context, comprehension is compromised further because of the LBLD student's inability to assign multiple meanings to words, assemble and integrate information, and go beyond a local referent to derive a global, coherent meaning. This study tested the effects of context and familiarity on comprehension of 24 idioms in 22 adolescents with LBLD. The students completed the Idiom Comprehension Test (ICT) [Language, Speech, and Hearing Services in Schools 30 (1999) 141; LSHSS 34 (2003) 69] in one of two conditions: in a story or during a verification task. Within each condition were three familiarity levels: high, moderate, and low. The LBLD adolescents' data were then compared to previously collected data from 21 age-, gender-, and reading ability-matched typically developing (TD) peers. The relations between reading and language literacy and idiom comprehension were also examined in the LBLD adolescents. Results showed that: (a) the LBLD adolescents generally performed poorly relative to their TD counterparts; however, the groups performed comparably on the high and moderate familiarity idioms in the verification condition; (b) the LBLD adolescents performed significantly better in the verification condition than in the story condition; and (c) reading ability was associated with comprehension of the low familiarity idioms in the story condition only. Findings are discussed relative to implications for speech-language pathologists (SLPs) and educators working with adolescents with LBLD. As a result of this activity, the participant will be able to (1) describe the importance of metalinguistic maturity for comprehension of idioms and other figures of speech; (2) understand the roles of context and familiarity when assessing idiom comprehension in adolescents with LBLD; and (3) critically evaluate assessments of idiom comprehension and determine their appropriateness for use with adolescents with LBLD.

  6. eBiometrics: an enhanced multi-biometrics authentication technique for real-time remote applications on mobile devices

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin

    2010-04-01

    The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.

  7. Improvement of a uniqueness-and-anonymity-preserving user authentication scheme for connected health care.

    PubMed

    Xie, Qi; Liu, Wenhao; Wang, Shengbao; Han, Lidong; Hu, Bin; Wu, Ting

    2014-09-01

    Patient's privacy-preserving, security and mutual authentication between patient and the medical server are the important mechanism in connected health care applications, such as telecare medical information systems and personally controlled health records systems. In 2013, Wen showed that Das et al.'s scheme is vulnerable to the replay attack, user impersonation attacks and off-line guessing attacks, and then proposed an improved scheme using biometrics, password and smart card to overcome these weaknesses. However, we show that Wen's scheme is still vulnerable to off-line password guessing attacks, does not provide user's anonymity and perfect forward secrecy. Further, we propose an improved scheme to fix these weaknesses, and use the applied pi calculus based formal verification tool ProVerif to prove the security and authentication.

  8. Electronic and School-Based Victimization: Unique Contexts for Adjustment Difficulties during Adolescence

    ERIC Educational Resources Information Center

    Fredstrom, Bridget K.; Adams, Ryan E.; Gilman, Rich

    2011-01-01

    Previous research suggests that school-based and electronic victimization have similar negative consequences, yet it is unclear whether these two contexts offer overlapping or unique associations with adolescents' adjustment. 802 ninth-graders (43% male, mean age = 15.84 years), majority being Caucasian (82%), completed measures assessing the…

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.

    The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without themore » use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.« less

  10. Atmospheric transport modelling in support of CTBT verification—overview and basic concepts

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; De Geer, Lars-Erik; Denier, Philippe; Kalinowski, Martin; Toivonen, Harri; D'Amours, Real; Desiato, Franco; Issartel, Jean-Pierre; Langer, Matthias; Seibert, Petra; Frank, Andreas; Sloan, Craig; Yamazawa, Hiromi

    Under the provisions of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global monitoring system comprising different verification technologies is currently being set up. The network will include 80 radionuclide (RN) stations distributed all over the globe that measure treaty-relevant radioactive species. While the seismic subsystem cannot distinguish between chemical and nuclear explosions, RN monitoring would provide the "smoking gun" of a possible treaty violation. Atmospheric transport modelling (ATM) will be an integral part of CTBT verification, since it provides a geo-temporal location capability for the RN technology. In this paper, the basic concept for the future ATM software system to be installed at the International Data Centre is laid out. The system is based on the operational computation of multi-dimensional source-receptor sensitivity fields for all RN samples by means of adjoint tracer transport modelling. While the source-receptor matrix methodology has already been applied in the past, the system that we suggest will be unique and unprecedented, since it is global, real-time and aims at uncovering source scenarios that are compatible with measurements. Furthermore, it has to deal with source dilution ratios that are by orders of magnitude larger than in typical transport model applications. This new verification software will need continuous scientific attention, and may well provide a prototype system for future applications in areas of environmental monitoring, emergency response and verification of other international agreements and treaties.

  11. A Tool for Verification and Validation of Neural Network Based Adaptive Controllers for High Assurance Systems

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Schumann, Johann

    2004-01-01

    High reliability of mission- and safety-critical software systems has been identified by NASA as a high-priority technology challenge. We present an approach for the performance analysis of a neural network (NN) in an advanced adaptive control system. This problem is important in the context of safety-critical applications that require certification, such as flight software in aircraft. We have developed a tool to measure the performance of the NN during operation by calculating a confidence interval (error bar) around the NN's output. Our tool can be used during pre-deployment verification as well as monitoring the network performance during operation. The tool has been implemented in Simulink and simulation results on a F-15 aircraft are presented.

  12. Runtime Verification in Context : Can Optimizing Error Detection Improve Fault Diagnosis

    NASA Technical Reports Server (NTRS)

    Dwyer, Matthew B.; Purandare, Rahul; Person, Suzette

    2010-01-01

    Runtime verification has primarily been developed and evaluated as a means of enriching the software testing process. While many researchers have pointed to its potential applicability in online approaches to software fault tolerance, there has been a dearth of work exploring the details of how that might be accomplished. In this paper, we describe how a component-oriented approach to software health management exposes the connections between program execution, error detection, fault diagnosis, and recovery. We identify both research challenges and opportunities in exploiting those connections. Specifically, we describe how recent approaches to reducing the overhead of runtime monitoring aimed at error detection might be adapted to reduce the overhead and improve the effectiveness of fault diagnosis.

  13. Exploring the e-cigarette e-commerce marketplace: Identifying Internet e-cigarette marketing characteristics and regulatory gaps.

    PubMed

    Mackey, Tim K; Miner, Angela; Cuomo, Raphael E

    2015-11-01

    The electronic cigarette (e-cigarette) market is maturing into a billion-dollar industry. Expansion includes new channels of access not sufficiently assessed, including Internet sales of e-cigarettes. This study identifies unique e-cigarette Internet vendor characteristics, including geographic location, promotional strategies, use of social networking, presence/absence of age verification, and consumer warning representation. We performed structured Internet search engine queries and used inclusion/exclusion criteria to identify e-cigarette vendors. We then conducted content analysis of characteristics of interest. Our examination yielded 57 e-cigarette Internet vendors including 54.4% (n=31) that sold exclusively online. The vast majority of websites (96.5%, n=55) were located in the U.S. Vendors used a variety of sales promotion strategies to market e-cigarettes including 70.2% (n=40) that used more than one social network service (SNS) and 42.1% (n=24) that used more than one promotional sales strategies. Most vendors (68.4%, n=39) displayed one or more health warnings on their website, but often displayed them in smaller font or in their terms and conditions. Additionally, 35.1% (n=20) of vendors did not have any detectable age verification process. E-cigarette Internet vendors are actively engaged in various promotional activities to increase the appeal and presence of their products online. In the absence of FDA regulations specific to the Internet, the e-cigarette e-commerce marketplace is likely to grow. This digital environment poses unique challenges requiring targeted policy-making including robust online age verification, monitoring of SNS marketing, and greater scrutiny of certain forms of marketing promotional practices. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Verification, Validation and Accreditation using AADL

    DTIC Science & Technology

    2011-05-03

    component h component, c r2 socsr hhh  max. height (surface relative), hsr r1 pwbsra thh  max. height (absolute), ha pwb pwb t c0. Context-Specific...5512 digital oscillatorABC_9230 Warning Module PWB component component, c r2 hhh max. height (surface relative), hsr r1 pwbsra thh  max. height

  15. Current Problems of Improving the Environmental Certification and Output Compliance Verification in the Context of Environmental Management in Kazakhstan

    ERIC Educational Resources Information Center

    Zhambaev, Yerzhan S.; Sagieva, Galia K.; Bazarbek, Bakhytzhan Zh.; Akkulov, Rustem T.

    2016-01-01

    The article discusses the issues of improving the activity of subjects of environmental management in accordance with international environmental standards and national environmental legislation. The article deals with the problem of ensuring the implementation of international environmental standards, the introduction of eco-management, and the…

  16. Standards for the calibration of extensometers

    NASA Astrophysics Data System (ADS)

    Loveday, Malcolm S.

    1991-10-01

    The consequences of the impending publication of a new European Standard, BS EN 10002 Pt 4 'Metallic Materials: Verification of Extensometers Used in Uniaxial Testing', which was based on the equivalent International Standard ISO 9513, are considered within the context of the new standard superceding the present British Standard, BS 3846. The three standards are compared and the differences are highlighted.

  17. Duality based direct resolution of unique profiles using zero concentration region information.

    PubMed

    Tavakkoli, Elnaz; Rajkó, Róbert; Abdollahi, Hamid

    2018-07-01

    Self Modeling Curve Resolution (SMCR) is a class of techniques concerned with estimating pure profiles underlying a set of measurements on chemical systems. In general, the estimated profiles are ambiguous (non-unique) except if some special conditions fulfilled. Implementing the adequate information can reduce the so-called rotational ambiguity effectively, and in the most desirable cases lead to the unique solution. Therefore, studies on circumstances resulting in unique solution are of particular importance. The conditions of unique solution can particularly be studied based on duality principle. In bilinear chemical (e.g., spectroscopic) data matrix, there is a natural duality between its row and column vector spaces using minimal constraints (non-negativity of concentrations and absorbances). In this article, the conditions of the unique solution according to duality concept and using zero concentration region information is intended to show. A simulated dataset of three components and an experimental system with synthetic mixtures containing three amino acids tyrosine, phenylalanine and tryptophan are analyzed. It is shown that in the presence of sufficient information, the reliable unique solution is obtained that is valuable in analytical qualification and for quantitative verification analysis. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Representation of the contextual statistical model by hyperbolic amplitudes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khrennikov, Andrei

    We continue the development of a so-called contextual statistical model (here context has the meaning of a complex of physical conditions). It is shown that, besides contexts producing the conventional trigonometric cos-interference, there exist contexts producing the hyperbolic cos-interference. Starting with the corresponding interference formula of total probability we represent such contexts by hyperbolic probabilistic amplitudes or in the abstract formalism by normalized vectors of a hyperbolic analogue of the Hilbert space. There is obtained a hyperbolic Born's rule. Incompatible observables are represented by noncommutative operators. This paper can be considered as the first step towards hyperbolic quantum probability. Wemore » also discuss possibilities of experimental verification of hyperbolic quantum mechanics: in physics of elementary particles, string theory as well as in experiments with nonphysical systems, e.g., in psychology, cognitive sciences, and economy.« less

  19. Representation of the contextual statistical model by hyperbolic amplitudes

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2005-06-01

    We continue the development of a so-called contextual statistical model (here context has the meaning of a complex of physical conditions). It is shown that, besides contexts producing the conventional trigonometric cos-interference, there exist contexts producing the hyperbolic cos-interference. Starting with the corresponding interference formula of total probability we represent such contexts by hyperbolic probabilistic amplitudes or in the abstract formalism by normalized vectors of a hyperbolic analogue of the Hilbert space. There is obtained a hyperbolic Born's rule. Incompatible observables are represented by noncommutative operators. This paper can be considered as the first step towards hyperbolic quantum probability. We also discuss possibilities of experimental verification of hyperbolic quantum mechanics: in physics of elementary particles, string theory as well as in experiments with nonphysical systems, e.g., in psychology, cognitive sciences, and economy.

  20. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    PubMed

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.

  1. Guidance and Control Software Project Data - Volume 3: Verification Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  2. Design for Verification: Using Design Patterns to Build Reliable Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Koga, Dennis (Technical Monitor)

    2003-01-01

    Components so far have been mainly used in commercial software development to reduce time to market. While some effort has been spent on formal aspects of components, most of this was done in the context of programming language or operating system framework integration. As a consequence, increased reliability of composed systems is mainly regarded as a side effect of a more rigid testing of pre-fabricated components. In contrast to this, Design for Verification (D4V) puts the focus on component specific property guarantees, which are used to design systems with high reliability requirements. D4V components are domain specific design pattern instances with well-defined property guarantees and usage rules, which are suitable for automatic verification. The guaranteed properties are explicitly used to select components according to key system requirements. The D4V hypothesis is that the same general architecture and design principles leading to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the limitations of conventional reliability assurance measures, such as too large a state space or too many execution paths.

  3. Using ICT techniques for improving mechatronic systems' dependability

    NASA Astrophysics Data System (ADS)

    Miron, Emanuel; Silva, João P. M. A.; Machado, José; Olaru, Dumitru; Prisacaru, Gheorghe

    2013-10-01

    The use of analysis techniques for industrial controller's analysis, such as Simulation and Formal Verification, is complex on industrial context. This complexity is due to the fact that such techniques require sometimes high investment in specific skilled human resources that have sufficient theoretical knowledge in those domains. This paper aims, mainly, to show that it is possible to obtain a timed automata model for formal verification purposes, considering the CAD model of a mechanical component. This systematic approach can be used, by companies, for the analysis of industrial controllers programs. For this purpose, it is discussed, in the paper, the best way to systematize these procedures, and this paper describes, only, the first step of a complex process and promotes a discussion of the main difficulties that can be found and a possibility for handle those difficulties. A library for formal verification purposes is obtained from original 3D CAD models using Software as a Service platform (SaaS) that, nowadays, has become a common deliverable model for many applications, because SaaS is typically accessed by users via internet access.

  4. A zero-knowledge protocol for nuclear warhead verification

    NASA Astrophysics Data System (ADS)

    Glaser, Alexander; Barak, Boaz; Goldston, Robert J.

    2014-06-01

    The verification of nuclear warheads for arms control involves a paradox: international inspectors will have to gain high confidence in the authenticity of submitted items while learning nothing about them. Proposed inspection systems featuring `information barriers', designed to hide measurements stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of a fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, which is designed in such a way that sensitive information is never measured and so does not need to be hidden. We interrogate submitted items with energetic neutrons, making, in effect, differential measurements of both neutron transmission and emission. Calculations for scenarios in which material is diverted from a test object show that a high degree of discrimination can be achieved while revealing zero information. Our ideas for a physical zero-knowledge system could have applications beyond the context of nuclear disarmament. The proposed technique suggests a way to perform comparisons or computations on personal or confidential data without measuring the data in the first place.

  5. Using Model Replication to Improve the Reliability of Agent-Based Models

    NASA Astrophysics Data System (ADS)

    Zhong, Wei; Kim, Yushim

    The basic presupposition of model replication activities for a computational model such as an agent-based model (ABM) is that, as a robust and reliable tool, it must be replicable in other computing settings. This assumption has recently gained attention in the community of artificial society and simulation due to the challenges of model verification and validation. Illustrating the replication of an ABM representing fraudulent behavior in a public service delivery system originally developed in the Java-based MASON toolkit for NetLogo by a different author, this paper exemplifies how model replication exercises provide unique opportunities for model verification and validation process. At the same time, it helps accumulate best practices and patterns of model replication and contributes to the agenda of developing a standard methodological protocol for agent-based social simulation.

  6. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in socialmore » media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.« less

  7. User interface and operational issues with thermionic space power systems

    NASA Technical Reports Server (NTRS)

    Dahlberg, R. C.; Fisher, C. R.

    1987-01-01

    Thermionic space power systems have unique features which facilitate predeployment operations, provide operational flexibility and simplify the interface with the user. These were studied in some detail during the SP-100 program from 1983 to 1985. Three examples are reviewed in this paper: (1) system readiness verification in the prelaunch phase; (2) startup, shutdown, and dormancy in the operations phase; (3) part-load operation in the operations phase.

  8. Some General Principles in Cryogenic Design, Implementation, and Testing

    NASA Technical Reports Server (NTRS)

    Dipirro, Michael James

    2015-01-01

    Brief Course Description: In 2 hours only the most basic principles of cryogenics can be presented. I will concentrate on the differences between a room temperature thermal analysis and cryogenic thermal analysis, namely temperature dependent properties. I will talk about practical materials for thermal contact and isolation. I will finish by describing the verification process and instrumentation used that is unique to cryogenic (in general less than 100K) systems.

  9. Analog Video Authentication and Seal Verification Equipment Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory Lancaster

    Under contract to the US Department of Energy in support of arms control treaty verification activities, the Savannah River National Laboratory in conjunction with the Pacific Northwest National Laboratory, the Idaho National Laboratory and Milagro Consulting, LLC developed equipment for use within a chain of custody regime. This paper discussed two specific devices, the Authentication Through the Lens (ATL) analog video authentication system and a photographic multi-seal reader. Both of these devices have been demonstrated in a field trial, and the experience gained throughout will also be discussed. Typically, cryptographic methods are used to prove the authenticity of digital imagesmore » and video used in arms control chain of custody applications. However, in some applications analog cameras are used. Since cryptographic authentication methods will not work on analog video streams, a simple method of authenticating analog video was developed and tested. A photographic multi-seal reader was developed to image different types of visual unique identifiers for use in chain of custody and authentication activities. This seal reader is unique in its ability to image various types of seals including the Cobra Seal, Reflective Particle Tags, and adhesive seals. Flicker comparison is used to compare before and after images collected with the seal reader in order to detect tampering and verify the integrity of the seal.« less

  10. Considering land tenure in REDD+ participatory measurement, reporting, and verification: A case study from Indonesia

    PubMed Central

    Bong, Indah Waty; DePuy, Walker Holton; Jihadah, Lina Farida

    2017-01-01

    Measurement, Reporting, and Verification (MRV) systems are thought to be essential for effective carbon accounting and joint REDD+ carbon, conservation, and social development goals. Community participation in MRV (PMRV) has been shown to be both cost effective and accurate, as well as a method to potentially advance stakeholder empowerment and perceptions of legitimacy. Recognizing land tenure as a long-standing point of tension in REDD+ planning, we argue that its engagement also has a key role to play in developing a legitimate PMRV. Using household surveys, key informant interviews, and participatory mapping exercises, we present three ‘lived’ land tenure contexts in Indonesia to highlight their socially and ecologically situated natures and to consider the role of tenure pluralism in shaping PMRV. We then raise and interrogate three questions for incorporating lived land tenure contexts into a legitimate PMRV system: 1) Who holds the right to conduct PMRV activities?; 2) How are the impacts of PMRV differentially distributed within local communities?; and 3) What is the relationship between tenure security and motivation to participate in PMRV? We conclude with implementation lessons for REDD+ practitioners, including the benefits of collaborative practices, and point to critical areas for further research. PMID:28406908

  11. Considering land tenure in REDD+ participatory measurement, reporting, and verification: A case study from Indonesia.

    PubMed

    Felker, Mary Elizabeth; Bong, Indah Waty; DePuy, Walker Holton; Jihadah, Lina Farida

    2017-01-01

    Measurement, Reporting, and Verification (MRV) systems are thought to be essential for effective carbon accounting and joint REDD+ carbon, conservation, and social development goals. Community participation in MRV (PMRV) has been shown to be both cost effective and accurate, as well as a method to potentially advance stakeholder empowerment and perceptions of legitimacy. Recognizing land tenure as a long-standing point of tension in REDD+ planning, we argue that its engagement also has a key role to play in developing a legitimate PMRV. Using household surveys, key informant interviews, and participatory mapping exercises, we present three 'lived' land tenure contexts in Indonesia to highlight their socially and ecologically situated natures and to consider the role of tenure pluralism in shaping PMRV. We then raise and interrogate three questions for incorporating lived land tenure contexts into a legitimate PMRV system: 1) Who holds the right to conduct PMRV activities?; 2) How are the impacts of PMRV differentially distributed within local communities?; and 3) What is the relationship between tenure security and motivation to participate in PMRV? We conclude with implementation lessons for REDD+ practitioners, including the benefits of collaborative practices, and point to critical areas for further research.

  12. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    NASA Technical Reports Server (NTRS)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated into the KMD-WRF runs, using the product generated by NOAA/NESDIS. Model verification capabilities are also being transitioned to KMD using NCAR's Model *Corresponding author address: Jonathan Case, ENSCO, Inc., 320 Sparkman Dr., Room 3008, Huntsville, AL, 35805. Email: Jonathan.Case-1@nasa.gov Evaluation Tools (MET; Brown et al. 2009) software in conjunction with a SPoRT-developed scripting package, in order to quantify and compare errors in simulated temperature, moisture and precipitation in the experimental WRF model simulations. This extended abstract and accompanying presentation summarizes the efforts and training done to date to support this unique regional modeling initiative at KMD. To honor the memory of Dr. Peter J. Lamb and his extensive efforts in bolstering weather and climate science and capacity-building in Africa, we offer this contribution to the special Peter J. Lamb symposium. The remainder of this extended abstract is organized as follows. The collaborating international organizations involved in the project are presented in Section 2. Background information on the unique land surface input datasets is presented in Section 3. The hands-on training sessions from March 2014 and June 2015 are described in Section 4. Sample experimental WRF output and verification from the June 2015 training are given in Section 5. A summary is given in Section 6, followed by Acknowledgements and References.

  13. Optical security verification for blurred fingerprints

    NASA Astrophysics Data System (ADS)

    Soon, Boon Y.; Karim, Mohammad A.; Alam, Mohammad S.

    1998-12-01

    Optical fingerprint security verification is gaining popularity, as it has the potential to perform correlation at the speed of light. With advancement in optical security verification techniques, authentication process can be almost foolproof and reliable for financial transaction, banking, etc. In law enforcement, when a fingerprint is obtained from a crime scene, it may be blurred and can be an unhealthy candidate for correlation purposes. Therefore, the blurred fingerprint needs to be clarified before it is used for the correlation process. There are a several different types of blur, such as linear motion blur and defocus blur, induced by aberration of imaging system. In addition, we may or may not know the blur function. In this paper, we propose the non-singularity inverse filtering in frequency/power domain for deblurring known motion-induced blur in fingerprints. This filtering process will be incorporated with the pow spectrum subtraction technique, uniqueness comparison scheme, and the separated target and references planes method in the joint transform correlator. The proposed hardware implementation is a hybrid electronic-optical correlator system. The performance of the proposed system would be verified with computer simulation for both cases: with and without additive random noise corruption.

  14. Absolute pitch memory: its prevalence among musicians and dependence on the testing context.

    PubMed

    Wong, Yetta Kwailing; Wong, Alan C-N

    2014-04-01

    Absolute pitch (AP) is widely believed to be a rare ability possessed by only a small group of gifted and special individuals (AP possessors). While AP has fascinated psychologists, neuroscientists, and musicians for more than a century, no theory can satisfactorily explain why this ability is so rare and difficult to learn. Here, we show that AP ability appears rare because of the methodological issues of the standard pitch-naming test. Specifically, the standard test unnecessarily poses a high decisional demand on AP judgments and uses a testing context that is highly inconsistent with one's musical training. These extra cognitive challenges are not central to AP memory per se and have thus led to consistent underestimation of AP ability in the population. Using the standard test, we replicated the typical findings that the accuracy for general violinists was low (12.38 %; chance level = 0 %). With identical stimuli, scoring criteria, and participants, violinists attained 25 % accuracy in a pitch verification test in which the decisional demand of AP judgment was reduced. When the testing context was increasingly similar to their musical experience, verification accuracy improved further and reached 39 %, three times higher than that for the standard test. Results were replicated with a separate group of pianists. Our findings challenge current theories about AP and suggest that the prevalence of AP among musicians has been highly underestimated in prior work. A multimodal framework is proposed to better explain AP memory.

  15. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    NASA Technical Reports Server (NTRS)

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  16. Keystroke dynamics in the pre-touchscreen era

    PubMed Central

    Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A.

    2013-01-01

    Biometric authentication seeks to measure an individual’s unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals’ typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts. PMID:24391568

  17. Keystroke dynamics in the pre-touchscreen era.

    PubMed

    Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A

    2013-12-19

    Biometric authentication seeks to measure an individual's unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals' typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts.

  18. International Space Station Passive Thermal Control System Analysis, Top Ten Lessons-Learned

    NASA Technical Reports Server (NTRS)

    Iovine, John

    2011-01-01

    The International Space Station (ISS) has been on-orbit for over 10 years, and there have been numerous technical challenges along the way from design to assembly to on-orbit anomalies and repairs. The Passive Thermal Control System (PTCS) management team has been a key player in successfully dealing with these challenges. The PTCS team performs thermal analysis in support of design and verification, launch and assembly constraints, integration, sustaining engineering, failure response, and model validation. This analysis is a significant body of work and provides a unique opportunity to compile a wealth of real world engineering and analysis knowledge and the corresponding lessons-learned. The analysis lessons encompass the full life cycle of flight hardware from design to on-orbit performance and sustaining engineering. These lessons can provide significant insight for new projects and programs. Key areas to be presented include thermal model fidelity, verification methods, analysis uncertainty, and operations support.

  19. Category V Compliant Container for Mars Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Dolgin, Benjamin; Sanok, Joseph; Sevilla, Donald; Bement, Laurence J.

    2000-01-01

    A novel containerization technique that satisfies Planetary Protection (PP) Category V requirements has been developed and demonstrated on the mock-up of the Mars Sample Return Container. The proposed approach uses explosive welding with a sacrificial layer and cut-through-the-seam techniques. The technology produces a container that is free from Martian contaminants on an atomic level. The containerization technique can be used on any celestial body that may support life. A major advantage of the proposed technology is the possibility of very fast (less than an hour) verification of both containment and cleanliness with typical metallurgical laboratory equipment. No separate biological verification is required. In addition to Category V requirements, the proposed container presents a surface that is clean from any, even nonviable organisms, and any molecular fragments of biological origin that are unique to Mars or any other celestial body other than Earth.

  20. An Empirical Verification of a-priori Learning Models on Mailing Archives in the Context of Online Learning Activities of Participants in Free\\Libre Open Source Software (FLOSS) Communities

    ERIC Educational Resources Information Center

    Mukala, Patrick; Cerone, Antonio; Turini, Franco

    2017-01-01

    Free\\Libre Open Source Software (FLOSS) environments are increasingly dubbed as learning environments where practical software engineering skills can be acquired. Numerous studies have extensively investigated how knowledge is acquired in these environments through a collaborative learning model that define a learning process. Such a learning…

  1. Sensorimotor simulations underlie conceptual representations: modality-specific effects of prior activation.

    PubMed

    Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W

    2004-02-01

    According to the perceptual symbols theory (Barsalou, 1999), sensorimotor simulations underlie the representation of concepts. Simulations are componential in the sense that they vary with the context in which the concept is presented. In the present study, we investigated whether representations are affected by recent experiences with a concept. Concept names (e.g., APPLE) were presented twice in a property verification task with a different property on each occasion. The two properties were either from the same perceptual modality (e.g., green, shiny) or from different modalities (e.g., tart, shiny). All stimuli were words. There was a lag of several intervening trials between the first and second presentation. Verification times and error rates for the second presentation of the concept were higher if the properties were from different modalities than if they were from the same modality.

  2. Quality dependent fusion of intramodal and multimodal biometric experts

    NASA Astrophysics Data System (ADS)

    Kittler, J.; Poh, N.; Fatukasi, O.; Messer, K.; Kryszczuk, K.; Richiardi, J.; Drygajlo, A.

    2007-04-01

    We address the problem of score level fusion of intramodal and multimodal experts in the context of biometric identity verification. We investigate the merits of confidence based weighting of component experts. In contrast to the conventional approach where confidence values are derived from scores, we use instead raw measures of biometric data quality to control the influence of each expert on the final fused score. We show that quality based fusion gives better performance than quality free fusion. The use of quality weighted scores as features in the definition of the fusion functions leads to further improvements. We demonstrate that the achievable performance gain is also affected by the choice of fusion architecture. The evaluation of the proposed methodology involves 6 face and one speech verification experts. It is carried out on the XM2VTS data base.

  3. Human body as a set of biometric features identified by means of optoelectronics

    NASA Astrophysics Data System (ADS)

    Podbielska, Halina; Bauer, Joanna

    2005-09-01

    Human body posses many unique, singular features that are impossible to copy or forge. Nowadays, to establish and to ensure the public security requires specially designed devices and systems. Biometrics is a field of science and technology, exploiting human body characteristics for people recognition. It identifies the most characteristic and unique ones in order to design and construct systems capable to recognize people. In this paper some overview is given, presenting the achievements in biometrics. The verification and identification process is explained, along with the way of evaluation of biometric recognition systems. The most frequently human biometrics used in practice are shortly presented, including fingerprints, facial imaging (including thermal characteristic), hand geometry and iris patterns.

  4. Capturing Safety Requirements to Enable Effective Task Allocation Between Humans and Automaton in Increasingly Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Neogi, Natasha A.

    2016-01-01

    There is a current drive towards enabling the deployment of increasingly autonomous systems in the National Airspace System (NAS). However, shifting the traditional roles and responsibilities between humans and automation for safety critical tasks must be managed carefully, otherwise the current emergent safety properties of the NAS may be disrupted. In this paper, a verification activity to assess the emergent safety properties of a clearly defined, safety critical, operational scenario that possesses tasks that can be fluidly allocated between human and automated agents is conducted. Task allocation role sets were proposed for a human-automation team performing a contingency maneuver in a reduced crew context. A safety critical contingency procedure (engine out on takeoff) was modeled in the Soar cognitive architecture, then translated into the Hybrid Input Output formalism. Verification activities were then performed to determine whether or not the safety properties held over the increasingly autonomous system. The verification activities lead to the development of several key insights regarding the implicit assumptions on agent capability. It subsequently illustrated the usefulness of task annotations associated with specialized requirements (e.g., communication, timing etc.), and demonstrated the feasibility of this approach.

  5. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  6. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  7. Expert system verification and validation study. Delivery 1: Survey and interview questions

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The NASA funded questionnaire is presented to help define the state-of-the-practice in the formal evaluation of Expert Systems on current NASA and industry applications. The answers to this questionnaire, together with follow-up interviews, will provide realistic answers to the following questions: (1) How much evaluation is being performed; (2) What evaluation techniques are in use; and (3) What, if any, are the unique issues in evaluating Expert Systems.

  8. Research of aerohydrodynamic and aeroelastic processes on PNRPU HPC system

    NASA Astrophysics Data System (ADS)

    Modorskii, V. Ya.; Shevelev, N. A.

    2016-10-01

    Research of aerohydrodynamic and aeroelastic processes with the High Performance Computing Complex in PNIPU is actively conducted within the university priority development direction "Aviation engine and gas turbine technology". Work is carried out in two areas: development and use of domestic software and use of well-known foreign licensed applied software packets. In addition, the third direction associated with the verification of computational experiments - physical modeling, with unique proprietary experimental installations is being developed.

  9. System and method for simultaneously collecting serial number information from numerous identity tags

    DOEpatents

    Doty, Michael A.

    1997-01-01

    A system and method for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded "D" character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the "D" interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available.

  10. System and method for simultaneously collecting serial number information from numerous identity tags

    DOEpatents

    Doty, M.A.

    1997-01-07

    A system and method are disclosed for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded ``D`` character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the ``D`` interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available. 5 figs.

  11. WE-EF-303-10: Single- Detector Proton Radiography as a Portal Imaging Equivalent for Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doolan, P; Bentefour, E; Testa, M

    2015-06-15

    Purpose: In proton therapy, patient alignment is of critical importance due to the sensitivity of the proton range to tissue heterogeneities. Traditionally proton radiography is used for verification of the water-equivalent path length (WEPL), which dictates the depth protons reach. In this work we propose its use for alignment. Additionally, many new proton centers have cone-beam computed tomography in place of beamline X-ray imaging and so proton radiography offers a unique patient alignment verification similar to portal imaging in photon therapy. Method: Proton radiographs of a CIRS head phantom were acquired using the Beam Imaging System (BIS) (IBA, Louvain-la-Neuve) inmore » a horizontal beamline. A scattered beam was produced using a small, dedicated, range modulator (RM) wheel fabricated out of aluminum. The RM wheel was rotated slowly (20 sec/rev) using a stepper motor to compensate for the frame rate of the BIS (120 ms). Dose rate functions (DRFs) over two RM wheel rotations were acquired. Calibration was made with known thicknesses of homogeneous solid water. For each pixel the time width, skewness and kurtosis of the DRFs were computed. The time width was used to compute the object WEPL. In the heterogeneous phantom, the excess skewness and excess kurtosis (i.e. difference from homogeneous cases) were computed and assessed for suitability for patient set up. Results: The technique allowed for the simultaneous production of images that can be used for WEPL verification, showing few internal details, and excess skewness and kurtosis images that can be used for soft tissue alignment. These latter images highlight areas where range mixing has occurred, correlating with phantom heterogeneities. Conclusion: The excess skewness and kurtosis images contain details that are not visible in the WET images. These images, unique to the time-resolved proton radiographic method, could be used for patient set up according to soft tissues.« less

  12. Crewed Space Vehicle Battery Safety Requirements

    NASA Technical Reports Server (NTRS)

    Jeevarajan, Judith A.; Darcy, Eric C.

    2014-01-01

    This requirements document is applicable to all batteries on crewed spacecraft, including vehicle, payload, and crew equipment batteries. It defines the specific provisions required to design a battery that is safe for ground personnel and crew members to handle and/or operate during all applicable phases of crewed missions, safe for use in the enclosed environment of a crewed space vehicle, and safe for use in launch vehicles, as well as in unpressurized spaces adjacent to the habitable portion of a space vehicle. The required provisions encompass hazard controls, design evaluation, and verification. The extent of the hazard controls and verification required depends on the applicability and credibility of the hazard to the specific battery design and applicable missions under review. Evaluation of the design and verification program results shall be completed prior to certification for flight and ground operations. This requirements document is geared toward the designers of battery systems to be used in crewed vehicles, crew equipment, crew suits, or batteries to be used in crewed vehicle systems and payloads (or experiments). This requirements document also applies to ground handling and testing of flight batteries. Specific design and verification requirements for a battery are dependent upon the battery chemistry, capacity, complexity, charging, environment, and application. The variety of battery chemistries available, combined with the variety of battery-powered applications, results in each battery application having specific, unique requirements pertinent to the specific battery application. However, there are basic requirements for all battery designs and applications, which are listed in section 4. Section 5 includes a description of hazards and controls and also includes requirements.

  13. Computer Security Models

    DTIC Science & Technology

    1984-09-01

    Verification Technique for a Class of Security Kernels," International Symposium on Programming , Lecture Notes in Computer Science 137, Springer-Verlag, New York...September 1984 MTR9S31 " J. K. Millen Computer Security C. M. Cerniglia Models * 0 Ne c - ¢- C. S• ~CONTRACT SPONSOR OUSDRE/C31 & ESO/ALEE...ABSTRACT The purpose of this report is to provide a basis for evaluating security models in the context of secure computer system development

  14. Motivation Matters: Lessons for REDD+ Participatory Measurement, Reporting and Verification from Three Decades of Child Health Participatory Monitoring in Indonesia.

    PubMed

    Ekowati, Dian; Hofstee, Carola; Praputra, Andhika Vega; Sheil, Douglas

    2016-01-01

    Participatory Measurement, Reporting and Verification (PMRV), in the context of reducing emissions from deforestation and forest degradation with its co-benefits (REDD+) requires sustained monitoring and reporting by community members. This requirement appears challenging and has yet to be achieved. Other successful, long established, community self-monitoring and reporting systems may provide valuable lessons. The Indonesian integrated village healthcare program (Posyandu) was initiated in the 1980s and still provides effective and successful participatory measurement and reporting of child health status across the diverse, and often remote, communities of Indonesia. Posyandu activities focus on the growth and development of children under the age of five by recording their height and weight and reporting these monthly to the Ministry of Health. Here we focus on the local Posyandu personnel (kaders) and their motivations and incentives for contributing. While Posyandu and REDD+ measurement and reporting activities differ, there are sufficient commonalities to draw useful lessons. We find that the Posyandu kaders are motivated by their interests in health care, by their belief that it benefits the community, and by encouragement by local leaders. Recognition from the community, status within the system, training opportunities, competition among communities, and small payments provide incentives to sustain participation. We examine these lessons in the context of REDD+.

  15. Quantum supremacy in constant-time measurement-based computation: A unified architecture for sampling and verification

    NASA Astrophysics Data System (ADS)

    Miller, Jacob; Sanders, Stephen; Miyake, Akimasa

    2017-12-01

    While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.

  16. Hylemetry versus Biometry: a new method to certificate the lithography authenticity

    NASA Astrophysics Data System (ADS)

    Schirripa Spagnolo, Giuseppe; Cozzella, Lorenzo; Simonetti, Carla

    2011-06-01

    When we buy an artwork object a certificate of authenticity contain specific details about the artwork. Unfortunately, these certificates are often exchanged between similar artworks: the same document is supplied by the seller to certificate the originality. In this way the buyer will have a copy of an original certificate to attest that the "not original artwork" is an original one. A solution for this problem would be to insert a system that links together the certificate and a specific artwork. To do this it is necessary, for a single artwork, to find unique, unrepeatable, and unchangeable characteristics. In this paper we propose a new lithography certification based on the color spots distribution, which compose the lithography itself. Due to the high resolution acquisition media available today, it is possible using analysis method typical of speckle metrology. In particular, in verification phase it is only necessary acquiring the same portion of lithography, extracting the verification information, using the private key to obtain the same information from the certificate and confronting the two information using a comparison threshold. Due to the possible rotation and translation it is applied image correlation solutions, used in speckle metrology, to determine translation and rotation error and correct allow to verifying extracted and acquired images in the best situation, for granting correct originality verification.

  17. Precision cleaning verification of fluid components by air/water impingement and total carbon analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1994-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 sq m. Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging/diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg/sq ft of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVRs impinged from witness plates of 0.05 to 0.75 sq m.

  18. Precision Cleaning Verification of Fluid Components by Air/Water Impingement and Total Carbon Analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1995-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).

  19. Effect of study context on item recollection.

    PubMed

    Skinner, Erin I; Fernandes, Myra A

    2010-07-01

    We examined how visual context information provided during encoding, and unrelated to the target word, affected later recollection for words presented alone using a remember-know paradigm. Experiments 1A and 1B showed that participants had better overall memory-specifically, recollection-for words studied with pictures of intact faces than for words studied with pictures of scrambled or inverted faces. Experiment 2 replicated these results and showed that recollection was higher for words studied with pictures of faces than when no image accompanied the study word. In Experiment 3 participants showed equivalent memory for words studied with unique faces as for those studied with a repeatedly presented face. Results suggest that recollection benefits when visual context information high in meaningful content accompanies study words and that this benefit is not related to the uniqueness of the context. We suggest that participants use elaborative processes to integrate item and meaningful contexts into ensemble information, improving subsequent item recollection.

  20. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  1. Security Police Career Ladders AFSCs 811X0, 811X2, and 811X2A.

    DTIC Science & Technology

    1984-11-01

    MONITORS (GRP658) PERCENT MEMBERS PERFORMING TASKS (N=186) J424 PERFORM SPCDS OPERATOR REACTIONS TO SENSOR ALARM, LINE FAULT, OR UNIQUE LINE FAULT...MESSAGES 96 J426 PERFORM SPCDS VERIFICATION PROCEDURES 96 J423 PERFORM SMALL PERMANENT COMMUNICATIONS DISPLAY SEGMENT ( SPCDS ) SHUT-DOWN PROCEDURES 92 J425...PERFORM SPCDS START-UP PROCEDURES 91 J419 PERFORM BISS OPERATOR REACTION TO PRIME POWER LOSS OR SEVERE WEATHER WARNINGS 91 E192 MAKE ENTRIES ON AF

  2. Is it Code Imperfection or 'garbage in Garbage Out'? Outline of Experiences from a Comprehensive Adr Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2013-12-01

    ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.

  3. Combining Model-driven and Schema-based Program Synthesis

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Whittle, John

    2004-01-01

    We describe ongoing work which aims to extend the schema-based program synthesis paradigm with explicit models. In this context, schemas can be considered as model-to-model transformations. The combination of schemas with explicit models offers a number of advantages, namely, that building synthesis systems becomes much easier since the models can be used in verification and in adaptation of the synthesis systems. We illustrate our approach using an example from signal processing.

  4. Optimization of Multimedia English Teaching in Context Creation

    ERIC Educational Resources Information Center

    Yang, Weiyan; Fang, Fan

    2008-01-01

    Using multimedia to create a context to teach English has its unique advantages. This paper explores the characteristics of multimedia and integrates how to use multimedia to optimize the context of English teaching as its purpose. In this paper, eight principles, specifically Systematization, Authenticity, Appropriateness, Interactivity,…

  5. WORDGRAPH: Keyword-in-Context Visualization for NETSPEAK's Wildcard Search.

    PubMed

    Riehmann, Patrick; Gruendl, Henning; Potthast, Martin; Trenkmann, Martin; Stein, Benno; Froehlich, Benno

    2012-09-01

    The WORDGRAPH helps writers in visually choosing phrases while writing a text. It checks for the commonness of phrases and allows for the retrieval of alternatives by means of wildcard queries. To support such queries, we implement a scalable retrieval engine, which returns high-quality results within milliseconds using a probabilistic retrieval strategy. The results are displayed as WORDGRAPH visualization or as a textual list. The graphical interface provides an effective means for interactive exploration of search results using filter techniques, query expansion, and navigation. Our observations indicate that, of three investigated retrieval tasks, the textual interface is sufficient for the phrase verification task, wherein both interfaces support context-sensitive word choice, and the WORDGRAPH best supports the exploration of a phrase's context or the underlying corpus. Our user study confirms these observations and shows that WORDGRAPH is generally the preferred interface over the textual result list for queries containing multiple wildcards.

  6. Building the Qualification File of EGNOS with DOORS

    NASA Astrophysics Data System (ADS)

    Fabre, J.

    2008-08-01

    EGNOS, the European Satellite-Based Augmentation System (SBAS) to GPS, is getting to its final deployment and being initially operated towards qualification and certification to reach operational capability by 2008/2009. A very important milestone in the development process is the System Qualification Review (QR). As the verification phase aims at demonstrating that the EGNOS System design meets the applicable requirements, the QR declares the completion of verification activities. The main document to present at QR is a consolidated, consistent and complete Qualification file. The information included shall give confidence to the QR reviewers that the performed qualification activities are completed. Therefore, an important issue for the project team is to focus on synthetic and consistent information, and to make the presentation as clear as possible. Traceability to applicable requirements shall be systematically presented. Moreover, in order to support verification justification, reference to details shall be available, and the reviewer shall have the possibility to link automatically to the documents including this detailed information. In that frame, Thales Alenia Space has implemented a strong support in terms of methodology and tool, to provide to System Engineering and Verification teams a single reference technical database, in which all team members consult the applicable requirements, compliance, justification, design data and record the information necessary to build the final Qualification file. This paper presents the EGNOS context, the Qualification file contents, and the methodology implemented, based on Thales Alenia Space practices and in line with ECSS. Finally, it shows how the Qualification file is built in a DOORS environment.

  7. On the assessment of biological life support system operation range

    NASA Astrophysics Data System (ADS)

    Bartsev, Sergey

    Biological life support systems (BLSS) can be used in long-term space missions only if well-thought-out assessment of the allowable operating range is obtained. The range has to account both permissible working parameters of BLSS and the critical level of perturbations of BLSS stationary state. Direct approach to outlining the range by statistical treatment of experimental data on BLSS destruction seems to be not applicable due to ethical, economical, and saving time reasons. Mathematical model is the unique tool for the generalization of experimental data and the extrapolation of the revealed regularities beyond empirical experience. The problem is that the quality of extrapolation depends on the adequacy of corresponding model verification, but good verification requires wide range of experimental data for fitting, which is not achievable for manned experimental BLSS. Possible way to improve the extrapolation quality of inevitably poorly verified models of manned BLSS is to extrapolate general tendency obtained from unmanned LSS theoretical-experiment investigations. Possibilities and limitations of such approach are discussed.

  8. In vivo proton range verification: a review

    NASA Astrophysics Data System (ADS)

    Knopf, Antje-Christin; Lomax, Antony

    2013-08-01

    Protons are an interesting modality for radiotherapy because of their well defined range and favourable depth dose characteristics. On the other hand, these same characteristics lead to added uncertainties in their delivery. This is particularly the case at the distal end of proton dose distributions, where the dose gradient can be extremely steep. In practice however, this gradient is rarely used to spare critical normal tissues due to such worries about its exact position in the patient. Reasons for this uncertainty are inaccuracies and non-uniqueness of the calibration from CT Hounsfield units to proton stopping powers, imaging artefacts (e.g. due to metal implants) and anatomical changes of the patient during treatment. In order to improve the precision of proton therapy therefore, it would be extremely desirable to verify proton range in vivo, either prior to, during, or after therapy. In this review, we describe and compare state-of-the art in vivo proton range verification methods currently being proposed, developed or clinically implemented.

  9. Energy- and time-resolved detection of prompt gamma-rays for proton range verification.

    PubMed

    Verburg, Joost M; Riley, Kent; Bortfeld, Thomas; Seco, Joao

    2013-10-21

    In this work, we present experimental results of a novel prompt gamma-ray detector for proton beam range verification. The detection system features an actively shielded cerium-doped lanthanum(III) bromide scintillator, coupled to a digital data acquisition system. The acquisition was synchronized to the cyclotron radio frequency to separate the prompt gamma-ray signals from the later-arriving neutron-induced background. We designed the detector to provide a high energy resolution and an effective reduction of background events, enabling discrete proton-induced prompt gamma lines to be resolved. Measuring discrete prompt gamma lines has several benefits for range verification. As the discrete energies correspond to specific nuclear transitions, the magnitudes of the different gamma lines have unique correlations with the proton energy and can be directly related to nuclear reaction cross sections. The quantification of discrete gamma lines also enables elemental analysis of tissue in the beam path, providing a better prediction of prompt gamma-ray yields. We present the results of experiments in which a water phantom was irradiated with proton pencil-beams in a clinical proton therapy gantry. A slit collimator was used to collimate the prompt gamma-rays, and measurements were performed at 27 positions along the path of proton beams with ranges of 9, 16 and 23 g cm(-2) in water. The magnitudes of discrete gamma lines at 4.44, 5.2 and 6.13 MeV were quantified. The prompt gamma lines were found to be clearly resolved in dimensions of energy and time, and had a reproducible correlation with the proton depth-dose curve. We conclude that the measurement of discrete prompt gamma-rays for in vivo range verification of clinical proton beams is feasible, and plan to further study methods and detector designs for clinical use.

  10. Wilderness management within an Australian interstate context

    Treesearch

    J. Elkinton

    2015-01-01

    Conservation, reserve management within an interstate context is not unique in or to Australia. Nevertheless, the policy and procedures derived from various, state legislation often creates ambiguity for interstate, land management agencies. This compels such agencies to adopt specific approaches and strategies to landscape management. This paper provides a context for...

  11. Emotional Intelligence and Organizational Context in Educational Leadership

    ERIC Educational Resources Information Center

    Horne, Matthew R.

    2017-01-01

    This qualitative, multiple case study investigated how educational leaders used and manifested Emotional Intelligence (EI) skills and abilities in unique organizational contexts. The study was conducted with five principals in a large, urban school district. The principals were selected to participate based on the organizational context of their…

  12. Initial Verification of GEOS-4 Aerosols Using CALIPSO and MODIS: Scene Classification

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; Colarco, Peter R.; Hlavka, Dennis; Levy, Robert C.; Vaughan, Mark A.; daSilva, Arlindo

    2007-01-01

    A-train sensors such as MODIS and MISR provide column aerosol properties, and in the process a means of estimating aerosol type (e.g. smoke vs. dust). Correct classification of aerosol type is important because retrievals are often dependent upon selection of the right aerosol model. In addition, aerosol scene classification helps place the retrieved products in context for comparisons and analysis with aerosol transport models. The recent addition of CALIPSO to the A-train now provides a means of classifying aerosol distribution with altitude. CALIPSO level 1 products include profiles of attenuated backscatter at 532 and 1064 nm, and depolarization at 532 nm. Backscatter intensity, wavelength ratio, and depolarization provide information on the vertical profile of aerosol concentration, size, and shape. Thus similar estimates of aerosol type using MODIS or MISR are possible with CALIPSO, and the combination of data from all sensors provides a means of 3D aerosol scene classification. The NASA Goddard Earth Observing System general circulation model and data assimilation system (GEOS-4) provides global 3D aerosol mass for sulfate, sea salt, dust, and black and organic carbon. A GEOS-4 aerosol scene classification algorithm has been developed to provide estimates of aerosol mixtures along the flight track for NASA's Geoscience Laser Altimeter System (GLAS) satellite lidar. GLAS launched in 2003 and did not have the benefit of depolarization measurements or other sensors from the A-train. Aerosol typing from GLAS data alone was not possible, and the GEOS-4 aerosol classifier has been used to identify aerosol type and improve the retrieval of GLAS products. Here we compare 3D aerosol scene classification using CALIPSO and MODIS with the GEOS-4 aerosol classifier. Dust, smoke, and pollution examples will be discussed in the context of providing an initial verification of the 3D GEOS-4 aerosol products. Prior model verification has only been attempted with surface mass comparisons and column optical depth from AERONET and MODIS.

  13. V&V Within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1996-01-01

    Verification and Validation (V&V) is used to increase the level of assurance of critical software, particularly that of safety-critical and mission-critical software. V&V is a systems engineering discipline that evaluates the software in a systems context, and is currently applied during the development of a specific application system. In order to bring the effectiveness of V&V to bear within reuse-based software engineering, V&V must be incorporated within the domain engineering process.

  14. C. V. Raman and Colonial Physics: Acoustics and the Quantum

    NASA Astrophysics Data System (ADS)

    Banerjee, Somaditya

    2014-06-01

    Presenting the social and historical context of Chandrasekhara Venkata Raman, this paper clarifies the nature and development of his work in early twentieth-century colonial India. Raman's early fascination with acoustics became the basis of his later insights into the nature of the light quantum. His work on light scattering played an important role in the experimental verification of quantum mechanics. In general, Raman's worldview corrects certain Orientalist stereotypes about scientific practice in Asia.

  15. Motivation Matters: Lessons for REDD+ Participatory Measurement, Reporting and Verification from Three Decades of Child Health Participatory Monitoring in Indonesia

    PubMed Central

    Ekowati, Dian; Hofstee, Carola; Praputra, Andhika Vega; Sheil, Douglas

    2016-01-01

    Participatory Measurement, Reporting and Verification (PMRV), in the context of reducing emissions from deforestation and forest degradation with its co-benefits (REDD+) requires sustained monitoring and reporting by community members. This requirement appears challenging and has yet to be achieved. Other successful, long established, community self-monitoring and reporting systems may provide valuable lessons. The Indonesian integrated village healthcare program (Posyandu) was initiated in the 1980s and still provides effective and successful participatory measurement and reporting of child health status across the diverse, and often remote, communities of Indonesia. Posyandu activities focus on the growth and development of children under the age of five by recording their height and weight and reporting these monthly to the Ministry of Health. Here we focus on the local Posyandu personnel (kaders) and their motivations and incentives for contributing. While Posyandu and REDD+ measurement and reporting activities differ, there are sufficient commonalities to draw useful lessons. We find that the Posyandu kaders are motivated by their interests in health care, by their belief that it benefits the community, and by encouragement by local leaders. Recognition from the community, status within the system, training opportunities, competition among communities, and small payments provide incentives to sustain participation. We examine these lessons in the context of REDD+. PMID:27806053

  16. Modeling and Simulation Verification, Validation and Accreditation (VV&A): A New Undertaking for the Exploration Systems Mission Directorate

    NASA Technical Reports Server (NTRS)

    Prill, Mark E.

    2005-01-01

    and Accreditation (VV&A) session audience, a snapshot review of the Exploration Space Mission Directorate s (ESMD) investigation into implementation of a modeling and simulation (M&S) VV&A program. The presentation provides some legacy ESMD reference material, including information on the then-current organizational structure, and M&S (Simulation Based Acquisition (SBA)) focus contained therein, to provide a context for the proposed M&S VV&A approach. This reference material briefly highlights the SBA goals and objectives, and outlines FY05 M&S development and implementation consistent with the Subjective Assessment, Constructive Assessment, Operator-in-the-Loop Assessment, Hardware-in-the-Loop Assessment, and In Service Operations Assessment M&S construct, the NASA Exploration Information Ontology Model (NExIOM) data model, and integration with the Windchill-based Integrated Collaborative Environment (ICE). The presentation then addresses the ESMD team s initial conclusions regarding an M&S VV&A program, summarizes the general VV&A implementation approach anticipated, and outlines some of the recognized VV&A program challenges, all within a broader context of the overarching Integrated Modeling and Simulation (IM&S) environment at both the ESMD and Agency (NASA) levels. The presentation concludes with a status on the current M&S organization s progress to date relative to the recommended IM&S implementation activity. The overall presentation was focused to provide, for the Verification, Validation,

  17. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  18. Very fast road database verification using textured 3D city models obtained from airborne imagery

    NASA Astrophysics Data System (ADS)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models of buildings, trees, and ground as output. Building s and ground are textured by means of available images. This facilitates the orientation in the model and the interactive verification of the road objects that where initially classified as unknown. The three main modules of the texturing algorithm are: Pose estimation (if the videos are not geo-referenced), occlusion analysis, and texture synthesis.

  19. The Rural Context and Post-Secondary School Enrollment: An Ecological Systems Approach

    ERIC Educational Resources Information Center

    Demi, Mary Ann; Coleman-Jensen, Alisha; Snyder, Anastasia R.

    2010-01-01

    This study uses an ecological systems framework to examine how indicators of individual, family, and school contexts are associated with post-secondary educational enrollment among a sample of rural youth. Structural equation modeling allows us to examine both direct and indirect effects of these contexts on school enrollment. Unique elements of…

  20. Postures and Motions Library Development for Verification of Ground Crew Human Systems Integration Requirements

    NASA Technical Reports Server (NTRS)

    Jackson, Mariea Dunn; Dischinger, Charles; Stambolian, Damon; Henderson, Gena

    2012-01-01

    Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a Primitive motion capture library. The Library will be used by the human factors engineering in the future to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the Primitive models are being developed for the library the project has selected several current human factors issues to be addressed for the SLS and Orion launch systems. This paper explains how the Motion Capture of unique ground systems activities are being used to verify the human factors analysis requirements for ground system used to process the STS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.

  1. Postures and Motions Library Development for Verification of Ground Crew Human Factors Requirements

    NASA Technical Reports Server (NTRS)

    Stambolian, Damon; Henderson, Gena; Jackson, Mariea Dunn; Dischinger, Charles

    2013-01-01

    Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a primitive motion capture library. The library will be used by human factors engineering analysts to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the primitive models are being developed for the library, the project has selected several current human factors issues to be addressed for the Space Launch System (SLS) and Orion launch systems. This paper explains how the motion capture of unique ground systems activities is being used to verify the human factors engineering requirements for ground systems used to process the SLS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.

  2. The role of criteria in design and management of space systems

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.

    1992-01-01

    Explicit requirements and standards arising in connection with space systems management serve as a framework for technical management and furnish legally binding control of development, verification, and operations. As a project develops, additional requirements are derived which are unique to the system in question; these are designated 'derived requirements'. The reliability and cost-effectiveness of a space system are best ensured where a balance has arisen between formal (legally binding) and informal. Attention is presently given to the development of criteria consistent with total quality management.

  3. Grid Modernization Laboratory Consortium - Testing and Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroposki, Benjamin; Skare, Paul; Pratt, Rob

    This paper highlights some of the unique testing capabilities and projects being performed at several national laboratories as part of the U. S. Department of Energy Grid Modernization Laboratory Consortium. As part of this effort, the Grid Modernization Laboratory Consortium Testing Network isbeing developed to accelerate grid modernization by enablingaccess to a comprehensive testing infrastructure and creating a repository of validated models and simulation tools that will be publicly available. This work is key to accelerating thedevelopment, validation, standardization, adoption, and deployment of new grid technologies to help meet U. S. energy goals.

  4. Advanced fingerprint verification software

    NASA Astrophysics Data System (ADS)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  5. 4MOST systems engineering: from conceptual design to preliminary design review

    NASA Astrophysics Data System (ADS)

    Bellido-Tirado, Olga; Frey, Steffen; Barden, Samuel C.; Brynnel, Joar; Giannone, Domenico; Haynes, Roger; de Jong, Roelof S.; Phillips, Daniel; Schnurr, Olivier; Walcher, Jakob; Winkler, Roland

    2016-08-01

    The 4MOST Facility is a high-multiplex, wide-field, brief-fed spectrograph system for the ESO VISTA telescope. It aims to create a world-class spectroscopic survey facility unique in its combination of wide-field multiplex, spectral resolution, spectral coverage, and sensitivity. At the end of 2014, after a successful concept optimization design phase, 4MOST entered into its Preliminary Design Phase. Here we present the process and tools adopted during the Preliminary Design Phase to define the subsystems specifications, coordinate the interface control documents and draft the system verification procedures.

  6. Efficient Ada multitasking on a RISC register window architecture

    NASA Technical Reports Server (NTRS)

    Kearns, J. P.; Quammen, D.

    1987-01-01

    This work addresses the problem of reducing context switch overhead on a processor which supports a large register file - a register file much like that which is part of the Berkeley RISC processors and several other emerging architectures (which are not necessarily reduced instruction set machines in the purest sense). Such a reduction in overhead is particularly desirable in a real-time embedded application, in which task-to-task context switch overhead may result in failure to meet crucial deadlines. A storage management technique by which a context switch may be implemented as cheaply as a procedure call is presented. The essence of this technique is the avoidance of the save/restore of registers on the context switch. This is achieved through analysis of the static source text of an Ada tasking program. Information gained during that analysis directs the optimized storage management strategy for that program at run time. A formal verification of the technique in terms of an operational control model and an evaluation of the technique's performance via simulations driven by synthetic Ada program traces are presented.

  7. Future Combat System Spinout 1 Technical Field Test - Establishing and Implementing Models and Simulations System of Systems Verification, Validation and Accreditation Practices, Methodologies and Procedures

    DTIC Science & Technology

    2009-11-24

    assisted by the Brigade Combat Team (BCT) Modernization effort, the use of Models and Simulations ( M &S) becomes more crucial in supporting major...in 2008 via a slice of the Current Force (CF) BCT structure. To ensure realistic operational context, a M &S System-of- Systems (SoS) level...messages, and constructive representation of platforms, vehicles, and terrain. The M &S federation also provided test control, data collection, and live

  8. Computational Materials: Modeling and Simulation of Nanostructured Materials and Systems

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Hinkley, Jeffrey A.

    2003-01-01

    The paper provides details on the structure and implementation of the Computational Materials program at the NASA Langley Research Center. Examples are given that illustrate the suggested approaches to predicting the behavior and influencing the design of nanostructured materials such as high-performance polymers, composites, and nanotube-reinforced polymers. Primary simulation and measurement methods applicable to multi-scale modeling are outlined. Key challenges including verification and validation of models are highlighted and discussed within the context of NASA's broad mission objectives.

  9. Addressing the Rural Context in Literacies Research: A Call to Action

    ERIC Educational Resources Information Center

    Azano, Amy Price

    2015-01-01

    The article features a discussion about rural contexts in literacy research. Rural students compose a significant portion of K-12 students, and rural schools have unique challenges, such as limited funding and resources. However, as a field of literacy researchers, we do not privilege place equitably across multiple contexts. The article serves as…

  10. Contextual Leadership Practices: The Case of a Successful School Principal in Malaysia

    ERIC Educational Resources Information Center

    Noman, Mohammad; Awang Hashim, Rosna; Shaik Abdullah, Sarimah

    2018-01-01

    The study of context-based leadership practices has gained currency during the last decade. This study aims to complement the recent efforts of researchers in identifying the context-based leadership practices of successful school leaders, and deliberating how these practices are enacted within their own unique contexts. An in-depth case study was…

  11. Experiencing Restorative Justice Practices within the Context of an Academic Course--A Phenomenological Mixed Methods Study

    ERIC Educational Resources Information Center

    Dedinsky, Paul C.

    2012-01-01

    This study explored restorative justice arising in the context of an academic high school course in which students learned restorative justice principles and strategies. Given that the literature provided limited guidance of restorative justice in this context, these novel circumstances presented a unique opportunity for study. The central…

  12. Enriching regulatory networks by bootstrap learning using optimised GO-based gene similarity and gene links mined from PubMed abstracts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Ronald C.; Sanfilippo, Antonio P.; McDermott, Jason E.

    2011-02-18

    Transcriptional regulatory networks are being determined using “reverse engineering” methods that infer connections based on correlations in gene state. Corroboration of such networks through independent means such as evidence from the biomedical literature is desirable. Here, we explore a novel approach, a bootstrapping version of our previous Cross-Ontological Analytic method (XOA) that can be used for semi-automated annotation and verification of inferred regulatory connections, as well as for discovery of additional functional relationships between the genes. First, we use our annotation and network expansion method on a biological network learned entirely from the literature. We show how new relevant linksmore » between genes can be iteratively derived using a gene similarity measure based on the Gene Ontology that is optimized on the input network at each iteration. Second, we apply our method to annotation, verification, and expansion of a set of regulatory connections found by the Context Likelihood of Relatedness algorithm.« less

  13. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  14. Carbon sequestration and its role in the global carbon cycle

    USGS Publications Warehouse

    McPherson, Brian J.; Sundquist, Eric T.

    2009-01-01

    For carbon sequestration the issues of monitoring, risk assessment, and verification of carbon content and storage efficacy are perhaps the most uncertain. Yet these issues are also the most critical challenges facing the broader context of carbon sequestration as a means for addressing climate change. In response to these challenges, Carbon Sequestration and Its Role in the Global Carbon Cycle presents current perspectives and research that combine five major areas: • The global carbon cycle and verification and assessment of global carbon sources and sinks • Potential capacity and temporal/spatial scales of terrestrial, oceanic, and geologic carbon storage • Assessing risks and benefits associated with terrestrial, oceanic, and geologic carbon storage • Predicting, monitoring, and verifying effectiveness of different forms of carbon storage • Suggested new CO2 sequestration research and management paradigms for the future. The volume is based on a Chapman Conference and will appeal to the rapidly growing group of scientists and engineers examining methods for deliberate carbon sequestration through storage in plants, soils, the oceans, and geological repositories.

  15. Formally Verified Practical Algorithms for Recovery from Loss of Separation

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Caesar A.

    2009-01-01

    In this paper, we develop and formally verify practical algorithms for recovery from loss of separation. The formal verification is performed in the context of a criteria-based framework. This framework provides rigorous definitions of horizontal and vertical maneuver correctness that guarantee divergence and achieve horizontal and vertical separation. The algorithms are shown to be independently correct, that is, separation is achieved when only one aircraft maneuvers, and implicitly coordinated, that is, separation is also achieved when both aircraft maneuver. In this paper we improve the horizontal criteria over our previous work. An important benefit of the criteria approach is that different aircraft can execute different algorithms and implicit coordination will still be achieved, as long as they all meet the explicit criteria of the framework. Towards this end we have sought to make the criteria as general as possible. The framework presented in this paper has been formalized and mechanically verified in the Prototype Verification System (PVS).

  16. Towards a Compositional SPIN

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Giannakopoulou, Dimitra

    2006-01-01

    This paper discusses our initial experience with introducing automated assume-guarantee verification based on learning in the SPIN tool. We believe that compositional verification techniques such as assume-guarantee reasoning could complement the state-reduction techniques that SPIN already supports, thus increasing the size of systems that SPIN can handle. We present a "light-weight" approach to evaluating the benefits of learning-based assume-guarantee reasoning in the context of SPIN: we turn our previous implementation of learning for the LTSA tool into a main program that externally invokes SPIN to provide the model checking-related answers. Despite its performance overheads (which mandate a future implementation within SPIN itself), this approach provides accurate information about the savings in memory. We have experimented with several versions of learning-based assume guarantee reasoning, including a novel heuristic introduced here for generating component assumptions when their environment is unavailable. We illustrate the benefits of learning-based assume-guarantee reasoning in SPIN through the example of a resource arbiter for a spacecraft. Keywords: assume-guarantee reasoning, model checking, learning.

  17. Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Miller, Karen A.; Garner, James R.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument uncertainties, provides an assessment of the partial-defect sensitivity of HEVA and PNEM for both one-time assay and (repeated) NDA Fingerprint verification scenarios. The findings presented in this report represent a significant step forward in the community’s understanding of the strengths and limitations of the PNEM and HEVA NDA methods, and the viability of the UCVS concept in front-end fuel cycle facilities. This experience will inform Phase II of the UCVS viability study, should the IAEA pursue it.« less

  18. Evaluation of streamflow forecast for the National Water Model of U.S. National Weather Service

    NASA Astrophysics Data System (ADS)

    Rafieeinasab, A.; McCreight, J. L.; Dugger, A. L.; Gochis, D.; Karsten, L. R.; Zhang, Y.; Cosgrove, B.; Liu, Y.

    2016-12-01

    The National Water Model (NWM), an implementation of the community WRF-Hydro modeling system, is an operational hydrologic forecasting model for the contiguous United States. The model forecasts distributed hydrologic states and fluxes, including soil moisture, snowpack, ET, and ponded water. In particular, the NWM provides streamflow forecasts at more than 2.7 million river reaches for three forecast ranges: short (15 hr), medium (10 days), and long (30 days). In this study, we verify short and medium range streamflow forecasts in the context of the verification of their respective quantitative precipitation forecasts/forcing (QPF), the High Resolution Rapid Refresh (HRRR) and the Global Forecast System (GFS). The streamflow evaluation is performed for summer of 2016 at more than 6,000 USGS gauges. Both individual forecasts and forecast lead times are examined. Selected case studies of extreme events aim to provide insight into the quality of the NWM streamflow forecasts. A goal of this comparison is to address how much streamflow bias originates from precipitation forcing bias. To this end, precipitation verification is performed over the contributing areas above (and between assimilated) USGS gauge locations. Precipitation verification is based on the aggregated, blended StageIV/StageII data as the "reference truth". We summarize the skill of the streamflow forecasts, their skill relative to the QPF, and make recommendations for improving NWM forecast skill.

  19. Verification assessment of piston boundary conditions for Lagrangian simulation of compressible flow similarity solutions

    DOE PAGES

    Ramsey, Scott D.; Ivancic, Philip R.; Lilieholm, Jennifer F.

    2015-12-10

    This work is concerned with the use of similarity solutions of the compressible flow equations as benchmarks or verification test problems for finite-volume compressible flow simulation software. In practice, this effort can be complicated by the infinite spatial/temporal extent of many candidate solutions or “test problems.” Methods can be devised with the intention of ameliorating this inconsistency with the finite nature of computational simulation; the exact strategy will depend on the code and problem archetypes under investigation. For example, self-similar shock wave propagation can be represented in Lagrangian compressible flow simulations as rigid boundary-driven flow, even if no such “piston”more » is present in the counterpart mathematical similarity solution. The purpose of this work is to investigate in detail the methodology of representing self-similar shock wave propagation as a piston-driven flow in the context of various test problems featuring simple closed-form solutions of infinite spatial/temporal extent. The closed-form solutions allow for the derivation of similarly closed-form piston boundary conditions (BCs) for use in Lagrangian compressible flow solvers. Finally, the consequences of utilizing these BCs (as opposed to directly initializing the self-similar solution in a computational spatial grid) are investigated in terms of common code verification analysis metrics (e.g., shock strength/position errors and global convergence rates).« less

  20. Verification assessment of piston boundary conditions for Lagrangian simulation of compressible flow similarity solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsey, Scott D.; Ivancic, Philip R.; Lilieholm, Jennifer F.

    This work is concerned with the use of similarity solutions of the compressible flow equations as benchmarks or verification test problems for finite-volume compressible flow simulation software. In practice, this effort can be complicated by the infinite spatial/temporal extent of many candidate solutions or “test problems.” Methods can be devised with the intention of ameliorating this inconsistency with the finite nature of computational simulation; the exact strategy will depend on the code and problem archetypes under investigation. For example, self-similar shock wave propagation can be represented in Lagrangian compressible flow simulations as rigid boundary-driven flow, even if no such “piston”more » is present in the counterpart mathematical similarity solution. The purpose of this work is to investigate in detail the methodology of representing self-similar shock wave propagation as a piston-driven flow in the context of various test problems featuring simple closed-form solutions of infinite spatial/temporal extent. The closed-form solutions allow for the derivation of similarly closed-form piston boundary conditions (BCs) for use in Lagrangian compressible flow solvers. Finally, the consequences of utilizing these BCs (as opposed to directly initializing the self-similar solution in a computational spatial grid) are investigated in terms of common code verification analysis metrics (e.g., shock strength/position errors and global convergence rates).« less

  1. Development of a Spanish language fertility educational brochure for pediatric oncology families.

    PubMed

    Murphy, D; Kashal, P; Quinn, G P; Sawczyn, K K; Termuhlen, A M

    2014-08-01

    Education materials detailing fertility preservation options geared towards pediatric oncology patients are inadequately available, particularly materials that are culturally tailored. An English language pediatric fertility preservation brochure was developed in 2011, and given the significance of family building among Hispanics, it is important to transcreate materials for these audiences using learner verification to explore the unique preferences of the population. Qualitative face-to-face interviews and focus groups. Spanish-speaking patients (n = 10), parents (n = 10), and healthcare providers (n = 5). Suggestions for revisions were tested with focus groups of the same population (N = 16). Design, readability, likelihood to read, and overall opinion. Feedback was organized into 2 distinct themes: design and reader action. Overall the majority of parents and patients wanted personal accounts of other patients who had undergone fertility preservation, as well as photos of actual patients. The medical terminology in the brochure was acceptable and understood by most. The majority of participants who preferred the design with vivid colors and patterns explained this was because that brochure also contained more relevant information; however, both brochures had identical information. Many participants explained they would be receptive to receiving the brochure and the reproductive health information should be reinforced throughout cancer care. A learner verification approach to create pediatric educational materials can judiciously identify unique preferences for information. These results will be utilized to educate Spanish-speaking pediatric oncology patients and their parents to improve decision-making processes regarding future parenthood. Copyright © 2014 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.

  2. Sexual Harassment and Assault in the U.S. Military: A Review of Policy and Research Trends.

    PubMed

    Stander, Valerie A; Thomsen, Cynthia J

    2016-01-01

    Recently, there has been increasing concern regarding the problem of sexual violence in the military. Because sexual harassment and assault are more closely intertwined in the military than in most civilian contexts, the military context affords a unique opportunity to study the interrelationships between these two types of sexual violence. In this review, we briefly summarize existing research on military sexual trauma prevalence rates, effects on victims, and risk factors, as well as prevention and response programs in the military context. In each of these topic areas, we emphasize issues unique to the complex interplay between sexual harassment and assault in the military and make recommendations for future research. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  3. Efficient design and verification of diagnostics for impurity transport experiments.

    PubMed

    Chilenski, M A; Greenwald, M J; Marzouk, Y M; Rice, J E; White, A E

    2018-01-01

    Recent attempts to measure impurity transport in Alcator C-Mod using an x-ray imaging crystal spectrometer and laser blow-off impurity injector have failed to yield unique reconstructions of the transport coefficient profiles. This paper presents a fast, linearized model which was constructed to estimate diagnostic requirements for impurity transport experiments. The analysis shows that the spectroscopic diagnostics on Alcator C-Mod should be capable of inferring simple profiles of impurity diffusion D Z and convection V Z accurate to better than ±10% uncertainty, suggesting that the failure to infer unique D Z and V Z from experimental data is attributable to an inadequate analysis procedure rather than the result of insufficient diagnostics. Furthermore, the analysis reveals that even a modest spatial resolution can overcome a low time resolution. This approach can be adapted to design and verify diagnostics for transport experiments on any magnetic confinement device.

  4. Simple thermal to thermal face verification method based on local texture descriptors

    NASA Astrophysics Data System (ADS)

    Grudzien, A.; Palka, Norbert; Kowalski, M.

    2017-08-01

    Biometrics is a science that studies and analyzes physical structure of a human body and behaviour of people. Biometrics found many applications ranging from border control systems, forensics systems for criminal investigations to systems for access control. Unique identifiers, also referred to as modalities are used to distinguish individuals. One of the most common and natural human identifiers is a face. As a result of decades of investigations, face recognition achieved high level of maturity, however recognition in visible spectrum is still challenging due to illumination aspects or new ways of spoofing. One of the alternatives is recognition of face in different parts of light spectrum, e.g. in infrared spectrum. Thermal infrared offer new possibilities for human recognition due to its specific properties as well as mature equipment. In this paper we present the scheme of subject's verification methodology by using facial images in thermal range. The study is focused on the local feature extraction methods and on the similarity metrics. We present comparison of two local texture-based descriptors for thermal 1-to-1 face recognition.

  5. Development of a Targeted Smoking Relapse-Prevention Intervention for Cancer Patients.

    PubMed

    Meltzer, Lauren R; Meade, Cathy D; Diaz, Diana B; Carrington, Monica S; Brandon, Thomas H; Jacobsen, Paul B; McCaffrey, Judith C; Haura, Eric B; Simmons, Vani N

    2018-04-01

    We describe the series of iterative steps used to develop a smoking relapse-prevention intervention customized to the needs of cancer patients. Informed by relevant literature and a series of preliminary studies, an educational tool (DVD) was developed to target the unique smoking relapse risk factors among cancer patients. Learner verification interviews were conducted with 10 cancer patients who recently quit smoking to elicit feedback and inform the development of the DVD. The DVD was then refined using iterative processes and feedback from the learner verification interviews. Major changes focused on visual appeal, and the inclusion of additional testimonials and graphics to increase comprehension of key points and further emphasize the message that the patient is in control of their ability to maintain their smoking abstinence. Together, these steps resulted in the creation of a DVD titled Surviving Smokefree®, which represents the first smoking relapse-prevention intervention for cancer patients. If found effective, the Surviving Smokefree® DVD is an easily disseminable and low-cost portable intervention which can assist cancer patients in maintaining smoking abstinence.

  6. Cooperative GN&C development in a rapid prototyping environment. [flight software design for space vehicles

    NASA Technical Reports Server (NTRS)

    Bordano, Aldo; Uhde-Lacovara, JO; Devall, Ray; Partin, Charles; Sugano, Jeff; Doane, Kent; Compton, Jim

    1993-01-01

    The Navigation, Control and Aeronautics Division (NCAD) at NASA-JSC is exploring ways of producing Guidance, Navigation and Control (GN&C) flight software faster, better, and cheaper. To achieve these goals NCAD established two hardware/software facilities that take an avionics design project from initial inception through high fidelity real-time hardware-in-the-loop testing. Commercially available software products are used to develop the GN&C algorithms in block diagram form and then automatically generate source code from these diagrams. A high fidelity real-time hardware-in-the-loop laboratory provides users with the capability to analyze mass memory usage within the targeted flight computer, verify hardware interfaces, conduct system level verification, performance, acceptance testing, as well as mission verification using reconfigurable and mission unique data. To evaluate these concepts and tools, NCAD embarked on a project to build a real-time 6 DOF simulation of the Soyuz Assured Crew Return Vehicle flight software. To date, a productivity increase of 185 percent has been seen over traditional NASA methods for developing flight software.

  7. Euler equation existence, non-uniqueness and mesh converged statistics

    PubMed Central

    Glimm, James; Sharp, David H.; Lim, Hyunkyung; Kaufman, Ryan; Hu, Wenlin

    2015-01-01

    We review existence and non-uniqueness results for the Euler equation of fluid flow. These results are placed in the context of physical models and their solutions. Non-uniqueness is in direct conflict with the purpose of practical simulations, so that a mitigating strategy, outlined here, is important. We illustrate these issues in an examination of mesh converged turbulent statistics, with comparison to laboratory experiments. PMID:26261361

  8. Anxiety as a context for understanding associations between hypochondriasis, obsessive-compulsive, and panic attack symptoms.

    PubMed

    Longley, Susan L; Calamari, John E; Wu, Kevin; Wade, Michael

    2010-12-01

    In the context of the integrative model of anxiety and depression, we examined whether the essential problem of hypochondriasis is one of anxiety. When analyzed, data from a large nonclinical sample corresponded to the integrative model's characterization of anxiety as composed of both broad, shared and specific, unique symptom factors. The unique hypochondriasis, obsessive-compulsive, and panic attack symptom factors all had correlational patterns expected of anxiety with the shared, broad factors of negative emotionality and positive emotionality. A confirmatory factor analysis showed a higher-order, bifactor model was the best fit to our data; the shared and the unique hypochondriasis and anxiety symptom factors both contributed substantial variance. This study provides refinements to an empirically based taxonomy and clarifies what hypochondriasis is and, importantly, what it is not. Copyright © 2010. Published by Elsevier Ltd.

  9. Source Characterization of Underground Explosions from Combined Regional Moment Tensor and First-Motion Analysis

    DOE PAGES

    Chiang, Andrea; Dreger, Douglas S.; Ford, Sean R.; ...

    2014-07-08

    Here in this study, we investigate the 14 September 1988 U.S.–Soviet Joint Verification Experiment nuclear test at the Semipalatinsk test site in eastern Kazakhstan and two nuclear explosions conducted less than 10 years later at the Chinese Lop Nor test site. These events were very sparsely recorded by stations located within 1600 km, and in each case only three or four stations were available in the regional distance range. We have utilized a regional distance seismic waveform method fitting long-period, complete, three-component waveforms jointly with first-motion observations from regional stations and teleseismic arrays. The combination of long-period waveforms and first-motionmore » observations provides a unique discrimination of these sparsely recorded events in the context of the Hudson et al. (1989) source-type diagram. We demonstrate through a series of jackknife tests and sensitivity analyses that the source type of the explosions is well constrained. One event, a 1996 Lop Nor shaft explosion, displays large Love waves and possibly reversed Rayleigh waves at one station, indicative of a large F-factor. We show the combination of long-period waveforms and P-wave first motions are able to discriminate this event as explosion-like and distinct from earthquakes and collapses. We further demonstrate the behavior of network sensitivity solutions for models of tectonic release and spall-based tensile damage over a range of F-factors and K-factors.« less

  10. Source Characterization of Underground Explosions from Combined Regional Moment Tensor and First-Motion Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiang, Andrea; Dreger, Douglas S.; Ford, Sean R.

    Here in this study, we investigate the 14 September 1988 U.S.–Soviet Joint Verification Experiment nuclear test at the Semipalatinsk test site in eastern Kazakhstan and two nuclear explosions conducted less than 10 years later at the Chinese Lop Nor test site. These events were very sparsely recorded by stations located within 1600 km, and in each case only three or four stations were available in the regional distance range. We have utilized a regional distance seismic waveform method fitting long-period, complete, three-component waveforms jointly with first-motion observations from regional stations and teleseismic arrays. The combination of long-period waveforms and first-motionmore » observations provides a unique discrimination of these sparsely recorded events in the context of the Hudson et al. (1989) source-type diagram. We demonstrate through a series of jackknife tests and sensitivity analyses that the source type of the explosions is well constrained. One event, a 1996 Lop Nor shaft explosion, displays large Love waves and possibly reversed Rayleigh waves at one station, indicative of a large F-factor. We show the combination of long-period waveforms and P-wave first motions are able to discriminate this event as explosion-like and distinct from earthquakes and collapses. We further demonstrate the behavior of network sensitivity solutions for models of tectonic release and spall-based tensile damage over a range of F-factors and K-factors.« less

  11. Experimental verification of the rainbow trapping effect in adiabatic plasmonic gratings

    PubMed Central

    Gan, Qiaoqiang; Gao, Yongkang; Wagner, Kyle; Vezenov, Dmitri; Ding, Yujie J.; Bartoli, Filbert J.

    2011-01-01

    We report the experimental observation of a trapped rainbow in adiabatically graded metallic gratings, designed to validate theoretical predictions for this unique plasmonic structure. One-dimensional graded nanogratings were fabricated and their surface dispersion properties tailored by varying the grating groove depth, whose dimensions were confirmed by atomic force microscopy. Tunable plasmonic bandgaps were observed experimentally, and direct optical measurements on graded grating structures show that light of different wavelengths in the 500–700-nm region is “trapped” at different positions along the grating, consistent with computer simulations, thus verifying the “rainbow” trapping effect. PMID:21402936

  12. Using Automated Theorem Provers to Certify Auto-Generated Aerospace Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann

    2004-01-01

    We describe a system for the automated certification of safety properties of NASA software. The system uses Hoare-style program verification technology to generate proof obligations which are then processed by an automated first-order theorem prover (ATP). For full automation, however, the obligations must be aggressively preprocessed and simplified We describe the unique requirements this places on the ATP and demonstrate how the individual simplification stages, which are implemented by rewriting, influence the ability of the ATP to solve the proof tasks. Experiments on more than 25,000 tasks were carried out using Vampire, Spass, and e-setheo.

  13. Experimental Verification of the Theory of Wind-Tunnel Boundary Interference

    NASA Technical Reports Server (NTRS)

    Theodorsen, Theodore; Silverstein, Abe

    1935-01-01

    The results of an experimental investigation on the boundary-correction factor are presented in this report. The values of the boundary-correction factor from the theory, which at the present time is virtually completed, are given in the report for all conventional types of tunnels. With the isolation of certain disturbing effects, the experimental boundary-correction factor was found to be in satisfactory agreement with the theoretically predicted values, thus verifying the soundness and sufficiency of the theoretical analysis. The establishment of a considerable velocity distortion, in the nature of a unique blocking effect, constitutes a principal result of the investigation.

  14. Use of IMS data and its potential for research through global noble gases concentration maps

    NASA Astrophysics Data System (ADS)

    Terzi, Lucrezia; Kalinowski, Martin; Gueibe, Christophe; Camps, Johan; Gheddou, Abdelhakim; Kusmierczyk-Michulec, Jolanta; Schoeppner, Michael

    2017-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) established for verification purposes a global monitoring system for atmospheric radioisotopes and noble gas radioactivity. Daily activity concentrations have been collected worldwide for over 15 years providing unique data sets with long term time series that can be used for atmospheric circulation dynamics analysis. In this study, we want to emphasize the value of worldwide noble gas data by reconstructing global xenon concentration maps and comparing these observations with ATM simulations. By creating a residual plot, we can improve our understanding of our source estimation level for each region.

  15. Building on Our Teaching Assets: The Unique Pedagogical Contributions of Bilingual Educators

    ERIC Educational Resources Information Center

    Hopkins, Megan

    2013-01-01

    This article examines the unique contributions that bilingual and bilingually credentialed teachers make to the instruction of emergent bilinguals in the United States. This mixed methodological study involved 474 teachers in Arizona, California, and Texas, which represent distinct language policy contexts. Results revealed that, irrespective of…

  16. Transforming Teaching Challenges into Learning Opportunities: Interdisciplinary Reflective Collaboration

    ERIC Educational Resources Information Center

    Callaghan, Ronel

    2015-01-01

    Teaching in higher education poses unique sets of challenges, especially for academics in the engineering, built sciences and information science education disciplines. This article focuses on how reflective collaboration can support academics in their quest to find unique solutions to challenges in different academic contexts. A reflective…

  17. Context-Dependent Learning in People With Parkinson's Disease.

    PubMed

    Lee, Ya-Yun; Winstein, Carolee J; Gordon, James; Petzinger, Giselle M; Zelinski, Elizabeth M; Fisher, Beth E

    2016-01-01

    Context-dependent learning is a phenomenon in which people demonstrate superior performance in the context in which they originally learned a skill but perform less well in a novel context. This study investigated context-dependent learning in people with Parkinson's disease (PD) and age-matched nondisabled adults. All participants practiced 3 finger sequences, each embedded within a unique context (colors and locations on a computer screen). One day after practice, the participants were tested either under the sequence-context associations remained the same as during practice, or the sequence-context associations were changed (SWITCH). Compared with nondisabled adults, people with PD demonstrated significantly greater decrement in performance (especially movement time) under the SWITCH condition, suggesting that individuals with PD are more context dependent than nondisabled adults.

  18. Monte Carlo modeling of HD120 multileaf collimator on Varian TrueBeam linear accelerator for verification of 6X and 6X FFF VMAT SABR treatment plans

    PubMed Central

    Gete, Ermias; Duzenli, Cheryl; Teke, Tony

    2014-01-01

    A Monte Carlo (MC) validation of the vendor‐supplied Varian TrueBeam 6 MV flattened (6X) phase‐space file and the first implementation of the Siebers‐Keall MC MLC model as applied to the HD120 MLC (for 6X flat and 6X flattening filterfree (6X FFF) beams) are described. The MC model is validated in the context of VMAT patient‐specific quality assurance. The Monte Carlo commissioning process involves: 1) validating the calculated open‐field percentage depth doses (PDDs), profiles, and output factors (OF), 2) adapting the Siebers‐Keall MLC model to match the new HD120‐MLC geometry and material composition, 3) determining the absolute dose conversion factor for the MC calculation, and 4) validating this entire linac/MLC in the context of dose calculation verification for clinical VMAT plans. MC PDDs for the 6X beams agree with the measured data to within 2.0% for field sizes ranging from 2 × 2 to 40 × 40 cm2. Measured and MC profiles show agreement in the 50% field width and the 80%‐20% penumbra region to within 1.3 mm for all square field sizes. MC OFs for the 2 to 40 cm2 square fields agree with measurement to within 1.6%. Verification of VMAT SABR lung, liver, and vertebra plans demonstrate that measured and MC ion chamber doses agree within 0.6% for the 6X beam and within 2.0% for the 6X FFF beam. A 3D gamma factor analysis demonstrates that for the 6X beam, > 99% of voxels meet the pass criteria (3%/3 mm). For the 6X FFF beam, > 94% of voxels meet this criteria. The TrueBeam accelerator delivering 6X and 6X FFF beams with the HD120 MLC can be modeled in Monte Carlo to provide an independent 3D dose calculation for clinical VMAT plans. This quality assurance tool has been used clinically to verify over 140 6X and 16 6X FFF TrueBeam treatment plans. PACS number: 87.55.K‐ PMID:24892341

  19. Apollo Soyuz Test Project Weights and Mass Properties Operational Management System

    NASA Technical Reports Server (NTRS)

    Collins, M. A., Jr.; Hischke, E. R.

    1975-01-01

    The Apollo Soyuz Test Project (ASTP) Weights and Mass Properties Operational Management System was established to assure a timely and authoritative method of acquiring, controlling, generating, and disseminating an official set of vehicle weights and mass properties data. This paper provides an overview of the system and its interaction with the various aspects of vehicle and component design, mission planning, hardware and software simulations and verification, and real-time mission support activities. The effect of vehicle configuration, design maturity, and consumables updates is discussed in the context of weight control.

  20. "Context-Specific" Teacher Preparation for New York City: An Exploration of the Content of Context in Bard College's Urban Teacher Residency Program

    ERIC Educational Resources Information Center

    Hammerness, Karen; Craig, Elizabeth

    2016-01-01

    In this article, we examine a residency program that was developed to prepare teachers specifically for New York City schools--the Bard College Master of Arts in Teaching Urban Teacher Residency program. This focused preparation on the particular urban context of New York City provides us with a unique opportunity to examine the nature of…

  1. Self-management support at the end of life: Patients', carers' and professionals' perspectives on managing medicines.

    PubMed

    Campling, N; Richardson, A; Mulvey, M; Bennett, M; Johnston, B; Latter, S

    2017-11-01

    Pain is a frequently reported symptom by patients approaching the end of life and well-established that patients and carers hold fears relating to opioids, and experience side effects related to their use. The management of medicines is intrinsic to achieving effective pain relief. The concept of self-management support whilst well characterised in the context of chronic illness has not been elaborated with respect to end of life care. To identify patient, carer and professional views on the concept of self-management support at end of life, specifically in relation to analgesia and related medicines (for side-effect management) in order to describe, characterise and explain self-management support in this context. Qualitative design, data collection methods involved focus groups and interviews. Topics included the meaning of self-management support in this context, roles and behaviours adopted to manage pain-related medicines, and factors that influence these. A largely deductive approach was used, involving verification and validation of key frameworks from the literature, but with capacity for new findings to emerge. Participants were drawn from two different localities in England, one North, the other South. Interviews with patients and carers took place in their own homes and focus groups with healthcare professionals were held at local hospices. 38 individuals participated. 15 patients, in the last year of life, and 4 carers under the care of community-based specialist palliative care services and 19 specialist palliative care health professionals (predominantly community palliative care nurses). The concept of self-management support had salience for patients, carers and specialist nurses alongside some unique features, specific to the end of life context. Specifically self-management was identified as an ever-changing process enacted along a continuum of behaviours fluctuating from full to no engagement. Disease progression, frequent changes in symptoms and side-effects, led to a complex web of roles and behaviours, varying day by day, if not hour by hour. Data confirmed previously proposed professional roles were enacted to support self-management. Furthermore, as patients, carers and clinical nurse specialists worked together to achieve effective pain management, they enacted and inter-acted in the roles of advocate, educator, facilitator, problem solver, communicator, goal setter, monitor and reporter. The study has demonstrated what self-management support at end of life entails and how it is enacted in practice. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Paediatric Palliative Care and Intellectual Disability--A Unique Context

    ERIC Educational Resources Information Center

    Duc, Jacqueline K.; Herbert, Anthony Robert; Heussler, Helen S.

    2017-01-01

    Background: Paediatric palliative care is a nuanced area of practice with additional complexities in the context of intellectual disability. There is currently minimal research to guide clinicians working in this challenging area of care. Method: This study describes the complex care of children with life-limiting conditions and intellectual…

  3. Maternal Sensitivity and Child Responsiveness: Associations with Social Context, Maternal Characteristics, and Child Characteristics in a Multivariate Analysis

    ERIC Educational Resources Information Center

    Bornstein, Marc H.; Hendricks, Charlene; Haynes, O. Maurice; Painter, Kathleen M.

    2007-01-01

    This study examined unique associations of multiple distal context variables (family socioeconomic status [SES], maternal employment, and paternal parenting) and proximal maternal (personality, intelligence, and knowledge; behavior, self-perceptions, and attributions) and child (age, gender, representation, language, and sociability)…

  4. Role of Communication in the Context of Educating Children with Attention-Deficit/Hyperactivity Disorder: Parents' and Teachers' Perspectives

    ERIC Educational Resources Information Center

    Koro-Ljungberg, Mirka; Bussing, Regina; Wilder, JeffriAnne; Gary, Faye

    2011-01-01

    Recent school policies increasingly support "parent-integrated" school environments, which benefit from effective parent-school collaborations and strong communication skills to ensure optimal educational outcomes. However, invisible disabilities, such as attention-deficit/hyperactivity disorder, provide unique sociopolitical contexts that shape…

  5. Establishing Quality Assurance in the South African Context

    ERIC Educational Resources Information Center

    Strydom, A. H.; Strydom, J. F.

    2004-01-01

    This paper provides perspectives on the unique challenges and opportunities facing the national auditing and accreditation system in South African higher education. In doing so, the quality assurance contexts of developed countries, Africa and South Africa are considered and the issues of uncertainty and conformity are highlighted. This is…

  6. Verification of consumers' experiences and perceptions of genetic discrimination and its impact on utilization of genetic testing.

    PubMed

    Barlow-Stewart, Kristine; Taylor, Sandra D; Treloar, Susan A; Stranger, Mark; Otlowski, Margaret

    2009-03-01

    To undertake a systematic process of verification of consumer accounts of alleged genetic discrimination. Verification of incidents reported in life insurance and other contexts that met the criteria of genetic discrimination, and the impact of fear of such treatment, was determined, with consent, through interview, document analysis and where appropriate, direct contact with the third party involved. The process comprised obtaining evidence that the alleged incident was accurately reported and determining whether the decision or action seemed to be justifiable and/or ethical. Reported incidents of genetic discrimination were verified in life insurance access, underwriting and coercion (9), applications for worker's compensation (1) and early release from prison (1) and in two cases of fear of discrimination impacting on access to genetic testing. Relevant conditions were inherited cancer susceptibility (8), Huntington disease (3), hereditary hemochromatosis (1), and polycystic kidney disease (1). In two cases, the reversal of an adverse underwriting decision to standard rate after intervention with insurers by genetics health professionals was verified. The mismatch between consumer and third party accounts in three life insurance incidents involved miscommunication or lack of information provision by financial advisers. These first cases of verified genetic discrimination make it essential for policies and guidelines to be developed and implemented to ensure appropriate use of genetic test results in insurance underwriting, to promote education and training in the financial industry, and to provide support for consumers and health professionals undertaking challenges of adverse decisions.

  7. Differentiation of Illusory and True Halo in Writing Scores

    ERIC Educational Resources Information Center

    Lai, Emily R.; Wolfe, Edward W.; Vickers, Daisy

    2015-01-01

    This report summarizes an empirical study that addresses two related topics within the context of writing assessment--illusory halo and how much unique information is provided by multiple analytic scores. Specifically, we address the issue of whether unique information is provided by analytic scores assigned to student writing, beyond what is…

  8. X-ray Cryogenic Facility (XRCF) Handbook

    NASA Technical Reports Server (NTRS)

    Kegley, Jeffrey R.

    2016-01-01

    The X-ray & Cryogenic Facility (XRCF) Handbook is a guide for planning operations at the facility. A summary of the capabilities, policies, and procedures is provided to enhance project coordination between the facility user and XRCF personnel. This handbook includes basic information that will enable the XRCF to effectively plan and support test activities. In addition, this handbook describes the facilities and systems available at the XRCF for supporting test operations. 1.2 General Facility Description The XRCF was built in 1989 to meet the stringent requirements associated with calibration of X-ray optics, instruments, and telescopes and was subsequently modified in 1999 & 2005 to perform the challenging cryogenic verification of Ultraviolet, Optical, and Infrared mirrors. These unique and premier specialty capabilities, coupled with its ability to meet multiple generic thermal vacuum test requirements for large payloads, make the XRCF the most versatile and adaptable space environmental test facility in the Agency. XRCF is also recognized as the newest, most cost effective, most highly utilized facility in the portfolio and as one of only five NASA facilities having unique capabilities. The XRCF is capable of supporting and has supported missions during all phases from technology development to flight verification. Programs/projects that have benefited from XRCF include Chandra, Solar X-ray Imager, Hinode, and James Webb Space Telescope. All test programs have been completed on-schedule and within budget and have experienced no delays due to facility readiness or failures. XRCF is currently supporting Strategic Astrophysics Technology Development for Cosmic Origins. Throughout the years, XRCF has partnered with and continues to maintain positive working relationships with organizations such as ATK, Ball Aerospace, Northrop Grumman Aerospace, Excelis (formerly Kodak/ITT), Smithsonian Astrophysical Observatory, Goddard Space Flight Center, University of Alabama Huntsville, and more.

  9. The Explosive Universe with Gaia

    NASA Astrophysics Data System (ADS)

    Wyrzykowski, Łukasz; Hodgkin, Simon T.; Blagorodnova, Nadejda; Belokurov, Vasily

    2014-01-01

    The Gaia mission will observe the entire sky for 5 years providing ultra-precise astrometric, photometric and spectroscopic measurements for a billion stars in the Galaxy. Hence, naturally, Gaia becomes an all-sky multi-epoch photometric survey, which will monitor and detect variability with millimag precision as well as new transient sources such as supernovae, novae, microlensing events, tidal disruption events, asteroids, among others. Gaia data-flow allows for quick detections of anomalies within 24-48h after the observation. Such near-real-time survey will be able to detect about 6000 supernovae brighter than 19 mag up to redshifts of Z 0.15. The on-board low-resolution (R 100) spectrograph will allow for early and robust classification of transients and minimise the false-alert rate, even providing the estimates on redshift for supernovae. Gaia will also offer a unique possibility for detecting astrometric shifts in microlensing events, which, combined with Gaia's and ground-based photometry, will provide unique mass measurements of lenses, constrains on the dark matter content in the Milky Way and possible detections of free floating black holes. Alerts from Gaia will be publicly available soon after the detection is verified and tested. First alerts are expected early in 2014 and those will be used for ground-based verification. All facilities are invited to join the verification and the follow-up effort. Alerts will be published on a web page, via Skyalert.org and via emailing list. Each alert will contain coordinates, Gaia light curve and low-resolution spectra, classification and cross-matching results. More information on the Gaia Science Alerts can be found here: http://www.ast.cam.ac.uk/ioa/wikis/gsawgwiki/ The full version of the poster is available here: http://www.ast.cam.ac.uk/ioa/wikis/gsawgwiki/images/1/13/GaiaAlertsPosterIAUS298.pdf

  10. Management of the JWST MIRI pFM environmental and performance verification test campaign

    NASA Astrophysics Data System (ADS)

    Eccleston, Paul; Glasse, Alistair; Grundy, Timothy; Detre, Örs Hunor; O'Sullivan, Brian; Shaughnessy, Bryan; Sykes, Jon; Thatcher, John; Walker, Helen; Wells, Martyn; Wright, Gillian; Wright, David

    2012-09-01

    The Mid-Infrared Instrument (MIRI) is one of four scientific instruments on the James Webb Space Telescope (JWST) observatory, scheduled for launch in 2018. It will provide unique capabilities to probe the distant or deeply dust-enshrouded regions of the Universe, investigating the history of star and planet formation from the earliest universe to the present day. To enable this the instrument optical module must be cooled below 7K, presenting specific challenges for the environmental testing and calibration activities. The assembly, integration and verification (AIV) activities for the proto-flight model (pFM) instrument ran from March 2010 to May 2012 at RAL where the instrument has been put through a full suite of environmental and performance tests with a non-conventional single cryo-test approach. In this paper we present an overview of the testing conducted on the MIRI pFM including ambient alignment testing, vibration testing, gravity release testing, cryogenic performance and calibration testing, functional testing at ambient and operational temperatures, thermal balance tests, and Electro-Magnetic Compatibility (EMC) testing. We discuss how tests were planned and managed to ensure that the whole AIV process remained on schedule and give an insight into the lessons learned from this process. We also show how the process of requirement verification for this complex system was managed and documented. We describe how the risks associated with a single long duration test at operating temperature were controlled so that the complete suite of environmental tests could be used to build up a full picture of instrument compliance.

  11. What is new about covered interest parity condition in the European Union? Evidence from fractal cross-correlation regressions

    NASA Astrophysics Data System (ADS)

    Ferreira, Paulo; Kristoufek, Ladislav

    2017-11-01

    We analyse the covered interest parity (CIP) using two novel regression frameworks based on cross-correlation analysis (detrended cross-correlation analysis and detrending moving-average cross-correlation analysis), which allow for studying the relationships at different scales and work well under non-stationarity and heavy tails. CIP is a measure of capital mobility commonly used to analyse financial integration, which remains an interesting feature of study in the context of the European Union. The importance of this features is related to the fact that the adoption of a common currency is associated with some benefits for countries, but also involves some risks such as the loss of economic instruments to face possible asymmetric shocks. While studying the Eurozone members could explain some problems in the common currency, studying the non-Euro countries is important to analyse if they are fit to take the possible benefits. Our results point to the CIP verification mainly in the Central European countries while in the remaining countries, the verification of the parity is only residual.

  12. Measurement of self-evaluative motives: a shopping scenario.

    PubMed

    Wajda, Theresa A; Kolbe, Richard; Hu, Michael Y; Cui, Annie Peng

    2008-08-01

    To develop measures of consumers' self-evaluative motives of Self-verification, Self-enhancement, and Self-improvement within the context of a mall shopping environment, an initial set of 49 items was generated by conducting three focus-group sessions. These items were subsequently converted into shopping-dependent motive statements. 250 undergraduate college students responded on a 7-point scale to each statement as these related to the acquisition of recent personal shopping goods. An exploratory factor analysis yielded five factors, accounting for 57.7% of the variance, three of which corresponded to the Self-verification motive (five items), Self-enhancement motive (three items), and Self-improvement motive (six items). These 14 items, along with 9 reconstructed items, yielded 23 items retained and subjected to additional testing. In a final round of data collection, 169 college students provided data for exploratory factor analysis. 11 items were used in confirmatory factor analysis. Analysis indicated that the 11-item scale adequately captured measures of the three self-evaluative motives. However, further data reduction produced a 9-item scale with marked improvement in statistical fit over the 11-item scale.

  13. A model-based design and validation approach with OMEGA-UML and the IF toolset

    NASA Astrophysics Data System (ADS)

    Ben-hafaiedh, Imene; Constant, Olivier; Graf, Susanne; Robbana, Riadh

    2009-03-01

    Intelligent, embedded systems such as autonomous robots and other industrial systems are becoming increasingly more heterogeneous with respect to the platforms on which they are implemented, and thus the software architecture more complex to design and analyse. In this context, it is important to have well-defined design methodologies which should be supported by (1) high level design concepts allowing to master the design complexity, (2) concepts for the expression of non-functional requirements and (3) analysis tools allowing to verify or invalidate that the system under development will be able to conform to its requirements. We illustrate here such an approach for the design of complex embedded systems on hand of a small case study used as a running example for illustration purposes. We briefly present the important concepts of the OMEGA-RT UML profile, we show how we use this profile in a modelling approach, and explain how these concepts are used in the IFx verification toolbox to integrate validation into the design flow and make scalable verification possible.

  14. Mashup Model and Verification Using Mashup Processing Network

    NASA Astrophysics Data System (ADS)

    Zahoor, Ehtesham; Perrin, Olivier; Godart, Claude

    Mashups are defined to be lightweight Web applications aggregating data from different Web services, built using ad-hoc composition and being not concerned with long term stability and robustness. In this paper we present a pattern based approach, called Mashup Processing Network (MPN). The idea is based on Event Processing Network and is supposed to facilitate the creation, modeling and the verification of mashups. MPN provides a view of how different actors interact for the mashup development namely the producer, consumer, mashup processing agent and the communication channels. It also supports modeling transformations and validations of data and offers validation of both functional and non-functional requirements, such as reliable messaging and security, that are key issues within the enterprise context. We have enriched the model with a set of processing operations and categorize them into data composition, transformation and validation categories. These processing operations can be seen as a set of patterns for facilitating the mashup development process. MPN also paves a way for realizing Mashup Oriented Architecture where mashups along with services are used as building blocks for application development.

  15. The species translation challenge—A systems biology perspective on human and rat bronchial epithelial cells

    PubMed Central

    Poussin, Carine; Mathis, Carole; Alexopoulos, Leonidas G; Messinis, Dimitris E; Dulize, Rémi H J; Belcastro, Vincenzo; Melas, Ioannis N; Sakellaropoulos, Theodore; Rhrissorrakrai, Kahn; Bilal, Erhan; Meyer, Pablo; Talikka, Marja; Boué, Stéphanie; Norel, Raquel; Rice, John J; Stolovitzky, Gustavo; Ivanov, Nikolai V; Peitsch, Manuel C; Hoeng, Julia

    2014-01-01

    The biological responses to external cues such as drugs, chemicals, viruses and hormones, is an essential question in biomedicine and in the field of toxicology, and cannot be easily studied in humans. Thus, biomedical research has continuously relied on animal models for studying the impact of these compounds and attempted to ‘translate’ the results to humans. In this context, the SBV IMPROVER (Systems Biology Verification for Industrial Methodology for PROcess VErification in Research) collaborative initiative, which uses crowd-sourcing techniques to address fundamental questions in systems biology, invited scientists to deploy their own computational methodologies to make predictions on species translatability. A multi-layer systems biology dataset was generated that was comprised of phosphoproteomics, transcriptomics and cytokine data derived from normal human (NHBE) and rat (NRBE) bronchial epithelial cells exposed in parallel to more than 50 different stimuli under identical conditions. The present manuscript describes in detail the experimental settings, generation, processing and quality control analysis of the multi-layer omics dataset accessible in public repositories for further intra- and inter-species translation studies. PMID:25977767

  16. The species translation challenge-a systems biology perspective on human and rat bronchial epithelial cells.

    PubMed

    Poussin, Carine; Mathis, Carole; Alexopoulos, Leonidas G; Messinis, Dimitris E; Dulize, Rémi H J; Belcastro, Vincenzo; Melas, Ioannis N; Sakellaropoulos, Theodore; Rhrissorrakrai, Kahn; Bilal, Erhan; Meyer, Pablo; Talikka, Marja; Boué, Stéphanie; Norel, Raquel; Rice, John J; Stolovitzky, Gustavo; Ivanov, Nikolai V; Peitsch, Manuel C; Hoeng, Julia

    2014-01-01

    The biological responses to external cues such as drugs, chemicals, viruses and hormones, is an essential question in biomedicine and in the field of toxicology, and cannot be easily studied in humans. Thus, biomedical research has continuously relied on animal models for studying the impact of these compounds and attempted to 'translate' the results to humans. In this context, the SBV IMPROVER (Systems Biology Verification for Industrial Methodology for PROcess VErification in Research) collaborative initiative, which uses crowd-sourcing techniques to address fundamental questions in systems biology, invited scientists to deploy their own computational methodologies to make predictions on species translatability. A multi-layer systems biology dataset was generated that was comprised of phosphoproteomics, transcriptomics and cytokine data derived from normal human (NHBE) and rat (NRBE) bronchial epithelial cells exposed in parallel to more than 50 different stimuli under identical conditions. The present manuscript describes in detail the experimental settings, generation, processing and quality control analysis of the multi-layer omics dataset accessible in public repositories for further intra- and inter-species translation studies.

  17. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...

  18. School Psychology in Rural Contexts: Ethical, Professional, and Legal Issues

    ERIC Educational Resources Information Center

    Edwards, Lynn M.; Sullivan, Amanda L.

    2014-01-01

    Delivering psychological services in rural communities presents a number of unique challenges for practitioners relative to their peers in urban and suburban communities. In this article, the authors describe the current context of rural schools and examine the ethical and legal issues school psychologists may face when practicing in rural…

  19. Key Educational Factors in the Education of Students with a Medical Condition

    ERIC Educational Resources Information Center

    Capurso, Michele; Dennis, John L.

    2017-01-01

    The education of children with a medical condition represents a unique educational context. The key educational factors that can help these children continue their education despite the burdens associated with their illness were discussed and analysed by a pool of experts for an EU funded project. In this context, "relationships,"…

  20. Managers as Writers: A Metanalysis of Research in Context.

    ERIC Educational Resources Information Center

    Smeltzer, Larry R.; Thomas, Gail Fann

    1994-01-01

    Argues that managers write within a unique context, and, thus, much of what is known about writing in general or professional writing may not apply. Reviews the literature on managerial writing, finding a paucity of research and a heavy emphasis on survey methodology. Offers six general research questions for future research. (SR)

  1. Text and Context: "The Passion of the Christ" and Other Jesus Films

    ERIC Educational Resources Information Center

    Gilmour, Peter

    2005-01-01

    This article approaches the immense popularity of Mel Gibson's 2004 film, "The Passion of the Christ" as a significant artifact in the contemporary public, cultural curriculum, and a unique opportunity for religious educators to build on its notoriety. Five interrelated contexts are identified and explored to assist religious educators more deeply…

  2. A Comparison of Adolescents' Friendship Networks by Advanced Coursework Participation Status

    ERIC Educational Resources Information Center

    Barber, Carolyn; Wasson, Jillian Woodford

    2015-01-01

    Friendships serve as a source of support and as a context for developing social competence. Although advanced coursework may provide a unique context for the development of friendships, more research is needed to explore exactly what differences exist. Using the National Longitudinal Study of Adolescent Health and the Adolescent Health and…

  3. Reading and Spelling Skills in German Third Graders: Examining the Role of Student and Context Characteristics

    ERIC Educational Resources Information Center

    Suchodoletz, Antje; Larsen, Ross A. A.; Gunzenhauser, Catherine; Fäsche, Anika

    2015-01-01

    Background: Educational processes and outcomes are influenced by a multitude of factors, including individual and contextual characteristics. Recently, studies have demonstrated that student and context characteristics may produce unique and cumulative effects on educational outcomes. Aims: The study aimed to investigate (1) the relative…

  4. Early Childhood Educators Attitudes towards Playful Aggression among Boys: Exploring the Importance of Situational Context

    ERIC Educational Resources Information Center

    Hart, Jennifer L.

    2016-01-01

    The current study investigates the influence of situational context on perceptions of playful aggression. Using an online data collection instrument embedded with video vignettes showing young boys engaged in aggressive play behaviour, 36 situational profiles that are defined by the unique combinations of variables believed to influence attitudes…

  5. The Environmental Context of Patient Safety and Medical Errors

    ERIC Educational Resources Information Center

    Wholey, Douglas; Moscovice, Ira; Hietpas, Terry; Holtzman, Jeremy

    2004-01-01

    The environmental context of patient safety and medical errors was explored with specific interest in rural settings. Special attention was paid to unique features of rural health care organizations and their environment that relate to the patient safety issue and medical errors (including the distribution of patients, types of adverse events…

  6. Children's Activity Levels and Lesson Context during Summer Swim Instruction

    ERIC Educational Resources Information Center

    Schwamberger, Benjamin; Wahl-Alexander, Zachary

    2016-01-01

    Summer swim programs provide a unique opportunity to engage children in PA as well as an important lifesaving skill. Offering summer swim programs is critical, especially for minority populations who tend to have higher rates of drowning, specifically in youth populations. The purpose of this study was to determine the lesson context and…

  7. Leading Up in the Scholarship of Teaching and Learning

    ERIC Educational Resources Information Center

    Miller-Young, Janice E.; Anderson, Catherine; Kiceniuk, Deborah; Mooney, Julie; Riddell, Jessica; Schmidt Hanbidge, Alice; Ward, Veronica; Wideman, Maureen A.; Chick, Nancy

    2017-01-01

    Scholarship of teaching and learning (SoTL) scholars, including those who are not in formal positions of leadership, are uniquely positioned to engage in leadership activities that can grow the field, influence their colleagues, and effect change in their local contexts as well as in institutional, disciplinary, and the broader Canadian contexts.…

  8. But Does It Work? Reflective Activities, Learning Outcomes and Instrumental Learning in Continuing Professional Development

    ERIC Educational Resources Information Center

    Roessger, Kevin M.

    2015-01-01

    This paper examines the relationship between reflective practice and instrumental learning within the context of continuing professional development (CPD). It is argued that instrumental learning is a unique process of adult learning, and reflective practice's impact on learning outcomes in instrumental learning contexts remains unclear. A…

  9. A Survey on the Feasibility of Sound Classification on Wireless Sensor Nodes

    PubMed Central

    Salomons, Etto L.; Havinga, Paul J. M.

    2015-01-01

    Wireless sensor networks are suitable to gain context awareness for indoor environments. As sound waves form a rich source of context information, equipping the nodes with microphones can be of great benefit. The algorithms to extract features from sound waves are often highly computationally intensive. This can be problematic as wireless nodes are usually restricted in resources. In order to be able to make a proper decision about which features to use, we survey how sound is used in the literature for global sound classification, age and gender classification, emotion recognition, person verification and identification and indoor and outdoor environmental sound classification. The results of the surveyed algorithms are compared with respect to accuracy and computational load. The accuracies are taken from the surveyed papers; the computational loads are determined by benchmarking the algorithms on an actual sensor node. We conclude that for indoor context awareness, the low-cost algorithms for feature extraction perform equally well as the more computationally-intensive variants. As the feature extraction still requires a large amount of processing time, we present four possible strategies to deal with this problem. PMID:25822142

  10. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  11. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique capabilities to study the effects of a given change to the control system in the context of the distributed paradigm. The simulation tool can support treatment of all components within the control system, both virtual and real; these include communication data network, smart sensor and actuator nodes, centralized control system (FADEC full authority digital engine control), and the aircraft engine itself. The DECsim tool can allow simulation-based prototyping of control laws, control architectures, and decentralization strategies before hardware is integrated into the system. With the configuration specified, the simulator allows a variety of key factors to be systematically assessed. Such factors include control system performance, reliability, weight, and bandwidth utilization.

  12. Unlocking Women's Leadership Potential: A Curricular Example for Developing Female Leaders in Academia

    ERIC Educational Resources Information Center

    Knipfer, Kristin; Shaughnessy, Brooke; Hentschel, Tanja; Schmid, Ellen

    2017-01-01

    Women in academia face unique challenges when it comes to advancing to professorship. Using latest research about gender and academic leadership, we present a training curriculum that is sensitive to the unique demands of women in and aspiring to leadership positions in academia. The context-specific and evidence-based approach and a focus on…

  13. The Emergence of Hybrid Role Conflict in Conflicting Settings: A Unique Challenge for School Leaders

    ERIC Educational Resources Information Center

    Nir, Adam E.

    2011-01-01

    To what extent do divided cities characterized by geopolitical conflicts and a variety of contradictory expectations create a distinctive context and a unique professional conflict for individuals holding boundary-spanning roles? Data collected in a set of in-depth interviews conducted with school principals leading Arab schools located in East…

  14. The role of perceived well-being in the family, school and peer context in adolescents' subjective health complaints: evidence from a Greek cross-sectional study.

    PubMed

    Petanidou, Dimitra; Daskagianni, Evangelie; Dimitrakaki, Christine; Kolaitis, Gerasimos; Tountas, Yannis

    2013-11-28

    During adolescence children are usually confronted with an expanding social arena. Apart from families, schools and neighbourhoods, peers, classmates, teachers, and other adult figures gain increasing importance for adolescent socio-emotional adjustment. The aim of the present study was to investigate the extent to which Greek adolescents' perceived well-being in three main social contexts (family, school and peers) predicted self-reported Subjective Health Complaints. Questionnaires were administered to a Greek nation-wide, random, school-based sample of children aged 12-18 years in 2003. Data from 1.087 adolescents were analyzed. A hierarchical regression model with Subjective Health Complaints as the outcome variable was employed in order to i) control for the effects of previously well-established demographic factors (sex, age and subjective economic status) and ii) to identify the unique proportion of variance attributed to each context. Bivariate correlations and multicollinearity were also explored. As hypothesized, adolescents' perceived well-being in each of the three social contexts appeared to hold unique proportions of variance in self-reported Subjective Health Complaints, after controlling for the effects of sex, age and subjective economic status. In addition, our final model confirmed that the explained variance in SHC was accumulated from each social context studied. The regression models were statistically significant and explained a total of approximately 24% of the variance in Subjective Health Complaints. Our study delineated the unique and cumulative contributions of adolescents' perceived well-being in the family, school and peer setting in the explanation of Subjective Health Complaints. Apart from families, schools, teachers and peers appear to have a salient role in adolescent psychosomatic adjustment. A thorough understanding of the relationship between adolescents' Subjective Health Complaints and perceived well-being in their social contexts could not only lead to more effective tailored initiatives, but also to promote a multi- and inter-disciplinary culture in adolescent psychosomatic health.

  15. The role of perceived well-being in the family, school and peer context in adolescents’ subjective health complaints: evidence from a Greek cross-sectional study

    PubMed Central

    2013-01-01

    Background During adolescence children are usually confronted with an expanding social arena. Apart from families, schools and neighbourhoods, peers, classmates, teachers, and other adult figures gain increasing importance for adolescent socio-emotional adjustment. The aim of the present study was to investigate the extent to which Greek adolescents’ perceived well-being in three main social contexts (family, school and peers) predicted self-reported Subjective Health Complaints. Methods Questionnaires were administered to a Greek nation-wide, random, school-based sample of children aged 12–18 years in 2003. Data from 1.087 adolescents were analyzed. A hierarchical regression model with Subjective Health Complaints as the outcome variable was employed in order to i) control for the effects of previously well-established demographic factors (sex, age and subjective economic status) and ii) to identify the unique proportion of variance attributed to each context. Bivariate correlations and multicollinearity were also explored. Results As hypothesized, adolescents’ perceived well-being in each of the three social contexts appeared to hold unique proportions of variance in self-reported Subjective Health Complaints, after controlling for the effects of sex, age and subjective economic status. In addition, our final model confirmed that the explained variance in SHC was accumulated from each social context studied. The regression models were statistically significant and explained a total of approximately 24% of the variance in Subjective Health Complaints. Conclusions Our study delineated the unique and cumulative contributions of adolescents’ perceived well-being in the family, school and peer setting in the explanation of Subjective Health Complaints. Apart from families, schools, teachers and peers appear to have a salient role in adolescent psychosomatic adjustment. A thorough understanding of the relationship between adolescents’ Subjective Health Complaints and perceived well-being in their social contexts could not only lead to more effective tailored initiatives, but also to promote a multi- and inter-disciplinary culture in adolescent psychosomatic health. PMID:24283390

  16. Quantum integrability and functional equations

    NASA Astrophysics Data System (ADS)

    Volin, Dmytro

    2010-03-01

    In this thesis a general procedure to represent the integral Bethe Ansatz equations in the form of the Reimann-Hilbert problem is given. This allows us to study in simple way integrable spin chains in the thermodynamic limit. Based on the functional equations we give the procedure that allows finding the subleading orders in the solution of various integral equations solved to the leading order by the Wiener-Hopf technics. The integral equations are studied in the context of the AdS/CFT correspondence, where their solution allows verification of the integrability conjecture up to two loops of the strong coupling expansion. In the context of the two-dimensional sigma models we analyze the large-order behavior of the asymptotic perturbative expansion. Obtained experience with the functional representation of the integral equations allowed us also to solve explicitly the crossing equations that appear in the AdS/CFT spectral problem.

  17. International nurse migration: impacts on New Zealand.

    PubMed

    North, Nicola

    2007-08-01

    As a source and destination country, nurse flows in and out of New Zealand (NZ) are examined to determine impacts and regional contexts. A descriptive statistics method was used to analyze secondary data on nurses added to the register, NZ nurse qualifications verified by overseas authorities, nursing workforce data, and census data. It found that international movement of nurses was minimal during the 1990s, but from 2001 a sharp jump in the verification of NZ-registered nurses (RNs) by overseas authorities coincided with an equivalent increase in international RNs (IRNs) added to the NZ nursing register-a pattern that has been sustained to the present. Movement of NZ RNs to Australia is expedited by the Trans-Tasman Agreement, whereas entry of IRNs to NZ is facilitated by nursing being an identified Priority Occupation. Future research needs to consider health system and nurse workforce contexts and take a regional perspective on migration patterns.

  18. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    At the SPACEHAB Facility, STS-96 Mission Specialist Ellen Ochoa and Commander Kent Rominger pause during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. Other crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Tamara Jernigan, Dan Barry, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  19. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialist Julie Payette closes a container, part of the equipment to be carried on the SPACEHAB and mission STS-96. She and other crew members Commander Kent Rominger, Pilot Rick Husband, and Mission Speciaists Ellen Ochoa, Tamara Jernigan, Dan Barry and Valery Tokarev of Russia are at KSC for a payload Interface Verification Test for the upcoming mission to the International Space Station . Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  20. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Posing on the platform next to the SPACEHAB Logistics Double Module in the SPACEHAB Facility are the STS-96 crew (from left) Mission Specialists Dan Barry, Tamara Jernigan, Valery Tokarev of Russia, and Julie Payette; Pilot Rick Husband; Mission Specialist Ellen Ochoa; and Commander Kent Rominger. The crew is at KSC for a payload Interface Verification Test for their upcoming mission to the International Space Station. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  1. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    At the SPACEHAB Facility, STS-96 Mission Specialist Ellen Ochoa and Commander Kent Rominger smile for the camera during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. Other crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Tamara Jernigan, Dan Barry, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  2. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) for the upcoming mission to the International Space Station , Chris Jaskolka of Boeing points out a piece of equipment in the SPACEHAB module to STS-96 Commander Kent Rominger, Mission Specialist Ellen Ochoa and Pilot Rick Husband. Other crew members visiting KSC for the IVT are Mission Specialists Tamara Jernigan, Dan Barry, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  3. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialists Dan Barry and Tamara Jernigan discuss procedures during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. Other STS-96 crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband and Mission Specialists Ellen Ochoa, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  4. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, James Behling, with Boeing, talks about equipment for mission STS-96 during a payload Interface Verification Test (IVT). Watching are (from left) Mission Specialists Ellen Ochoa, Julie Payette and Dan Berry, and Pilot Rick Husband. Other STS-96 crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  5. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station, STS-96 Mission Specialists Julie Payette, Dan Barry, and Valery Tokarev of Russia, look at a Sequential Shunt Unit in the SPACEHAB Facility. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband, and Mission Specialists Ellen Ochoa and Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  6. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station are (left to right) Mission Specialists Valery Tokarev, Julie Payette (holding a lithium hydroxide canister) and Dan Barry. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband and Mission Specialists Ellen Ochoa and Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  7. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, the STS-96 crew looks over equipment during a payload Interface Verification Test for the upcoming mission to the International Space Station. From left are Commander Kent Rominger, Mission Specialists Tamara Jernigan and Valery Tokarev of Russia, Pilot Rick Husband, and Mission Specialists Ellen Ochoa and Julie Payette (backs to the camera). They are listening to Chris Jaskolka of Boeing talk about the equipment. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  8. Curve Set Feature-Based Robust and Fast Pose Estimation Algorithm

    PubMed Central

    Hashimoto, Koichi

    2017-01-01

    Bin picking refers to picking the randomly-piled objects from a bin for industrial production purposes, and robotic bin picking is always used in automated assembly lines. In order to achieve a higher productivity, a fast and robust pose estimation algorithm is necessary to recognize and localize the randomly-piled parts. This paper proposes a pose estimation algorithm for bin picking tasks using point cloud data. A novel descriptor Curve Set Feature (CSF) is proposed to describe a point by the surface fluctuation around this point and is also capable of evaluating poses. The Rotation Match Feature (RMF) is proposed to match CSF efficiently. The matching process combines the idea of the matching in 2D space of origin Point Pair Feature (PPF) algorithm with nearest neighbor search. A voxel-based pose verification method is introduced to evaluate the poses and proved to be more than 30-times faster than the kd-tree-based verification method. Our algorithm is evaluated against a large number of synthetic and real scenes and proven to be robust to noise, able to detect metal parts, more accurately and more than 10-times faster than PPF and Oriented, Unique and Repeatable (OUR)-Clustered Viewpoint Feature Histogram (CVFH). PMID:28771216

  9. Upgrades at the NASA Langley Research Center National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Paryz, Roman W.

    2012-01-01

    Several projects have been completed or are nearing completion at the NASA Langley Research Center (LaRC) National Transonic Facility (NTF). The addition of a Model Flow-Control/Propulsion Simulation test capability to the NTF provides a unique, transonic, high-Reynolds number test capability that is well suited for research in propulsion airframe integration studies, circulation control high-lift concepts, powered lift, and cruise separation flow control. A 1992 vintage Facility Automation System (FAS) that performs the control functions for tunnel pressure, temperature, Mach number, model position, safety interlock and supervisory controls was replaced using current, commercially available components. This FAS upgrade also involved a design study for the replacement of the facility Mach measurement system and the development of a software-based simulation model of NTF processes and control systems. The FAS upgrades were validated by a post upgrade verification wind tunnel test. The data acquisition system (DAS) upgrade project involves the design, purchase, build, integration, installation and verification of a new DAS by replacing several early 1990's vintage computer systems with state of the art hardware/software. This paper provides an update on the progress made in these efforts. See reference 1.

  10. Detecting photovoltaic solar panels using hyperspectral imagery and estimating solar power production

    NASA Astrophysics Data System (ADS)

    Czirjak, Daniel

    2017-04-01

    Remote sensing platforms have consistently demonstrated the ability to detect, and in some cases identify, specific targets of interest, and photovoltaic solar panels are shown to have a unique spectral signature that is consistent across multiple manufacturers and construction methods. Solar panels are proven to be detectable in hyperspectral imagery using common statistical target detection methods such as the adaptive cosine estimator, and false alarms can be mitigated through the use of a spectral verification process that eliminates pixels that do not have the key spectral features of photovoltaic solar panel reflectance spectrum. The normalized solar panel index is described and is a key component in the false-alarm mitigation process. After spectral verification, these solar panel arrays are confirmed on openly available literal imagery and can be measured using numerous open-source algorithms and tools. The measurements allow for the assessment of overall solar power generation capacity using an equation that accounts for solar insolation, the area of solar panels, and the efficiency of the solar panels conversion of solar energy to power. Using a known location with readily available information, the methods outlined in this paper estimate the power generation capabilities within 6% of the rated power.

  11. Software Testing and Verification in Climate Model Development

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  12. Double patterning from design enablement to verification

    NASA Astrophysics Data System (ADS)

    Abercrombie, David; Lacour, Pat; El-Sewefy, Omar; Volkov, Alex; Levine, Evgueni; Arb, Kellen; Reid, Chris; Li, Qiao; Ghosh, Pradiptya

    2011-11-01

    Litho-etch-litho-etch (LELE) is the double patterning (DP) technology of choice for 20 nm contact, via, and lower metal layers. We discuss the unique design and process characteristics of LELE DP, the challenges they present, and various solutions. ∘ We examine DP design methodologies, current DP conflict feedback mechanisms, and how they can help designers identify and resolve conflicts. ∘ In place and route (P&R), the placement engine must now be aware of the assumptions made during IP cell design, and use placement directives provide by the library designer. We examine the new effects DP introduces in detail routing, discuss how multiple choices of LELE and the cut allowances can lead to different solutions, and describe new capabilities required by detail routers and P&R engines. ∘ We discuss why LELE DP cuts and overlaps are critical to optical process correction (OPC), and how a hybrid mechanism of rule and model-based overlap generation can provide a fast and effective solution. ∘ With two litho-etch steps, mask misalignment and image rounding are now verification considerations. We present enhancements to the OPCVerify engine that check for pinching and bridging in the presence of DP overlay errors and acute angles.

  13. Resilience amongst Australian Aboriginal Youth: An Ecological Analysis of Factors Associated with Psychosocial Functioning in High and Low Family Risk Contexts

    PubMed Central

    Hopkins, Katrina D.; Zubrick, Stephen R.; Taylor, Catherine L.

    2014-01-01

    We investigate whether the profile of factors protecting psychosocial functioning of high risk exposed Australian Aboriginal youth are the same as those promoting psychosocial functioning in low risk exposed youth. Data on 1,021 youth aged 12–17 years were drawn from the Western Australian Aboriginal Child Health Survey (WAACHS 2000–2002), a population representative survey of the health and well-being of Aboriginal children, their families and community contexts. A person-centered approach was used to define four groups of youth cross-classified according to level of risk exposure (high/low) and psychosocial functioning (good/poor). Multivariate logistic regression was used to model the influence of individual, family, cultural and community factors on psychosocial outcomes separately for youth in high and low family-risk contexts. Results showed that in high family risk contexts, prosocial friendship and low area-level socioeconomic status uniquely protected psychosocial functioning. However, in low family risk contexts the perception of racism increased the likelihood of poor psychosocial functioning. For youth in both high and low risk contexts, higher self-esteem and self-regulation were associated with good psychosocial functioning although the relationship was non-linear. These findings demonstrate that an empirical resilience framework of analysis can identify potent protective processes operating uniquely in contexts of high risk and is the first to describe distinct profiles of risk, protective and promotive factors within high and low risk exposed Australian Aboriginal youth. PMID:25068434

  14. Experimental verification of Space Platform battery discharger design optimization

    NASA Astrophysics Data System (ADS)

    Sable, Dan M.; Deuty, Scott; Lee, Fred C.; Cho, Bo H.

    The detailed design of two candidate topologies for the Space Platform battery discharger, a four module boost converter (FMBC) and a voltage-fed push-pull autotransformer (VFPPAT), is presented. Each has unique problems. The FMBC requires careful design and analysis in order to obtain good dynamic performance. This is due to the presence of a right-half-plane (RHP) zero in the control-to-output transfer function. The VFPPAT presents a challenging power stage design in order to yield high efficiency and light component weight. The authors describe the design of each of these converters and compare their efficiency, weight, and dynamic characteristics.

  15. A novel trauma leadership model reflective of changing times.

    PubMed

    DʼHuyvetter, Cecile; Cogbill, Thomas H

    2014-01-01

    As a result of generational changes in the health care workforce, we sought to evaluate our current Trauma Medical Director Leadership model. We assessed the responsibilities, accountability, time requirements, cost, and provider satisfaction with the current leadership model. Three new providers who had recently completed fellowship training were hired, each with unique professional desires, skill sets, and experience. Our goal was to establish a comprehensive, cost-effective, accountable leadership model that enabled provider satisfaction and equalized leadership responsibilities. A 3-pronged team model was established with a Medical Director title and responsibilities rotating per the American College of Surgeons verification cycle to develop leadership skills and lessen hierarchical differences.

  16. An Update on the Mechanical and EM Performance of the Composite Dish Verification Antenna (DVA-1) for the SKA

    NASA Technical Reports Server (NTRS)

    Lacy, G. E.; Fleming, M.; Baker, L.; Imbriale, W.; Cortes-Medellin, G.; Veidt, B.; Hovey, G. J.; DeBoer, D.

    2012-01-01

    This paper will give an overview of the unique mechanical and optical design of the DVA-1 telescope. The rim supported carbon fibre reflector surfaces are designed to be both low cost and have high performance under wind, gravity, and thermal loads. The shaped offset Gregorian optics offer low and stable side lobes along with a large area at the secondary focus for multiple feeds with no aperture blockage. Telescope performance under ideal conditions as well as performance under gravity, wind, and thermal loads will be compared directly using calculated radiation patterns for each of these operating conditions.

  17. Capillary electrophoresis for the analysis of contaminants in emerging food safety issues and food traceability.

    PubMed

    Vallejo-Cordoba, Belinda; González-Córdova, Aarón F

    2010-07-01

    This review presents an overview of the applicability of CE in the analysis of chemical and biological contaminants involved in emerging food safety issues. Additionally, CE-based genetic analyzers' usefulness as a unique tool in food traceability verification systems was presented. First, analytical approaches for the determination of melamine and specific food allergens in different foods were discussed. Second, natural toxin analysis by CE was updated from the last review reported in 2008. Finally, the analysis of prion proteins associated with the "mad cow" crises and the application of CE-based genetic analyzers for meat traceability were summarized.

  18. Head related transfer functions measurement and processing for the purpose of creating a spatial sound environment

    NASA Astrophysics Data System (ADS)

    Pec, Michał; Bujacz, Michał; Strumiłło, Paweł

    2008-01-01

    The use of Head Related Transfer Functions (HRTFs) in audio processing is a popular method of obtaining spatialized sound. HRTFs describe disturbances caused in the sound wave by the human body, especially by head and the ear pinnae. Since these shapes are unique, HRTFs differ greatly from person to person. For this reason measurement of personalized HRTFs is justified. Measured HRTFs also need further processing to be utilized in a system producing spatialized sound. This paper describes a system designed for efficient collecting of Head Related Transfer Functions as well as the measurement, interpolation and verification procedures.

  19. Experimental verification of Space Platform battery discharger design optimization

    NASA Technical Reports Server (NTRS)

    Sable, Dan M.; Deuty, Scott; Lee, Fred C.; Cho, Bo H.

    1991-01-01

    The detailed design of two candidate topologies for the Space Platform battery discharger, a four module boost converter (FMBC) and a voltage-fed push-pull autotransformer (VFPPAT), is presented. Each has unique problems. The FMBC requires careful design and analysis in order to obtain good dynamic performance. This is due to the presence of a right-half-plane (RHP) zero in the control-to-output transfer function. The VFPPAT presents a challenging power stage design in order to yield high efficiency and light component weight. The authors describe the design of each of these converters and compare their efficiency, weight, and dynamic characteristics.

  20. Beyond rules: The next generation of expert systems

    NASA Technical Reports Server (NTRS)

    Ferguson, Jay C.; Wagner, Robert E.

    1987-01-01

    The PARAGON Representation, Management, and Manipulation system is introduced. The concepts of knowledge representation, knowledge management, and knowledge manipulation are combined in a comprehensive system for solving real world problems requiring high levels of expertise in a real time environment. In most applications the complexity of the problem and the representation used to describe the domain knowledge tend to obscure the information from which solutions are derived. This inhibits the acquisition of domain knowledge verification/validation, places severe constraints on the ability to extend and maintain a knowledge base while making generic problem solving strategies difficult to develop. A unique hybrid system was developed to overcome these traditional limitations.

  1. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  2. Agreement between self-reported and physically verified male circumcision status in Nyanza region, Kenya: Evidence from the TASCO study.

    PubMed

    Odoyo-June, Elijah; Agot, Kawango; Mboya, Edward; Grund, Jonathan; Musingila, Paul; Emusu, Donath; Soo, Leonard; Otieno-Nyunya, Boaz

    2018-01-01

    Self-reported male circumcision (MC) status is widely used to estimate community prevalence of circumcision, although its accuracy varies in different settings depending on the extent of misreporting. Despite this challenge, self-reported MC status remains essential because it is the most feasible method of collecting MC status data in community surveys. Therefore, its accuracy is an important determinant of the reliability of MC prevalence estimates based on such surveys. We measured the concurrence between self-reported and physically verified MC status among men aged 25-39 years during a baseline household survey for a study to test strategies for enhancing MC uptake by older men in Nyanza region of Kenya. The objective was to determine the accuracy of self-reported MC status in communities where MC for HIV prevention is being rolled out. Agreement between self-reported and physically verified MC status was measured among 4,232 men. A structured questionnaire was used to collect data on MC status followed by physical examination to verify the actual MC status whose outcome was recorded as fully circumcised (no foreskin), partially circumcised (foreskin is past corona sulcus but covers less than half of the glans) or uncircumcised (foreskin covers half or more of the glans). The sensitivity and specificity of self-reported MC status were calculated using physically verified MC status as the gold standard. Out of 4,232 men, 2,197 (51.9%) reported being circumcised, of whom 99.0% were confirmed to be fully circumcised on physical examination. Among 2,035 men who reported being uncircumcised, 93.7% (1,907/2,035) were confirmed uncircumcised on physical examination. Agreement between self-reported and physically verified MC status was almost perfect, kappa (k) = 98.6% (95% CI, 98.1%-99.1%. The sensitivity of self-reporting being circumcised was 99.6% (95% CI, 99.2-99.8) while specificity of self-reporting uncircumcised was 99.0% (95% CI, 98.4-99.4) and did not differ significantly by age group based on chi-square test. Rate of consenting to physical verification of MC status differed by client characteristics; unemployed men were more likely to consent to physical verification (odds ratio [OR] = 1.48, (95% CI, 1.30-1.69) compared to employed men and those with post-secondary education were less likely to consent to physical verification than those with primary education or less (odds ratio [OR] = 0.61, (95% CI, 0.51-0.74). In this Kenyan context, both sensitivity and specificity of self-reported MC status was high; therefore, MC prevalence estimates based on self-reported MC status should be deemed accurate and applicable for planning. However MC programs should assess accuracy of self-reported MC status periodically for any secular changes that may undermine its usefulness for estimating community MC prevalence in their unique settings.

  3. V&V of Fault Management: Challenges and Successes

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Costello, Ken; Ohi, Don; Lu, Tiffany; Newhouse, Marilyn

    2013-01-01

    This paper describes the results of a special breakout session of the NASA Independent Verification and Validation (IV&V) Workshop held in the fall of 2012 entitled "V&V of Fault Management: Challenges and Successes." The NASA IV&V Program is in a unique position to interact with projects across all of the NASA development domains. Using this unique opportunity, the IV&V program convened a breakout session to enable IV&V teams to share their challenges and successes with respect to the V&V of Fault Management (FM) architectures and software. The presentations and discussions provided practical examples of pitfalls encountered while performing V&V of FM including the lack of consistent designs for implementing faults monitors and the fact that FM information is not centralized but scattered among many diverse project artifacts. The discussions also solidified the need for an early commitment to developing FM in parallel with the spacecraft systems as well as clearly defining FM terminology within a project.

  4. The Potential of Using Brain Images for Authentication

    PubMed Central

    Zhou, Zongtan; Shen, Hui; Hu, Dewen

    2014-01-01

    Biometric recognition (also known as biometrics) refers to the automated recognition of individuals based on their biological or behavioral traits. Examples of biometric traits include fingerprint, palmprint, iris, and face. The brain is the most important and complex organ in the human body. Can it be used as a biometric trait? In this study, we analyze the uniqueness of the brain and try to use the brain for identity authentication. The proposed brain-based verification system operates in two stages: gray matter extraction and gray matter matching. A modified brain segmentation algorithm is implemented for extracting gray matter from an input brain image. Then, an alignment-based matching algorithm is developed for brain matching. Experimental results on two data sets show that the proposed brain recognition system meets the high accuracy requirement of identity authentication. Though currently the acquisition of the brain is still time consuming and expensive, brain images are highly unique and have the potential possibility for authentication in view of pattern recognition. PMID:25126604

  5. The potential of using brain images for authentication.

    PubMed

    Chen, Fanglin; Zhou, Zongtan; Shen, Hui; Hu, Dewen

    2014-01-01

    Biometric recognition (also known as biometrics) refers to the automated recognition of individuals based on their biological or behavioral traits. Examples of biometric traits include fingerprint, palmprint, iris, and face. The brain is the most important and complex organ in the human body. Can it be used as a biometric trait? In this study, we analyze the uniqueness of the brain and try to use the brain for identity authentication. The proposed brain-based verification system operates in two stages: gray matter extraction and gray matter matching. A modified brain segmentation algorithm is implemented for extracting gray matter from an input brain image. Then, an alignment-based matching algorithm is developed for brain matching. Experimental results on two data sets show that the proposed brain recognition system meets the high accuracy requirement of identity authentication. Though currently the acquisition of the brain is still time consuming and expensive, brain images are highly unique and have the potential possibility for authentication in view of pattern recognition.

  6. A Practical Guide To Solar Array Simulation And PCDU Test

    NASA Astrophysics Data System (ADS)

    Schmitz, Noah; Carroll, Greg; Clegg, Russell

    2011-10-01

    Solar arrays consisting of multiple photovoltaic segments provide power to satellites and charge internal batteries for use during eclipse. Solar arrays have unique I-V characteristics and output power which vary with environmental and operational conditions such as temperature, irradiance, spin, and eclipse. Therefore, specialty power solutions are needed to properly test the satellite on the ground, especially the Power Control and Distribution Unit (PCDU) and the Array Power Regulator (APR.) This paper explores some practical and theoretical considerations that should be taken into account when choosing a commercial, off-the-shelf solar array simulator (SAS) for verification of the satellite PCDU. An SAS is a unique power supply with I-V output characteristics that emulate the solar arrays used to power a satellite. It is important to think about the strengths and the limitations of this emulation capability, how closely the SAS approximates a real solar panel, and how best to design a system using SAS as components.

  7. Tongue prints: A novel biometric and potential forensic tool.

    PubMed

    Radhika, T; Jeddy, Nadeem; Nithya, S

    2016-01-01

    Tongue is a vital internal organ well encased within the oral cavity and protected from the environment. It has unique features which differ from individual to individual and even between identical twins. The color, shape, and surface features are characteristic of every individual, and this serves as a tool for identification. Many modes of biometric systems have come into existence such as fingerprint, iris scan, skin color, signature verification, voice recognition, and face recognition. The search for a new personal identification method secure has led to the use of the lingual impression or the tongue print as a method of biometric authentication. Tongue characteristics exhibit sexual dimorphism thus aiding in the identification of the person. Emerging as a novel biometric tool, tongue prints also hold the promise of a potential forensic tool. This review highlights the uniqueness of tongue prints and its superiority over other biometric identification systems. The various methods of tongue print collection and the classification of tongue features are also elucidated.

  8. Optimising the benefits of community health workers' unique position between communities and the health sector: A comparative analysis of factors shaping relationships in four countries.

    PubMed

    Kok, Maryse C; Ormel, Hermen; Broerse, Jacqueline E W; Kane, Sumit; Namakhoma, Ireen; Otiso, Lilian; Sidat, Moshin; Kea, Aschenaki Z; Taegtmeyer, Miriam; Theobald, Sally; Dieleman, Marjolein

    2017-11-01

    Community health workers (CHWs) have a unique position between communities and the health sector. The strength of CHWs' relationships with both sides influences their motivation and performance. This qualitative comparative study aimed at understanding similarities and differences in how relationships between CHWs, communities and the health sector were shaped in different Sub-Saharan African settings. The study demonstrates a complex interplay of influences on trust and CHWs' relationships with their communities and actors in the health sector. Mechanisms influencing relationships were feelings of (dis)connectedness, (un)familiarity and serving the same goals, and perceptions of received support, respect, competence, honesty, fairness and recognition. Sometimes, constrained relationships between CHWs and the health sector resulted in weaker relationships between CHWs and communities. The broader context (such as the socio-economic situation) and programme context (related to, for example, task-shifting, volunteering and supervision) in which these mechanisms took place were identified. Policy-makers and programme managers should take into account the broader context and could adjust CHW programmes so that they trigger mechanisms that generate trusting relationships between CHWs, communities and other actors in the health system. This can contribute to enabling CHWs to perform well and responding to the opportunities offered by their unique intermediary position.

  9. Exploring Sources and Influences of Social Capital on Community College Students' First-Year Success: Does Age Make a Difference?

    ERIC Educational Resources Information Center

    Wang, Xueli; Wickersham, Kelly; Lee, Yen; Chan, Hsun-Yu

    2018-01-01

    Background/Context: Although numerous studies have emerged shedding light on community college student success, the enduring role of social capital is often overlooked. Furthermore, when conceptualizing social capital in the community college context and its diverse student population, age represents a unique form of diversity in these…

  10. Unique Nature of the Quality of Life in the Context of Extreme Climatic, Geographical and Specific Socio-Cultural Living Conditions

    ERIC Educational Resources Information Center

    Kulik, Anastasia; Neyaskina, Yuliya; Frizen, Marina; Shiryaeva, Olga; Surikova, Yana

    2016-01-01

    This article presents the results of a detailed empirical research, aimed at studying the quality of life in the context of extreme climatic, geographical and specific sociocultural living conditions. Our research is based on the methodological approach including social, economical, ecological and psychological characteristics and reflecting…

  11. Quality Issues and Trends in Teacher Education: Perspectives and Concerns

    ERIC Educational Resources Information Center

    V., Asha J.

    2016-01-01

    The diversity in educational contexts found in India should be viewed as a valuable feature and as a unique challenge. In an era of greater globalization and educational standardization, of policy borrowing and of international comparisons of achievement, there is a high demand and need to respect context and to appreciate how countries with…

  12. Contexts That Matter to the Leadership Development of Latino Male College Students: A Mixed Methods Perspective

    ERIC Educational Resources Information Center

    Garcia, Gina A.; Huerta, Adrian H.; Ramirez, Jenesis J.; Patrón, Oscar E.

    2017-01-01

    As the number of Latino males entering college increases, there is a need to understand their unique leadership experiences. This study used a convergent parallel mixed methods design to understand what contexts contribute to Latino male undergraduate students' leadership development, capacity, and experiences. Quantitative data were gathered by…

  13. Effect of Vowel Context on the Recognition of Initial Consonants in Kannada.

    PubMed

    Kalaiah, Mohan Kumar; Bhat, Jayashree S

    2017-09-01

    The present study was carried out to investigate the effect of vowel context on the recognition of Kannada consonants in quiet for young adults. A total of 17 young adults with normal hearing in both ears participated in the study. The stimuli included consonant-vowel syllables, spoken by 12 native speakers of Kannada. Consonant recognition task was carried out as a closed-set (fourteen-alternative forced-choice). The present study showed an effect of vowel context on the perception of consonants. Maximum consonant recognition score was obtained in the /o/ vowel context, followed by the /a/ and /u/ vowel contexts, and then the /e/ context. Poorest consonant recognition score was obtained in the vowel context /i/. Vowel context has an effect on the recognition of Kannada consonants, and the vowel effect was unique for Kannada consonants.

  14. The challenge of doing science in wilderness: historical, legal, and policy context

    Treesearch

    Peter Landres; Judy Alderson; David J. Parsons

    2003-01-01

    Lands designated by Congress under the Wilderness Act of 1964 (Public Law 88-577) offer unique opportunities for social and biophysical research in areas that are relatively unmodified by modern human actions. Wilderness designation also imposes a unique set of constraints on the methods that may be used or permitted to conduct this research. For example, legislated...

  15. Starting small: Revisiting young children's perceptions of social withdrawal in China.

    PubMed

    Ding, Xuechen; Coplan, Robert J; Sang, Biao; Liu, Junsheng; Pan, Tingting; Cheng, Chen

    2015-06-01

    In this reply to the commentaries by Xinyin Chen, Charissa Cheah, Yiyuan Xu, and Dawn Watling, we further discuss the conceptual and methodological challenges that arise when attempting to study beliefs about social withdrawal (1) in the unique cultural context of China and (2) in the unique developmental age period of early childhood. © 2015 The British Psychological Society.

  16. Skeletal age and age verification in youth sport.

    PubMed

    Malina, Robert M

    2011-11-01

    Problems with accurate chronological age (CA) reporting occur on a more or less regular basis in youth sports. As a result, there is increasing discussion of age verification. Use of 'bone age' or skeletal age (SA) for the purpose of estimating or verifying CA has been used in medicolegal contexts for many years and also in youth sport competitions. This article reviews the concept of SA, and the three most commonly used methods of assessment. Variation in SA within CA groups among male soccer players and female artistic gymnasts is evaluated relative to the use of SA as a tool for verification of CA. Corresponding data for athletes in several other sports are also summarized. Among adolescent males, a significant number of athletes will be identified as older than a CA cutoff because of advanced skeletal maturation when they in fact have a valid CA. SA assessments of soccer players are comparable to MRI assessments of epiphyseal-diaphyseal union of the distal radius in under-17 soccer players. Both protocols indicate a relatively large number of false negatives among youth players aged 15-17 years. Among adolescent females, a significant number of age-eligible artistic gymnasts will be identified as younger than the CA cutoff because of later skeletal maturation when in fact they have a valid CA. There is also the possibility of false positives-identifying gymnasts as younger than the CA cutoff because of late skeletal maturation when they have a valid CA. The risk of false negatives and false positives implies that SA is not a valid indicator of CA.

  17. Meeting the Challenges of Exploration Systems: Health Management Technologies for Aerospace Systems With Emphasis on Propulsion

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Sowers, T. Shane; Maul, William A.

    2005-01-01

    The constraints of future Exploration Missions will require unique Integrated System Health Management (ISHM) capabilities throughout the mission. An ambitious launch schedule, human-rating requirements, long quiescent periods, limited human access for repair or replacement, and long communication delays all require an ISHM system that can span distinct yet interdependent vehicle subsystems, anticipate failure states, provide autonomous remediation, and support the Exploration Mission from beginning to end. NASA Glenn Research Center has developed and applied health management system technologies to aerospace propulsion systems for almost two decades. Lessons learned from past activities help define the approach to proper ISHM development: sensor selection- identifies sensor sets required for accurate health assessment; data qualification and validation-ensures the integrity of measurement data from sensor to data system; fault detection and isolation-uses measurements in a component/subsystem context to detect faults and identify their point of origin; information fusion and diagnostic decision criteria-aligns data from similar and disparate sources in time and use that data to perform higher-level system diagnosis; and verification and validation-uses data, real or simulated, to provide variable exposure to the diagnostic system for faults that may only manifest themselves in actual implementation, as well as faults that are detectable via hardware testing. This presentation describes a framework for developing health management systems and highlights the health management research activities performed by the Controls and Dynamics Branch at the NASA Glenn Research Center. It illustrates how those activities contribute to the development of solutions for Integrated System Health Management.

  18. SMAP Instrument Mechanical System Engineering

    NASA Technical Reports Server (NTRS)

    Slimko, Eric; French, Richard; Riggs, Benjamin

    2013-01-01

    The Soil Moisture Active Passive (SMAP) mission, scheduled for launch by the end of 2014, is being developed to measure the soil moisture and soil freeze/thaw state on a global scale over a three-year period. The accuracy, resolution, and global coverage of SMAP measurements are invaluable across many science and applications disciplines including hydrology, climate, carbon cycle, and the meteorological, environment, and ecology applications communities. The SMAP observatory is composed of a despun bus and a spinning instrument platform that includes both a deployable 6 meter aperture low structural frequency Astromesh reflector and a spin control system. The instrument section has engendered challenging mechanical system issues associated with the antenna deployment, flexible antenna pointing in the context of a multitude of disturbances, spun section mass properties, spin control system development, and overall integration with the flight system on both mechanical and control system levels. Moreover, the multitude of organizations involved, including two major vendors providing the spin subsystem and reflector boom assembly plus the flight system mechanical and guidance, navigation, and control teams, has led to several unique system engineering challenges. Capturing the key physics associated with the function of the flight system has been challenging due to the many different domains that are applicable. Key interfaces and operational concepts have led to complex negotiations because of the large number of organizations that integrate with the instrument mechanical system. Additionally, the verification and validation concerns associated with the mechanical system have had required far-reaching involvement from both the flight system and other subsystems. The SMAP instrument mechanical systems engineering issues and their solutions are described in this paper.

  19. Applications of Advanced Technology for Monitoring Forest Carbon to Support Climate Change Mitigation

    NASA Astrophysics Data System (ADS)

    Birdsey, R.; Hurtt, G. C.; Dubayah, R.; Hagen, S. C.; Vargas, R.; Nehrkorn, T.; Domke, G. M.; Houghton, R. A.

    2015-12-01

    Measurement, Reporting, and Verification (MRV) is a broad concept guiding the application of monitoring technology to the needs of countries or entities for reporting and verifying reductions in greenhouse gas emissions or increases in greenhouse gas sinks. Credibility, cost-effectiveness, and compatibility are important features of global MRV efforts that can support implementation of climate change mitigation programs such as Reducing Emissions from Deforestation and Forest Degradation and Sustainable Forest Management (REDD+). Applications of MRV technology may be tailored to individual country circumstances following guidance provided by the Intergovernmental Panel on Climate Change; hence, there is no single approach that is uniquely viable but rather a range of ways to integrate new MRV methods. MRV technology is advancing rapidly with new remote sensing and advanced measurement of atmospheric CO2, and in situ terrestrial and ocean measurements, coupled with improvements in data analysis, modeling, and assessing uncertainty. Here we briefly summarize some of the most application-ready MRV technologies being developed under NASA's Carbon Monitoring System (CMS) program, and illustrate how these technologies may be applied for monitoring forests using several case studies that span a range of scales, country circumstances, and stakeholder reporting requirements. We also include remarks about the potential role of advanced monitoring technology in the context of the global climate accord that is expected to result from the 21st session of the Conference of the Parties to the United Nations Framework Convention on Climate Change, which is expected to take place in December 2015, in Paris, France.

  20. A unique memory process modulated by emotion underpins successful odor recognition and episodic retrieval in humans

    PubMed Central

    Saive, Anne-Lise; Royet, Jean-Pierre; Ravel, Nadine; Thévenet, Marc; Garcia, Samuel; Plailly, Jane

    2014-01-01

    We behaviorally explore the link between olfaction, emotion and memory by testing the hypothesis that the emotion carried by odors facilitates the memory of specific unique events. To investigate this idea, we used a novel behavioral approach inspired by a paradigm developed by our team to study episodic memory in a controlled and as ecological as possible way in humans. The participants freely explored three unique and rich laboratory episodes; each episode consisted of three unfamiliar odors (What) positioned at three specific locations (Where) within a visual context (Which context). During the retrieval test, which occurred 24–72 h after the encoding, odors were used to trigger the retrieval of the complex episodes. The participants were proficient in recognizing the target odors among distractors and retrieving the visuospatial context in which they were encountered. The episodic nature of the task generated high and stable memory performances, which were accompanied by faster responses and slower and deeper breathing. Successful odor recognition and episodic memory were not related to differences in odor investigation at encoding. However, memory performances were influenced by the emotional content of the odors, regardless of odor valence, with both pleasant and unpleasant odors generating higher recognition and episodic retrieval than neutral odors. Finally, the present study also suggested that when the binding between the odors and the spatio-contextual features of the episode was successful, the odor recognition and the episodic retrieval collapsed into a unique memory process that began as soon as the participants smelled the odors. PMID:24936176

  1. Automation bias and verification complexity: a systematic review.

    PubMed

    Lyell, David; Coiera, Enrico

    2017-03-01

    While potentially reducing decision errors, decision support systems can introduce new types of errors. Automation bias (AB) happens when users become overreliant on decision support, which reduces vigilance in information seeking and processing. Most research originates from the human factors literature, where the prevailing view is that AB occurs only in multitasking environments. This review seeks to compare the human factors and health care literature, focusing on the apparent association of AB with multitasking and task complexity. EMBASE, Medline, Compendex, Inspec, IEEE Xplore, Scopus, Web of Science, PsycINFO, and Business Source Premiere from 1983 to 2015. Evaluation studies where task execution was assisted by automation and resulted in errors were included. Participants needed to be able to verify automation correctness and perform the task manually. Tasks were identified and grouped. Task and automation type and presence of multitasking were noted. Each task was rated for its verification complexity. Of 890 papers identified, 40 met the inclusion criteria; 6 were in health care. Contrary to the prevailing human factors view, AB was found in single tasks, typically involving diagnosis rather than monitoring, and with high verification complexity. The literature is fragmented, with large discrepancies in how AB is reported. Few studies reported the statistical significance of AB compared to a control condition. AB appears to be associated with the degree of cognitive load experienced in decision tasks, and appears to not be uniquely associated with multitasking. Strategies to minimize AB might focus on cognitive load reduction. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  2. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  3. Multispectral and hyperspectral advanced characterization of soldier's camouflage equipment

    NASA Astrophysics Data System (ADS)

    Farley, Vincent; Kastek, Mariusz; Chamberland, Martin; PiÄ tkowski, Tadeusz; Lagueux, Philippe; Dulski, Rafał; Trzaskawka, Piotr

    2013-05-01

    The requirements for soldier camouflage in the context of modern warfare are becoming more complex and challenging given the emergence of novel infrared sensors. There is a pressing need for the development of adapted fabrics and soldier camouflage devices to provide efficient camouflage in both the visible and infrared spectral ranges. The Military University of Technology has conducted an intensive project to develop new materials and fabrics to further improve the camouflage efficiency of soldiers. The developed materials shall feature visible and infrared properties that make these unique and adapted to various military context needs. This paper presents the details of an advanced measurement campaign of those unique materials where the correlation between multispectral and hyperspectral infrared measurements is performed.

  4. Multispectral and hyperspectral advanced characterization of soldier's camouflage equipment

    NASA Astrophysics Data System (ADS)

    Lagueux, Philippe; Kastek, Mariusz; Chamberland, Martin; PiÄ tkowski, Tadeusz; Farley, Vincent; Dulski, Rafał; Trzaskawka, Piotr

    2013-10-01

    The requirements for soldier camouflage in the context of modern warfare are becoming more complex and challenging given the emergence of novel infrared sensors. There is a pressing need for the development of adapted fabrics and soldier camouflage devices to provide efficient camouflage in both the visible and infrared spectral ranges. The Military University of Technology has conducted an intensive project to develop new materials and fabrics to further improve the camouflage efficiency of soldiers. The developed materials shall feature visible and infrared properties that make these unique and adapted to various military context needs. This paper presents the details of an advanced measurement campaign of those unique materials where the correlation between multispectral and hyperspectral infrared measurements is performed.

  5. The long tail of a demon drug: The 'bath salts' risk environment.

    PubMed

    Elliott, Luther; Benoit, Ellen; Campos, Stephanie; Dunlap, Eloise

    2018-01-01

    Using the case of synthetic cathinones (commonly referred to as 'bath salts' in the US context), this paper analyses structural factors surrounding novel psychoactive substances (NPS) as contributing to the unique risk environment surrounding their use. Drawing on interviews with 39 people who use bath salts from four U.S. cities and analysis of the infrastructural, social, economic, and policy contexts, we document the unique harms related to changing contexts for illicit drug regulation, manufacture, and consumption. Findings suggest that NPS and designer drug markets, which are highly reliant upon the internet, share characteristics of the entertainment industry which has come to rely more heavily upon profits derived from the 'long tail' of myriad lesser-known products and the diminished centrality of 'superstars' and 'hits'. Findings point toward increased theoretical and policy attention to changing drug market structures, more rigorous evaluations of drug 'analogues' legislation and greater involvement with NPS education and testing by harm reduction agencies. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Classroom and Teacher Support in Kindergarten: Associations with the Behavioral and Academic Adjustment of Low-Income Students

    PubMed Central

    Lee, Phyllis; Bierman, Karen L.

    2016-01-01

    For socio-economically disadvantaged children, a positive experience in kindergarten may play a particularly important role in fostering the behavioral adjustment and learning engagement necessary for school success. Prior research has identified supportive student-teacher relationships and classroom emotional support as two features of the classroom context that can promote student adjustment; however, very few studies have examined these two aspects of the classroom context simultaneously. Given their modest inter-correlations, these dimensions of classroom context may have both unique and shared associations with child progress. This study followed 164 children as they transitioned from Head Start into elementary school, and regressions revealed significant unique associations between each type of kindergarten support and children’s aggressive behaviors, social withdrawal, learning engagement, and emergent literacy skills in first grade, controlling for their pre-kindergarten adjustment. In addition, learning engagement significantly mediated the association between a supportive relationship with the kindergarten teacher and first grade literacy skills. PMID:27274606

  7. A Cross-Cultural Analysis of Advertisements from High-Context Cultures and Low-Context Cultures

    ERIC Educational Resources Information Center

    Bai, He

    2016-01-01

    With the development of economy and the change of social culture, advertisements have penetrated our life slowly and done a lot to the commercial markets. Advertisements have often been analyzed in a stylistic way for its unique language style. But language is an important part, as well as a carrier, of culture. Advertising language, as other…

  8. Mirth and Murder: Crime Scene Investigation as a Work Context for Examining Humor Applications

    ERIC Educational Resources Information Center

    Roth, Gene L.; Vivona, Brian

    2010-01-01

    Within work settings, humor is used by workers for a wide variety of purposes. This study examines humor applications of a specific type of worker in a unique work context: crime scene investigation. Crime scene investigators examine death and its details. Members of crime scene units observe death much more frequently than other police officers…

  9. Case-Based Teaching in a Bilingual Context: Perceptions of Business Faculty in Hong Kong

    ERIC Educational Resources Information Center

    Jackson, Jane

    2004-01-01

    Case methods of teaching are now common in business education programs worldwide. This problem-based approach, however, can pose unique challenges in bilingual contexts, especially if the students are more familiar with transmission modes of learning. This paper focuses on an investigation of case-based teaching in Hong Kong. By way of surveys and…

  10. The Michigan Context and Performance Report Card: Public Elementary and Middle Schools, 2015

    ERIC Educational Resources Information Center

    Spalding, Audrey; DeGrow, Ben

    2016-01-01

    This is the Mackinac Center's fourth school report card and covers elementary and middle schools. A similar report card was published in 2013, and this edition includes two years' worth of new data. A unique characteristic of this report card is that takes into consideration the "context" of a school when assessing its performance.…

  11. Situated Willingness to Communicate in an L2: Interplay of Individual Characteristics and Context

    ERIC Educational Resources Information Center

    Yashima, Tomoko; MacIntyre, Peter D.; Ikeda, Maiko

    2018-01-01

    Recently, situated willingness to communicate (WTC) has received increasing research attention in addition to traditional quantitative studies of trait-like WTC. This article is an addition to the former but unique in two ways. First, it investigates both trait and state WTC in a classroom context and explores ways to combine the two to reach a…

  12. Complementing the Australian Primary School Health and Physical Education (HPE) Curriculum: Exploring Children's HPE Learning Experiences within Varying School Ground Equipment Contexts

    ERIC Educational Resources Information Center

    Hyndman, Brendon; Mahony, Linda; Te Ava, Aue; Smith, Sue; Nutton, Georgie

    2017-01-01

    This paper unearths how primary school children experience and can complement the Australian HPE curriculum within three unique school ground equipment scenarios that include an "empty", "loose parts" and a "traditional" school ground context. Using direct observation, 490 scans were undertaken of the school grounds…

  13. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  14. Seismic Safety Of Simple Masonry Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guadagnuolo, Mariateresa; Faella, Giuseppe

    2008-07-08

    Several masonry buildings comply with the rules for simple buildings provided by seismic codes. For these buildings explicit safety verifications are not compulsory if specific code rules are fulfilled. In fact it is assumed that their fulfilment ensures a suitable seismic behaviour of buildings and thus adequate safety under earthquakes. Italian and European seismic codes differ in the requirements for simple masonry buildings, mostly concerning the building typology, the building geometry and the acceleration at site. Obviously, a wide percentage of buildings assumed simple by codes should satisfy the numerical safety verification, so that no confusion and uncertainty have tomore » be given rise to designers who must use the codes. This paper aims at evaluating the seismic response of some simple unreinforced masonry buildings that comply with the provisions of the new Italian seismic code. Two-story buildings, having different geometry, are analysed and results from nonlinear static analyses performed by varying the acceleration at site are presented and discussed. Indications on the congruence between code rules and results of numerical analyses performed according to the code itself are supplied and, in this context, the obtained result can provide a contribution for improving the seismic code requirements.« less

  15. Measurement of radiation damage of water-based liquid scintillator and liquid scintillator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bignell, L. J.; Diwan, M. V.; Hans, S.

    2015-10-19

    Liquid scintillating phantoms have been proposed as a means to perform real-time 3D dosimetry for proton therapy treatment plan verification. We have studied what effect radiation damage to the scintillator will have upon this application. We have performed measurements of the degradation of the light yield and optical attenuation length of liquid scintillator and water-based liquid scintillator after irradiation by 201 MeV proton beams that deposited doses of approximately 52 Gy, 300 Gy, and 800 Gy in the scintillator. Liquid scintillator and water-based liquid scintillator (composed of 5% scintillating phase) exhibit light yield reductions of 1.74 ± 0.55 % andmore » 1.31 ± 0.59 % after ≈ 800 Gy of proton dose, respectively. Some increased optical attenuation was observed in the irradiated samples, the measured reduction to the light yield is also due to damage to the scintillation light production. Based on our results and conservative estimates of the expected dose in a clinical context, a scintillating phantom used for proton therapy treatment plan verification would exhibit a systematic light yield reduction of approximately 0.1% after a year of operation.« less

  16. A Study of Feature Combination for Vehicle Detection Based on Image Processing

    PubMed Central

    2014-01-01

    Video analytics play a critical role in most recent traffic monitoring and driver assistance systems. In this context, the correct detection and classification of surrounding vehicles through image analysis has been the focus of extensive research in the last years. Most of the pieces of work reported for image-based vehicle verification make use of supervised classification approaches and resort to techniques, such as histograms of oriented gradients (HOG), principal component analysis (PCA), and Gabor filters, among others. Unfortunately, existing approaches are lacking in two respects: first, comparison between methods using a common body of work has not been addressed; second, no study of the combination potentiality of popular features for vehicle classification has been reported. In this study the performance of the different techniques is first reviewed and compared using a common public database. Then, the combination capabilities of these techniques are explored and a methodology is presented for the fusion of classifiers built upon them, taking into account also the vehicle pose. The study unveils the limitations of single-feature based classification and makes clear that fusion of classifiers is highly beneficial for vehicle verification. PMID:24672299

  17. An Overview of Integration and Test of the James Webb Space Telescope Integrated Science Instrument Module

    NASA Technical Reports Server (NTRS)

    Drury, Michael; Becker, Neil; Bos, Brent; Davila, Pamela; Frey, Bradley; Hylan, Jason; Marsh, James; McGuffey, Douglas; Novak, Maria; Ohl, Raymond; hide

    2007-01-01

    The James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (approx.40K). The JWST Observatory architecture includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI) including a Guider. The SIs and Guider are mounted to a composite metering structure with outer dimensions of 2.1x2.2x1.9m. The SI and Guider units are integrated to the ISIM structure and optically tested at NASA/Goddard Space Flight Center as an instrument suite using a high-fidelity, cryogenic JWST telescope simulator that features a 1.5m diameter powered mirror. The SIs are integrated and aligned to the structure under ambient, clean room conditions. SI performance, including focus, pupil shear and wavefront error, is evaluated at the operating temperature. We present an overview of the ISIM integration within the context of Observatory-level construction. We describe the integration and verification plan for the ISIM element, including an overview of our incremental verification approach, ambient mechanical integration and test plans and optical alignment and cryogenic test plans. We describe key ground support equipment and facilities.

  18. New biometric modalities using internal physical characteristics

    NASA Astrophysics Data System (ADS)

    Mortenson, Juliana (Brooks)

    2010-04-01

    Biometrics is described as the science of identifying people based on physical characteristics such as their fingerprints, facial features, hand geometry, iris patterns, palm prints, or speech recognition. Notably, all of these physical characteristics are visible or detectable from the exterior of the body. These external characteristics can be lifted, photographed, copied or recorded for unauthorized access to a biometric system. Individual humans are unique internally, however, just as they are unique externally. New biometric modalities have been developed which identify people based on their unique internal characteristics. For example, "BoneprintsTM" use acoustic fields to scan the unique bone density pattern of a thumb pressed on a small acoustic sensor. Thanks to advances in piezoelectric materials the acoustic sensor can be placed in virtually any device such as a steering wheel, door handle, or keyboard. Similarly, "Imp-PrintsTM" measure the electrical impedance patterns of a hand to identify or verify a person's identity. Small impedance sensors can be easily embedded in devices such as smart cards, handles, or wall mounts. These internal biometric modalities rely on physical characteristics which are not visible or photographable, providing an added level of security. In addition, both the acoustic and impedance methods can be combined with physiologic measurements such as acoustic Doppler or impedance plethysmography, respectively. Added verification that the biometric pattern came from a living person can be obtained. These new biometric modalities have the potential to allay user concerns over protection of privacy, while providing a higher level of security.*

  19. Infrared sensing of non-observable human biometrics

    NASA Astrophysics Data System (ADS)

    Willmore, Michael R.

    2005-05-01

    Interest and growth of biometric recognition technologies surged after 9/11. Once a technology mainly used for identity verification in law enforcement, biometrics are now being considered as a secure means of providing identity assurance in security related applications. Biometric recognition in law enforcement must, by necessity, use attributes of human uniqueness that are both observable and vulnerable to compromise. Privacy and protection of an individual's identity is not assured during criminal activity. However, a security system must rely on identity assurance for access control to physical or logical spaces while not being vulnerable to compromise and protecting the privacy of an individual. The solution resides in the use of non-observable attributes of human uniqueness to perform the biometric recognition process. This discussion will begin by presenting some key perspectives about biometric recognition and the characteristic differences between observable and non-observable biometric attributes. An introduction to the design, development, and testing of the Thermo-ID system will follow. The Thermo-ID system is an emerging biometric recognition technology that uses non-observable patterns of infrared energy naturally emanating from within the human body. As with all biometric systems, the infrared patterns recorded and compared within the Thermo-ID system are unique and individually distinguishable permitting a link to be confirmed between an individual and a claimed or previously established identity. The non-observable characteristics of infrared patterns of human uniqueness insure both the privacy and protection of an individual using this type of biometric recognition system.

  20. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  1. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  2. The politics of verification and the control of nuclear tests, 1945-1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, N.W.

    1990-01-01

    This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less

  3. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  4. Assessing the Claims of Participatory Measurement, Reporting and Verification (PMRV) in Achieving REDD+ Outcomes: A Systematic Review

    PubMed Central

    Hawthorne, Sandra; Boissière, Manuel; Felker, Mary Elizabeth; Atmadja, Stibniati

    2016-01-01

    Participation of local communities in the Measurement, Reporting and Verification (MRV) of forest changes has been promoted as a strategy that lowers the cost of MRV and increases their engagement with REDD+. This systematic review of literature assessed the claims of participatory MRV (PMRV) in achieving REDD+ outcomes. We identified 29 PMRV publications that consisted of 20 peer-reviewed and 9 non peer-reviewed publications, with 14 publications being empirically based studies. The evidence supporting PMRV claims was categorized into empirical finding, citation or assumption. Our analysis of the empirical studies showed that PMRV projects were conducted in 17 countries in three tropical continents and across various forest and land tenure types. Most of these projects tested the feasibility of participatory measurement or monitoring, which limited the participation of local communities to data gathering. PMRV claims of providing accurate local biomass measurements and lowering MRV cost were well-supported with empirical evidence. Claims that PMRV supports REDD+ social outcomes that affect local communities directly, such as increased environmental awareness and equity in benefit sharing, were supported with less empirical evidence than REDD+ technical outcomes. This may be due to the difficulties in measuring social outcomes and the slow progress in the development and implementation of REDD+ components outside of experimental research contexts. Although lessons from other monitoring contexts have been used to support PMRV claims, they are only applicable when the enabling conditions can be replicated in REDD+ contexts. There is a need for more empirical evidence to support PMRV claims on achieving REDD+ social outcomes, which may be addressed with more opportunities and rigorous methods for assessing REDD+ social outcomes. Integrating future PMRV studies into local REDD+ implementations may help create those opportunities, while increasing the participation of local communities as local REDD+ stakeholders. Further development and testing of participatory reporting framework are required to integrate PMRV data with the national database. Publication of empirical PMRV studies is encouraged to guide when, where and how PMRV should be implemented. PMID:27812110

  5. Airborne Wind Profiling With the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2012-01-01

    A pulsed 2-micron coherent Doppler lidar system at NASA Langley Research Center in Virginia flew on the NASA's DC-8 aircraft during the NASA Genesis and Rapid Intensification Processes (GRIP) during the summer of 2010. The participation was part of the project Doppler Aerosol Wind Lidar (DAWN) Air. Selected results of airborne wind profiling are presented and compared with the dropsonde data for verification purposes. Panoramic presentations of different wind parameters over a nominal observation time span are also presented for selected GRIP data sets. The realtime data acquisition and analysis software that was employed during the GRIP campaign is introduced with its unique features.

  6. Challenges in verification and validation of autonomous systems for space exploration

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Jonsson, Ari

    2005-01-01

    Space exploration applications offer a unique opportunity for the development and deployment of autonomous systems, due to limited communications, large distances, and great expense of direct operation. At the same time, the risk and cost of space missions leads to reluctance to taking on new, complex and difficult-to-understand technology. A key issue in addressing these concerns is the validation of autonomous systems. In recent years, higher-level autonomous systems have been applied in space applications. In this presentation, we will highlight those autonomous systems, and discuss issues in validating these systems. We will then look to future demands on validating autonomous systems for space, identify promising technologies and open issues.

  7. Items Supporting the Hanford Internal Dosimetry Program Implementation of the IMBA Computer Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, Eugene H.; Bihl, Donald E.

    2008-01-07

    The Hanford Internal Dosimetry Program has adopted the computer code IMBA (Integrated Modules for Bioassay Analysis) as its primary code for bioassay data evaluation and dose assessment using methodologies of ICRP Publications 60, 66, 67, 68, and 78. The adoption of this code was part of the implementation plan for the June 8, 2007 amendments to 10 CFR 835. This information release includes action items unique to IMBA that were required by PNNL quality assurance standards for implementation of safety software. Copie of the IMBA software verification test plan and the outline of the briefing given to new users aremore » also included.« less

  8. Design and Testing of a Transcutaneous RF Recharging System for a Fetal Micropacemaker.

    PubMed

    Vest, Adriana N; Zhou, Li; Huang, Xuechen; Norekyan, Viktoria; Bar-Cohen, Yaniv; Chmait, Ramen H; Loeb, Gerald Eli

    2017-04-01

    We have developed a rechargeable fetal micropacemaker in order to treat severe fetal bradycardia with comorbid hydrops fetalis. The necessarily small form factor of the device, small patient population, and fetal anatomy put unique constraints on the design of the recharging system. To overcome these constraints, a custom high power field generator was built and the recharging process was controlled by utilizing pacing rate as a measure of battery state, a feature of the relaxation oscillator used to generate stimuli. The design and in vitro and in vivo verification of the recharging system is presented here, showing successful generation of recharging current in a fetal lamb model.

  9. Design and Testing of a Transcutaneous RF Recharging System for a Fetal Micropacemaker

    PubMed Central

    Vest, Adriana N.; Zhou, Li; Huang, Xuechen; Norekyan, Viktoria; Bar-Cohen, Yaniv; Chmait, Ramen H.; Loeb, Gerald Eli

    2017-01-01

    We have developed a rechargeable fetal micropacemaker in order to treat severe fetal bradycardia with comorbid hydrops fetalis. The necessarily small form factor of the device, small patient population, and fetal anatomy put unique constraints on the design of the recharging system. To overcome these constraints, a custom high power field generator was built and the recharging process was controlled by utilizing pacing rate as a measure of battery state, a feature of the relaxation oscillator used to generate stimuli. The design and in vitro and in vivo verification of the recharging system is presented here, showing successful generation of recharging current in a fetal lamb model. PMID:28212097

  10. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...

  11. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  12. The effects of environmental context on recognition memory and claims of remembering.

    PubMed

    Hockley, William E

    2008-11-01

    Recognition memory for words was tested in same or different contexts using the remember/know response procedure. Context was manipulated by presenting words in different screen colors and locations and by presenting words against real-world photographs. Overall hit and false-alarm rates were higher for tests presented in an old context compared to a new context. This concordant effect was seen in both remember responses and estimates of familiarity. Similar results were found for rearranged pairings of old study contexts and targets, for study contexts that were unique or were repeated with different words, and for new picture contexts that were physically similar to old contexts. Similar results were also found when subjects focused attention on the study words, but a different pattern of results was obtained when subjects explicitly associated the study words with their picture context. The results show that subjective feelings of recollection play a role in the effects of environmental context but are likely based more on a sense of familiarity that is evoked by the context than on explicit associations between targets and their study context.

  13. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...

  14. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...

  15. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  16. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  17. Weaving Action Learning into the Fabric of Manufacturing: The Impact of Humble Inquiry and Structured Reflection in a Cross-Cultural Context

    ERIC Educational Resources Information Center

    Luckman, Elizabeth A.

    2017-01-01

    This account of practice examines the implementation of and reactions to action learning through the Lean methodology in a unique, cross-cultural context. I review my time spent as a Lean coach; engaging with, training, and using action learning with employees in a garment manufacturing facility located in Bali, Indonesia. This research addresses…

  18. Attitudes and Attitude Change.

    PubMed

    Albarracin, Dolores; Shavitt, Sharon

    2018-01-04

    This review covers research on attitudes and attitude change published between 2010 and 2017. We characterize this period as one of significant progress toward an understanding of how attitudes form and change in three critical contexts. The first context is the person, as attitudes change in connection to values, general goals, language, emotions, and human development. The second context is social relationships, which link attitude change to the communicator of persuasive messages, social media, and culture. The third context is sociohistorical and highlights the influence of unique events, including sociopolitical, economic, and climatic occurrences. In conclusion, many important recent findings reflect the fact that holism, with a focus on situating attitudes within their personal, social, and historical contexts, has become the zeitgeist of attitude research during this period.

  19. Molecular docking analysis of known flavonoids as duel COX-2 inhibitors in the context of cancer

    PubMed Central

    Dash, Raju; Uddin, Mir Muhammad Nasir; Hosen, S.M. Zahid; Rahim, Zahed Bin; Dinar, Abu Mansur; Kabir, Mohammad Shah Hafez; Sultan, Ramiz Ahmed; Islam, Ashekul; Hossain, Md Kamrul

    2015-01-01

    Cyclooxygenase-2 (COX-2) catalyzed synthesis of prostaglandin E2 and it associates with tumor growth, infiltration, and metastasis in preclinical experiments. Known inhibitors against COX-2 exhibit toxicity. Therefore, it is of interest to screen natural compounds like flavanoids against COX-2. Molecular docking using 12 known flavanoids against COX-2 by FlexX and of ArgusLab were performed. All compounds showed a favourable binding energy of >-10 KJ/mol in FlexX and > -8 kcal/mol in ArgusLab. However, this data requires in vitro and in vivo verification for further consideration. PMID:26770028

  20. NASA/DOD Aerospace Knowledge Diffusion Research Project. XXIV - A general approach to measuring the value of aerospace information products and services

    NASA Technical Reports Server (NTRS)

    Brinberg, Herbert R.; Pinelli, Thomas E.

    1993-01-01

    This paper discusses the various approaches to measuring the value of information, first defining the meanings of information, economics of information, and value. It concludes that no general model of measuring the value of information is possible and that the usual approaches, such as cost/benefit equations, have very limited applications. It also concludes that in specific contexts with given goals for newly developed products and services or newly acquired information, there is a basis for its objective valuation. The axioms and inputs for such a model are described and directions for further verification and analysis are proposed.

  1. Assume-Guarantee Abstraction Refinement Meets Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Bogomolov, Sergiy; Frehse, Goran; Greitschus, Marius; Grosu, Radu; Pasareanu, Corina S.; Podelski, Andreas; Strump, Thomas

    2014-01-01

    Compositional verification techniques in the assume- guarantee style have been successfully applied to transition systems to efficiently reduce the search space by leveraging the compositional nature of the systems under consideration. We adapt these techniques to the domain of hybrid systems with affine dynamics. To build assumptions we introduce an abstraction based on location merging. We integrate the assume-guarantee style analysis with automatic abstraction refinement. We have implemented our approach in the symbolic hybrid model checker SpaceEx. The evaluation shows its practical potential. To the best of our knowledge, this is the first work combining assume-guarantee reasoning with automatic abstraction-refinement in the context of hybrid automata.

  2. A retrospective of the GREGOR solar telescope in scientific literature

    NASA Astrophysics Data System (ADS)

    Denker, C.; von der Lühe, O.; Feller, A.; Arlt, K.; Balthasar, H.; Bauer, S.-M.; Bello González, N.; Berkefeld, Th.; Caligari, P.; Collados, M.; Fischer, A.; Granzer, T.; Hahn, T.; Halbgewachs, C.; Heidecke, F.; Hofmann, A.; Kentischer, T.; Klva{ňa, M.; Kneer, F.; Lagg, A.; Nicklas, H.; Popow, E.; Puschmann, K. G.; Rendtel, J.; Schmidt, D.; Schmidt, W.; Sobotka, M.; Solanki, S. K.; Soltau, D.; Staude, J.; Strassmeier, K. G.; Volkmer, R.; Waldmann, T.; Wiehr, E.; Wittmann, A. D.; Woche, M.

    2012-11-01

    In this review, we look back upon the literature, which had the GREGOR solar telescope project as its subject including science cases, telescope subsystems, and post-focus instruments. The articles date back to the year 2000, when the initial concepts for a new solar telescope on Tenerife were first presented at scientific meetings. This comprehensive bibliography contains literature until the year 2012, i.e., the final stages of commissioning and science verification. Taking stock of the various publications in peer-reviewed journals and conference proceedings also provides the ``historical'' context for the reference articles in this special issue of Astronomische Nachrichten/Astronomical Notes.

  3. Formal Verification of Safety Buffers for Sate-Based Conflict Detection and Resolution

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Jeannin, Jean-Baptiste; Munoz, Cesar A.

    2010-01-01

    The information provided by global positioning systems is never totally exact, and there are always errors when measuring position and velocity of moving objects such as aircraft. This paper studies the effects of these errors in the actual separation of aircraft in the context of state-based conflict detection and resolution. Assuming that the state information is uncertain but that bounds on the errors are known, this paper provides an analytical definition of a safety buffer and sufficient conditions under which this buffer guarantees that actual conflicts are detected and solved. The results are presented as theorems, which were formally proven using a mechanical theorem prover.

  4. ESSAA: Embedded system safety analysis assistant

    NASA Technical Reports Server (NTRS)

    Wallace, Peter; Holzer, Joseph; Guarro, Sergio; Hyatt, Larry

    1987-01-01

    The Embedded System Safety Analysis Assistant (ESSAA) is a knowledge-based tool that can assist in identifying disaster scenarios. Imbedded software issues hazardous control commands to the surrounding hardware. ESSAA is intended to work from outputs to inputs, as a complement to simulation and verification methods. Rather than treating the software in isolation, it examines the context in which the software is to be deployed. Given a specified disasterous outcome, ESSAA works from a qualitative, abstract model of the complete system to infer sets of environmental conditions and/or failures that could cause a disasterous outcome. The scenarios can then be examined in depth for plausibility using existing techniques.

  5. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, the STS-96 crew looks at equipment as part of a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station . From left are Mission Specialist Ellen Ochoa (behind the opened storage cover ), Commander Kent Rominger, Pilot Rick Husband (holding a lithium hydroxide canister) and Mission Specialists Dan Barry, Valery Tokarev of Russia and Julie Payette. In the background is TTI interpreter Valentina Maydell. The other crew member at KSC for the IVT is Mission Specialist Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  6. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 crew members look over equipment during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. From left are Khristal Parker, with Boeing; Mission Specialist Dan Barry, Pilot Rick Husband, Mission Specialist Tamara Jernigan, and at the far right, Mission Specialist Julie Payette. An unidentified worker is in the background. Also at KSC for the IVT are Commander Kent Rominger and Mission Specialists Ellen Ochoa and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  7. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, (left to right) STS-96 Pilot Rick Husband and Mission Specialists Julie Payette and Ellen Ochoa work the straps on the Sequential Shunt Unit (SSU) in front of them. The STS-96 crew is at KSC for a payload Interface Verification Test (IVT) for its upcoming mission to the International Space Station . Other crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan, Dan Barry and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  8. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialist Valery Tokarev of Russia (left) and Commander Kent Rominger (second from right) listen to Lynn Ashby (far right), with JSC, talking about the SPACEHAB equipment in front of them during a payload Interface Verification Test (IVT). In the background behind Tokarev is TTI interpreter Valentina Maydell. Other STS-96 crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Dan Barry, Ellen Ochoa, Tamara Jernigan and Julie Payette. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  9. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) in the SPACEHAB Facility, STS-96 Mission Specialist Valery Tokarev of Russia (second from left) and Commander Kent Rominger learn about the Sequential Shunt Unit (SSU) in front of them from Lynn Ashby (far right), with Johnson Space Center. At the far left looking on is TTI interpreter Valentina Maydell. Other crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Ellen Ochoa, Tamara Jernigan, Dan Barry and Julie Payette. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  10. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialist Valery Tokarev (in foreground) of the Russian Space Agency closes a container, part of the equipment that will be in the SPACEHAB module on mission STS-96. Behind Tokarev are Pilot Rick Husband (left) and Mission Specialist Dan Barry (right). Other crew members at KSC for a payload Interface Verification Test for the upcoming mission to the International Space Station are Commander Kent Rominger and Mission Specialists Ellen Ochoa, Tamara Jernigan and Julie Payette. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  11. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) in the SPACEHAB Facility, STS-96 Mission Specialist Tamara Jernigan checks over instructions while Mission Specialist Dan Barry looks up from the Sequential Shunt Unit (SSU) in front of him to other equipment Lynn Ashby (right), with Johnson Space Center, is pointing at. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband, and Mission Specialists Ellen Ochoa, Julie Payette and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  12. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) in the SPACEHAB Facility, STS-96 Pilot Rick Husband and Mission Specialist Ellen Ochoa (on the left) and Mission Specialist Julie Payette (on the far right) listen to Khristal Parker (second from right), with Boeing, explain about the equipment in front of them. Other crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan, Dan Barry and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  13. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station are (kneeling) STS-96 Mission Specialists Julie Payette and Ellen Ochoa, Pilot Rick Husband, and (standing at right) Mission Specialist Dan Barry. At the left is James Behling, with Boeing, explaining some of the equipment that will be on board STS-96. Other STS-96 crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  14. Decontamination and management of human remains following incidents of hazardous chemical release.

    PubMed

    Hauschild, Veronique D; Watson, Annetta; Bock, Robert

    2012-01-01

    To provide specific guidance and resources for systematic and orderly decontamination of human remains resulting from a chemical terrorist attack or accidental chemical release. A detailed review and health-based decision criteria protocol is summarized. Protocol basis and logic are derived from analyses of compound-specific toxicological data and chemical/physical characteristics. Guidance is suitable for civilian or military settings where human remains potentially contaminated with hazardous chemicals may be present, such as sites of transportation accidents, terrorist operations, or medical examiner processing points. Guidance is developed from data-characterizing controlled experiments with laboratory animals, fabrics, and materiel. Logic and specific procedures for decontamination and management of remains, protection of mortuary affairs personnel, and decision criteria to determine when remains are sufficiently decontaminated are presented. Established procedures as well as existing materiel and available equipment for decontamination and verification provide reasonable means to mitigate chemical hazards from chemically exposed remains. Unique scenarios such as those involving supralethal concentrations of certain liquid chemical warfare agents may prove difficult to decontaminate but can be resolved in a timely manner by application of the characterized systematic approaches. Decision criteria and protocols to "clear" decontaminated remains for transport and processing are also provided. Once appropriate decontamination and verification have been accomplished, normal procedures for management of remains and release can be followed.

  15. Thermal design verification testing of the Clementine spacecraft: Quick, cheap, and useful

    NASA Technical Reports Server (NTRS)

    Kim, Jeong H.; Hyman, Nelson L.

    1994-01-01

    At this writing, Clementine had successfully fulfilled its moon-mapping mission; at this reading it will have also, with continued good fortune, taken a close look at the asteroid Geographos. The thermal design that made all this possible was indeed formidable in many respects, with very high ratios of requirements-to-available resources and performance-to-cost and mass. There was no question that a test verification of this quite unique and complex design was essential, but it had to be squeezed into an unyielding schedule and executed with bare-bones cost and manpower. After describing the thermal control subsystem's features, we report all the drama, close-calls, and cost-cutting, how objectives were achieved under severe handicap but (thankfully) with little management and documentation interference. Topics include the newly refurbished chamber (ready just in time), the reality level of the engineering model, using the analytical thermal model, the manner of environment simulation, the hand-scratched film heaters, functioning of all three types of heat pipes (but not all heat pipes), and the BMDO sensors' checkout through the chamber window. Test results revealed some surprises and much valuable data, resulting in thermal model and flight hardware refinements. We conclude with the level of correlation between predictions and both test temperatures and flight telemetry.

  16. Systematic Transcreation of Self-Help Smoking Cessation Materials for Hispanic/Latino Smokers: Improving Cultural Relevance and Acceptability.

    PubMed

    Piñeiro, Bárbara; Díaz, Diana R; Monsalve, Luis M; Martínez, Úrsula; Meade, Cathy D; Meltzer, Lauren R; Brandon, Karen O; Unrod, Marina; Brandon, Thomas H; Simmons, Vani N

    2018-01-01

    Smoking-related illnesses are the leading causes of death among Hispanics/Latinos. Yet, there are few smoking cessation interventions targeted for this population. The goal of this study was to "transcreate" an existing, previously validated, English language self-help smoking cessation intervention, titled Forever Free ® : Stop Smoking for Good, for Spanish-speaking smokers. Rather than simply translating the materials, our transcreation process involved culturally adapting the intervention to enhance acceptability and receptivity of the information. We utilized a multiphase qualitative approach (focus groups and learner verification interviews) to develop a linguistically and culturally relevant intervention for the diverse sub-ethnic groups of Hispanic/Latino smokers. Focus group findings indicated a need to underscore several additional cultural characteristics and themes such as the need to address familism and unique stressors faced by immigrants and to provide information regarding nicotine replacement therapy. Learner verification findings indicated a need to further emphasize financial and social benefits of quitting smoking and to discuss how family and friends can support the quit attempt. These steps led to the development of a Spanish-language smoking cessation intervention titled, Libre del cigarillo, por mi familia y por mí: Guía para dejar de fumar, that is currently being tested in a national randomized controlled trial.

  17. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich

    2003-01-01

    We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.

  18. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  19. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  20. Development and Use of Engineering Standards for Computational Fluid Dynamics for Complex Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Lee, Hyung B.; Ghia, Urmila; Bayyuk, Sami; Oberkampf, William L.; Roy, Christopher J.; Benek, John A.; Rumsey, Christopher L.; Powers, Joseph M.; Bush, Robert H.; Mani, Mortaza

    2016-01-01

    Computational fluid dynamics (CFD) and other advanced modeling and simulation (M&S) methods are increasingly relied on for predictive performance, reliability and safety of engineering systems. Analysts, designers, decision makers, and project managers, who must depend on simulation, need practical techniques and methods for assessing simulation credibility. The AIAA Guide for Verification and Validation of Computational Fluid Dynamics Simulations (AIAA G-077-1998 (2002)), originally published in 1998, was the first engineering standards document available to the engineering community for verification and validation (V&V) of simulations. Much progress has been made in these areas since 1998. The AIAA Committee on Standards for CFD is currently updating this Guide to incorporate in it the important developments that have taken place in V&V concepts, methods, and practices, particularly with regard to the broader context of predictive capability and uncertainty quantification (UQ) methods and approaches. This paper will provide an overview of the changes and extensions currently underway to update the AIAA Guide. Specifically, a framework for predictive capability will be described for incorporating a wide range of error and uncertainty sources identified during the modeling, verification, and validation processes, with the goal of estimating the total prediction uncertainty of the simulation. The Guide's goal is to provide a foundation for understanding and addressing major issues and concepts in predictive CFD. However, this Guide will not recommend specific approaches in these areas as the field is rapidly evolving. It is hoped that the guidelines provided in this paper, and explained in more detail in the Guide, will aid in the research, development, and use of CFD in engineering decision-making.

  1. Development and verification of a real-time stochastic precipitation nowcasting system for urban hydrology in Belgium

    NASA Astrophysics Data System (ADS)

    Foresti, L.; Reyniers, M.; Seed, A.; Delobbe, L.

    2016-01-01

    The Short-Term Ensemble Prediction System (STEPS) is implemented in real-time at the Royal Meteorological Institute (RMI) of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e., the observation that large-scale rainfall structures are more persistent and predictable than small-scale convective cells. This paper presents the development, adaptation and verification of the STEPS system for Belgium (STEPS-BE). STEPS-BE provides in real-time 20-member ensemble precipitation nowcasts at 1 km and 5 min resolutions up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60-90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 75-90 % of the forecast errors.

  2. Development and verification of a real-time stochastic precipitation nowcasting system for urban hydrology in Belgium

    NASA Astrophysics Data System (ADS)

    Foresti, L.; Reyniers, M.; Seed, A.; Delobbe, L.

    2015-07-01

    The Short-Term Ensemble Prediction System (STEPS) is implemented in real-time at the Royal Meteorological Institute (RMI) of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e. the observation that large scale rainfall structures are more persistent and predictable than small scale convective cells. This paper presents the development, adaptation and verification of the system STEPS for Belgium (STEPS-BE). STEPS-BE provides in real-time 20 member ensemble precipitation nowcasts at 1 km and 5 min resolution up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60-90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 80-90 % of the forecast errors.

  3. Evaluation of an alternative in vitro test battery for detecting reproductive toxicants in a grouping context.

    PubMed

    Kroese, E Dinant; Bosgra, Sieto; Buist, Harrie E; Lewin, Geertje; van der Linden, Sander C; Man, Hai-yen; Piersma, Aldert H; Rorije, Emiel; Schulpen, Sjors H W; Schwarz, Michael; Uibel, Frederik; van Vugt-Lussenburg, Barbara M A; Wolterbeek, Andre P M; van der Burg, Bart

    2015-08-01

    Previously we showed a battery consisting of CALUX transcriptional activation assays, the ReProGlo assay, and the embryonic stem cell test, and zebrafish embryotoxicity assay as 'apical' tests to correctly predict developmental toxicity for 11 out of 12 compounds, and to explain the one false negative [7]. Here we report on applying this battery within the context of grouping and read across, put forward as a potential tool to fill data gaps and avoid animal testing, to distinguish in vivo non- or weak developmental toxicants from potent developmental toxicants within groups of structural analogs. The battery correctly distinguished 2-methylhexanoic acid, monomethyl phthalate, and monobutyltin trichloride as non- or weak developmental toxicants from structurally related developmental toxicants valproic acid, mono-ethylhexyl phthalate, and tributyltin chloride, respectively, and, therefore, holds promise as a biological verification model in grouping and read across approaches. The relevance of toxicokinetic information is indicated. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  5. Autonomous indoor wayfinding for individuals with cognitive impairments

    PubMed Central

    2010-01-01

    Background A challenge to individuals with cognitive impairments in wayfinding is how to remain oriented, recall routines, and travel in unfamiliar areas in a way relying on limited cognitive capacity. While people without disabilities often use maps or written directions as navigation tools or for remaining oriented, this cognitively-impaired population is very sensitive to issues of abstraction (e.g. icons on maps or signage) and presents the designer with a challenge to tailor navigation information specific to each user and context. Methods This paper describes an approach to providing distributed cognition support of travel guidance for persons with cognitive disabilities. A solution is proposed based on passive near-field RFID tags and scanning PDAs. A prototype is built and tested in field experiments with real subjects. The unique strength of the system is the ability to provide unique-to-the-user prompts that are triggered by context. The key to the approach is to spread the context awareness across the system, with the context being flagged by the RFID tags and the appropriate response being evoked by displaying the appropriate path guidance images indexed by the intersection of specific end-user and context ID embedded in RFID tags. Results We found that passive RFIDs generally served as good context for triggering navigation prompts, although individual differences in effectiveness varied. The results of controlled experiments provided more evidence with regard to applicabilities of the proposed autonomous indoor wayfinding method. Conclusions Our findings suggest that the ability to adapt indoor wayfinding devices for appropriate timing of directions and standing orientation will be particularly important. PMID:20840786

  6. Isotropic band gaps and freeform waveguides observed in hyperuniform disordered photonic solids

    PubMed Central

    Man, Weining; Florescu, Marian; Williamson, Eric Paul; He, Yingquan; Hashemizad, Seyed Reza; Leung, Brian Y. C.; Liner, Devin Robert; Torquato, Salvatore; Chaikin, Paul M.; Steinhardt, Paul J.

    2013-01-01

    Recently, disordered photonic media and random textured surfaces have attracted increasing attention as strong light diffusers with broadband and wide-angle properties. We report the experimental realization of an isotropic complete photonic band gap (PBG) in a 2D disordered dielectric structure. This structure is designed by a constrained optimization method, which combines advantages of both isotropy due to disorder and controlled scattering properties due to low-density fluctuations (hyperuniformity) and uniform local topology. Our experiments use a modular design composed of Al2O3 walls and cylinders arranged in a hyperuniform disordered network. We observe a complete PBG in the microwave region, in good agreement with theoretical simulations, and show that the intrinsic isotropy of this unique class of PBG materials enables remarkable design freedom, including the realization of waveguides with arbitrary bending angles impossible in photonic crystals. This experimental verification of a complete PBG and realization of functional defects in this unique class of materials demonstrate their potential as building blocks for precise manipulation of photons in planar optical microcircuits and has implications for disordered acoustic and electronic band gap materials. PMID:24043795

  7. Bi-Axial Solar Array Drive Mechanism: Design, Build and Environmental Testing

    NASA Technical Reports Server (NTRS)

    Scheidegger, Noemy; Ferris, Mark; Phillips, Nigel

    2014-01-01

    The development of the Bi-Axial Solar Array Drive Mechanism (BSADM) presented in this paper is a demonstration of SSTL's unique space manufacturing approach that enables performing rapid development cycles for cost-effective products that meet ever-challenging mission requirements: The BSADM is designed to orient a solar array wing towards the sun, using its first rotation axis to track the sun, and its second rotation axis to compensate for the satellite orbit and attitude changes needed for a successful payload operation. The tight development schedule, with manufacture of 7 Flight Models within 1.5 year after kick-off, is offset by the risk-reduction of using qualified key component-families from other proven SSTL mechanisms. This allowed focusing the BSADM design activities on the mechanism features that are unique to the BSADM, and having an Engineering Qualification Model (EQM) built 8 months after kick-off. The EQM is currently undergoing a full environmental qualification test campaign. This paper presents the BSADM design approach that enabled meeting such a challenging schedule, its design particularities, and the ongoing verification activities.

  8. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  9. Mapping the Route of Leadership Education: Caution Ahead

    DTIC Science & Technology

    2004-01-01

    apprenticeship, and study of educational purpose. Such context -stripped research-based knowledge cannot substitute for professional knowledge.” — Joe L...concrete, ra- tional processes in high esteem . Reviewing the J9 proposal required us to step back and review educa- tional strategies for developing...tion. Those who learn and employ that knowledge in unique contexts are rightly described as professionals; in them lies the heart and soul of the

  10. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  11. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  12. Contextuality in canonical systems of random variables

    NASA Astrophysics Data System (ADS)

    Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.

    2017-10-01

    Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  13. Self-verification motives at the collective level of self-definition.

    PubMed

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  14. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  15. Discrete Circuits Support Generalized versus Context-Specific Vocal Learning in the Songbird.

    PubMed

    Tian, Lucas Y; Brainard, Michael S

    2017-12-06

    Motor skills depend on the reuse of individual gestures in multiple sequential contexts (e.g., a single phoneme in different words). Yet optimal performance requires that a given gesture be modified appropriately depending on the sequence in which it occurs. To investigate the neural architecture underlying such context-dependent modifications, we studied Bengalese finch song, which, like speech, consists of variable sequences of "syllables." We found that when birds are instructed to modify a syllable in one sequential context, learning generalizes across contexts; however, if unique instruction is provided in different contexts, learning is specific for each context. Using localized inactivation of a cortical-basal ganglia circuit specialized for song, we show that this balance between generalization and specificity reflects a hierarchical organization of neural substrates. Primary motor circuitry encodes a core syllable representation that contributes to generalization, while top-down input from cortical-basal ganglia circuitry biases this representation to enable context-specific learning. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. An object-oriented approach to deploying highly configurable Web interfaces for the ATLAS experiment

    NASA Astrophysics Data System (ADS)

    Lange, Bruno; Maidantchik, Carmen; Pommes, Kathy; Pavani, Varlen; Arosa, Breno; Abreu, Igor

    2015-12-01

    The ATLAS Technical Coordination disposes of 17 Web systems to support its operation. These applications, whilst ranging from managing the process of publishing scientific papers to monitoring radiation levels in the equipment in the experimental cavern, are constantly prone to changes in requirements due to the collaborative nature of the experiment and its management. In this context, a Web framework is proposed to unify the generation of the supporting interfaces. FENCE assembles classes to build applications by making extensive use of JSON configuration files. It relies heavily on Glance, a technology that was set forth in 2003 to create an abstraction layer on top of the heterogeneous sources that store the technical coordination data. Once Glance maps out the database modeling, records can be referenced in the configuration files by wrapping unique identifiers around double enclosing brackets. The deployed content can be individually secured by attaching clearance attributes to their description thus ensuring that view/edit privileges are granted to eligible users only. The framework also provides tools for securely writing into a database. Fully HTML5-compliant multi-step forms can be generated from their JSON description to assure that the submitted data comply with a series of constraints. Input validation is carried out primarily on the server- side but, following progressive enhancement guidelines, verification might also be performed on the client-side by enabling specific markup data attributes which are then handed over to the jQuery validation plug-in. User monitoring is accomplished by thoroughly logging user requests along with any POST data. Documentation is built from the source code using the phpDocumentor tool and made readily available for developers online. Fence, therefore, speeds up the implementation of Web interfaces and reduces the response time to requirement changes by minimizing maintenance overhead.

  17. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  18. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  19. National Centers for Environmental Prediction

    Science.gov Websites

    Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS

  20. National Centers for Environmental Prediction

    Science.gov Websites

    Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration /EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION / DIAGNOSTICS Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS

  1. A survey of context recognition in surgery.

    PubMed

    Pernek, Igor; Ferscha, Alois

    2017-10-01

    With the introduction of operating rooms of the future context awareness has gained importance in the surgical environment. This paper organizes and reviews different approaches for recognition of context in surgery. Major electronic research databases were queried to obtain relevant publications submitted between the years 2010 and 2015. Three different types of context were identified: (i) the surgical workflow context, (ii) surgeon's cognitive and (iii) technical state context. A total of 52 relevant studies were identified and grouped based on the type of context detected and sensors used. Different approaches were summarized to provide recommendations for future research. There is still room for improvement in terms of methods used and evaluations performed. Machine learning should be used more extensively to uncover hidden relationships between different properties of the surgeon's state, particularly when performing cognitive context recognition. Furthermore, validation protocols should be improved by performing more evaluations in situ and with a higher number of unique participants. The paper also provides a structured outline of recent context recognition methods to facilitate development of new generation context-aware surgical support systems.

  2. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  3. The project management office: transforming healthcare in the context of a hospital redevelopment project.

    PubMed

    Richer, Marie-Claire; Marchionni, Caroline; Lavoie-Tremblay, Melanie; Aubry, Monique

    2013-01-01

    It has been shown that classifying projects into a typology allows improved allocation of resources and promotes project success. However, a typology of healthcare projects has yet to be developed. The projects encountered by the Transition Support Office at the McGill University Health Centre in Montreal, Quebec, where a major redevelopment project is under way, were classified into a typology unique to the healthcare context. Examples of the 3 project types, Process, People, and Practice, are provided to clarify the specific support strategies and context-adapted interventions that were instrumental to their success.

  4. The role of trait mindfulness in the pain experience of adolescents.

    PubMed

    Petter, Mark; Chambers, Christine T; McGrath, Patrick J; Dick, Bruce D

    2013-12-01

    Trait mindfulness appears to mitigate pain among adult clinical populations and has a unique relationship with pain catastrophizing. However, little is understood about this phenomenon among adolescents. The association between trait mindfulness and pain in both real-world and experimental contexts was examined in a community sample of adolescents. Participants were 198 adolescents who completed measures of trait mindfulness, pain catastrophizing, and pain interference, as well as an interview on day-to-day pain before undergoing an acute experimental pain task. Following the task, they provided ratings of pain intensity and state catastrophizing. Results showed that with regard to day-to-day pains, mindfulness was a significant and unique predictor of pain interference, and this relationship was partially mediated by pain catastrophizing. Mindfulness also had an indirect relationship with experimental pain intensity and tolerance. These associations were mediated by catastrophizing during the pain task. These findings highlight the association between trait mindfulness and both real-world and experimental pain and offer insight into how mindfulness may affect pain among youth. Findings are discussed in the context of current psychological models of pediatric pain and future avenues for research. This article highlights the association between trait mindfulness and pain variables among adolescents in both real-world and experimental pain settings. These findings offer further evidence of the unique relationship between trait mindfulness and pain catastrophizing in affecting pain variables across pain contexts and populations. Copyright © 2013 American Pain Society. Published by Elsevier Inc. All rights reserved.

  5. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Eligibility verification. 457.380 Section 457.380... Requirements: Eligibility, Screening, Applications, and Enrollment § 457.380 Eligibility verification. (a) The... State may establish reasonable eligibility verification mechanisms to promote enrollment of eligible...

  6. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  7. Comparison of statistical models for writer verification

    NASA Astrophysics Data System (ADS)

    Srihari, Sargur; Ball, Gregory R.

    2009-01-01

    A novel statistical model for determining whether a pair of documents, a known and a questioned, were written by the same individual is proposed. The goal of this formulation is to learn the specific uniqueness of style in a particular author's writing, given the known document. Since there are often insufficient samples to extrapolate a generalized model of an writer's handwriting based solely on the document, we instead generalize over the differences between the author and a large population of known different writers. This is in contrast to an earlier model proposed whereby probability distributions were a priori without learning. We show the performance of the model along with a comparison in performance to the non-learning, older model, which shows significant improvement.

  8. A resonance based model of biological evolution

    NASA Astrophysics Data System (ADS)

    Damasco, Achille; Giuliani, Alessandro

    2017-04-01

    We propose a coarse grained physical model of evolution. The proposed model 'at least in principle' is amenable of an experimental verification even if this looks as a conundrum: evolution is a unique historical process and the tape cannot be reversed and played again. Nevertheless, we can imagine a phenomenological scenario tailored upon state transitions in physical chemistry in which different agents of evolution play the role of the elements of a state transition like thermal noise or resonance effects. The abstract model we propose can be of help for sketching hypotheses and getting rid of some well-known features of natural history like the so-called Cambrian explosion. The possibility of an experimental proof of the model is discussed as well.

  9. Space Shuttle Orbiter - Leading edge structural design/analysis and material allowables

    NASA Technical Reports Server (NTRS)

    Johnson, D. W.; Curry, D. M.; Kelly, R. E.

    1986-01-01

    Reinforced Carbon-Carbon (RCC), a structural composite whose development was targeted for the high temperature reentry environments of reusable space vehicles, has successfully demonstrated that capability on the Space Shuttle Orbiter. Unique mechanical properties, particularly at elevated temperatures up to 3000 F, make this material ideally suited for the 'hot' regions of multimission space vehicles. Design allowable characterization testing, full-scale development and qualification testing, and structural analysis techniques will be presented herein that briefly chart the history of the RCC material from infancy to eventual multimission certification for the Orbiter. Included are discussions pertaining to the development of the design allowable data base, manipulation of the test data into usable forms, and the analytical verification process.

  10. Rotating Rake Turbofan Duct Mode Measurement System

    NASA Technical Reports Server (NTRS)

    Sutliff, Daniel L.

    2005-01-01

    An experimental measurement system was developed and implemented by the NASA Glenn Research Center in the 1990s to measure turbofan duct acoustic modes. The system is a continuously rotating radial microphone rake that is inserted into the duct. This Rotating Rake provides a complete map of the acoustic duct modes present in a ducted fan and has been used on a variety of test articles: from a low-speed, concept test rig, to a full-scale production turbofan engine. The Rotating Rake has been critical in developing and evaluating a number of noise reduction concepts as well as providing experimental databases for verification of several aero-acoustic codes. More detailed derivation of the unique Rotating Rake equations are presented in the appendix.

  11. Decomposed Photo Response Non-Uniformity for Digital Forensic Analysis

    NASA Astrophysics Data System (ADS)

    Li, Yue; Li, Chang-Tsun

    The last few years have seen the applications of Photo Response Non-Uniformity noise (PRNU) - a unique stochastic fingerprint of image sensors, to various types of digital forensic investigations such as source device identification and integrity verification. In this work we proposed a new way of extracting PRNU noise pattern, called Decomposed PRNU (DPRNU), by exploiting the difference between the physical andartificial color components of the photos taken by digital cameras that use a Color Filter Array for interpolating artificial components from physical ones. Experimental results presented in this work have shown the superiority of the proposed DPRNU to the commonly used version. We also proposed a new performance metrics, Corrected Positive Rate (CPR) to evaluate the performance of the common PRNU and the proposed DPRNU.

  12. An international approach to Mission to Planet Earth

    NASA Technical Reports Server (NTRS)

    Lawrence, Robert M.; Sadeh, Willy Z.; Tsygichko, Viktor N.

    1992-01-01

    The new international political constellation resulting from the disintegration of the Soviet Union opens up unique opportunities for cooperation in the space arena. Precedents since 1955 indicate a pervasive interest in mutual cooperation to use military reconnaissance and surveillance satellites for space observations to enforce treaty verification and compliance. One of the avenues that offer immediate prospects for fruitful cooperation is the incorporation of the military reconnaissance and surveillance satellite capabilities of both U.S. and Russia into the Mission to Planet Earth. Formation of a United Nations Satellite (UNSAT) fleet drawn from the American and Russian space assets is proposed. The role of UNSAT is to provide world wide monitoring of both military and enviromental activities under the umbrella of the Mission to Planet Earth.

  13. Characterization and modeling of an advanced flexible thermal protection material for space applications

    NASA Technical Reports Server (NTRS)

    Clayton, Joseph P.; Tinker, Michael L.

    1991-01-01

    This paper describes experimental and analytical characterization of a new flexible thermal protection material known as Tailorable Advanced Blanket Insulation (TABI). This material utilizes a three-dimensional ceramic fabric core structure and an insulation filler. TABI is the leading candidate for use in deployable aeroassisted vehicle designs. Such designs require extensive structural modeling, and the most significant in-plane material properties necessary for model development are measured and analytically verified in this study. Unique test methods are developed for damping measurements. Mathematical models are developed for verification of the experimental modulus and damping data, and finally, transverse properties are described in terms of the inplane properties through use of a 12-dof finite difference model of a simple TABI configuration.

  14. The Earth Observing System AM Spacecraft - Thermal Control Subsystem

    NASA Technical Reports Server (NTRS)

    Chalmers, D.; Fredley, J.; Scott, C.

    1993-01-01

    Mission requirements for the EOS-AM Spacecraft intended to monitor global changes of the entire earth system are considered. The spacecraft is based on an instrument set containing the Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER), Clouds and Earth's Radiant Energy System (CERES), Multiangle Imaging Spectro-Radiometer (MISR), Moderate-Resolution Imaging Spectrometer (MODIS), and Measurements of Pollution in the Troposphere (MOPITT). Emphasis is placed on the design, analysis, development, and verification plans for the unique EOS-AM Thermal Control Subsystem (TCS) aimed at providing the required environments for all the onboard equipment in a densely packed layout. The TCS design maximizes the use of proven thermal design techniques and materials, in conjunction with a capillary pumped two-phase heat transport system for instrument thermal control.

  15. A bimodal biometric identification system

    NASA Astrophysics Data System (ADS)

    Laghari, Mohammad S.; Khuwaja, Gulzar A.

    2013-03-01

    Biometrics consists of methods for uniquely recognizing humans based upon one or more intrinsic physical or behavioral traits. Physicals are related to the shape of the body. Behavioral are related to the behavior of a person. However, biometric authentication systems suffer from imprecision and difficulty in person recognition due to a number of reasons and no single biometrics is expected to effectively satisfy the requirements of all verification and/or identification applications. Bimodal biometric systems are expected to be more reliable due to the presence of two pieces of evidence and also be able to meet the severe performance requirements imposed by various applications. This paper presents a neural network based bimodal biometric identification system by using human face and handwritten signature features.

  16. Testing Orions Fairing Separation System

    NASA Technical Reports Server (NTRS)

    Martinez, Henry; Cloutier, Chris; Lemmon, Heber; Rakes, Daniel; Oldham, Joe; Schlagel, Keith

    2014-01-01

    Traditional fairing systems are designed to fully encapsulate and protect their payload from the harsh ascent environment including acoustic vibrations, aerodynamic forces and heating. The Orion fairing separation system performs this function and more by also sharing approximately half of the vehicle structural load during ascent. This load-share condition through launch and during jettison allows for a substantial increase in mass to orbit. A series of component-level development tests were completed to evaluate and characterize each component within Orion's unique fairing separation system. Two full-scale separation tests were performed to verify system-level functionality and provide verification data. This paper summarizes the fairing spring, Pyramidal Separation Mechanism and forward seal system component-level development tests, system-level separation tests, and lessons learned.

  17. Environmental Technology Verification Report -- Baghouse filtration products, GE Energy QG061 filtration media ( tested May 2007)

    EPA Science Inventory

    EPA has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The Air Pollution Control Technology Verification Center, a cente...

  18. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270 and... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066...

  19. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  20. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  1. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  2. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  3. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  4. Joint ETV/NOWATECH verification protocol for the Sorbisense GSW40 passive sampler

    EPA Science Inventory

    Environmental technology verification (ETV) is an independent (third party) assessment of the performance of a technology or a product for a specified application, under defined conditions and quality assurance. This verification is a joint verification with the US EPA ETV schem...

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PERFORMANCE VERIFICATION OF THE W.L. GORE & ASSOCIATES GORE-SORBER SCREENING SURVEY

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  6. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  7. Mineralogy of Huygens Basin, Mars: A Transect of Noachian Highlands Crust

    NASA Astrophysics Data System (ADS)

    Seelos, K. D.; Ackiss, S. E.; Seelos, F. P.; McBeck, J. A.; Buczkowski, D. L.; Hash, C. D.; Viviano, C. E.; Murchie, S. L.

    2018-06-01

    Huygens crater represents a unique probe of the Noachain crust in the Hellas rim region. We have identified four mineralogic units within a morphologic context to understand the ancient martian highlands.

  8. Second order gyrokinetic theory for particle-in-cell codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tronko, Natalia; Bottino, Alberto; Sonnendrücker, Eric

    2016-08-15

    The main idea of the gyrokinetic dynamical reduction consists in a systematical removal of the fast scale motion (the gyromotion) from the dynamics of the plasma, resulting in a considerable simplification and a significant gain of computational time. The gyrokinetic Maxwell–Vlasov equations are nowadays implemented in for modeling (both laboratory and astrophysical) strongly magnetized plasmas. Different versions of the reduced set of equations exist, depending on the construction of the gyrokinetic reduction procedure and the approximations performed in the derivation. The purpose of this article is to explicitly show the connection between the general second order gyrokinetic Maxwell–Vlasov system issuedmore » from the modern gyrokinetic theory and the model currently implemented in the global electromagnetic Particle-in-Cell code ORB5. Necessary information about the modern gyrokinetic formalism is given together with the consistent derivation of the gyrokinetic Maxwell–Vlasov equations from first principles. The variational formulation of the dynamics is used to obtain the corresponding energy conservation law, which in turn is used for the verification of energy conservation diagnostics currently implemented in ORB5. This work fits within the context of the code verification project VeriGyro currently run at IPP Max-Planck Institut in collaboration with others European institutions.« less

  9. Transparency in practice: Evidence from 'verification analyses' issued by the Polish Agency for Health Technology Assessment in 2012-2015.

    PubMed

    Ozierański, Piotr; Löblová, Olga; Nicholls, Natalia; Csanádi, Marcell; Kaló, Zoltán; McKee, Martin; King, Lawrence

    2018-01-08

    Transparency is recognised to be a key underpinning of the work of health technology assessment (HTA) agencies, yet it has only recently become a subject of systematic inquiry. We contribute to this research field by considering the Polish Agency for Health Technology Assessment (AHTAPol). We situate the AHTAPol in a broader context by comparing it with the National Institute for Health and Care Excellence (NICE) in England. To this end, we analyse all 332 assessment reports, called verification analyses, that the AHTAPol issued from 2012 to 2015, and a stratified sample of 22 Evidence Review Group reports published by NICE in the same period. Overall, by increasingly presenting its key conclusions in assessment reports, the AHTAPol has reached the transparency standards set out by NICE in transparency of HTA outputs. The AHTAPol is more transparent than NICE in certain aspects of the HTA process, such as providing rationales for redacting assessment reports and providing summaries of expert opinions. Nevertheless, it is less transparent in other areas of the HTA process, such as including information on expert conflicts of interest. Our findings have important implications for understanding HTA in Poland and more broadly. We use them to formulate recommendations for policymakers.

  10. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  11. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 24: A general approach to measuring the value of aerospace information products and services

    NASA Technical Reports Server (NTRS)

    Brinberg, Herbert R.; Pinelli, Thomas E.

    1993-01-01

    This paper discusses the various approaches to measuring the value of information, first defining the meanings of information, economics of information, and value. It concludes that no general model of measuring the value of information is possible and that the usual approaches, such as cost/benefit equations, have very limited applications. It also concludes that in specific contexts with given goals for newly developed products and services or newly acquired information there is a basis for its objective valuation. The axioms and inputs for such a model are described and directions for further verification and analysis are proposed.

  12. NARMER-1: a photon point-kernel code with build-up factors

    NASA Astrophysics Data System (ADS)

    Visonneau, Thierry; Pangault, Laurence; Malouch, Fadhel; Malvagi, Fausto; Dolci, Florence

    2017-09-01

    This paper presents an overview of NARMER-1, the new generation of photon point-kernel code developed by the Reactor Studies and Applied Mathematics Unit (SERMA) at CEA Saclay Center. After a short introduction giving some history points and the current context of development of the code, the paper exposes the principles implemented in the calculation, the physical quantities computed and surveys the generic features: programming language, computer platforms, geometry package, sources description, etc. Moreover, specific and recent features are also detailed: exclusion sphere, tetrahedral meshes, parallel operations. Then some points about verification and validation are presented. Finally we present some tools that can help the user for operations like visualization and pre-treatment.

  13. The value of crime scene and site visitation by forensic psychologists and psychiatrists.

    PubMed

    Mohandie, Kris; Meloy, J Reid

    2013-05-01

    Site visits and crime scene visitation by forensic psychologists and psychiatrists may enhance the accuracy and credibility of their forensic work in criminal, civil, and other important contexts. This ethically sound technique of after-the-fact data collection and verification offers numerous potential benefits to the forensic mental health professional: clarifying the subject's actions, assessing the reliability of witness reports, identifying contextual determinants of behavior, and more fully illuminating subject motivation and decision-making. Limitations and suggested guidelines for conducting site visits are offered. Guidelines include preplanning, arranging for an informed guide to accompany and narrate the visit, and conducting the site visit prior to forensic examinations. © 2013 American Academy of Forensic Sciences.

  14. The Role of the DOE Weapons Laboratories in a Changing National Security Environment: CNSS Papers No. 8, April 1988

    DOE R&D Accomplishments Database

    Hecker, S. S.

    1988-04-01

    The contributions of the Department of Energy (DOE) nuclear weapons laboratories to the nation's security are reviewed in testimony before the Subcommittee on Procurement and Military Nuclear Systems of the House Armed Services Committee. Also presented are contributions that technology will make in maintaining the strategic balance through deterrence, treaty verification, and a sound nuclear weapons complex as the nation prepares for significant arms control initiatives. The DOE nuclear weapons laboratories can contribute to the broader context of national security, one that recognizes that military strength can be maintained over the long term only if it is built upon the foundations of economic strength and energy security.

  15. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  16. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  17. 78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-13

    ... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...

  18. 78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-28

    .... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process AGENCY: Pipeline and... announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity Verification Process shares similar characteristics with fitness for service processes. At this workshop, the...

  19. Interim Letter Report - Verification Survey of Partial Grid E9, David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P.C. Weaver

    2008-06-12

    Conduct verification surveys of available grids at the DWI 1630 in Knoxville, Tennessee. A representative with the Independent Environmental Assessment and Verification (IEAV) team from ORISE conducted a verification survey of a partial area within Grid E9.

  20. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  1. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  2. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Import certificate/delivery verification...

  3. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Import certificate/delivery verification...

  4. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  5. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  6. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  7. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  8. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  9. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  10. Transnational Intersectionality in Family Therapy With Resettled Refugees.

    PubMed

    Gangamma, Rashmi; Shipman, Daran

    2018-04-01

    In this article, we discuss incorporating the transnational intersectionality framework in family therapy with resettled refugees. Transnational intersectionality is an extension of the framework of intersectionality which helps to better understand complexities of power and oppression across national contexts and their influence on refugees' lives. Adopting this framework alerts family therapists to: (a) develop critical awareness of refugee's transnational contexts; (b) understand differences in experiences of social identities across contexts; (c) acknowledge postmigration factors of oppression affecting resettlement; and (d) critically reflect upon therapist-interpreter-client intersectionalities. This shifts our conceptualization of therapy with refugees to actively consider transnational contexts which refugees uniquely occupy. We describe the framework and provide two case illustrations to highlight its usefulness. © 2017 American Association for Marriage and Family Therapy.

  11. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  12. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  13. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  14. Towards the formal verification of the requirements and design of a processor interface unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.

  15. IoT Contextual Factors on Healthcare.

    PubMed

    Michalakis, Konstantinos; Caridakis, George

    2017-01-01

    With the emergence of the Internet of Things, new services in healthcare will be available and existing systems will be integrated in the IoT framework, providing automated medical supervision and efficient medical treatment. Context awareness plays a critical role in realizing the vision of the IoT, providing rich contextual information that can help the system act more efficiently. Since context in healthcare has its unique characteristics, it is necessary to define an appropriate context aware framework for healthcare IoT applications. We identify this context as perceived in healthcare applications and describe the context aware procedures. We also present an architecture that connects the sensors that measure biometric data with the sensory networks of the environment and the various IoT middleware that reside in the geographical area. Finally, we discuss the challenges for the realization of this vision.

  16. Hybrid context aware recommender systems

    NASA Astrophysics Data System (ADS)

    Jain, Rajshree; Tyagi, Jaya; Singh, Sandeep Kumar; Alam, Taj

    2017-10-01

    Recommender systems and context awareness is currently a vital field of research. Most hybrid recommendation systems implement content based and collaborative filtering techniques whereas this work combines context and collaborative filtering. The paper presents a hybrid context aware recommender system for books and movies that gives recommendations based on the user context as well as user or item similarity. It also addresses the issue of dimensionality reduction using weighted pre filtering based on dynamically entered user context and preference of context. This unique step helps to reduce the size of dataset for collaborative filtering. Bias subtracted collaborative filtering is used so as to consider the relative rating of a particular user and not the absolute values. Cosine similarity is used as a metric to determine the similarity between users or items. The unknown ratings are calculated and evaluated using MSE (Mean Squared Error) in test and train datasets. The overall process of recommendation has helped to personalize recommendations and give more accurate results with reduced complexity in collaborative filtering.

  17. 7 CFR 272.8 - State income and eligibility verification system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false State income and eligibility verification system. 272... PARTICIPATING STATE AGENCIES § 272.8 State income and eligibility verification system. (a) General. (1) State agencies may maintain and use an income and eligibility verification system (IEVS), as specified in this...

  18. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicators, HUD verification..., HUD verification methods and ratings. This section states the performance indicators that are used to assess PHA Section 8 management. HUD will use the verification method identified for each indicator in...

  19. 78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    .... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension... public workshop on ``Integrity Verification Process'' which took place on August 7, 2013. The notice also sought comments on the proposed ``Integrity Verification Process.'' In response to the comments received...

  20. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  1. 76 FR 50164 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-12

    ...-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing... correct certain portions of the Protocol Gas Verification Program and Minimum Competency Requirements for... final rule that amends the Agency's Protocol Gas Verification Program (PGVP) and the minimum competency...

  2. 30 CFR 227.601 - What are a State's responsibilities if it performs automated verification?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... performs automated verification? 227.601 Section 227.601 Mineral Resources MINERALS MANAGEMENT SERVICE... Perform Delegated Functions § 227.601 What are a State's responsibilities if it performs automated verification? To perform automated verification of production reports or royalty reports, you must: (a) Verify...

  3. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  4. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  5. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  6. Verification test report on a solar heating and hot water system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  7. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  8. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  9. Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)

    NASA Technical Reports Server (NTRS)

    Basinger, Scott A.

    2012-01-01

    This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the software package can be used to verify that the underlying requirements have been met.

  10. International Low Impact Docking System (iLIDS) Project Technical Requirements Specification, Revision F

    NASA Technical Reports Server (NTRS)

    Lewis, James L.

    2011-01-01

    The NASA Docking System (NDS) is NASA's implementation for the emerging International Docking System Standard (IDSS) using low impact docking technology. The NASA Docking System Project (NDSP) is the International Space Station (ISS) Program's project to produce the NDS, Common Docking Adapter (CDA) and Docking Hub. The NDS design evolved from the Low Impact Docking System (LIDS). The acronym international Low Impact Docking System (iLIDS) is also used to describe this system as well as the Government Furnished Equipment (GFE) project designing the NDS for the NDSP. NDS and iLIDS may be used interchangeability. This document will use the acronym iLIDS. Some of the heritage documentation and implementations (e.g., software command names, requirement identification (ID), figures, etc.) used on NDS will continue to use the LIDS acronym. This specification defines the technical requirements for the iLIDS GFE delivered to the NDSP by the iLIDS project. This document contains requirements for two iLIDS configurations, SEZ29101800-301 and SEZ29101800-302. Requirements with the statement, iLIDS shall, are for all configurations. Examples of requirements that are unique to a single configuration may be identified as iLIDS (-301) shall or iLIDS (-302) shall. Furthermore, to allow a requirement to encompass all configurations with an exception, the requirement may be designated as iLIDS (excluding -302) shall. Verification requirements for the iLIDS project are identified in the Verification Matrix (VM) provided in the iLIDS Verification and Validation Document, JSC-63966. The following definitions differentiate between requirements and other statements: Shall: This is the only verb used for the binding requirements. Should/May: These verbs are used for stating non-mandatory goals. Will: This verb is used for stating facts or declaration of purpose. A Definition of Terms table is provided in Appendix B to define those terms with specific tailored uses in this document.

  11. Cultural safety in New Zealand midwifery practice. Part 2.

    PubMed

    Farry, Annabel; Crowther, Susan

    2014-01-01

    Midwives in New Zealand work within a unique cultural context. This calls for an understanding and appreciation of biculturalism and the equal status of Mãori and Europeans as the nation's founding peoples. This paper is the second of two papers that explore the notions of cultural safety and competence. Exploration and discussion take place in the New Zealand context, yet have transferable implications for midwives everywhere. This second paper focuses on midwifery education and practice.

  12. Adapting the Unique Minds Program: Exploring the Feasibility of a Multiple Family Intervention for Children with Learning Disabilities in the Context of Spain.

    PubMed

    López-Larrosa, Silvia; González-Seijas, Rosa M; Carpenter, John S W

    2017-06-01

    The Unique Minds Program (Stern, Unique Minds Program, 1999) addresses the socio-emotional needs of children with learning disabilities (LD) and their families. Children and their parents work together in a multiple family group to learn more about LD and themselves as people with the capacity to solve problems in a collaborative way, including problems in family school relationships. This article reports the cultural adaptation of the program for use in Spain and findings from a feasibility study involving three multiple family groups and a total of 15 children and 15 mothers, using a pre-post design. This Spanish adaptation of the program is called "Mentes Únicas". Standardized outcome measures indicated an overall statistically significant decrease in children's self-rated maladjustment and relationship difficulties by the end of the program. Improvements were endorsed by most mothers, although they were not always recognized by the children's teachers. The program had a high level of acceptability: Mothers and children felt safe, understood, and helped throughout the sessions. The efficacy of the adapted intervention for the context of Spain remains to be tested in a more rigorous study. © 2016 Family Process Institute.

  13. Iron Deficiency Anemia: Focus on Infectious Diseases in Lesser Developed Countries

    PubMed Central

    Shaw, Julia G.; Friedman, Jennifer F.

    2011-01-01

    Iron deficiency anemia is thought to affect the health of more than one billion people worldwide, with the greatest burden of disease experienced in lesser developed countries, particularly women of reproductive age and children. This greater disease burden is due to both nutritional and infectious etiologies. Individuals in lesser developed countries have diets that are much lower in iron, less access to multivitamins for young children and pregnant women, and increased rates of fertility which increase demands for iron through the life course. Infectious diseases, particularly parasitic diseases, also lead to both extracorporeal iron loss and anemia of inflammation, which decreases bioavailability of iron to host tissues. This paper will address the unique etiologies and consequences of both iron deficiency anemia and the alterations in iron absorption and distribution seen in the context of anemia of inflammation. Implications for diagnosis and treatment in this unique context will also be discussed. PMID:21738863

  14. Parenting Specificity An Examination of the Relation Between Three Parenting Behaviors and Child Problem Behaviors in the Context of a History of Caregiver Depression

    PubMed Central

    McKee, Laura; Forehand, Rex; Rakow, Aaron; Reeslund, Kristen; Roland, Erin; Hardcastle, Emily; Compas, Bruce

    2009-01-01

    The aim of this study was to advance our understanding of the relations between three specific parenting behaviors (warmth, monitoring, and discipline) and two child outcomes (internalizing and externalizing problems) within the context of parental depression. Using an approach recommended by A. Caron, B. Weiss, V. Harris, and T. Carron (2006), unique and differential specificity were examined. Ninety-seven parents with a history of depression and 136 of their 9- to 15-year-old children served as participants. Children reported parenting behaviors and parents reported child problem behaviors. The findings indicated that warmth/involvement, but not monitoring or discipline, was uniquely related to externalizing problems and differentially related to internalizing and externalizing problems. The findings suggest that parental warmth has implications for interventions conducted with children living in families with a history of parental depression. PMID:18391048

  15. Toxic and Beneficial Potential of Silver Nanoparticles: The Two Sides of the Same Coin.

    PubMed

    Souza, Lilian Rodrigues Rosa; da Silva, Veronica Santana; Franchi, Leonardo Pereira; de Souza, Tiago Alves Jorge

    2018-01-01

    Nanotechnology has allowed great changes in chemical, biological and physical properties of metals when compared to their bulk counterparts. Within this context, silver nanoparticles (AgNPs) play a major role due to their unique properties, being widely used in daily products such as fabrics, washing machines, water filters, food and medicine. However, AgNPs can enter cells inducing a "Trojan-horse" type mechanism which potentially leads to cellular autophagy, apoptosis or necrosis. On the other hand, this cytotoxicity mechanism can be optimized to develop drug nanocarriers and anticancer therapies. The increasing use of these NPs entails their release into the environment, damaging ecosystems balance and representing a threat to human health. In this context, the possible deleterious effects that these NPs may represent for the biotic and abiotic ecosystems components represent an obstacle that must be overcome in order to guarantee the safety use of their unique properties.

  16. Spring-summer temperatures reconstructed for northern Switzerland and southwestern Germany from winter rye harvest dates, 1454-1970

    NASA Astrophysics Data System (ADS)

    Wetter, O.; Pfister, C.

    2011-11-01

    This paper presents a unique 517-yr long documentary data-based reconstruction of spring-summer (MAMJJ) temperatures for northern Switzerland and south-western Germany from 1454 to 1970. It is composed of 25 partial series of winter grain (secale cereale) harvest starting dates (WGHD) that are partly based on harvest related bookkeeping of institutions (hospitals, municipalities), partly on (early) phenological observations. The resulting main Basel WGHD series was homogenised with regard to dating style, data type and altitude. The calibration and verification approach was applied using the homogenous HISTALP temperature series from 1774-1824 for calibration (r = 0.78) and from 1920-1970 for verification (r = 0.75). The latter result even suffers from the weak data base available for 1870-1950. Temperature reconstructions based on WGHD are more influenced by spring temperatures than those based on grape harvest dates (GHD), because rye in contrast to vines already begins to grow as soon as sunlight brings the plant to above freezing. The earliest and latest harvest dates were checked for consistency with narrative documentary weather reports. Comparisons with other European documentary-based GHD and WGHD temperature reconstructions generally reveal significant correlations decreasing with the distance from Switzerland. The new Basel WGHD series shows better skills in representing highly climate change sensitive variations of Swiss Alpine glaciers than available GHD series.

  17. Spring-summer temperatures reconstructed for northern Switzerland and south-western Germany from winter rye harvest dates, 1454-1970

    NASA Astrophysics Data System (ADS)

    Wetter, O.; Pfister, C.

    2011-08-01

    This paper presents a unique 517 yr long documentary data - based reconstruction of spring-summer (MAMJJ) temperatures for northern Switzerland and south western Germany from 1454 to 1970. It is composed of 25 partial series of winter grain (secale cereale) harvest starting dates (WGHD) that are in one part based on harvest related bookkeeping of institutions (hospitals, municipalities), in the other part to (early) phenological observations. The resulting main Basel WGHD series was homogenised with regard to dating style, data type and altitude. The calibration and verification approach was applied using the homogenous HISTALP temperature series from 1774-1824 for calibration (r = 0,78) and from 1920-1970 for verification (r = 0.75). The latter result even suffer from the weak data basis available for 1870-1950. Temperature reconstructions based on WGHD are more influenced by spring temperatures than those based on grape harvest dates (GHD), because rye in contrast to vines already begins to grow as soon as sunlight brings the plant to above freezing. The earliest and latest harvest dates were checked for consistency with narrative documentary weather reports. Comparisons with other European documentary-based GHD and WGHD temperature reconstructions generally reveal significant correlations decreasing with the distance from Switzerland. The new Basel WGHD series shows better skills in representing highly climate change sensitive variations of Swiss Alpine glaciers than available GHD series.

  18. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, (from left) STS-96 Mission Specialist Julie Payette, Pilot Rick Husband and Mission Specialist Ellen Ochoa learn about the Sequential Shunt Unit (SSU) in front of them from Lynn Ashby (far right), with Johnson Space Center. The STS-96 crew is at KSC for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station . Other crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan, Dan Barry and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  19. Availability of buprenorphine on the Internet for purchase without a prescription

    PubMed Central

    Bachhuber, Marcus A.; Cunningham, Chinazo O.

    2012-01-01

    Background Use of illicit buprenorphine is increasingly recognized, but it is unknown if the Internet currently represents an accessible source. Methods A series of Internet searches were conducted. Twenty searches were performed on two different search engines. The first 100 results of each search were classified into categories based on content. All Internet pharmacies were searched for buprenorphine preparations and if available, sites were examined to determine if a prescription was required for purchase, for the cost of buprenorphine, the geographical origin of the pharmacy, and evidence of validation by an online pharmacy verification service. Results Of the 2,000 links examined, 1422 were unique. Six percent of links were to illicit commercial sites, 2% were to legitimate commercial sites, and 2% were to illicit portal sites, which contained links to many illicit commercial sites. Twenty pharmacies offering buprenorphine for purchase without a prescription were identified. The monthly cost of a typical starting dose of 2 mg buprenorphine daily ranged between $232 and $1,163 USD. No pharmacies were listed by online pharmacy verification services. Conclusion Twenty online pharmacies advertising buprenorphine formulations for sale without a prescription were identified. Prices varied widely between illicit pharmacies but were uniformly more expensive than legitimate pharmacies. Illicitly obtained buprenorphine formulations appear to be relatively inaccessible and at high cost on the Internet. PMID:23201172

  20. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  1. 37 CFR 201.29 - Access to, and confidentiality of, Statements of Account, Verification Auditor's Reports, and...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... confidentiality of, Statements of Account, Verification Auditor's Reports, and other verification information... GENERAL PROVISIONS § 201.29 Access to, and confidentiality of, Statements of Account, Verification Auditor... Account, including the Primary Auditor's Reports, filed under 17 U.S.C. 1003(c) and access to a Verifying...

  2. Formal Multilevel Hierarchical Verification of Synchronous MOS VLSI Circuits.

    DTIC Science & Technology

    1987-06-01

    166 12.4 Capacitance Coupling............................. 166 12.5 Multiple Abstraction Fuctions ....................... 168...depend on whether it is performing flat verification or hierarchical verification. The primary operations of Silica Pithecus when performing flat...signals never arise. The primary operation of Silica Pithecus when performing hierarchical verification is processing constraints to show they hold

  3. 49 CFR 802.7 - Requests: How, where, and when presented; verification of identity of individuals making requests...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...; verification of identity of individuals making requests; accompanying persons; and procedures for... Procedures and Requirements § 802.7 Requests: How, where, and when presented; verification of identity of... which the record is contained. (d) Verification of identity of requester. (1) For written requests, the...

  4. 49 CFR 802.7 - Requests: How, where, and when presented; verification of identity of individuals making requests...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...; verification of identity of individuals making requests; accompanying persons; and procedures for... Procedures and Requirements § 802.7 Requests: How, where, and when presented; verification of identity of... which the record is contained. (d) Verification of identity of requester. (1) For written requests, the...

  5. 76 FR 54810 - Submission for Review: 3206-0215, Verification of Full-Time School Attendance, RI 25-49

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: 3206-0215, Verification of Full-Time School...) 3206-0215, Verification of Full-Time School Attendance. As required by the Paperwork Reduction Act of... or faxed to (202) 395-6974. SUPPLEMENTARY INFORMATION: RI 25-49, Verification of Full-Time School...

  6. 25 CFR 61.8 - Verification forms.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... using the last address of record. The verification form will be used to ascertain the previous enrollee... death. Name and/or address changes will only be made if the verification form is signed by an adult... 25 Indians 1 2010-04-01 2010-04-01 false Verification forms. 61.8 Section 61.8 Indians BUREAU OF...

  7. A new lineage of braconid wasps in Burmese Cenomanian amber (Hymenoptera, Braconidae).

    PubMed

    Engel, Michael S; Huang, Diying; Cai, Chenyang; Alqarni, Abdulaziz S

    2018-01-01

    A new braconid wasp from the Upper Cretaceous (Cenomanian) amber of the Hukawng Valley in Kachin State, Myanmar is described and figured from a unique female. Seneciobracon novalatus Engel & Huang, gen. et sp. n. , is placed in a distinct subfamily, Seneciobraconinae Engel & Huang, subfam. n. , owing to the presence of a unique combination of primitive protorhyssaline-like traits, with an otherwise more derived wing venation. The fossil is discussed in the context of other Cretaceous Braconidae.

  8. A new lineage of braconid wasps in Burmese Cenomanian amber (Hymenoptera, Braconidae)

    PubMed Central

    Engel, Michael S.; Huang, Diying; Cai, Chenyang; Alqarni, Abdulaziz S.

    2018-01-01

    Abstract A new braconid wasp from the Upper Cretaceous (Cenomanian) amber of the Hukawng Valley in Kachin State, Myanmar is described and figured from a unique female. Seneciobracon novalatus Engel & Huang, gen. et sp. n., is placed in a distinct subfamily, Seneciobraconinae Engel & Huang, subfam. n., owing to the presence of a unique combination of primitive protorhyssaline-like traits, with an otherwise more derived wing venation. The fossil is discussed in the context of other Cretaceous Braconidae. PMID:29416397

  9. Audio-visual imposture

    NASA Astrophysics Data System (ADS)

    Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard

    2006-05-01

    A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.

  10. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    PubMed

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  11. Current status of verification practices in clinical biochemistry in Spain.

    PubMed

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  12. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  13. Challenges for Multilevel Health Disparities Research in a Transdisciplinary Environment

    PubMed Central

    Holmes, John H.; Lehman, Amy; Hade, Erinn; Ferketich, Amy K.; Sarah, Gehlert; Rauscher, Garth H.; Abrams, Judith; Bird, Chloe E.

    2008-01-01

    Numerous factors play a part in health disparities. Although health disparities are manifested at the level of the individual, other contexts should be considered when investigating the associations of disparities with clinical outcomes. These contexts include families, neighborhoods, social organizations, and healthcare facilities. This paper reports on health disparities research as a multilevel research domain from the perspective of a large national initiative. The Centers for Population Health and Health Disparities (CPHHD) program was established by the NIH to examine the highly dimensional, complex nature of disparities and their effects on health. Because of its inherently transdisciplinary nature, the CPHHD program provides a unique environment in which to perform multilevel health disparities research. During the course of the program, the CPHHD centers have experienced challenges specific to this type of research. The challenges were categorized along three axes: sources of subjects and data, data characteristics, and multilevel analysis and interpretation. The CPHHDs collectively offer a unique example of how these challenges are met; just as importantly, they reveal a broad range of issues that health disparities researchers should consider as they pursue transdisciplinary investigations in this domain, particularly in the context of a large team science initiative. PMID:18619398

  14. Family predictors of disease management over one year in Latino and European American patients with type 2 diabetes.

    PubMed

    Chesla, Catherine A; Fisher, Lawrence; Skaff, Marilyn M; Mullan, Joseph T; Gilliss, Catherine L; Kanter, Richard

    2003-01-01

    Family context is thought to influence chronic disease management but few studies have longitudinally examined these relationships. Research on families and chronic illness has focused almost exclusively on European American families. In this prospective study we tested a multidimensional model of family influence on disease management in type 2 diabetes in a bi-ethnic sample of European Americans and Latinos. Specifically, we tested how baseline family characteristics (structure, world view, and emotion management) predicted change in disease management over one year in 104 European American and 57 Latino patients with type 2 diabetes. We found that emotion management predicted change in disease management in both groups of patients as hypothesized, while family world view predicted change in both ethnic groups but in the predicted direction only for European Americans. Examining family context within ethnic groups is required to elucidate unique cultural patterns. Attending to culturally unique interpretations of constructs and measures is warranted. The import of family emotion management, specifically conflict resolution, in disease management deserves further study to support clinical intervention development. Examining multiple domains of family life and multidimensional health outcomes strengthens our capacity to develop theory about family contexts and individual health.

  15. Revalidation and quality assurance: the application of the MUSIQ framework in independent verification visits to healthcare organisations

    PubMed Central

    Griffin, Ann; Viney, Rowena; Welland, Trevor; Gafson, Irene

    2017-01-01

    Objectives We present a national evaluation of the impact of independent verification visits (IVVs) performed by National Health Service (NHS) England as part of quality assuring medical revalidation. Organisational visits are central to NHS quality assurance. They are costly, yet little empirical research evidence exists concerning their impact, and what does exist is conflicting. Setting The focus was on healthcare providers in the NHS (in secondary care) and private sector across England, who were designated bodies (DBs). DBs are healthcare organisations that have a statutory responsibility, via the lead clinician, the responsible officer (RO), to implement medical revalidation. Participants All ROs who had undergone an IVV in England in 2014 and 2015 were invited to participate. 46 ROs were interviewed. Ethnographic data were gathered at 18 observations of the IVVs and 20 IVV post visit reports underwent documentary analysis. Primary and secondary outcome measures Primary outcomes were the findings pertaining to the effectiveness of the IVV system in supporting the revalidation processes at the DBs. Secondary outcomes were methodological, relating to the Model for Understanding Success in Quality (MUSIQ) and how its application to the IVV reveals the relevance of contextual factors described in the model. Results The impact of the IVVs varied by DB according to three major themes: the personal context of the RO; the organisational context of the DB; and the visit and its impact. ROs were largely satisfied with visits which raised the status of appraisal within their organisations. Inadequate or untimely feedback was associated with dissatisfaction. Conclusions Influencing teams whose prime responsibility is establishing processes and evaluating progress was crucial for internal quality improvement. Visits acted as a nudge, generating internal quality review, which was reinforced by visit teams with relevant expertise. Diverse team membership, knowledge transfer and timely feedback made visits more impactful. PMID:28196952

  16. Revalidation and quality assurance: the application of the MUSIQ framework in independent verification visits to healthcare organisations.

    PubMed

    Griffin, Ann; McKeown, Alex; Viney, Rowena; Rich, Antonia; Welland, Trevor; Gafson, Irene; Woolf, Katherine

    2017-02-14

    We present a national evaluation of the impact of independent verification visits (IVVs) performed by National Health Service (NHS) England as part of quality assuring medical revalidation. Organisational visits are central to NHS quality assurance. They are costly, yet little empirical research evidence exists concerning their impact, and what does exist is conflicting. The focus was on healthcare providers in the NHS (in secondary care) and private sector across England, who were designated bodies (DBs). DBs are healthcare organisations that have a statutory responsibility, via the lead clinician, the responsible officer (RO), to implement medical revalidation. All ROs who had undergone an IVV in England in 2014 and 2015 were invited to participate. 46 ROs were interviewed. Ethnographic data were gathered at 18 observations of the IVVs and 20 IVV post visit reports underwent documentary analysis. Primary outcomes were the findings pertaining to the effectiveness of the IVV system in supporting the revalidation processes at the DBs. Secondary outcomes were methodological, relating to the Model for Understanding Success in Quality (MUSIQ) and how its application to the IVV reveals the relevance of contextual factors described in the model. The impact of the IVVs varied by DB according to three major themes: the personal context of the RO; the organisational context of the DB; and the visit and its impact. ROs were largely satisfied with visits which raised the status of appraisal within their organisations. Inadequate or untimely feedback was associated with dissatisfaction. Influencing teams whose prime responsibility is establishing processes and evaluating progress was crucial for internal quality improvement. Visits acted as a nudge, generating internal quality review, which was reinforced by visit teams with relevant expertise. Diverse team membership, knowledge transfer and timely feedback made visits more impactful. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  17. Correctional nursing: a study protocol to develop an educational intervention to optimize nursing practice in a unique context.

    PubMed

    Almost, Joan; Gifford, Wendy A; Doran, Diane; Ogilvie, Linda; Miller, Crystal; Rose, Don N; Squires, Mae

    2013-06-21

    Nurses are the primary healthcare providers in correctional facilities. A solid knowledge and expertise that includes the use of research evidence in clinical decision making is needed to optimize nursing practice and promote positive health outcomes within these settings. The institutional emphasis on custodial care within a heavily secured, regulated, and punitive environment presents unique contextual challenges for nursing practice. Subsequently, correctional nurses are not always able to obtain training or ongoing education that is required for broad scopes of practice. The purpose of the proposed study is to develop an educational intervention for correctional nurses to support the provision of evidence-informed care. A two-phase mixed methods research design will be used. The setting will be three provincial correctional facilities. Phase one will focus on identifying nurses' scope of practice and practice needs, describing work environment characteristics that support evidence-informed practice and developing the intervention. Semi-structured interviews will be completed with nurses and nurse managers. To facilitate priorities for the intervention, a Delphi process will be used to rank the learning needs identified by participants. Based on findings, an online intervention will be developed. Phase two will involve evaluating the acceptability and feasibility of the intervention to inform a future experimental design. The context of provincial correctional facilities presents unique challenges for nurses' provision of care. This study will generate information to address practice and learning needs specific to correctional nurses. Interventions tailored to barriers and supports within specific contexts are important to enable nurses to provide evidence-informed care.

  18. Performance-Based Financing to Strengthen the Health System in Benin: Challenging the Mainstream Approach

    PubMed Central

    Paul, Elisabeth; Lamine Dramé, Mohamed; Kashala, Jean-Pierre; Ekambi Ndema, Armand; Kounnou, Marcel; Codjovi Aïssan, Julien; Gyselinck, Karel

    2018-01-01

    Background: Performance-based financing (PBF) is often proposed as a way to improve health system performance. In Benin, PBF was launched in 2012 through a World Bank-supported project. The Belgian Development Agency (BTC) followed suit through a health system strengthening (HSS) project. This paper analyses and draws lessons from the experience of BTC-supported PBF alternative approach – especially with regards to institutional aspects, the role of demand-side actors, ownership, and cost-effectiveness – and explores the mechanisms at stake so as to better understand how the "PBF package" functions and produces effects Methods: An exploratory, theory-driven evaluation approach was adopted. Causal mechanisms through which PBF is hypothesised to impact on results were singled out and explored. This paper stems from the co-authors’ capitalisation of experiences; mixed methods were used to collect, triangulate and analyse information. Results are structured along Witter et al framework. Results: Influence of context is strong over PBF in Benin; the policy is donor-driven. BTC did not adopt the World Bank’s mainstream PBF model, but developed an alternative approach in line with its HSS support programme, which is grounded on existing domestic institutions. The main features of this approach are described (decentralised governance, peer review verification, counter-verification entrusted to health service users’ platforms), as well as its adaptive process. PBF has contributed to strengthen various aspects of the health system and led to modest progress in utilisation of health services, but noticeable improvements in healthcare quality. Three mechanisms explaining observed outcomes within the context are described: comprehensive HSS at district level; acting on health workers’ motivation through a complex package of incentives; and increased accountability by reinforcing dialogue with demand-side actors. Cost-effectiveness and sustainability issues are also discussed. Conclusion: BTC’s alternative PBF approach is both promising in terms of effects, ownership and sustainability, and less resource consuming. This experience testifies that PBF is not a uniform or rigid model, and opens the policy ground for recipient governments to put their own emphasis and priorities and design ad hoc models adapted to their context specificities. However, integrating PBF within the normal functioning of local health systems, in line with other reforms, is a big challenge. PMID:29325401

  19. Environmental Technology Verification: Supplement to Test/QA Plan for Biological and Aerosol Testing of General Ventilation Air Cleaners; Bioaerosol Inactivation Efficiency by HVAC In-Duct Ultraviolet Light Air Cleaners

    EPA Science Inventory

    The Air Pollution Control Technology Verification Center has selected general ventilation air cleaners as a technology area. The Generic Verification Protocol for Biological and Aerosol Testing of General Ventilation Air Cleaners is on the Environmental Technology Verification we...

  20. 49 CFR 40.135 - What does the MRO tell the employee at the beginning of the verification interview?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... beginning of the verification interview? 40.135 Section 40.135 Transportation Office of the Secretary of... verification interview? (a) As the MRO, you must tell the employee that the laboratory has determined that the... finding of adulteration or substitution. (b) You must explain the verification interview process to the...

  1. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations... concentration subcomponents (e.g., THC and CH4 for NMHC) separately. For example, for NMHC measurements, perform drift verification on NMHC; do not verify THC and CH4 separately. (2) Drift verification requires two...

  2. 76 FR 44051 - Submission for Review: Verification of Who Is Getting Payments, RI 38-107 and RI 38-147

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-22

    .... SUPPLEMENTARY INFORMATION: RI 38-107, Verification of Who is Getting Payments, is designed for use by the... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Verification of Who Is Getting Payments, RI... currently approved information collection request (ICR) 3206-0197, Verification of Who is Getting Payments...

  3. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  4. Research on key technology of the verification system of steel rule based on vision measurement

    NASA Astrophysics Data System (ADS)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  5. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  6. Mentoring of Junior Faculty.

    ERIC Educational Resources Information Center

    Campbell, William H.

    1992-01-01

    Some personal aspects of the mentoring relationship between senior and junior faculty are discussed, including the "psychological contract" between mentor and protege, the unique role played by the mentor in an organizational context, mentor characteristics, and 10 specific principles of effective mentoring. (MSE)

  7. Mapping the Universe: Slices and Bubbles.

    ERIC Educational Resources Information Center

    Geller, Margaret J.

    1990-01-01

    Map making is described in the context of extraterrestrial areas. An analogy to terrestrial map making is used to provide some background. The status of projects designed to map extraterrestrial areas are discussed including problems unique to this science. (CW)

  8. Ethical review of research on human subjects at Unilever: reflections on governance.

    PubMed

    Sheehan, Mark; Marti, Vernon; Roberts, Tony

    2014-07-01

    This article considers the process of ethical review of research on human subjects at a very large multinational consumer products company. The commercial context of this research throws up unique challenges and opportunities that make the ethics of the process of oversight distinct from mainstream medical research. Reflection on the justification of governance processes sheds important, contrasting light on the ethics of governance of other forms and context of research. © 2013 John Wiley & Sons Ltd.

  9. Influence of the Redundant Verification and the Non-Redundant Verification on the Hydraulic Tomography

    NASA Astrophysics Data System (ADS)

    Wei, T. B.; Chen, Y. L.; Lin, H. R.; Huang, S. Y.; Yeh, T. C. J.; Wen, J. C.

    2016-12-01

    In the groundwater study, it estimated the heterogeneous spatial distribution of hydraulic Properties, there were many scholars use to hydraulic tomography (HT) from field site pumping tests to estimate inverse of heterogeneous spatial distribution of hydraulic Properties, to prove the most of most field site aquifer was heterogeneous hydrogeological parameters spatial distribution field. Many scholars had proposed a method of hydraulic tomography to estimate heterogeneous spatial distribution of hydraulic Properties of aquifer, the Huang et al. [2011] was used the non-redundant verification analysis of pumping wells changed, observation wells fixed on the inverse and the forward, to reflect the feasibility of the heterogeneous spatial distribution of hydraulic Properties of field site aquifer of the non-redundant verification analysis on steady-state model.From post literature, finding only in steady state, non-redundant verification analysis of pumping well changed location and observation wells fixed location for inverse and forward. But the studies had not yet pumping wells fixed or changed location, and observation wells fixed location for redundant verification or observation wells change location for non-redundant verification of the various combinations may to explore of influences of hydraulic tomography method. In this study, it carried out redundant verification method and non-redundant verification method for forward to influences of hydraulic tomography method in transient. And it discuss above mentioned in NYUST campus sites the actual case, to prove the effectiveness of hydraulic tomography methods, and confirmed the feasibility on inverse and forward analysis from analysis results.Keywords: Hydraulic Tomography, Redundant Verification, Heterogeneous, Inverse, Forward

  10. Fingerprint changes and verification failure among patients with hand dermatitis.

    PubMed

    Lee, Chew Kek; Chang, Choong Chor; Johar, Asmah; Puwira, Othman; Roshidah, Baba

    2013-03-01

    To determine the prevalence of fingerprint verification failure and to define and quantify the fingerprint changes associated with fingerprint verification failure. Case-control study. Referral public dermatology center. The study included 100 consecutive patients with clinical hand dermatitis involving the palmar distal phalanx of either thumb and 100 age-, sex-, and ethnicity-matched controls. Patients with an altered thumb print due to other causes and palmar hyperhidrosis were excluded. Fingerprint verification(pass/fail) and hand eczema severity index score. Twenty-seven percent of patients failed fingerprint verification compared with 2% of controls. Fingerprint verification failure was associated with a higher hand eczema severity index score (P.001). The main fingerprint abnormalities were fingerprint dystrophy (42.0%) and abnormal white lines (79.5%). The number of abnormal white lines was significantly higher among the patients with hand dermatitis compared with controls(P=.001). Among the patients with hand dermatitis, theodds of failing fingerprint verification with fingerprint dystrophy was 4.01. The presence of broad lines and long lines was associated with a greater odds of fingerprint verification failure (odds ratio [OR], 8.04; 95% CI, 3.56-18.17 and OR, 2.37; 95% CI, 1.31-4.27, respectively),while the presence of thin lines was protective of verification failure (OR, 0.45; 95% CI, 0.23-0.89). Fingerprint verification failure is a significant problem among patients with more severe hand dermatitis. It is mainly due to fingerprint dystrophy and abnormal white lines. Malaysian National Medical Research Register Identifier: NMRR-11-30-8226

  11. Comments for A Conference on Verification in the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doyle, James E.

    2012-06-12

    The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification ismore » information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.« less

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR TREATMENT OF WASTEWATER GENERATED DURING DECONTAMINATION ACTIVITIES - ULTRASTRIP SYSTEMS, INC., MOBILE EMERGENCY FILTRATION SYSTEM (MEFS) - 04/14/WQPC-HS

    EPA Science Inventory

    Performance verification testing of the UltraStrip Systems, Inc., Mobile Emergency Filtration System (MEFS) was conducted under EPA's Environmental Technology Verification (ETV) Program at the EPA Test and Evaluation (T&E) Facility in Cincinnati, Ohio, during November, 2003, thr...

  13. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-12-10

    Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error

  14. Process Document, Joint Verification Protocol, and Joint Test Plan for Verification of HACH-LANGE GmbH LUMIStox 300 Bench Top Luminometer and ECLOX Handheld Luminometer for Luminescent Bacteria Test for use in Wastewater

    EPA Science Inventory

    The Danish Environmental Technology Verification program (DANETV) Water Test Centre operated by DHI, is supported by the Danish Ministry for Science, Technology and Innovation. DANETV, the United States Environmental Protection Agency Environmental Technology Verification Progra...

  15. Certification and verification for Northrup Model NSC-01-0732 Fresnel lens concentrating solar collector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-03-01

    The certification and verification of the Northrup Model NSC-01-0732 Fresnel lens tracking solar collector are presented. A certification statement is included with signatures and a separate report on the structural analysis of the collector system. System verification against the Interim Performance Criteria are indicated by matrices with verification discussion, analysis, and enclosed test results.

  16. Worldwide governmental efforts to locate and destroy chemical weapons and weapons materials: minimizing risk in transport and destruction.

    PubMed

    Trapp, Ralf

    2006-09-01

    The article gives an overview on worldwide efforts to eliminate chemical weapons and facilities for their production in the context of the implementation of the 1997 Chemical Weapons Convention (CWC). It highlights the objectives of the Organisation for the Prohibition of Chemical Weapons (OPCW), the international agency set up in The Hague to implement the CWC, and provides an overview of the present status of implementation of the CWC requirements with respect to chemical weapons (CW) destruction under strict international verification. It addresses new requirements that result from an increased threat that terrorists might attempt to acquire or manufacture CW or related materials. The article provides an overview of risks associated with CW and their elimination, from storage or recovery to destruction. It differentiates between CW in stockpile and old/abandoned CW, and gives an overview on the factors and key processes that risk assessment, management, and communication need to address. This discussion is set in the overall context of the CWC that requires the completion of the destruction of all declared CW stockpiles by 2012 at the latest.

  17. The HUPO proteomics standards initiative--overcoming the fragmentation of proteomics data.

    PubMed

    Hermjakob, Henning

    2006-09-01

    Proteomics is a key field of modern biomolecular research, with many small and large scale efforts producing a wealth of proteomics data. However, the vast majority of this data is never exploited to its full potential. Even in publicly funded projects, often the raw data generated in a specific context is analysed, conclusions are drawn and published, but little attention is paid to systematic documentation, archiving, and public access to the data supporting the scientific results. It is often difficult to validate the results stated in a particular publication, and even simple global questions like "In which cellular contexts has my protein of interest been observed?" can currently not be answered with realistic effort, due to a lack of standardised reporting and collection of proteomics data. The Proteomics Standards Initiative (PSI), a work group of the Human Proteome Organisation (HUPO), defines community standards for data representation in proteomics to facilitate systematic data capture, comparison, exchange and verification. In this article we provide an overview of PSI organisational structure, activities, and current results, as well as ways to get involved in the broad-based, open PSI process.

  18. Security evaluation and assurance of electronic health records.

    PubMed

    Weber-Jahnke, Jens H

    2009-01-01

    Electronic Health Records (EHRs) maintain information of sensitive nature. Security requirements in this context are typically multilateral, encompassing the viewpoints of multiple stakeholders. Two main research questions arise from a security assurance point of view, namely how to demonstrate the internal correctness of EHRs and how to demonstrate their conformance in relation to multilateral security regulations. The above notions of correctness and conformance directly relate to the general concept of system verification, which asks the question "are we building the system right?" This should not be confused with the concept of system validation, which asks the question "are we building the right system?" Much of the research in the medical informatics community has been concerned with the latter aspect (validation). However, trustworthy security requires assurances that standards are followed and specifications are met. The objective of this paper is to contribute to filling this gap. We give an introduction to fundamentals of security assurance, summarize current assurance standards, and report on experiences with using security assurance methodology applied to the EHR domain, specifically focusing on case studies in the Canadian context.

  19. Verification of a SEU model for advanced 1-micron CMOS structures using heavy ions

    NASA Technical Reports Server (NTRS)

    Cable, J. S.; Carter, J. R.; Witteles, A. A.

    1986-01-01

    Modeling and test results are reported for 1 micron CMOS circuits. Analytical predictions are correlated with experimental data, and sensitivities to process and design variations are discussed. Unique features involved in predicting the SEU performance of these devices are described. The results show that the critical charge for upset exhibits a strong dependence on pulse width for very fast devices, and upset predictions must factor in the pulse shape. Acceptable SEU error rates can be achieved for a 1 micron bulk CMOS process. A thin retrograde well provides complete SEU immunity for N channel hits at normal incidence angle. Source interconnect resistance can be important parameter in determining upset rates, and Cf-252 testing can be a valuable tool for cost-effective SEU testing.

  20. Accessing defect dynamics using intense, nanosecond pulsed ion beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Persaud, A.; Barnard, J. J.; Guo, H.

    2015-06-18

    Gaining in-situ access to relaxation dynamics of radiation induced defects will lead to a better understanding of materials and is important for the verification of theoretical models and simulations. We show preliminary results from experiments at the new Neutralized Drift Compression Experiment (NDCX-II) at Lawrence Berkeley National Laboratory that will enable in-situ access to defect dynamics through pump-probe experiments. Here, the unique capabilities of the NDCX-II accelerator to generate intense, nanosecond pulsed ion beams are utilized. Preliminary data of channeling experiments using lithium and potassium ions and silicon membranes are shown. We compare these data to simulation results using Crystalmore » Trim. Furthermore, we discuss the improvements to the accelerator to higher performance levels and the new diagnostics tools that are being incorporated.« less

Top