Sample records for economic verification experiments

  1. An evaluation of SEASAT-A candidate ocean industry economic verification experiments

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A description of the candidate economic verification experiments which could be performed with SEASAT is provided. Experiments have been identified in each of the areas of ocean-based activity that are expected to show an economic impact from the use of operational SEASAT data. Experiments have been identified in the areas of Arctic operations, the ocean fishing industry, the offshore oil and natural gas industry, as well as ice monitoring and coastal zone applications.

  2. SEASAT - A candidate ocean industry economic verification experiments

    NASA Technical Reports Server (NTRS)

    Miller, B. P.

    1976-01-01

    The economic benefits of an operational SEASAT system are discussed in the areas of marine transportation, offshore oil and natural gas exploration and development, ocean fishing, and Arctic operations. A description of the candidate economic verification experiments which could be performed with SEASAT-A is given. With the exception of the area of Arctic operations, experiments have been identified in each of the areas of ocean based activity that are expected to show an economic impact from the use of operational SEASAT data. Experiments have been identified in the areas of the offshore oil and natural gas industry, as well as ice monitoring and coastal zone applications. Emphasis has been placed on the identification and the development of those experiments which meet criteria for: (1) end user participation; (2) SEASAT-A data utility; (3) measurability of operational parameters to demonstrate economic effect; and (4) non-proprietary nature of results.

  3. A plan for application system verification tests: The value of improved meteorological information, volume 1. [economic consequences of improved meteorological information

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The framework within which the Applications Systems Verification Tests (ASVTs) are performed and the economic consequences of improved meteorological information demonstrated is described. This framework considers the impact of improved information on decision processes, the data needs to demonstrate the economic impact of the improved information, the data availability, the methodology for determining and analyzing the collected data and demonstrating the economic impact of the improved information, and the possible methods of data collection. Three ASVTs are considered and program outlines and plans are developed for performing experiments to demonstrate the economic consequences of improved meteorological information. The ASVTs are concerned with the citrus crop in Florida, the cotton crop in Mississippi and a group of diverse crops in Oregon. The program outlines and plans include schedules, manpower estimates and funding requirements.

  4. 14 CFR 211.11 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification. 211.11 Section 211.11 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION (AVIATION PROCEEDINGS) ECONOMIC REGULATIONS APPLICATIONS FOR PERMITS TO FOREIGN AIR CARRIERS General Requirements § 211.11 Verification...

  5. On the assessment of biological life support system operation range

    NASA Astrophysics Data System (ADS)

    Bartsev, Sergey

    Biological life support systems (BLSS) can be used in long-term space missions only if well-thought-out assessment of the allowable operating range is obtained. The range has to account both permissible working parameters of BLSS and the critical level of perturbations of BLSS stationary state. Direct approach to outlining the range by statistical treatment of experimental data on BLSS destruction seems to be not applicable due to ethical, economical, and saving time reasons. Mathematical model is the unique tool for the generalization of experimental data and the extrapolation of the revealed regularities beyond empirical experience. The problem is that the quality of extrapolation depends on the adequacy of corresponding model verification, but good verification requires wide range of experimental data for fitting, which is not achievable for manned experimental BLSS. Possible way to improve the extrapolation quality of inevitably poorly verified models of manned BLSS is to extrapolate general tendency obtained from unmanned LSS theoretical-experiment investigations. Possibilities and limitations of such approach are discussed.

  6. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    PubMed

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  7. Evaluation and economic value of winter weather forecasts

    NASA Astrophysics Data System (ADS)

    Snyder, Derrick W.

    State and local highway agencies spend millions of dollars each year to deploy winter operation teams to plow snow and de-ice roadways. Accurate and timely weather forecast information is critical for effective decision making. Students from Purdue University partnered with the Indiana Department of Transportation to create an experimental winter weather forecast service for the 2012-2013 winter season in Indiana to assist in achieving these goals. One forecast product, an hourly timeline of winter weather hazards produced daily, was evaluated for quality and economic value. Verification of the forecasts was performed with data from the Rapid Refresh numerical weather model. Two objective verification criteria were developed to evaluate the performance of the timeline forecasts. Using both criteria, the timeline forecasts had issues with reliability and discrimination, systematically over-forecasting the amount of winter weather that was observed while also missing significant winter weather events. Despite these quality issues, the forecasts still showed significant, but varied, economic value compared to climatology. Economic value of the forecasts was estimated to be 29.5 million or 4.1 million, depending on the verification criteria used. Limitations of this valuation system are discussed and a framework is developed for more thorough studies in the future.

  8. Formal methods for dependable real-time systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.

  9. An Environmental Technology Verification (ETV) Testing of a Ballast Exchange Assurance Meter (BEAM) 100

    EPA Science Inventory

    Mid-ocean ballast water exchange (BWE) is mandatory for all vessels entering U.S. waters from outside the 200-mile exclusive economic zone. To support such regulation, accurate and portable verification tools are needed for determining that BWE has taken place. One parameter pr...

  10. 24 CFR 3286.207 - Process for obtaining installation license.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... installation license must submit verification of the experience required in § 3286.205(a). This verification may be in the form of statements by past or present employers or a self-certification that the applicant meets those experience requirements, but HUD may contact the applicant for additional verification...

  11. 24 CFR 3286.307 - Process for obtaining trainer's qualification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... verification of the experience required in § 3286.305. This verification may be in the form of statements by past or present employers or a self-certification that the applicant meets those experience requirements, but HUD may contact the applicant for additional verification at any time. The applicant must...

  12. Formal Verification of the AAMP-FV Microcode

    NASA Technical Reports Server (NTRS)

    Miller, Steven P.; Greve, David A.; Wilding, Matthew M.; Srivas, Mandayam

    1999-01-01

    This report describes the experiences of Collins Avionics & Communications and SRI International in formally specifying and verifying the microcode in a Rockwell proprietary microprocessor, the AAMP-FV, using the PVS verification system. This project built extensively on earlier experiences using PVS to verify the microcode in the AAMP5, a complex, pipelined microprocessor designed for use in avionics displays and global positioning systems. While the AAMP5 experiment demonstrated the technical feasibility of formal verification of microcode, the steep learning curve encountered left unanswered the question of whether it could be performed at reasonable cost. The AAMP-FV project was conducted to determine whether the experience gained on the AAMP5 project could be used to make formal verification of microcode cost effective for safety-critical and high volume devices.

  13. The Mailbox Computer System for the IAEA verification experiment on HEU downlending at the Portsmouth Gaseous Diffusion Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aronson, A.L.; Gordon, D.M.

    IN APRIL 1996, THE UNITED STATES (US) ADDED THE PORTSMOUTH GASEOUS DIFFUSION PLANT TO THE LIST OF FACILITIES ELIGIBLE FOR THE APPLICATION OF INTERNATIONAL ATOMIC ENERGY AGENCY (IAEA) SAFEGUARDS. AT THAT TIME, THE US PROPOSED THAT THE IAEA CARRY OUT A ''VERIFICATION EXPERIMENT'' AT THE PLANT WITH RESPECT TO DOOWNBLENDING OF ABOUT 13 METRIC TONS OF HIGHLY ENRICHED URANIUM (HEU) IN THE FORM OF URANIUM HEXAFLUROIDE (UF6). DURING THE PERIOD DECEMBER 1997 THROUGH JULY 1998, THE IAEA CARRIED OUT THE REQUESTED VERIFICATION EXPERIMENT. THE VERIFICATION APPROACH USED FOR THIS EXPERIMENT INCLUDED, AMONG OTHER MEASURES, THE ENTRY OF PROCESS-OPERATIONAL DATA BYmore » THE FACILITY OPERATOR ON A NEAR-REAL-TIME BASIS INTO A ''MAILBOX'' COMPUTER LOCATED WITHIN A TAMPER-INDICATING ENCLOSURE SEALED BY THE IAEA.« less

  14. Aqueous cleaning and verification processes for precision cleaning of small parts

    NASA Technical Reports Server (NTRS)

    Allen, Gale J.; Fishell, Kenneth A.

    1995-01-01

    The NASA Kennedy Space Center (KSC) Materials Science Laboratory (MSL) has developed a totally aqueous process for precision cleaning and verification of small components. In 1990 the Precision Cleaning Facility at KSC used approximately 228,000 kg (500,000 lbs) of chlorofluorocarbon (CFC) 113 in the cleaning operations. It is estimated that current CFC 113 usage has been reduced by 75 percent and it is projected that a 90 percent reduction will be achieved by the end of calendar year 1994. The cleaning process developed utilizes aqueous degreasers, aqueous surfactants, and ultrasonics in the cleaning operation and an aqueous surfactant, ultrasonics, and Total Organic Carbon Analyzer (TOCA) in the nonvolatile residue (NVR) and particulate analysis for verification of cleanliness. The cleaning and verification process is presented in its entirety, with comparison to the CFC 113 cleaning and verification process, including economic and labor costs/savings.

  15. ADVANCED SURVEILLANCE OF ENVIROMENTAL RADIATION IN AUTOMATIC NETWORKS.

    PubMed

    Benito, G; Sáez, J C; Blázquez, J B; Quiñones, J

    2018-06-01

    The objective of this study is the verification of the operation of a radiation monitoring network conformed by several sensors. The malfunction of a surveillance network has security and economic consequences, which derive from its maintenance and could be avoided with an early detection. The proposed method is based on a kind of multivariate distance, and the verification for the methodology has been tested at CIEMAT's local radiological early warning network.

  16. Experimental Verification of Boyle's Law and the Ideal Gas Law

    ERIC Educational Resources Information Center

    Ivanov, Dragia Trifonov

    2007-01-01

    Two new experiments are offered concerning the experimental verification of Boyle's law and the ideal gas law. To carry out the experiments, glass tubes, water, a syringe and a metal manometer are used. The pressure of the saturated water vapour is taken into consideration. For educational purposes, the experiments are characterized by their…

  17. Audio-visual imposture

    NASA Astrophysics Data System (ADS)

    Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard

    2006-05-01

    A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.

  18. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  19. The MiniCLEAN Dark Matter Experiment

    NASA Astrophysics Data System (ADS)

    Schnee, Richard; Deap/Clean Collaboration

    2011-10-01

    The MiniCLEAN dark matter experiment exploits a single-phase liquid argon (LAr) detector, instrumented with photomultiplier tubes submerged in the cryogen with nearly 4 π coverage of a 500 kg target (150 kg fiducial) mass. The high light yield and large difference in singlet/triplet scintillation time-profiles in LAr provide effective defense against radioactive backgrounds through pulse-shape discrimination and event position reconstruction. The detector is also designed for a liquid neon target which, in the event of a positive signal in LAr, will enable an independent verification of backgrounds and provide a unique test of the expected A2 dependence of the WIMP interaction rate. The conceptually simple design can be scaled to target masses in excess of 10 tons in a relatively straightforward and economic manner. The experimental technique and current status of MiniCLEAN will be summarized.

  20. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  1. Verification and intercomparison of mesoscale ensemble prediction systems in the Beijing 2008 Olympics Research and Development Project

    NASA Astrophysics Data System (ADS)

    Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui

    2011-05-01

    During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.

  2. Apollo experience report: Guidance and control systems. Engineering simulation program

    NASA Technical Reports Server (NTRS)

    Gilbert, D. W.

    1973-01-01

    The Apollo Program experience from early 1962 to July 1969 with respect to the engineering-simulation support and the problems encountered is summarized in this report. Engineering simulation in support of the Apollo guidance and control system is discussed in terms of design analysis and verification, certification of hardware in closed-loop operation, verification of hardware/software compatibility, and verification of both software and procedures for each mission. The magnitude, time, and cost of the engineering simulations are described with respect to hardware availability, NASA and contractor facilities (for verification of the command module, the lunar module, and the primary guidance, navigation, and control system), and scheduling and planning considerations. Recommendations are made regarding implementation of similar, large-scale simulations for future programs.

  3. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  4. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  5. Recent progress in econophysics: Chaos, leverage, and business cycles as revealed by agent-based modeling and human experiments

    NASA Astrophysics Data System (ADS)

    Xin, Chen; Huang, Ji-Ping

    2017-12-01

    Agent-based modeling and controlled human experiments serve as two fundamental research methods in the field of econophysics. Agent-based modeling has been in development for over 20 years, but how to design virtual agents with high levels of human-like "intelligence" remains a challenge. On the other hand, experimental econophysics is an emerging field; however, there is a lack of experience and paradigms related to the field. Here, we review some of the most recent research results obtained through the use of these two methods concerning financial problems such as chaos, leverage, and business cycles. We also review the principles behind assessments of agents' intelligence levels, and some relevant designs for human experiments. The main theme of this review is to show that by combining theory, agent-based modeling, and controlled human experiments, one can garner more reliable and credible results on account of a better verification of theory; accordingly, this way, a wider range of economic and financial problems and phenomena can be studied.

  6. Striving to be known by significant others: automatic activation of self-verification goals in relationship contexts.

    PubMed

    Kraus, Michael W; Chen, Serena

    2009-07-01

    Extending research on the automatic activation of goals associated with significant others, the authors hypothesized that self-verification goals typically pursued with significant others are automatically elicited when a significant-other representation is activated. Supporting this hypothesis, the activation of a significant-other representation through priming (Experiments 1 and 3) or through a transference encounter (Experiment 2) led participants to seek feedback that verifies their preexisting self-views. Specifically, significant-other primed participants desired self-verifying feedback, in general (Experiment 1), from an upcoming interaction partner (Experiment 2), and relative to acquaintance-primed participants and favorable feedback (Experiment 3). Finally, self-verification goals were activated, especially for relational self-views deemed high in importance to participants' self-concepts (Experiment 2) and held with high certainty (Experiment 3). Implications for research on self-evaluative goals, the relational self, and the automatic goal activation literature are discussed, as are consequences for close relationships. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  7. Synesthesia affects verification of simple arithmetic equations.

    PubMed

    Ghirardelli, Thomas G; Mills, Carol Bergfeld; Zilioli, Monica K C; Bailey, Leah P; Kretschmar, Paige K

    2010-01-01

    To investigate the effects of color-digit synesthesia on numerical representation, we presented a synesthete, called SE, in the present study, and controls with mathematical equations for verification. In Experiment 1, SE verified addition equations made up of digits that either matched or mismatched her color-digit photisms or were in black. In Experiment 2A, the addends were presented in the different color conditions and the solution was presented in black, whereas in Experiment 2B the addends were presented in black and the solutions were presented in the different color conditions. In Experiment 3, multiplication and division equations were presented in the same color conditions as in Experiment 1. SE responded significantly faster to equations that matched her photisms than to those that did not; controls did not show this effect. These results suggest that photisms influence the processing of digits in arithmetic verification, replicating and extending previous findings.

  8. EM&V for Energy Efficiency Policies and Initiatives

    EPA Pesticide Factsheets

    Learn how representatives of jurisdictions, companies, and other entities can use evaluation, measurement, and verification (EM&V) in demand-side energy efficiency (EE) investments to achieve intended environmental, energy, and economic goals.

  9. Research on key technology of the verification system of steel rule based on vision measurement

    NASA Astrophysics Data System (ADS)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  10. Application verification research of cloud computing technology in the field of real time aerospace experiment

    NASA Astrophysics Data System (ADS)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  11. JPL control/structure interaction test bed real-time control computer architecture

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1989-01-01

    The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.

  12. Wiltech Component Cleaning and Refurbishment Facility CFC Elimination Plan at NASA Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Williamson, Steve; Aman, Bob; Aurigema, Andrew; Melendez, Orlando

    1999-01-01

    The Wiltech Component Cleaning & Refurbishment Facility (WT-CCRF) at NASA Kennedy Space Center performs precision cleaning on approximately 200,000 metallic and non metallic components every year. WT-CCRF has developed a CFC elimination plan consisting of aqueous cleaning and verification and an economical dual solvent strategy for alternative solvent solution. Aqueous Verification Methodologies were implemented two years ago on a variety of Ground Support Equipment (GSE) components and sampling equipment. Today, 50% of the current workload is verified using aqueous methods and 90% of the total workload is degreased aqueously using, Zonyl and Brulin surfactants in ultrasonic baths. An additional estimated 20% solvent savings could be achieved if the proposed expanded use of aqueous methods are approved. Aqueous cleaning has shown to be effective, environmentally friendly and economical (i.e.. cost of materials, equipment, facilities and labor).

  13. Psychiatric Residents' Attitudes toward and Experiences with the Clinical-Skills Verification Process: A Pilot Study on U.S. and International Medical Graduates

    ERIC Educational Resources Information Center

    Rao, Nyapati R.; Kodali, Rahul; Mian, Ayesha; Ramtekkar, Ujjwal; Kamarajan, Chella; Jibson, Michael D.

    2012-01-01

    Objective: The authors report on a pilot study of the experiences and perceptions of foreign international medical graduate (F-IMG), United States international medical graduate (US-IMG), and United States medical graduate (USMG) psychiatric residents with the newly mandated Clinical Skills Verification (CSV) process. The goal was to identify and…

  14. Design Authority in the Test Programme Definition: The Alenia Spazio Experience

    NASA Astrophysics Data System (ADS)

    Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.

    2004-08-01

    In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).

  15. Spectral Analysis of Forecast Error Investigated with an Observing System Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Prive, N. C.; Errico, Ronald M.

    2015-01-01

    The spectra of analysis and forecast error are examined using the observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASAGMAO). A global numerical weather prediction model, the Global Earth Observing System version 5 (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation, is cycled for two months with once-daily forecasts to 336 hours to generate a control case. Verification of forecast errors using the Nature Run as truth is compared with verification of forecast errors using self-analysis; significant underestimation of forecast errors is seen using self-analysis verification for up to 48 hours. Likewise, self analysis verification significantly overestimates the error growth rates of the early forecast, as well as mischaracterizing the spatial scales at which the strongest growth occurs. The Nature Run-verified error variances exhibit a complicated progression of growth, particularly for low wave number errors. In a second experiment, cycling of the model and data assimilation over the same period is repeated, but using synthetic observations with different explicitly added observation errors having the same error variances as the control experiment, thus creating a different realization of the control. The forecast errors of the two experiments become more correlated during the early forecast period, with correlations increasing for up to 72 hours before beginning to decrease.

  16. Monte Carlo verification of radiotherapy treatments with CloudMC.

    PubMed

    Miras, Hector; Jiménez, Rubén; Perales, Álvaro; Terrón, José Antonio; Bertolet, Alejandro; Ortiz, Antonio; Macías, José

    2018-06-27

    A new implementation has been made on CloudMC, a cloud-based platform presented in a previous work, in order to provide services for radiotherapy treatment verification by means of Monte Carlo in a fast, easy and economical way. A description of the architecture of the application and the new developments implemented is presented together with the results of the tests carried out to validate its performance. CloudMC has been developed over Microsoft Azure cloud. It is based on a map/reduce implementation for Monte Carlo calculations distribution over a dynamic cluster of virtual machines in order to reduce calculation time. CloudMC has been updated with new methods to read and process the information related to radiotherapy treatment verification: CT image set, treatment plan, structures and dose distribution files in DICOM format. Some tests have been designed in order to determine, for the different tasks, the most suitable type of virtual machines from those available in Azure. Finally, the performance of Monte Carlo verification in CloudMC is studied through three real cases that involve different treatment techniques, linac models and Monte Carlo codes. Considering computational and economic factors, D1_v2 and G1 virtual machines were selected as the default type for the Worker Roles and the Reducer Role respectively. Calculation times up to 33 min and costs of 16 € were achieved for the verification cases presented when a statistical uncertainty below 2% (2σ) was required. The costs were reduced to 3-6 € when uncertainty requirements are relaxed to 4%. Advantages like high computational power, scalability, easy access and pay-per-usage model, make Monte Carlo cloud-based solutions, like the one presented in this work, an important step forward to solve the long-lived problem of truly introducing the Monte Carlo algorithms in the daily routine of the radiotherapy planning process.

  17. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    NASA Astrophysics Data System (ADS)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.

  18. SU-F-T-269: Preliminary Experience of Kuwait Cancer Control Center (KCCC) On IMRT Treatment Planning and Pre-Treatment Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sethuraman, TKR; Sherif, M; Subramanian, N

    Purpose: The complexity of IMRT delivery requires pre-treatment quality assurance and plan verification. KCCC has implemented IMRT clinically in few sites and will extend to all sites. Recently, our Varian linear accelerator and Eclipse planning system were upgraded from Millennium 80 to 120 Multileaf Collimator (MLC) and from v8.6 to 11.0 respectively. Our preliminary experience on the pre-treatment quality assurance verification is discussed. Methods: Eight Breast, Three Prostate and One Hypopharynx cancer patients were planned with step and shoot IMRT. All breast cases were planned before the upgrade with 60% cases treated. The ICRU 83 recommendations were followed for themore » dose prescription and constraints to OAR for all cases. Point dose measurement was done with CIRS cylindrical phantom and PTW 0.125 cc ionization chamber. Measured dose was compared with calculated dose at the point of measurement. Map CHECK diode array phantom was used for the plan verification. Planned and measured doses were compared by applying gamma index of 3% (dose difference) / 3 mm DTA (average distance to agreement). For all cases, a plan is considered to be successful if more than 95% of the tested diodes pass the gamma test. A prostate case was chosen to compare the plan verification before and after the upgrade. Results: Point dose measurement results were in agreement with the calculated doses. The maximum deviation observed was 2.3%. The passing rate of average gamma index was measured higher than 97% for the plan verification of all cases. Similar result was observed for plan verification of the chosen prostate case before and after the upgrade. Conclusion: Our preliminary experience from the obtained results validates the accuracy of our QA process and provides confidence to extend IMRT to all sites in Kuwait.« less

  19. The Yes-No Question Answering System and Statement Verification.

    ERIC Educational Resources Information Center

    Akiyama, M. Michael; And Others

    1979-01-01

    Two experiments investigated the relationship of verification to the answering of yes-no questions. Subjects verified simple statements or answered simple questions. Various proposals concerning the relative difficulty of answering questions and verifying statements were considered, and a model was proposed. (SW)

  20. Is identity per se irrelevant? A contrarian view of self-verification effects.

    PubMed

    Gregg, Aiden P

    2009-01-01

    Self-verification theory (SVT) posits that people who hold negative self-views, such as depressive patients, ironically strive to verify that these self-views are correct, by actively seeking out critical feedback or interaction partners who evaluate them unfavorably. Such verification strivings are allegedly directed towards maximizing subjective perceptions of prediction and control. Nonetheless, verification strivings are also alleged to stabilize maladaptive self-perceptions, and thereby hindering therapeutic recovery. Despite the widespread acceptance of SVT, I contend that the evidence for it is weak and circumstantial. In particular, I contend that that most or all major findings cited in support of SVT can be more economically explained in terms of raison oblige theory (ROT). ROT posits that people with negative self-views solicit critical feedback, not because they want it, but because they their self-view inclines them regard it as probative, a necessary condition for considering it worth obtaining. Relevant findings are reviewed and reinterpreted with an emphasis on depression, and some new empirical data reported. (c) 2008 Wiley-Liss, Inc.

  1. Using computer graphics to enhance astronaut and systems safety

    NASA Technical Reports Server (NTRS)

    Brown, J. W.

    1985-01-01

    Computer graphics is being employed at the NASA Johnson Space Center as a tool to perform rapid, efficient and economical analyses for man-machine integration, flight operations development and systems engineering. The Operator Station Design System (OSDS), a computer-based facility featuring a highly flexible and versatile interactive software package, PLAID, is described. This unique evaluation tool, with its expanding data base of Space Shuttle elements, various payloads, experiments, crew equipment and man models, supports a multitude of technical evaluations, including spacecraft and workstation layout, definition of astronaut visual access, flight techniques development, cargo integration and crew training. As OSDS is being applied to the Space Shuttle, Orbiter payloads (including the European Space Agency's Spacelab) and future space vehicles and stations, astronaut and systems safety are being enhanced. Typical OSDS examples are presented. By performing physical and operational evaluations during early conceptual phases. supporting systems verification for flight readiness, and applying its capabilities to real-time mission support, the OSDS provides the wherewithal to satisfy a growing need of the current and future space programs for efficient, economical analyses.

  2. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    DTIC Science & Technology

    2003-03-01

    Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig

  3. Remote age verification to prevent underage alcohol sales. First results from Dutch liquor stores and the economic viability of national adoption.

    PubMed

    van Hoof, Joris J; van Velthoven, Ben C J

    2015-04-01

    Alcohol consumption among minors is a popular topic in the public health debate, also in the Netherlands. Compliance with the legal age limits for selling alcohol proves to be rather low. Some Dutch liquor stores (outlets with an exclusive license to sell off-premise drinks with 15% alcohol or more) have recently adopted a remote age verification system. This paper discusses the first results of the use of the system. We use data from 67 liquor stores that adopted Ageviewers, a remote age verification system, in 2011. A remote validator judges the customer's age using camera footage and asks for an ID if there is any doubt. The system then sends a signal to the cash register, which approves or rejects the alcohol purchase. From the 367346 purchase attempts in the database, 8374 were rejected or aborted for age-related reasons. This figure amounts to an average ratio of 1.12 underage alcohol purchase attempts per sales day in each participating liquor store. Scaling up to a national level, the figures suggest at least 1 million underage alcohol purchase attempts per year in Dutch liquor stores. Underage alcohol purchases can be prevented by the nationwide adoption of remote age verification. However, given the lax enforcement of the age limits by the government, adopting such a system on a voluntary basis is generally not in the economic interest of the liquor stores. Obligatory installation of the system in off-premise alcohol outlets may pass a social cost-benefit test if certain conditions are fulfilled. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Definition of ground test for Large Space Structure (LSS) control verification

    NASA Technical Reports Server (NTRS)

    Waites, H. B.; Doane, G. B., III; Tollison, D. K.

    1984-01-01

    An overview for the definition of a ground test for the verification of Large Space Structure (LSS) control is given. The definition contains information on the description of the LSS ground verification experiment, the project management scheme, the design, development, fabrication and checkout of the subsystems, the systems engineering and integration, the hardware subsystems, the software, and a summary which includes future LSS ground test plans. Upon completion of these items, NASA/Marshall Space Flight Center will have an LSS ground test facility which will provide sufficient data on dynamics and control verification of LSS so that LSS flight system operations can be reasonably ensured.

  5. Space station data management system - A common GSE test interface for systems testing and verification

    NASA Technical Reports Server (NTRS)

    Martinez, Pedro A.; Dunn, Kevin W.

    1987-01-01

    This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.

  6. Mitigating errors caused by interruptions during medication verification and administration: interventions in a simulated ambulatory chemotherapy setting.

    PubMed

    Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia

    2014-11-01

    Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  7. Mitigating errors caused by interruptions during medication verification and administration: interventions in a simulated ambulatory chemotherapy setting

    PubMed Central

    Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia

    2014-01-01

    Background Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. Objective The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. Methods The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Results Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Conclusions Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. PMID:24906806

  8. A study of compositional verification based IMA integration method

    NASA Astrophysics Data System (ADS)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  9. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    PubMed

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  10. Clinical Skills Verification, Formative Feedback, and Psychiatry Residency Trainees

    ERIC Educational Resources Information Center

    Dalack, Gregory W.; Jibson, Michael D.

    2012-01-01

    Objective: The authors describe the implementation of Clinical Skills Verification (CSV) in their program as an in-training assessment intended primarily to provide formative feedback to trainees, strengthen the supervisory experience, identify the need for remediation of interviewing skills, and secondarily to demonstrating resident competence…

  11. Statement Verification: A Stochastic Model of Judgment and Response.

    ERIC Educational Resources Information Center

    Wallsten, Thomas S.; Gonzalez-Vallejo, Claudia

    1994-01-01

    A stochastic judgment model (SJM) is presented as a framework for addressing issues in statement verification and probability judgment. Results of 5 experiments with 264 undergraduates support the validity of the model and provide new information that is interpreted in terms of the SJM. (SLD)

  12. Blood collection tubes as medical devices: The potential to affect assays and proposed verification and validation processes for the clinical laboratory.

    PubMed

    Bowen, Raffick A R; Adcock, Dorothy M

    2016-12-01

    Blood collection tubes (BCTs) are an often under-recognized variable in the preanalytical phase of clinical laboratory testing. Unfortunately, even the best-designed and manufactured BCTs may not work well in all clinical settings. Clinical laboratories, in collaboration with healthcare providers, should carefully evaluate BCTs prior to putting them into clinical use to determine their limitations and ensure that patients are not placed at risk because of inaccuracies due to poor tube performance. Selection of the best BCTs can be achieved through comparing advertising materials, reviewing the literature, observing the device at a scientific meeting, receiving a demonstration, evaluating the device under simulated conditions, or testing the device with patient samples. Although many publications have discussed method validations, few detail how to perform experiments for tube verification and validation. This article highlights the most common and impactful variables related to BCTs and discusses the validation studies that a typical clinical laboratory should perform when selecting BCTs. We also present a brief review of how in vitro diagnostic devices, particularly BCTs, are regulated in the United States, the European Union, and Canada. The verification and validation of BCTs will help to avoid the economic and human costs associated with incorrect test results, including poor patient care, unnecessary testing, and delays in test results. We urge laboratorians, tube manufacturers, diagnostic companies, and other researchers to take all the necessary steps to protect against the adverse effects of BCT components and their additives on clinical assays. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  13. Cluster man/system design requirements and verification. [for Skylab program

    NASA Technical Reports Server (NTRS)

    Watters, H. H.

    1974-01-01

    Discussion of the procedures employed for determining the man/system requirements that guided Skylab design, and review of the techniques used for implementing the man/system design verification. The foremost lesson learned from the design need anticipation and design verification experience is the necessity to allow for human capabilities of in-flight maintenance and repair. It is now known that the entire program was salvaged by a series of unplanned maintenance and repair events which were implemented in spite of poor design provisions for maintenance.

  14. Speed and Accuracy in the Processing of False Statements About Semantic Information.

    ERIC Educational Resources Information Center

    Ratcliff, Roger

    1982-01-01

    A standard reaction time procedure and a response signal procedure were used on data from eight experiments on semantic verifications. Results suggest that simple models of the semantic verification task that assume a single yes/no dimension on which discrimination is made are not correct. (Author/PN)

  15. Formulating face verification with semidefinite programming.

    PubMed

    Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S

    2007-11-01

    This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.

  16. Guidelines for mission integration, a summary report

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Guidelines are presented for instrument/experiment developers concerning hardware design, flight verification, and operations and mission implementation requirements. Interface requirements between the STS and instruments/experiments are defined. Interface constraints and design guidelines are presented along with integrated payload requirements for Spacelab Missions 1, 2, and 3. Interim data are suggested for use during hardware development until more detailed information is developed when a complete mission and an integrated payload system are defined. Safety requirements, flight verification requirements, and operations procedures are defined.

  17. Alternative sample sizes for verification dose experiments and dose audits

    NASA Astrophysics Data System (ADS)

    Taylor, W. A.; Hansen, J. M.

    1999-01-01

    ISO 11137 (1995), "Sterilization of Health Care Products—Requirements for Validation and Routine Control—Radiation Sterilization", provides sampling plans for performing initial verification dose experiments and quarterly dose audits. Alternative sampling plans are presented which provide equivalent protection. These sampling plans can significantly reduce the cost of testing. These alternative sampling plans have been included in a draft ISO Technical Report (type 2). This paper examines the rational behind the proposed alternative sampling plans. The protection provided by the current verification and audit sampling plans is first examined. Then methods for identifying equivalent plans are highlighted. Finally, methods for comparing the cost associated with the different plans are provided. This paper includes additional guidance for selecting between the original and alternative sampling plans not included in the technical report.

  18. Overview of the TOPEX/Poseidon Platform Harvest Verification Experiment

    NASA Technical Reports Server (NTRS)

    Morris, Charles S.; DiNardo, Steven J.; Christensen, Edward J.

    1995-01-01

    An overview is given of the in situ measurement system installed on Texaco's Platform Harvest for verification of the sea level measurement from the TOPEX/Poseidon satellite. The prelaunch error budget suggested that the total root mean square (RMS) error due to measurements made at this verification site would be less than 4 cm. The actual error budget for the verification site is within these original specifications. However, evaluation of the sea level data from three measurement systems at the platform has resulted in unexpectedly large differences between the systems. Comparison of the sea level measurements from the different tide gauge systems has led to a better understanding of the problems of measuring sea level in relatively deep ocean. As of May 1994, the Platform Harvest verification site has successfully supported 60 TOPEX/Poseidon overflights.

  19. 14 CFR Sec. 1-5 - Records.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Records. Sec. 1-5 Section 1-5 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION (AVIATION PROCEEDINGS) ECONOMIC... information as will render certain the identification of all facts essential to a verification of the nature...

  20. ETV/ESTCP Demonstration Plan - Demonstration and Verification of a Turbine Power Generation System Utilizing Renewable Fuel: Landfill Gas

    EPA Science Inventory

    This Test and Quality Assurance Plan (TQAP) provides data quality objections for the success factors that were validated during this demonstration include energy production, emissions and emission reductions compared to alternative systems, economics, and operability, including r...

  1. Multi-response optimization of Artemia hatching process using split-split-plot design based response surface methodology

    PubMed Central

    Arun, V. V.; Saharan, Neelam; Ramasubramanian, V.; Babitha Rani, A. M.; Salin, K. R.; Sontakke, Ravindra; Haridas, Harsha; Pazhayamadom, Deepak George

    2017-01-01

    A novel method, BBD-SSPD is proposed by the combination of Box-Behnken Design (BBD) and Split-Split Plot Design (SSPD) which would ensure minimum number of experimental runs, leading to economical utilization in multi- factorial experiments. The brine shrimp Artemia was tested to study the combined effects of photoperiod, temperature and salinity, each with three levels, on the hatching percentage and hatching time of their cysts. The BBD was employed to select 13 treatment combinations out of the 27 possible combinations that were grouped in an SSPD arrangement. Multiple responses were optimized simultaneously using Derringer’s desirability function. Photoperiod and temperature as well as temperature-salinity interaction were found to significantly affect the hatching percentage of Artemia, while the hatching time was significantly influenced by photoperiod and temperature, and their interaction. The optimum conditions were 23 h photoperiod, 29 °C temperature and 28 ppt salinity resulting in 96.8% hatching in 18.94 h. In order to verify the results obtained from BBD-SSPD experiment, the experiment was repeated preserving the same set up. Results of verification experiment were found to be similar to experiment originally conducted. It is expected that this method would be suitable to optimize the hatching process of animal eggs. PMID:28091611

  2. 24 CFR 572.110 - Identifying and selecting eligible families for homeownership.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... otherwise qualified eligible families who have completed participation in one of the following economic self-sufficiency programs: Project Self-Sufficiency, Operation Bootstrap, Family Self-Sufficiency, JOBS, and any... for the disclosure and verification of social security numbers, as provided by part 5, subpart B, of...

  3. IMRT verification using a radiochromic/optical-CT dosimetry system

    NASA Astrophysics Data System (ADS)

    Oldham, Mark; Guo, Pengyi; Gluckman, Gary; Adamovics, John

    2006-12-01

    This work represents our first experiences relating to IMRT verification using a relatively new 3D dosimetry system consisting of a PRESAGETM dosimeter (Heuris Inc, Pharma LLC) and an optical-CT scanning system (OCTOPUSTM TM MGS Inc). This work builds in a step-wise manner on prior work in our lab.

  4. Comments for A Conference on Verification in the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doyle, James E.

    2012-06-12

    The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification ismore » information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.« less

  5. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    NASA Astrophysics Data System (ADS)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  6. Value of the GENS Forecast Ensemble as a Tool for Adaptation of Economic Activity to Climate Change

    NASA Astrophysics Data System (ADS)

    Hancock, L. O.; Alpert, J. C.; Kordzakhia, M.

    2009-12-01

    In an atmosphere of uncertainty as to the magnitude and direction of climate change in upcoming decades, one adaptation mechanism has emerged with consensus support: the upgrade and dissemination of spatially-resolved, accurate forecasts tailored to the needs of users. Forecasting can facilitate the changeover from dependence on climatology that is increasingly out of date. The best forecasters are local, but local forecasters face great constraints in some countries. Indeed, it is no coincidence that some areas subject to great weather variability and strong processes of climate change are economically vulnerable: mountainous regions, for example, where heavy and erratic flooding can destroy the value built up by households over years. It follows that those best placed to benefit from forecasting upgrades may not be those who have invested in the greatest capacity to date. More-flexible use of the global forecasts may contribute to adaptation. NOAA anticipated several years ago that their forecasts could be used in new ways in the future, and accordingly prepared sockets for easy access to their archives. These could be used to empower various national and regional capacities. Verification to identify practical lead times for the economically important variables is a needed first step. This presentation presents the verification that our team has undertaken, a pilot effort in which we considered variables of interest to economic actors in several lower income countries, cf. shepherds in a remote area of Central Asia, and verified the ensemble forecasts of those variables.

  7. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  8. Experimental evaluation of fingerprint verification system based on double random phase encoding

    NASA Astrophysics Data System (ADS)

    Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi

    2006-03-01

    We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.

  9. Simulation environment based on the Universal Verification Methodology

    NASA Astrophysics Data System (ADS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  10. Definition of ground test for verification of large space structure control

    NASA Technical Reports Server (NTRS)

    Doane, G. B., III; Glaese, J. R.; Tollison, D. K.; Howsman, T. G.; Curtis, S. (Editor); Banks, B.

    1984-01-01

    Control theory and design, dynamic system modelling, and simulation of test scenarios are the main ideas discussed. The overall effort is the achievement at Marshall Space Flight Center of a successful ground test experiment of a large space structure. A simplified planar model of ground test experiment of a large space structure. A simplified planar model of ground test verification was developed. The elimination from that model of the uncontrollable rigid body modes was also examined. Also studied was the hardware/software of computation speed.

  11. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory's Source Region Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.

    As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.

  12. Using Academia-Industry Partnerships to Enhance Software Verification & Validation Education via Active Learning Tools

    ERIC Educational Resources Information Center

    Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter

    2017-01-01

    Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…

  13. Experimental verification of vapor deposition rate theory in high velocity burner rigs

    NASA Technical Reports Server (NTRS)

    Gokoglu, Suleyman A.; Santoro, Gilbert J.

    1985-01-01

    The main objective has been the experimental verification of the corrosive vapor deposition theory in high-temperature, high-velocity environments. Towards this end a Mach 0.3 burner-rig appartus was built to measure deposition rates from salt-seeded (mostly Na salts) combustion gases on the internally cooled cylindrical collector. Deposition experiments are underway.

  14. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  15. Approaching the investigation of plasma turbulence through a rigorous verification and validation procedure: A practical example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.

    In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less

  16. Face verification with balanced thresholds.

    PubMed

    Yan, Shuicheng; Xu, Dong; Tang, Xiaoou

    2007-01-01

    The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.

  17. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  18. Sub-pixel mineral mapping using EO-1 Hyperion hyperspectral data

    NASA Astrophysics Data System (ADS)

    Kumar, C.; Shetty, A.; Raval, S.; Champatiray, P. K.; Sharma, R.

    2014-11-01

    This study describes the utility of Earth Observation (EO)-1 Hyperion data for sub-pixel mineral investigation using Mixture Tuned Target Constrained Interference Minimized Filter (MTTCIMF) algorithm in hostile mountainous terrain of Rajsamand district of Rajasthan, which hosts economic mineralization such as lead, zinc, and copper etc. The study encompasses pre-processing, data reduction, Pixel Purity Index (PPI) and endmember extraction from reflectance image of surface minerals such as illite, montmorillonite, phlogopite, dolomite and chlorite. These endmembers were then assessed with USGS mineral spectral library and lab spectra of rock samples collected from field for spectral inspection. Subsequently, MTTCIMF algorithm was implemented on processed image to obtain mineral distribution map of each detected mineral. A virtual verification method has been adopted to evaluate the classified image, which uses directly image information to evaluate the result and confirm the overall accuracy and kappa coefficient of 68 % and 0.6 respectively. The sub-pixel level mineral information with reasonable accuracy could be a valuable guide to geological and exploration community for expensive ground and/or lab experiments to discover economic deposits. Thus, the study demonstrates the feasibility of Hyperion data for sub-pixel mineral mapping using MTTCIMF algorithm with cost and time effective approach.

  19. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  20. Simulated Order Verification and Medication Reconciliation during an Introductory Pharmacy Practice Experience.

    PubMed

    Metzger, Nicole L; Chesson, Melissa M; Momary, Kathryn M

    2015-09-25

    Objective. To create, implement, and assess a simulated medication reconciliation and an order verification activity using hospital training software. Design. A simulated patient with medication orders and home medications was built into existing hospital training software. Students in an institutional introductory pharmacy practice experience (IPPE) reconciled the patient's medications and determined whether or not to verify the inpatient orders based on his medical history and laboratory data. After reconciliation, students identified medication discrepancies and documented their rationale for rejecting inpatient orders. Assessment. For a 3-year period, the majority of students agreed the simulation enhanced their learning, taught valuable clinical decision-making skills, integrated material from previous courses, and stimulated their interest in institutional pharmacy. Overall feedback from student evaluations about the IPPE also was favorable. Conclusion. Use of existing hospital training software can affordably simulate the pharmacist's role in order verification and medication reconciliation, as well as improve clinical decision-making.

  1. Signature Verification Based on Handwritten Text Recognition

    NASA Astrophysics Data System (ADS)

    Viriri, Serestina; Tapamo, Jules-R.

    Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.

  2. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  3. A Secure Framework for Location Verification in Pervasive Computing

    NASA Astrophysics Data System (ADS)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  4. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  5. Evaluation of geotechnical monitoring data from the ESF North Ramp Starter Tunnel, April 1994 to June 1995. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    This report presents the results of instrumentation measurements and observations made during construction of the North Ramp Starter Tunnel (NRST) of the Exploratory Studies Facility (ESF). The information in this report was developed as part of the Design Verification Study, Section 8.3.1.15.1.8 of the Yucca Mountain Site Characterization Plan (DOE 1988). The ESF is being constructed by the US Department of Energy (DOE) to evaluate the feasibility of locating a potential high-level nuclear waste repository on lands within and adjacent to the Nevada Test Site (NTS), Nye County, Nevada. The Design Verification Studies are performed to collect information during constructionmore » of the ESF that will be useful for design and construction of the potential repository. Four experiments make up the Design Verification Study: Evaluation of Mining Methods, Monitoring Drift Stability, Monitoring of Ground Support Systems, and The Air Quality and Ventilation Experiment. This report describes Sandia National Laboratories` (SNL) efforts in the first three of these experiments in the NRST.« less

  6. Field evaluations of the VD max approach for substantiation of a 25 kGy sterilization dose and its application to other preselected doses

    NASA Astrophysics Data System (ADS)

    Kowalski, John B.; Herring, Craig; Baryschpolec, Lisa; Reger, John; Patel, Jay; Feeney, Mary; Tallentire, Alan

    2002-08-01

    The International and European standards for radiation sterilization require evidence of the effectiveness of a minimum sterilization dose of 25 kGy but do not provide detailed guidance on how this evidence can be generated. An approach, designated VD max, has recently been described and computer evaluated to provide safe and unambiguous substantiation of a 25 kGy sterilization dose. The approach has been further developed into a practical method, which has been subjected to field evaluations at three manufacturing facilities which produce different types of medical devices. The three facilities each used a different overall evaluation strategy: Facility A used VD max for quarterly dose audits; Facility B compared VD max and Method 1 in side-by-side parallel experiments; and Facility C, a new facility at start-up, used VD max for initial substantiation of 25 kGy and subsequent quarterly dose audits. A common element at all three facilities was the use of 10 product units for irradiation in the verification dose experiment. The field evaluations of the VD max method were successful at all three facilities; they included many different types of medical devices/product families with a wide range of average bioburden and sample item portion values used in the verification dose experiments. Overall, around 500 verification dose experiments were performed and no failures were observed. In the side-by-side parallel experiments, the outcomes of the VD max experiments were consistent with the outcomes observed with Method 1. The VD max approach has been extended to sterilization doses >25 and <25 kGy; verification doses have been derived for sterilization doses of 15, 20, 30, and 35 kGy. Widespread application of the VD max method for doses other than 25 kGy must await controlled field evaluations and the development of appropriate specifications/standards.

  7. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory`s Source Region Program. Appendix D: Ionospheric measurements for IVEs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.

    As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.

  8. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    NASA Astrophysics Data System (ADS)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  9. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    PubMed

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  10. You Can't See the Real Me: Attachment Avoidance, Self-Verification, and Self-Concept Clarity.

    PubMed

    Emery, Lydia F; Gardner, Wendi L; Carswell, Kathleen L; Finkel, Eli J

    2018-03-01

    Attachment shapes people's experiences in their close relationships and their self-views. Although attachment avoidance and anxiety both undermine relationships, past research has primarily emphasized detrimental effects of anxiety on the self-concept. However, as partners can help people maintain stable self-views, avoidant individuals' negative views of others might place them at risk for self-concept confusion. We hypothesized that avoidance would predict lower self-concept clarity and that less self-verification from partners would mediate this association. Attachment avoidance was associated with lower self-concept clarity (Studies 1-5), an effect that was mediated by low self-verification (Studies 2-3). The association between avoidance and self-verification was mediated by less self-disclosure and less trust in partner feedback (Study 4). Longitudinally, avoidance predicted changes in self-verification, which in turn predicted changes in self-concept clarity (Study 5). Thus, avoidant individuals' reluctance to trust or become too close to others may result in hidden costs to the self-concept.

  11. Formal verification of medical monitoring software using Z language: a representative sample.

    PubMed

    Babamir, Seyed Morteza; Borhani, Mehdi

    2012-08-01

    Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.

  12. The MCNP6 Analytic Criticality Benchmark Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less

  13. Space Construction Automated Fabrication Experiment Definition Study (SCAFEDS). Volume 3: Requirements

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The performance, design, and verification requirements for the space construction automated fabrication experiment (SCAFE) are defined and the source of each imposed or derived requirement is identified.

  14. Testing and Demonstrating Speaker Verification Technology in Iraqi-Arabic as Part of the Iraqi Enrollment Via Voice Authentication Project (IEVAP) in Support of the Global War on Terrorism (GWOT)

    DTIC Science & Technology

    2007-09-01

    Australian/New Zealand English, Canadian French, Cantonese , European French, German, Italian, Japanese, Jordanian Arabic, Mandarin, Portuguese...Environment Within the congruence model, the environment “includes people, other organizations, social and economic forces, and legal constraints” [28

  15. Assessing the Value-Added by the Environmental Testing Process with the Aide of Physics/Engineering of Failure Evaluations

    NASA Technical Reports Server (NTRS)

    Cornford, S.; Gibbel, M.

    1997-01-01

    NASA's Code QT Test Effectiveness Program is funding a series of applied research activities focused on utilizing the principles of physics and engineering of failure and those of engineering economics to assess and improve the value-added by the various validation and verification activities to organizations.

  16. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory`s Source Region Program. Appendix B: Surface ground motion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, T.A.; Baker, D.F.; Edwards, C.L.

    1993-10-01

    Surface ground motion was recorded for many of the Integrated Verification Experiments using standard 10-, 25- and 100-g accelerometers, force-balanced accelerometers and, for some events, using golf balls and 0.39-cm steel balls as surface inertial gauges (SIGs). This report contains the semi-processed acceleration, velocity, and displacement data for the accelerometers fielded and the individual observations for the SIG experiments. Most acceleration, velocity, and displacement records have had calibrations applied and have been deramped, offset corrected, and deglitched but are otherwise unfiltered or processed from their original records. Digital data for all of these records are stored at Los Alamos Nationalmore » Laboratory.« less

  17. A UVM simulation environment for the study, optimization and verification of HL-LHC digital pixel readout chips

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.

    2018-05-01

    The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.

  18. Design of verification platform for wireless vision sensor networks

    NASA Astrophysics Data System (ADS)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  19. Control of operating parameters of laser ceilometers with the application of fiber optic delay line imitation

    NASA Astrophysics Data System (ADS)

    Kim, A. A.; Klochkov, D. V.; Konyaev, M. A.; Mihaylenko, A. S.

    2017-11-01

    The article considers the problem of control and verification of the laser ceilometers basic performance parameters and describes an alternative method based on the use of multi-length fiber optic delay line, simulating atmospheric track. The results of the described experiment demonstrate the great potential of this method for inspection and verification procedures of laser ceilometers.

  20. TeleOperator/telePresence System (TOPS) Concept Verification Model (CVM) development

    NASA Technical Reports Server (NTRS)

    Shimamoto, Mike S.

    1993-01-01

    The development of an anthropomorphic, undersea manipulator system, the TeleOperator/telePresence System (TOPS) Concept Verification Model (CVM) is described. The TOPS system's design philosophy, which results from NRaD's experience in undersea vehicles and manipulator systems development and operations, is presented. The TOPS design approach, task teams, manipulator, and vision system development and results, conclusions, and recommendations are presented.

  1. Dynamic Calibration and Verification Device of Measurement System for Dynamic Characteristic Coefficients of Sliding Bearing

    PubMed Central

    Chen, Runlin; Wei, Yangyang; Shi, Zhaoyang; Yuan, Xiaoyang

    2016-01-01

    The identification accuracy of dynamic characteristics coefficients is difficult to guarantee because of the errors of the measurement system itself. A novel dynamic calibration method of measurement system for dynamic characteristics coefficients is proposed in this paper to eliminate the errors of the measurement system itself. Compared with the calibration method of suspension quality, this novel calibration method is different because the verification device is a spring-mass system, which can simulate the dynamic characteristics of sliding bearing. The verification device is built, and the calibration experiment is implemented in a wide frequency range, in which the bearing stiffness is simulated by the disc springs. The experimental results show that the amplitude errors of this measurement system are small in the frequency range of 10 Hz–100 Hz, and the phase errors increase along with the increasing of frequency. It is preliminarily verified by the simulated experiment of dynamic characteristics coefficients identification in the frequency range of 10 Hz–30 Hz that the calibration data in this frequency range can support the dynamic characteristics test of sliding bearing in this frequency range well. The bearing experiments in greater frequency ranges need higher manufacturing and installation precision of calibration device. Besides, the processes of calibration experiments should be improved. PMID:27483283

  2. Being known, intimate, and valued: global self-verification and dyadic adjustment in couples and roommates.

    PubMed

    Katz, Jennifer; Joiner, Thomas E

    2002-02-01

    We contend that close relationships provide adults with optimal opportunities for personal growth when relationship partners provide accurate, honest feedback. Accordingly, it was predicted that young adults would experience the relationship quality with relationship partners who evaluated them in a manner consistent their own self-evaluations. Three empirical tests of this self-verification hypothesis as applied to close dyads were conducted. In Study 1, young adults in dating relationships were most intimate with and somewhat more committed to partners when they perceived that partners evaluated them as they evaluated themselves. Self-verification effects were pronounced for those involved in more serious dating relationships. In Study 2, men reported the greatest esteem for same-sex roommates who evaluated them in a self-verifying manner. Results from Study 2 were replicated and extended to both male and female roommate dyads in Study 3. Further, self-verification effects were most pronounced for young adults with high emotional empathy. Results suggest that self-verification theory is useful for understanding dyadic adjustment across a variety of relational contexts in young adulthood. Implications of self-verification processes for adult personal development are outlined within an identity negotiation framework.

  3. Annual verifications--a tick-box exercise?

    PubMed

    Walker, Gwen; Williams, David

    2014-09-01

    With the onus on healthcare providers and their staff to protect patients against all elements of 'avoidable harm' perhaps never greater, Gwen Walker, a highly experienced infection prevention control nurse specialist, and David Williams, MD of Approved Air, who has 30 years' experience in validation and verification of ventilation and ultraclean ventilation systems, examine changing requirements for, and trends in, operating theatre ventilation. Validation and verification reporting on such vital HVAC equipment should not, they argue, merely be viewed as a 'tick-box exercise'; it should instead 'comprehensively inform key stakeholders, and ultimately form part of clinical governance, thus protecting those ultimately named responsible for organisation-wide safety at Trust board level'.

  4. US corn and soybeans exploratory experiment

    NASA Technical Reports Server (NTRS)

    Carnes, J. G. (Principal Investigator)

    1981-01-01

    The results from the U.S. corn/soybeans exploratory experiment which was completed during FY 1980 are summarized. The experiment consisted of two parts: the classification procedures verification test and the simulated aggregation test. Evaluations of labeling, proportion estimation, and aggregation procedures are presented.

  5. Payload crew training complex simulation engineer's handbook

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1984-01-01

    The Simulation Engineer's Handbook is a guide for new engineers assigned to Experiment Simulation and a reference for engineers previously assigned. The experiment simulation process, development of experiment simulator requirements, development of experiment simulator hardware and software, and the verification of experiment simulators are discussed. The training required for experiment simulation is extensive and is only referenced in the handbook.

  6. Towards SMOS: The 2006 National Airborne Field Experiment Plan

    NASA Astrophysics Data System (ADS)

    Walker, J. P.; Merlin, O.; Panciera, R.; Kalma, J. D.

    2006-05-01

    The 2006 National Airborne Field Experiment (NAFE) is the second in a series of two intensive experiments to be conducted in different parts of Australia. The NAFE'05 experiment was undertaken in the Goulburn River catchment during November 2005, with the objective to provide high resolution data for process level understanding of soil moisture retrieval, scaling and data assimilation. The NAFE'06 experiment will be undertaken in the Murrumbidgee catchment during November 2006, with the objective to provide data for SMOS (Soil Moisture and Ocean Salinity) level soil moisture retrieval, downscaling and data assimilation. To meet this objective, PLMR (Polarimetric L-band Multibeam Radiometer) and supporting instruments (TIR and NDVI) will be flown at an altitude of 10,000 ft AGL to provide 1km resolution passive microwave data (and 20m TIR) across a 50km x 50km area every 2-3 days. This will both simulate a SMOS pixel and provide the 1km soil moisture data required for downscale verification, allowing downscaling and near-surface soil moisture assimilation techniques to be tested with remote sensing data which is consistent with that from current (MODIS) and planned (SMOS) satellite sensors.. Additionally, two transects will be flown across the area to provide both 1km multi-angular passive microwave data for SMOS algorithm development, and on the same day, 50m resolution passive microwave data for algorithm verification. The study area contains a total of 13 soil moisture profile and rainfall monitoring sites for assimilation verification, and the transect fight lines are planned to go through 5 of these. Ground monitoring of surface soil moisture and vegetation for algorithm verification will be targeted at these 5 focus farms, with soil moisture measurements made at 250m spacing for 1km resolution flights and 50m spacing for 50m resolution flights. While this experiment has a particular emphasis on the remote sensing of soil moisture, it is open for collaboration from interested scientists from all disciplines of environmental remote sensing and its application. See www.nafe.unimelb.edu.au for more detailed information on these experiments.

  7. eBiometrics: an enhanced multi-biometrics authentication technique for real-time remote applications on mobile devices

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin

    2010-04-01

    The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.

  8. Age verification cards fail to fully prevent minors from accessing tobacco products.

    PubMed

    Kanda, Hideyuki; Osaki, Yoneatsu; Ohida, Takashi; Kaneita, Yoshitaka; Munezawa, Takeshi

    2011-03-01

    Proper age verification can prevent minors from accessing tobacco products. For this reason, electronic locking devices based on a proof-of age system utilising cards were installed in almost every tobacco vending machine across Japan and Germany to restrict sales to minors. We aimed to clarify the associations between amount smoked by high school students and the usage of age verification cards by conducting a nationwide cross-sectional survey of students in Japan. This survey was conducted in 2008. We asked high school students, aged 13-18 years, in Japan about their smoking behaviour, where they purchase cigarettes, if or if not they have used age verification cards, and if yes, how they obtained this card. As the amount smoked increased, the prevalence of purchasing cigarettes from vending machines also rose for both males and females. The percentage of those with experience of using an age verification card was also higher among those who smoked more. Somebody outside of family was the top source of obtaining cards. Surprisingly, around 5% of males and females belonging to the group with highest smoking levels applied for cards themselves. Age verification cards cannot fully prevent minors from accessing tobacco products. These findings suggest that a total ban of tobacco vending machines, not an age verification system, is needed to prevent sales to minors.

  9. Comparison of OpenFOAM and EllipSys3D actuator line methods with (NEW) MEXICO results

    NASA Astrophysics Data System (ADS)

    Nathan, J.; Meyer Forsting, A. R.; Troldborg, N.; Masson, C.

    2017-05-01

    The Actuator Line Method exists for more than a decade and has become a well established choice for simulating wind rotors in computational fluid dynamics. Numerous implementations exist and are used in the wind energy research community. These codes were verified by experimental data such as the MEXICO experiment. Often the verification against other codes were made on a very broad scale. Therefore this study attempts first a validation by comparing two different implementations, namely an adapted version of SOWFA/OpenFOAM and EllipSys3D and also a verification by comparing against experimental results from the MEXICO and NEW MEXICO experiments.

  10. Data requirements for verification of ram glow chemistry

    NASA Technical Reports Server (NTRS)

    Swenson, G. R.; Mende, S. B.

    1985-01-01

    A set of questions is posed regarding the surface chemistry producing the ram glow on the space shuttle. The questions surround verification of the chemical cycle involved in the physical processes leading to the glow. The questions, and a matrix of measurements required for most answers, are presented. The measurements include knowledge of the flux composition to and from a ram surface as well as spectroscopic signatures from the U to visible to IR. A pallet set of experiments proposed to accomplish the measurements is discussed. An interim experiment involving an available infrared instrument to be operated from the shuttle Orbiter cabin is also be discussed.

  11. Direct and full-scale experimental verifications towards ground-satellite quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Jian-Yu; Yang, Bin; Liao, Sheng-Kai; Zhang, Liang; Shen, Qi; Hu, Xiao-Fang; Wu, Jin-Cai; Yang, Shi-Ji; Jiang, Hao; Tang, Yan-Lin; Zhong, Bo; Liang, Hao; Liu, Wei-Yue; Hu, Yi-Hua; Huang, Yong-Mei; Qi, Bo; Ren, Ji-Gang; Pan, Ge-Sheng; Yin, Juan; Jia, Jian-Jun; Chen, Yu-Ao; Chen, Kai; Peng, Cheng-Zhi; Pan, Jian-Wei

    2013-05-01

    Quantum key distribution (QKD) provides the only intrinsically unconditional secure method for communication based on the principle of quantum mechanics. Compared with fibre-based demonstrations, free-space links could provide the most appealing solution for communication over much larger distances. Despite significant efforts, all realizations to date rely on stationary sites. Experimental verifications are therefore extremely crucial for applications to a typical low Earth orbit satellite. To achieve direct and full-scale verifications of our set-up, we have carried out three independent experiments with a decoy-state QKD system, and overcome all conditions. The system is operated on a moving platform (using a turntable), on a floating platform (using a hot-air balloon), and with a high-loss channel to demonstrate performances under conditions of rapid motion, attitude change, vibration, random movement of satellites, and a high-loss regime. The experiments address wide ranges of all leading parameters relevant to low Earth orbit satellites. Our results pave the way towards ground-satellite QKD and a global quantum communication network.

  12. The greenhouse theory of climate change - A test by an inadvertent global experiment

    NASA Technical Reports Server (NTRS)

    Ramanathan, V.

    1988-01-01

    The greenhouse theory of climate change has reached the crucial stage of verification. Surface warming as large as that predicted by models would be unprecedented during an interglacial period such as the present. The theory, its scope for verification, and the emerging complexities of the climate feedback mechanisms are discussed in this paper. The evidence for change is described and competing nonclimatic forcings are discussed.

  13. Engineering of the LISA Pathfinder mission—making the experiment a practical reality

    NASA Astrophysics Data System (ADS)

    Warren, Carl; Dunbar, Neil; Backler, Mike

    2009-05-01

    LISA Pathfinder represents a unique challenge in the development of scientific spacecraft—not only is the LISA Test Package (LTP) payload a complex integrated development, placing stringent requirements on its developers and the spacecraft, but the payload also acts as the core sensor and actuator for the spacecraft, making the tasks of control design, software development and system verification unusually difficult. The micro-propulsion system which provides the remaining actuation also presents substantial development and verification challenges. As the mission approaches the system critical design review, flight hardware is completing verification and the process of verification using software and hardware simulators and test benches is underway. Preparation for operations has started, but critical milestones for LTP and field effect electric propulsion (FEEP) lie ahead. This paper summarizes the status of the present development and outlines the key challenges that must be overcome on the way to launch.

  14. Verification of Decision-Analytic Models for Health Economic Evaluations: An Overview.

    PubMed

    Dasbach, Erik J; Elbasha, Elamin H

    2017-07-01

    Decision-analytic models for cost-effectiveness analysis are developed in a variety of software packages where the accuracy of the computer code is seldom verified. Although modeling guidelines recommend using state-of-the-art quality assurance and control methods for software engineering to verify models, the fields of pharmacoeconomics and health technology assessment (HTA) have yet to establish and adopt guidance on how to verify health and economic models. The objective of this paper is to introduce to our field the variety of methods the software engineering field uses to verify that software performs as expected. We identify how many of these methods can be incorporated in the development process of decision-analytic models in order to reduce errors and increase transparency. Given the breadth of methods used in software engineering, we recommend a more in-depth initiative to be undertaken (e.g., by an ISPOR-SMDM Task Force) to define the best practices for model verification in our field and to accelerate adoption. Establishing a general guidance for verifying models will benefit the pharmacoeconomics and HTA communities by increasing accuracy of computer programming, transparency, accessibility, sharing, understandability, and trust of models.

  15. Extending to seasonal scales the current usage of short range weather forecasts and climate projections for water management in Spain

    NASA Astrophysics Data System (ADS)

    Rodriguez-Camino, Ernesto; Voces, José; Sánchez, Eroteida; Navascues, Beatriz; Pouget, Laurent; Roldan, Tamara; Gómez, Manuel; Cabello, Angels; Comas, Pau; Pastor, Fernando; Concepción García-Gómez, M.°; José Gil, Juan; Gil, Delfina; Galván, Rogelio; Solera, Abel

    2016-04-01

    This presentation, first, briefly describes the current use of weather forecasts and climate projections delivered by AEMET for water management in Spain. The potential use of seasonal climate predictions for water -in particular dams- management is then discussed more in-depth, using a pilot experience carried out by a multidisciplinary group coordinated by AEMET and DG for Water of Spain. This initiative is being developed in the framework of the national implementation of the GFCS and the European project, EUPORIAS. Among the main components of this experience there are meteorological and hydrological observations, and an empirical seasonal forecasting technique that provides an ensemble of water reservoir inflows. These forecasted inflows feed a prediction model for the dam state that has been adapted for this purpose. The full system is being tested retrospectively, over several decades, for selected water reservoirs located in different Spanish river basins. The assessment includes an objective verification of the probabilistic seasonal forecasts using standard metrics, and the evaluation of the potential social and economic benefits, with special attention to drought and flooding conditions. The methodology of implementation of these seasonal predictions in the decision making process is being developed in close collaboration with final users participating in this pilot experience.

  16. Environmental Verification Experiment for the Explorer Platform (EVEEP)

    NASA Technical Reports Server (NTRS)

    Norris, Bonnie; Lorentson, Chris

    1992-01-01

    Satellites and long-life spacecraft require effective contamination control measures to ensure data accuracy and maintain overall system performance margins. Satellite and spacecraft contamination can occur from either molecular or particulate matter. Some of the sources of the molecular species are as follows: mass loss from nonmetallic materials; venting of confined spacecraft or experiment volumes; exhaust effluents from attitude control systems; integration and test activities; and improper cleaning of surfaces. Some of the sources of particulates are as follows: leaks or purges which condense upon vacuum exposure; abrasion of movable surfaces; and micrometeoroid impacts. The Environmental Verification Experiment for the Explorer Platform (EVEEP) was designed to investigate the following aspects of spacecraft contamination control: materials selection; contamination modeling of existing designs; and thermal vacuum testing of a spacecraft with contamination monitors.

  17. Study for verification testing of the helmet-mounted display in the Japanese Experimental Module.

    PubMed

    Nakajima, I; Yamamoto, I; Kato, H; Inokuchi, S; Nemoto, M

    2000-02-01

    Our purpose is to propose a research and development project in the field of telemedicine. The proposed Multimedia Telemedicine Experiment for Extra-Vehicular Activity will entail experiments designed to support astronaut health management during Extra-Vehicular Activity (EVA). Experiments will have relevant applications to the Japanese Experimental Module (JEM) operated by National Space Development Agency of Japan (NASDA) for the International Space Station (ISS). In essence, this is a proposal for verification testing of the Helmet-Mounted Display (HMD), which enables astronauts to verify their own blood pressures and electrocardiograms, and to view a display of instructions from the ground station and listings of work procedures. Specifically, HMD is a device designed to project images and data inside the astronaut's helmet. We consider this R&D proposal to be one of the most suitable projects under consideration in response to NASDA's open invitation calling for medical experiments to be conducted on JEM.

  18. Guidance and Control Software Project Data - Volume 3: Verification Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  19. Practical Application of Model Checking in Software Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Skakkebaek, Jens Ulrik

    1999-01-01

    This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.

  20. Design and Verification Guidelines for Vibroacoustic and Transient Environments

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Design and verification guidelines for vibroacoustic and transient environments contain many basic methods that are common throughout the aerospace industry. However, there are some significant differences in methodology between NASA/MSFC and others - both government agencies and contractors. The purpose of this document is to provide the general guidelines used by the Component Analysis Branch, ED23, at MSFC, for the application of the vibroacoustic and transient technology to all launch vehicle and payload components and payload components and experiments managed by NASA/MSFC. This document is intended as a tool to be utilized by the MSFC program management and their contractors as a guide for the design and verification of flight hardware.

  1. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    NASA Astrophysics Data System (ADS)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  2. 6th Annual CMMI Technology Conference and User Group

    DTIC Science & Technology

    2006-11-17

    Operationally Oriented; Customer Focused Proven Approach – Level of Detail Beginner Decision Table (DT) is a tabular representation with tailoring options to...written to reflect the experience of the author Software Engineering led the process charge in the ’80s – Used Flowcharts – CASE tools – “data...Postpo ned PCR. Verification Steps • EPG configuration audits • EPG configuration status reports Flowcharts and Entry, Task, Verification and eXit

  3. Which Accelerates Faster--A Falling Ball or a Porsche?

    ERIC Educational Resources Information Center

    Rall, James D.; Abdul-Razzaq, Wathiq

    2012-01-01

    An introductory physics experiment has been developed to address the issues seen in conventional physics lab classes including assumption verification, technological dependencies, and real world motivation for the experiment. The experiment has little technology dependence and compares the acceleration due to gravity by using position versus time…

  4. Residual Negative Pressure in Vacuum Tubes Might Increase the Risk of Spurious Hemolysis.

    PubMed

    Xiao, Tong-Tong; Zhang, Qiao-Xin; Hu, Jing; Ouyang, Hui-Zhen; Cai, Ying-Mu

    2017-05-01

    We planned a study to establish whether spurious hemolysis may occur when negative pressure remains in vacuum tubes. Four tubes with different vacuum levels (-54, -65, -74, and -86 kPa) were used to examine blood drawn from one healthy volunteer; the tubes were allowed to stand for different times (1, 2, 3, and 4 hours). The plasma was separated and immediately tested for free hemoglobin (FHb). Thirty patients were enrolled in a verification experiment. The degree of hemolysis observed was greater when the remaining negative pressure was higher. Significant differences were recorded in the verification experiment. The results suggest that residual negative pressure might increase the risk of spurious hemolysis.

  5. Proton beam characterization in the experimental room of the Trento Proton Therapy facility

    NASA Astrophysics Data System (ADS)

    Tommasino, F.; Rovituso, M.; Fabiano, S.; Piffer, S.; Manea, C.; Lorentini, S.; Lanzone, S.; Wang, Z.; Pasini, M.; Burger, W. J.; La Tessa, C.; Scifoni, E.; Schwarz, M.; Durante, M.

    2017-10-01

    As proton therapy is becoming an established treatment methodology for cancer patients, the number of proton centres is gradually growing worldwide. The economical effort for building these facilities is motivated by the clinical aspects, but might be also supported by the potential relevance for the research community. Experiments with high-energy protons are needed not only for medical physics applications, but represent also an essential part of activities dedicated to detector development, space research, radiation hardness tests, as well as of fundamental research in nuclear and particle physics. Here we present the characterization of the beam line installed in the experimental room of the Trento Proton Therapy Centre (Italy). Measurements of beam spot size and envelope, range verification and proton flux were performed in the energy range between 70 and 228 MeV. Methods for reducing the proton flux from typical treatments values of 106-109 particles/s down to 101-105 particles/s were also investigated. These data confirm that a proton beam produced in a clinical centre build by a commercial company can be exploited for a broad spectrum of experimental activities. The results presented here will be used as a reference for future experiments.

  6. An Exploratory Analysis of Economic Factors in the Navy Total Force Strength Model (NTFSM)

    DTIC Science & Technology

    2015-12-01

    NTFSM is still in the testing phase and its overall behavior is largely unknown. In particular, the analysts that NTFSM was designed to help are...NTFSM is still in the testing phase and its overall behavior is largely unknown. In particular, the analysts that NTFSM was designed to help are...7 B. NTFSM VERIFICATION AND TESTING ......................................... 8 C

  7. Computer-Based Automation of Discrete Product Manufacture: A preliminary Discussion of Feasibility and Impact

    DTIC Science & Technology

    1974-07-01

    automated manufacturing processes and a rough technoeconomic evaluation of those concepts. Our evaluation is largely based on estimates; therefore, the...must be subjected to thorough analysis and experimental verification before they can be considered definitive. They are being published at this time...hardware and sensor technology, manufacturing engineering, automation, and economic analysis . Members of this team inspected over thirty manufacturing

  8. Analytical and policy issues in energy economics: Uses of the FRS data base

    NASA Astrophysics Data System (ADS)

    1981-12-01

    The relevant literature concerning several major analytical and policy issues in energy economics is reviewed and criticized. The possible uses of the Financial Reporting System (FRS) data base for the analysis of energy policy issues are investigated. Certain features of FRS data suggest several ways in which the data base can be used by policy makers. FRS data are collected on the firm level, and different segments of the same firm operating in different markets can be separately identified. The methods of collection as well as FRS's elaborate data verification process guarantee a high degree of accuracy and consistency among firms.

  9. Modeling interfacial fracture in Sierra.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Arthur A.; Ohashi, Yuki; Lu, Wei-Yang

    2013-09-01

    This report summarizes computational efforts to model interfacial fracture using cohesive zone models in the SIERRA/SolidMechanics (SIERRA/SM) finite element code. Cohesive surface elements were used to model crack initiation and propagation along predefined paths. Mesh convergence was observed with SIERRA/SM for numerous geometries. As the funding for this project came from the Advanced Simulation and Computing Verification and Validation (ASC V&V) focus area, considerable effort was spent performing verification and validation. Code verification was performed to compare code predictions to analytical solutions for simple three-element simulations as well as a higher-fidelity simulation of a double-cantilever beam. Parameter identification was conductedmore » with Dakota using experimental results on asymmetric double-cantilever beam (ADCB) and end-notched-flexure (ENF) experiments conducted under Campaign-6 funding. Discretization convergence studies were also performed with respect to mesh size and time step and an optimization study was completed for mode II delamination using the ENF geometry. Throughout this verification process, numerous SIERRA/SM bugs were found and reported, all of which have been fixed, leading to over a 10-fold increase in convergence rates. Finally, mixed-mode flexure experiments were performed for validation. One of the unexplained issues encountered was material property variability for ostensibly the same composite material. Since the variability is not fully understood, it is difficult to accurately assess uncertainty when performing predictions.« less

  10. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F; Yu, Yi-Hsiang; Nielsen, Kim

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30)more » [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.« less

  11. High-resolution face verification using pore-scale facial features.

    PubMed

    Li, Dong; Zhou, Huiling; Lam, Kin-Man

    2015-08-01

    Face recognition methods, which usually represent face images using holistic or local facial features, rely heavily on alignment. Their performances also suffer a severe degradation under variations in expressions or poses, especially when there is one gallery per subject only. With the easy access to high-resolution (HR) face images nowadays, some HR face databases have recently been developed. However, few studies have tackled the use of HR information for face recognition or verification. In this paper, we propose a pose-invariant face-verification method, which is robust to alignment errors, using the HR information based on pore-scale facial features. A new keypoint descriptor, namely, pore-Principal Component Analysis (PCA)-Scale Invariant Feature Transform (PPCASIFT)-adapted from PCA-SIFT-is devised for the extraction of a compact set of distinctive pore-scale facial features. Having matched the pore-scale features of two-face regions, an effective robust-fitting scheme is proposed for the face-verification task. Experiments show that, with one frontal-view gallery only per subject, our proposed method outperforms a number of standard verification methods, and can achieve excellent accuracy even the faces are under large variations in expression and pose.

  12. Multi-mode energy management strategy for fuel cell electric vehicles based on driving pattern identification using learning vector quantization neural network algorithm

    NASA Astrophysics Data System (ADS)

    Song, Ke; Li, Feiqiang; Hu, Xiao; He, Lin; Niu, Wenxu; Lu, Sihao; Zhang, Tong

    2018-06-01

    The development of fuel cell electric vehicles can to a certain extent alleviate worldwide energy and environmental issues. While a single energy management strategy cannot meet the complex road conditions of an actual vehicle, this article proposes a multi-mode energy management strategy for electric vehicles with a fuel cell range extender based on driving condition recognition technology, which contains a patterns recognizer and a multi-mode energy management controller. This paper introduces a learning vector quantization (LVQ) neural network to design the driving patterns recognizer according to a vehicle's driving information. This multi-mode strategy can automatically switch to the genetic algorithm optimized thermostat strategy under specific driving conditions in the light of the differences in condition recognition results. Simulation experiments were carried out based on the model's validity verification using a dynamometer test bench. Simulation results show that the proposed strategy can obtain better economic performance than the single-mode thermostat strategy under dynamic driving conditions.

  13. Development and preliminary verification of the 3D core neutronic code: COCO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, H.; Mo, K.; Li, W.

    As the recent blooming economic growth and following environmental concerns (China)) is proactively pushing forward nuclear power development and encouraging the tapping of clean energy. Under this situation, CGNPC, as one of the largest energy enterprises in China, is planning to develop its own nuclear related technology in order to support more and more nuclear plants either under construction or being operation. This paper introduces the recent progress in software development for CGNPC. The focus is placed on the physical models and preliminary verification results during the recent development of the 3D Core Neutronic Code: COCO. In the COCO code,more » the non-linear Green's function method is employed to calculate the neutron flux. In order to use the discontinuity factor, the Neumann (second kind) boundary condition is utilized in the Green's function nodal method. Additionally, the COCO code also includes the necessary physical models, e.g. single-channel thermal-hydraulic module, burnup module, pin power reconstruction module and cross-section interpolation module. The preliminary verification result shows that the COCO code is sufficient for reactor core design and analysis for pressurized water reactor (PWR). (authors)« less

  14. Clinical verification in homeopathy and allergic conditions.

    PubMed

    Van Wassenhoven, Michel

    2013-01-01

    The literature on clinical research in allergic conditions treated with homeopathy includes a meta-analysis of randomised controlled trials (RCT) for hay fever with positive conclusions and two positive RCTs in asthma. Cohort surveys using validated Quality of Life questionnaires have shown improvement in asthma in children, general allergic conditions and skin diseases. Economic surveys have shown positive results in eczema, allergy, seasonal allergic rhinitis, asthma, food allergy and chronic allergic rhinitis. This paper reports clinical verification of homeopathic symptoms in all patients and especially in various allergic conditions in my own primary care practice. For preventive treatments in hay fever patients, Arsenicum album was the most effective homeopathic medicine followed by Nux vomica, Pulsatilla pratensis, Gelsemium, Sarsaparilla, Silicea and Natrum muriaticum. For asthma patients, Arsenicum iodatum appeared most effective, followed by Lachesis, Calcarea arsenicosa, Carbo vegetabilis and Silicea. For eczema and urticaria, Mezereum was most effective, followed by Lycopodium, Sepia, Arsenicum iodatum, Calcarea carbonica and Psorinum. The choice of homeopathic medicine depends on the presence of other associated symptoms and 'constitutional' features. Repertories should be updated by including results of such clinical verifications of homeopathic prescribing symptoms. Copyright © 2012 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  15. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  16. Observations on CFD Verification and Validation from the AIAA Drag Prediction Workshops

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Kleb, Bil; Vassberg, John C.

    2014-01-01

    The authors provide observations from the AIAA Drag Prediction Workshops that have spanned over a decade and from a recent validation experiment at NASA Langley. These workshops provide an assessment of the predictive capability of forces and moments, focused on drag, for transonic transports. It is very difficult to manage the consistency of results in a workshop setting to perform verification and validation at the scientific level, but it may be sufficient to assess it at the level of practice. Observations thus far: 1) due to simplifications in the workshop test cases, wind tunnel data are not necessarily the “correct” results that CFD should match, 2) an average of core CFD data are not necessarily a better estimate of the true solution as it is merely an average of other solutions and has many coupled sources of variation, 3) outlier solutions should be investigated and understood, and 4) the DPW series does not have the systematic build up and definition on both the computational and experimental side that is required for detailed verification and validation. Several observations regarding the importance of the grid, effects of physical modeling, benefits of open forums, and guidance for validation experiments are discussed. The increased variation in results when predicting regions of flow separation and increased variation due to interaction effects, e.g., fuselage and horizontal tail, point out the need for validation data sets for these important flow phenomena. Experiences with a recent validation experiment at NASA Langley are included to provide guidance on validation experiments.

  17. Is it Code Imperfection or 'garbage in Garbage Out'? Outline of Experiences from a Comprehensive Adr Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2013-12-01

    ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.

  18. The concept verification testing of materials science payloads

    NASA Technical Reports Server (NTRS)

    Griner, C. S.; Johnston, M. H.; Whitaker, A.

    1976-01-01

    The concept Verification Testing (CVT) project at the Marshall Space Flight Center, Alabama, is a developmental activity that supports Shuttle Payload Projects such as Spacelab. It provides an operational 1-g environment for testing NASA and other agency experiment and support systems concepts that may be used in shuttle. A dedicated Materials Science Payload was tested in the General Purpose Laboratory to assess the requirements of a space processing payload on a Spacelab type facility. Physical and functional integration of the experiments into the facility was studied, and the impact of the experiments on the facility (and vice versa) was evaluated. A follow-up test designated CVT Test IVA was also held. The purpose of this test was to repeat Test IV experiments with a crew composed of selected and trained scientists. These personnel were not required to have prior knowledge of the materials science disciplines, but were required to have a basic knowledge of science and the scientific method.

  19. The migration response to the Legal Arizona Workers Act

    PubMed Central

    Ellis, Mark; Wright, Richard; Townley, Matthew; Copeland, Kristy

    2014-01-01

    The 2008 Legal Arizona Workers Act (LAWA) requires all public and private employers to authenticate the legal status of their workers using the federal employment verification system known as E-Verify. With LAWA, Arizona became the first state to have a universal mandate for employment verification. While LAWA targets unauthorized workers, most of whom are Latino immigrants, other groups could experience LAWA’s effects, such as those who share households with undocumented workers. In addition, employers may seek to minimize their risk of LAWA penalties by not hiring those who appear to them as more likely to be unauthorized, such as naturalized Latino immigrants and US-born Latinos. Existing research has found a reduction in foreign-born Latino employment and population in response to LAWA. This paper asks a different question: have groups that are most likely to be affected by the law migrated to other states? We find a significant and sustained increase in the internal outmigration rate from Arizona of foreign-born, noncitizen Latinos - the group most likely to include the unauthorized - after the passage of LAWA. There was no significant LAWA internal migration response by foreign-born Latino citizens. US-born Latinos showed some signs of a LAWA-induced internal migration response after the law went into effect, but it is not sustained. The results indicate that local and state immigration policy can alter the settlement geography of the foreign born. This leads us to speculate about how immigrant settlement may adjust in the coming years to the intersecting geographies of post-recession economic opportunity and tiered immigration policies. PMID:25018590

  20. Engineering support activities for the Apollo 17 Surface Electrical Properties Experiment.

    NASA Technical Reports Server (NTRS)

    Cubley, H. D.

    1972-01-01

    Description of the engineering support activities which were required to ensure fulfillment of objectives specified for the Apollo 17 SEP (Surface Electrical Properties) Experiment. Attention is given to procedural steps involving verification of hardware acceptability to the astronauts, computer simulation of the experiment hardware, field trials, receiver antenna pattern measurements, and the qualification test program.

  1. An Investigation of the Effects of Relevant Samples and a Comparison of Verification versus Discovery Based Lab Design

    ERIC Educational Resources Information Center

    Rieben, James C., Jr.

    2010-01-01

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect…

  2. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  3. Verification of Space Station Secondary Power System Stability Using Design of Experiment

    NASA Technical Reports Server (NTRS)

    Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce

    1998-01-01

    This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.

  4. The End-To-End Safety Verification Process Implemented to Ensure Safe Operations of the Columbus Research Module

    NASA Astrophysics Data System (ADS)

    Arndt, J.; Kreimer, J.

    2010-09-01

    The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough flexibility which is desired by manned space operations with scientific objectives. In the period of COLUMBUS operations since launch already a number of lessons learnt could be implemented especially in the IEHA that allow to improve the flexibility of on-board operations without degradation of Safety.

  5. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  6. Verification of road databases using multiple road models

    NASA Astrophysics Data System (ADS)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  7. A Protocol-Analytic Study of Metacognition in Mathematical Problem Solving.

    ERIC Educational Resources Information Center

    Cai, Jinfa

    1994-01-01

    Metacognitive behaviors of subjects having high (n=2) and low (n=2) levels of mathematical experience were compared across four cognitive processes in mathematical problem solving: orientation, organization, execution, and verification. High-experience subjects engaged in self-regulation and spent more time on orientation and organization. (36…

  8. 40 CFR 80.1450 - What are the registration requirements under the RFS program?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... professional work experience in the chemical engineering field or related to renewable fuel production. (B) For... third-party engineering review and written report and verification of the information provided pursuant... professional engineer licensed in the United States with professional work experience in the chemical...

  9. Verification of the modified model of drying process of a polymer liquid film on a flat substrate by experiment (3) - using organic solvent

    NASA Astrophysics Data System (ADS)

    Kagami, Hiroyuki

    2007-05-01

    We have proposed and modified a model of drying process of polymer solution coated on a flat substrate for flat polymer film fabrication and have presented the fruits through Photomask Japan 2002, 2003, 2004, Smart Materials, Nano-, and Micro-Smart Systems 2006 and so on. And for example numerical simulation of the model qualitatively reappears a typical thickness profile of the polymer film formed after drying, that is, the profile that the edge of the film is thicker and just the region next to the edge's bump is thinner. Then we have clarified dependence of distribution of polymer molecules on a flat substrate on a various parameters based on analysis of many numerical simulations. Then we did a few kinds of experiments so as to verify the modified model and reported the results of them through Photomask Japan 2005 and 2006. We could observe some results supporting the modified model. But we could not observe a characteristic region of a valley next to the edge's bump of a polymer film after drying. After some trial of various improved experiments we reached the conclusion that the characteristic region didn't appear by reason that water which vaporized slower than organic solvent was used as solvent. Then, in this study, we adopted organic solvent instead of water as solvent for experiments. As a result, that the characteristic region as mentioned above could be seen and we could verify the model more accurately. In this paper, we present verification of the model through above improved experiments for verification using organic solvent.

  10. Can Economics Provide Insights into Trust Infrastructure?

    NASA Astrophysics Data System (ADS)

    Vishik, Claire

    Many security technologies require infrastructure for authentication, verification, and other processes. In many cases, viable and innovative security technologies are never adopted on a large scale because the necessary infrastructure is slow to emerge. Analyses of such technologies typically focus on their technical flaws, and research emphasizes innovative approaches to stronger implementation of the core features. However, an observation can be made that in many cases the success of adoption pattern depends on non-technical issues rather than technology-lack of economic incentives, difficulties in finding initial investment, inadequate government support. While a growing body of research is dedicated to economics of security and privacy in general, few theoretical studies in this area have been completed, and even fewer that look at the economics of “trust infrastructure” beyond simple “cost of ownership” models. This exploratory paper takes a look at some approaches in theoretical economics to determine if they can provide useful insights into security infrastructure technologies and architectures that have the best chance to be adopted. We attempt to discover if models used in theoretical economics can help inform technology developers of the optimal business models that offer a better chance for quick infrastructure deployment.

  11. Acoustic emissions verification testing of International Space Station experiment racks at the NASA Glenn Research Center Acoustical Testing Laboratory

    NASA Astrophysics Data System (ADS)

    Akers, James C.; Passe, Paul J.; Cooper, Beth A.

    2005-09-01

    The Acoustical Testing Laboratory (ATL) at the NASA John H. Glenn Research Center (GRC) in Cleveland, OH, provides acoustic emission testing and noise control engineering services for a variety of specialized customers, particularly developers of equipment and science experiments manifested for NASA's manned space missions. The ATL's primary customer has been the Fluids and Combustion Facility (FCF), a multirack microgravity research facility being developed at GRC for the USA Laboratory Module of the International Space Station (ISS). Since opening in September 2000, ATL has conducted acoustic emission testing of components, subassemblies, and partially populated FCF engineering model racks. The culmination of this effort has been the acoustic emission verification tests on the FCF Combustion Integrated Rack (CIR) and Fluids Integrated Rack (FIR), employing a procedure that incorporates ISO 11201 (``Acoustics-Noise emitted by machinery and equipment-Measurement of emission sound pressure levels at a work station and at other specified positions-Engineering method in an essentially free field over a reflecting plane''). This paper will provide an overview of the test methodology, software, and hardware developed to perform the acoustic emission verification tests on the CIR and FIR flight racks and lessons learned from these tests.

  12. CFD modeling and experimental verification of a single-stage coaxial Stirling-type pulse tube cryocooler without either double-inlet or multi-bypass operating at 30-35 K using mixed stainless steel mesh regenerator matrices

    NASA Astrophysics Data System (ADS)

    Dang, Haizheng; Zhao, Yibo

    2016-09-01

    This paper presents the CFD modeling and experimental verifications of a single-stage inertance tube coaxial Stirling-type pulse tube cryocooler operating at 30-35 K using mixed stainless steel mesh regenerator matrices without either double-inlet or multi-bypass. A two-dimensional axis-symmetric CFD model with the thermal non-equilibrium mode is developed to simulate the internal process, and the underlying mechanism of significantly reducing the regenerator losses with mixed matrices is discussed in detail based on the given six cases. The modeling also indicates that the combination of the given different mesh segments can be optimized to achieve the highest cooling efficiency or the largest exergy ratio, and then the verification experiments are conducted in which the satisfactory agreements between simulated and tested results are observed. The experiments achieve a no-load temperature of 27.2 K and the cooling power of 0.78 W at 35 K, or 0.29 W at 30 K, with an input electric power of 220 W and a reject temperature of 300 K.

  13. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    NASA Astrophysics Data System (ADS)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help MOSWOC forecasters view verification results in near real-time; plans to objectively assess flare forecasts under the EU Horizon 2020 FLARECAST project; and summarise ISES efforts to achieve consensus on verification.

  14. ESTEST: A Framework for the Verification and Validation of Electronic Structure Codes

    NASA Astrophysics Data System (ADS)

    Yuan, Gary; Gygi, Francois

    2011-03-01

    ESTEST is a verification and validation (V& V) framework for electronic structure codes that supports Qbox, Quantum Espresso, ABINIT, the Exciting Code and plans support for many more. We discuss various approaches to the electronic structure V& V problem implemented in ESTEST, that are related to parsing, formats, data management, search, comparison and analyses. Additionally, an early experiment in the distribution of V& V ESTEST servers among the electronic structure community will be presented. Supported by NSF-OCI 0749217 and DOE FC02-06ER25777.

  15. Verification of the quantum dimension effects in electricsl condactivity with different topology of laser-induced thin-film structures

    NASA Astrophysics Data System (ADS)

    Arakelian, S.; Kucherik, A.; Kutrovskaya, S.; Osipov, A.; Istratov, A.; Skryabin, I.

    2018-01-01

    A clear physical model for the quantum states verification in nanocluster structures with jump/tunneling electroconductivity are under study in both theory and experiment. The accent is made on consideration of low-dimensional structures when the structural phase transitions occur and the tendency to high enhancement electroconductivity obtained. The results give us an opportunity to establish a basis for new physical principles to create the functional elements for the optoelectronics and photonics in hybrid set-up (optics + electrophysics) by the nanocluster technology approach.

  16. Experimental verification of a Monte Carlo-based MLC simulation model for IMRT dose calculations in heterogeneous media

    NASA Astrophysics Data System (ADS)

    Tyagi, N.; Curran, B. H.; Roberson, P. L.; Moran, J. M.; Acosta, E.; Fraass, B. A.

    2008-02-01

    IMRT often requires delivering small fields which may suffer from electronic disequilibrium effects. The presence of heterogeneities, particularly low-density tissues in patients, complicates such situations. In this study, we report on verification of the DPM MC code for IMRT treatment planning in heterogeneous media, using a previously developed model of the Varian 120-leaf MLC. The purpose of this study is twofold: (a) design a comprehensive list of experiments in heterogeneous media for verification of any dose calculation algorithm and (b) verify our MLC model in these heterogeneous type geometries that mimic an actual patient geometry for IMRT treatment. The measurements have been done using an IMRT head and neck phantom (CIRS phantom) and slab phantom geometries. Verification of the MLC model has been carried out using point doses measured with an A14 slim line (SL) ion chamber inside a tissue-equivalent and a bone-equivalent material using the CIRS phantom. Planar doses using lung and bone equivalent slabs have been measured and compared using EDR films (Kodak, Rochester, NY).

  17. A software engineering approach to expert system design and verification

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.; Goodwin, Mary Ann

    1988-01-01

    Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.

  18. EPA ENVIRONMENTAL TECHNOLOGY EXPERIENCE

    EPA Science Inventory

    THE USEPA's Environmental Technology Verification for Metal Finishing Pollution Prevention Technologies (ETV-MF) Program verifies the performance of innovative, commercial-ready technologies designed to improve industry performance and achieve cost-effective pollution prevention ...

  19. Reliability of Iris Recognition as a Means of Identity Verification and Future Impact on Transportation Worker Identification Credential

    DTIC Science & Technology

    2008-03-01

    80 5. Time Frame of the Experiment...81 6. Eyeglasses ............................................................................................81 D. OBSERVED RESULTS...97 2. Eyeglasses are a Factor

  20. Modality Switching Cost during Property Verification by 7 Years of Age

    ERIC Educational Resources Information Center

    Ambrosi, Solene; Kalenine, Solene; Blaye, Agnes; Bonthoux, Francoise

    2011-01-01

    Recent studies in neuroimagery and cognitive psychology support the view of sensory-motor based knowledge: when processing an object concept, neural systems would re-enact previous experiences with this object. In this experiment, a conceptual switching cost paradigm derived from Pecher, Zeelenberg, and Barsalou (2003, 2004) was used to…

  1. Conservation of Mechanical and Electric Energy: Simple Experimental Verification

    ERIC Educational Resources Information Center

    Ponikvar, D.; Planinsic, G.

    2009-01-01

    Two similar experiments on conservation of energy and transformation of mechanical into electrical energy are presented. Both can be used in classes, as they offer numerous possibilities for discussion with students and are simple to perform. Results are presented and are precise within 20% for the version of the experiment where measured values…

  2. Satellite Power Systems (SPS) concept definition study. Volume 6: SPS technology requirements and verification

    NASA Technical Reports Server (NTRS)

    Hanley, G.

    1978-01-01

    Volume 6 of the SPS Concept Definition Study is presented and also incorporates results of NASA/MSFC in-house effort. This volume includes a supporting research and technology summary. Other volumes of the final report that provide additional detail are as follows: (1) Executive Summary; (2) SPS System Requirements; (3) SPS Concept Evolution; (4) SPS Point Design Definition; (5) Transportation and Operations Analysis; and Volume 7, SPS Program Plan and Economic Analysis.

  3. Satellite power systems (SPS) concept definition study. Volume 7: SPS program plan and economic analysis, appendixes

    NASA Technical Reports Server (NTRS)

    Hanley, G.

    1978-01-01

    Three appendixes in support of Volume 7 are contained in this document. The three appendixes are: (1) Satellite Power System Work Breakdown Structure Dictionary; (2) SPS cost Estimating Relationships; and (3) Financial and Operational Concept. Other volumes of the final report that provide additional detail are: Executive Summary; SPS Systems Requirements; SPS Concept Evolution; SPS Point Design Definition; Transportation and Operations Analysis; and SPS Technology Requirements and Verification.

  4. Finding 'paydirt' on the moon and asteroids

    NASA Technical Reports Server (NTRS)

    Staehle, R. L.

    1983-01-01

    Lunar polar region water ice, the Trojan asteroids of the earth, accessible, volatile substance-rich near-earth asteroids, and lunar gas deposits, are theoretically identified extraterrestrial resources for application to space transportation whose existence and economical exploitability could be confirmed by explorations conducted with relatively simple spacecraft. Any of these resources could improve the economics of interorbit transportation, thereby permitting launch vehicle payloads to be devoted to the transport of revenue-generating or services-providing equipment, rather than to the large propellant volumes required for the placing of large payloads on station. Among the verification missions cited is a simple lunar prospector orbiter, carrying a gamma-ray spectrometer and an electromagnetic sounder, which could ascertain the presence of water ice at the lunar poles.

  5. Commander Lousma works with EEVT experiment and cryogenic tube on aft middeck

    NASA Image and Video Library

    1982-03-31

    Commander Jack Lousma works with Electrophoresis Equipment Verification Test (EEVT) electrophoresis unit, cryogenic freezer and tube, and stowage locker equipment located on crew compartment middeck aft bulkhead.

  6. Precision Cleaning and Verification Processes Used at Marshall Space Flight Center for Critical Hardware Applications

    NASA Technical Reports Server (NTRS)

    Caruso, Salvadore V.; Cox, Jack A.; McGee, Kathleen A.

    1999-01-01

    This presentation discuss the Marshall Space Flight Center Operations and Responsibilities. These are propulsion, microgravity experiments, international space station, space transportation systems, and advance vehicle research.

  7. Creep fatigue life prediction for engine hot section materials (ISOTROPIC)

    NASA Technical Reports Server (NTRS)

    Nelson, R. S.; Schoendorf, J. F.; Lin, L. S.

    1986-01-01

    The specific activities summarized include: verification experiments (base program); thermomechanical cycling model; multiaxial stress state model; cumulative loading model; screening of potential environmental and protective coating models; and environmental attack model.

  8. The 2014 Sandia Verification and Validation Challenge: Problem statement

    DOE PAGES

    Hu, Kenneth; Orient, George

    2016-01-18

    This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less

  9. Simulation of Laboratory Tests of Steel Arch Support

    NASA Astrophysics Data System (ADS)

    Horyl, Petr; Šňupárek, Richard; Maršálek, Pavel; Pacześniowski, Krzysztof

    2017-03-01

    The total load-bearing capacity of steel arch yielding roadways supports is among their most important characteristics. These values can be obtained in two ways: experimental measurements in a specialized laboratory or computer modelling by FEM. Experimental measurements are significantly more expensive and more time-consuming. However, for proper tuning, a computer model is very valuable and can provide the necessary verification by experiment. In the cooperating workplaces of GIG Katowice, VSB-Technical University of Ostrava and the Institute of Geonics ASCR this verification was successful. The present article discusses the conditions and results of this verification for static problems. The output is a tuned computer model, which may be used for other calculations to obtain the load-bearing capacity of other types of steel arch supports. Changes in other parameters such as the material properties of steel, size torques, friction coefficient values etc. can be determined relatively quickly by changing the properties of the investigated steel arch supports.

  10. IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.

    2005-01-01

    This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.

  11. Study of the penetration of a plate made of titanium alloy VT6 with a steel ball

    NASA Astrophysics Data System (ADS)

    Buzyurkin, A. E.

    2018-03-01

    The purpose of this work is the development and verification of mathematical relationships, adapted to the package of finite element analysis LS-DYNA and describing the deformation and destruction of a titanium plate in a high-speed collision. Using data from experiments on the interaction of a steel ball with a titanium plate made of VT6 alloy, verification of the available constants necessary for describing the behavior of the material using the Johnson-Cook relationships was performed, as well as verification of the parameters of the fracture model used in the numerical modeling of the collision process. An analysis of experimental data on the interaction of a spherical impactor with a plate showed that the data accepted for VT6 alloy in the first approximation for deformation hardening in the Johnson-Cook model give too high results on the residual velocities of the impactor when piercing the plate.

  12. Development of a verification program for deployable truss advanced technology

    NASA Technical Reports Server (NTRS)

    Dyer, Jack E.

    1988-01-01

    Use of large deployable space structures to satisfy the growth demands of space systems is contingent upon reducing the associated risks that pervade many related technical disciplines. The overall objectives of this program was to develop a detailed plan to verify deployable truss advanced technology applicable to future large space structures and to develop a preliminary design of a deployable truss reflector/beam structure for use a a technology demonstration test article. The planning is based on a Shuttle flight experiment program using deployable 5 and 15 meter aperture tetrahedral truss reflections and a 20 m long deployable truss beam structure. The plan addresses validation of analytical methods, the degree to which ground testing adequately simulates flight and in-space testing requirements for large precision antenna designs. Based on an assessment of future NASA and DOD space system requirements, the program was developed to verify four critical technology areas: deployment, shape accuracy and control, pointing and alignment, and articulation and maneuvers. The flight experiment technology verification objectives can be met using two shuttle flights with the total experiment integrated on a single Shuttle Test Experiment Platform (STEP) and a Mission Peculiar Experiment Support Structure (MPESS). First flight of the experiment can be achieved 60 months after go-ahead with a total program duration of 90 months.

  13. L band push broom microwave radiometer: Soil moisture verification and time series experiment Delmarva Peninsula

    NASA Technical Reports Server (NTRS)

    Jackson, T. J.; Shiue, J.; Oneill, P.; Wang, J.; Fuchs, J.; Owe, M.

    1984-01-01

    The verification of a multi-sensor aircraft system developed to study soil moisture applications is discussed. This system consisted of a three beam push broom L band microwave radiometer, a thermal infrared scanner, a multispectral scanner, video and photographic cameras and an onboard navigational instrument. Ten flights were made of agricultural sites in Maryland and Delaware with little or no vegetation cover. Comparisons of aircraft and ground measurements showed that the system was reliable and consistent. Time series analysis of microwave and evaporation data showed a strong similarity that indicates a potential direction for future research.

  14. Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring

    NASA Technical Reports Server (NTRS)

    Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.

    2015-01-01

    Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.

  15. Apollo experience report: Communications system flight evaluation and verification

    NASA Technical Reports Server (NTRS)

    Travis, D.; Royston, C. L., Jr.

    1972-01-01

    Flight tests of the synergetic operation of the spacecraft and earth based communications equipment were accomplished during Apollo missions AS-202 through Apollo 12. The primary goals of these tests were to verify that the communications system would adequately support lunar landing missions and to establish the inflight communications system performance characteristics. To attain these goals, a communications system flight verification and evaluation team was established. The concept of the team operations, the evolution of the evaluation processes, synopses of the team activities associated with each mission, and major conclusions and recommendations resulting from the performance evaluation are represented.

  16. Distinctive Features Hold a Privileged Status in the Computation of Word Meaning: Implications for Theories of Semantic Memory

    ERIC Educational Resources Information Center

    Cree, George S.; McNorgan, Chris; McRae, Ken

    2006-01-01

    The authors present data from 2 feature verification experiments designed to determine whether distinctive features have a privileged status in the computation of word meaning. They use an attractor-based connectionist model of semantic memory to derive predictions for the experiments. Contrary to central predictions of the conceptual structure…

  17. Neutrino Physics

    DOE R&D Accomplishments Database

    Lederman, L. M.

    1963-01-09

    The prediction and verification of the neutrino are reviewed, together with the V A theory for its interactions (particularly the difficulties with the apparent existence of two neutrinos and the high energy cross section). The Brookhaven experiment confirming the existence of two neutrinos and the cross section increase with momentum is then described, and future neutrino experiments are considered. (D.C.W.)

  18. The F1000Research: Ebola article collection

    PubMed Central

    Piot, Peter

    2014-01-01

    The explosion of information about Ebola requires rapid publication, transparent verification and unrestricted access. I urge everyone involved in all aspects of the Ebola epidemic to openly and rapidly report their experiences and findings. PMID:25580233

  19. Introduction of Building Information Modeling (BIM) Technologies in Construction

    NASA Astrophysics Data System (ADS)

    Milyutina, M. A.

    2018-05-01

    The issues of introduction of building information modeling (BIM) in construction industry are considered in this work. The advantages of this approach and perspectives of the transition to new design technologies, construction process management, and operation in the near future are stated. The importance of development of pilot projects that should identify the ways and means of verification of the regulatory and technical base, as well as economic indicators in the transition to Building Information Technologies in the construction, is noted.

  20. Satellite power system (SPS) concept definition study. Volume 3: Experimental verification definition

    NASA Technical Reports Server (NTRS)

    Hanley, G. M.

    1980-01-01

    An evolutionary Satellite Power Systems development plan was prepared. Planning analysis was directed toward the evolution of a scenario that met the stated objectives, was technically possible and economically attractive, and took into account constraining considerations, such as requirements for very large scale end-to-end demonstration in a compressed time frame, the relative cost/technical merits of ground testing versus space testing, and the need for large mass flow capability to low Earth orbit and geosynchronous orbit at reasonable cost per pound.

  1. Verification and Validation (V&V) Methodologies for Multiphase Turbulent and Explosive Flows. V&V Case Studies of Computer Simulations from Los Alamos National Laboratory GMFIX codes

    NASA Astrophysics Data System (ADS)

    Dartevelle, S.

    2006-12-01

    Large-scale volcanic eruptions are inherently hazardous events, hence cannot be described by detailed and accurate in situ measurements; hence, volcanic explosive phenomenology is inadequately constrained in terms of initial and inflow conditions. Consequently, little to no real-time data exist to Verify and Validate computer codes developed to model these geophysical events as a whole. However, code Verification and Validation remains a necessary step, particularly when volcanologists use numerical data for mitigation of volcanic hazards as more often performed nowadays. The Verification and Validation (V&V) process formally assesses the level of 'credibility' of numerical results produced within a range of specific applications. The first step, Verification, is 'the process of determining that a model implementation accurately represents the conceptual description of the model', which requires either exact analytical solutions or highly accurate simplified experimental data. The second step, Validation, is 'the process of determining the degree to which a model is an accurate representation of the real world', which requires complex experimental data of the 'real world' physics. The Verification step is rather simple to formally achieve, while, in the 'real world' explosive volcanism context, the second step, Validation, is about impossible. Hence, instead of validating computer code against the whole large-scale unconstrained volcanic phenomenology, we rather suggest to focus on the key physics which control these volcanic clouds, viz., momentum-driven supersonic jets and multiphase turbulence. We propose to compare numerical results against a set of simple but well-constrained analog experiments, which uniquely and unambiguously represent these two key-phenomenology separately. Herewith, we use GMFIX (Geophysical Multiphase Flow with Interphase eXchange, v1.62), a set of multiphase- CFD FORTRAN codes, which have been recently redeveloped to meet the strict Quality Assurance, verification, and validation requirements from the Office of Civilian Radioactive Waste Management of the US Dept of Energy. GMFIX solves Navier-Stokes and energy partial differential equations for each phase with appropriate turbulence and interfacial coupling between phases. For momentum-driven single- to multi-phase underexpanded jets, the position of the first Mach disk is known empirically as a function of both the pressure ratio, K, and the particle mass fraction, Phi at the nozzle. Namely, the higher K, the further downstream the Mach disk and the higher Phi, the further upstream the first Mach disk. We show that GMFIX captures these two essential features. In addition, GMFIX displays all the properties found in these jets, such as expansion fans, incident and reflected shocks, and subsequent downstream mach discs, which make this code ideal for further investigations of equivalent volcanological phenomena. One of the other most challenging aspects of volcanic phenomenology is the multiphase nature of turbulence. We also validated GMFIX in comparing the velocity profiles and turbulence quantities against well constrained analog experiments. The velocity profiles agree with the analog ones as well as these of production of turbulent quantities. Overall, the Verification and the Validation experiments although inherently challenging suggest GMFIX captures the most essential dynamical properties of multiphase and supersonic flows and jets.

  2. Participate or Observe? Effects of Economic Classroom Experiments on Students' Economic Literacy

    ERIC Educational Resources Information Center

    Grol, Roel; Sent, Esther-Mirjam; de Vries, Bregje

    2017-01-01

    Economic classroom experiments are controlled interactive learning exercises targeting the comprehension of economic concepts in an inductive way. Aiming at increasing students' knowledge of economic concepts, two types of economic classroom experiments are examined in a sample of 134 secondary school students. In the interactive research…

  3. The use of robots for arms control treaty verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michalowski, S.J.

    1991-01-01

    Many aspects of the superpower relationship now present a new set of challenges and opportunities, including the vital area of arms control. This report addresses one such possibility: the use of robots for the verification of arms control treaties. The central idea of this report is far from commonly-accepted. In fact, it was only encountered once in bibliographic review phase of the project. Nonetheless, the incentive for using robots is simple and coincides with that of industrial applications: to replace or supplement human activity in the performance of tasks for which human participation is unnecessary, undesirable, impossible, too dangerous ormore » too expensive. As in industry, robots should replace workers (in this case, arms control inspectors) only when questions of efficiency, reliability, safety, security and cost-effectiveness have been answered satisfactorily. In writing this report, it is not our purpose to strongly advocate the application of robots in verification. Rather, we wish to explore the significant aspects, pro and con, of applying experience from the field of flexible automation to the complex task of assuring arms control treaty compliance. We want to establish a framework for further discussion of this topic and to define criteria for evaluating future proposals. The authors' expertise is in robots, not arms control. His practical experience has been in developing systems for use in the rehabilitation of severely disabled persons (such as quadriplegics), who can use robots for assistance during activities of everyday living, as well as in vocational applications. This creates a special interest in implementations that, in some way, include a human operator in the control scheme of the robot. As we hope to show in this report, such as interactive systems offer the greatest promise of making a contribution to the challenging problems of treaty verification. 15 refs.« less

  4. Proceedings of the Joint IAEA/CSNI Specialists` Meeting on Fracture Mechanics Verification by Large-Scale Testing held at Pollard Auditorium, Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pugh, C.E.; Bass, B.R.; Keeney, J.A.

    This report contains 40 papers that were presented at the Joint IAEA/CSNI Specialists` Meeting Fracture Mechanics Verification by Large-Scale Testing held at the Pollard Auditorium, Oak Ridge, Tennessee, during the week of October 26--29, 1992. The papers are printed in the order of their presentation in each session and describe recent large-scale fracture (brittle and/or ductile) experiments, analyses of these experiments, and comparisons between predictions and experimental results. The goal of the meeting was to allow international experts to examine the fracture behavior of various materials and structures under conditions relevant to nuclear reactor components and operating environments. The emphasismore » was on the ability of various fracture models and analysis methods to predict the wide range of experimental data now available. The individual papers have been cataloged separately.« less

  5. Summary electrophoretic data base on human embryonic kidney cell strain 8514

    NASA Technical Reports Server (NTRS)

    Plank, L. D.; Kunze, M. E.; Arquiza, M. V.; Morrison, D. R.; Todd, P. W.

    1985-01-01

    To properly plan the electrophoresis equipment verification test (EEVT) and continuous flow electrophoresis system (CFES) experiments with human embryonic kidney cells, first a candidate cell lot had to be chosen on the basis of electrophoretic heterogeneity, growth potential, cytogenetics, and urokinase production. Cell lot 8514 from MA Bioproducts, Inc. was chosen for this purpose, and several essential analytical electrophoresis experiments were performed to test its final suitability for these experiments.

  6. Rapid Verification of Candidate Serological Biomarkers Using Gel-based, Label-free Multiple Reaction Monitoring

    PubMed Central

    Tang, Hsin-Yao; Beer, Lynn A.; Barnhart, Kurt T.; Speicher, David W.

    2011-01-01

    Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves, quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1-D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μl serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers. PMID:21726088

  7. Rapid verification of candidate serological biomarkers using gel-based, label-free multiple reaction monitoring.

    PubMed

    Tang, Hsin-Yao; Beer, Lynn A; Barnhart, Kurt T; Speicher, David W

    2011-09-02

    Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μL of serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers.

  8. Do Workers Who Experience Conflict between the Work and Family Domains Hit a "Glass Ceiling?": A Meta-Analytic Examination

    ERIC Educational Resources Information Center

    Hoobler, Jenny M.; Hu, Jia; Wilson, Morgan

    2010-01-01

    Based in Conservation of Resources (COR; Hobfoll, 1989) and self-verification (Swann, 1987) theories, we argue that when workers experience conflict between the work and family domains, this should have implications for evaluations of their work performance and ultimately affect more "objective" career outcomes such as salary and hierarchical…

  9. Development of a Scalable Testbed for Mobile Olfaction Verification.

    PubMed

    Zakaria, Syed Muhammad Mamduh Syed; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Yeon, Ahmad Shakaff Ali; Md Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-12-09

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment.

  10. Development of a Scalable Testbed for Mobile Olfaction Verification

    PubMed Central

    Syed Zakaria, Syed Muhammad Mamduh; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Ali Yeon, Ahmad Shakaff; Md. Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-01-01

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment. PMID:26690175

  11. Feasibility and systems definition study for Microwave Multi-Application Payload (MMAP)

    NASA Technical Reports Server (NTRS)

    Horton, J. B.; Allen, C. C.; Massaro, M. J.; Zemany, J. L.; Murrell, J. W.; Stanhouse, R. W.; Condon, G. P.; Stone, R. F.; Swana, J.; Afifi, M.

    1977-01-01

    Work completed on three Shuttle/Spacelab experiments is examined: the Adaptive Multibeam Phased Array Antenna (AMPA) Experiment, Electromagnetic Environment Experiment (EEE) and Millimeter Wave Communications Experiment (MWCE). Results included the definition of operating modes, sequence of operation, radii of operation about several ground stations, signal format, foot prints of typical orbits and preliminary definition of ground and user terminals. Conceptual hardware designs, Spacelab interfaces, data handling methods, experiment testing and verification studies were included. The MWCE-MOD I was defined conceptually for a steerable high gain antenna.

  12. Future Launch Vehicle Structures - Expendable and Reusable Elements

    NASA Astrophysics Data System (ADS)

    Obersteiner, M. H.; Borriello, G.

    2002-01-01

    Further evolution of existing expendable launch vehicles will be an obvious element influencing the future of space transportation. Besides this reusability might be the change with highest potential for essential improvement. The expected cost reduction and finally contributing to this, the improvement of reliability including safe mission abort capability are driving this idea. Although there are ideas of semi-reusable launch vehicles, typically two stages vehicles - reusable first stage or booster(s) and expendable second or upper stage - it should be kept in mind that the benefit of reusability will only overwhelm if there is a big enough share influencing the cost calculation. Today there is the understanding that additional technology preparation and verification will be necessary to master reusability and get enough benefits compared with existing launch vehicles. This understanding is based on several technology and system concepts preparation and verification programmes mainly done in the US but partially also in Europe and Japan. The major areas of necessary further activities are: - System concepts including business plan considerations - Sub-system or component technologies refinement - System design and operation know-how and capabilities - Verification and demonstration oriented towards future mission mastering: One of the most important aspects for the creation of those coming programmes and activities will be the iterative process of requirements definition derived from concepts analyses including economical considerations and the results achieved and verified within technology and verification programmes. It is the intention of this paper to provide major trends for those requirements focused on future launch vehicles structures. This will include the aspects of requirements only valid for reusable launch vehicles and those common for expendable, semi-reusable and reusable launch vehicles. Structures and materials is and will be one of the important technology areas to be improved. This includes: - Primary structures - Thermal protection systems (for high and low temperatures) - Hot structures (leading edges, engine cowling, ...) - Tanks (for various propellants and fluids, cryo, ...) Requirements to be considered are including materials properties and a variety of loads definition - static and dynamic. Based on existing knowledge and experience for expendable LV (Ariane, ...) and aircraft there is the need to established a combined understanding to provide the basis for an efficient RLV design. Health monitoring will support the cost efficient operation of future reusable structures, but will also need a sound understanding of loads and failure mechanisms as basis. Risk mitigation will ask for several steps of demonstration towards a cost efficient RLV (structures) operation. Typically this has or will start with basic technology, to be evolved to components demonstration (TPS, tanks, ...) and finally to result in the demonstration of the cost efficient reuse operation. This paper will also include a programmatic logic concerning future LV structures demonstration.

  13. Nuclear Energy Experiments to the Center for Global Security and Cooperation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Douglas M.

    2015-06-01

    This is to serve as verification that the Center 6200 experimental pieces supplied to the Technology Training and Demonstration Area within the Center of Global Security and Cooperation are indeed unclassified unlimited released for viewing.

  14. Density measurement verification for hot mixed asphalt concrete pavement construction.

    DOT National Transportation Integrated Search

    2010-06-01

    Oregon Department of Transportation (ODOT) requires a minimum density for the construction of dense-graded hot mix asphalt concrete (HMAC) pavements to ensure the likelihood that the pavement will not experience distresses that reduce the expected se...

  15. Density measurement verification for hot mix asphalt concrete pavement construction.

    DOT National Transportation Integrated Search

    2010-06-01

    Oregon Department of Transportation (ODOT) requires a minimum density for the construction of dense-graded hot mix asphalt concrete (HMAC) pavements to ensure the likelihood that the pavement will not experience distresses that reduce the expected se...

  16. TETAM Model Verification Study. Volume I. Representation of Intervisibility, Initial Comparisons

    DTIC Science & Technology

    1976-02-01

    simulation models in terms of firings, engagements, and losses between tank and antitank as compared with the field data collected during the free play battles of Field Experiment 11.8 are found in Volume III. (Author)

  17. How to Find a Bug in Ten Thousand Lines Transport Solver? Outline of Experiences from AN Advection-Diffusion Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F.

    2011-12-01

    Almost all natural phenomena on Earth are highly nonlinear. Even simplifications to the equations describing nature usually end up being nonlinear partial differential equations. Transport (ADR) equation is a pivotal equation in atmospheric sciences and water quality. This nonlinear equation needs to be solved numerically for practical purposes so academicians and engineers thoroughly rely on the assistance of numerical codes. Thus, numerical codes require verification before they are utilized for multiple applications in science and engineering. Model verification is a mathematical procedure whereby a numerical code is checked to assure the governing equation is properly solved as it is described in the design document. CFD verification is not a straightforward and well-defined course. Only a complete test suite can uncover all the limitations and bugs. Results are needed to be assessed to make a distinction between bug-induced-defect and innate limitation of a numerical scheme. As Roache (2009) said, numerical verification is a state-of-the-art procedure. Sometimes novel tricks work out. This study conveys the synopsis of the experiences we gained during a comprehensive verification process which was done for a transport solver. A test suite was designed including unit tests and algorithmic tests. Tests were layered in complexity in several dimensions from simple to complex. Acceptance criteria defined for the desirable capabilities of the transport code such as order of accuracy, mass conservation, handling stiff source term, spurious oscillation, and initial shape preservation. At the begining, mesh convergence study which is the main craft of the verification is performed. To that end, analytical solution of ADR equation gathered. Also a new solution was derived. In the more general cases, lack of analytical solution could be overcome through Richardson Extrapolation and Manufactured Solution. Then, two bugs which were concealed during the mesh convergence study uncovered with the method of false injection and visualization of the results. Symmetry had dual functionality: there was a bug, which was hidden due to the symmetric nature of a test (it was detected afterward utilizing artificial false injection), on the other hand self-symmetry was used to design a new test, and in a case the analytical solution of the ADR equation was unknown. Assisting subroutines designed to check and post-process conservation of mass and oscillatory behavior. Finally, capability of the solver also checked for stiff reaction source term. The above test suite not only was a decent tool of error detection but also it provided a thorough feedback on the ADR solvers limitations. Such information is the crux of any rigorous numerical modeling for a modeler who deals with surface/subsurface pollution transport.

  18. Satellite Power System (SPS) concept definition study (exhibit C)

    NASA Technical Reports Server (NTRS)

    Haley, G. M.

    1979-01-01

    The major outputs of the study are the constructability studies which resulted in the definition of the concepts for satellite, rectenna, and satellite construction base construction. Transportation analyses resulted in definition of heavy-lift launch vehicle, electric orbit transfer vehicle, personnel orbit transfer vehicle, and intra-orbit transfer vehicle as well as overall operations related to transportation systems. The experiment/verification program definition resulted in the definition of elements for the Ground-Based Experimental Research and Key Technology plans. These studies also resulted in conceptual approaches for early space technology verification. The cost analysis defined the overall program and cost data for all program elements and phases.

  19. Main propulsion system test requirements for the two-engine Shuttle-C

    NASA Technical Reports Server (NTRS)

    Lynn, E. E.; Platt, G. K.

    1989-01-01

    The Shuttle-C is an unmanned cargo carrying derivative of the space shuttle with optional two or three space shuttle main engines (SSME's), whereas the shuttle has three SSME's. Design and operational differences between the Shuttle-C and shuttle were assessed to determine requirements for additional main propulsion system (MPS) verification testing. Also, reviews were made of the shuttle main propulsion test program objectives and test results and shuttle flight experience. It was concluded that, if significant MPS modifications are not made beyond those currently planned, then main propulsion system verification can be concluded with an on-pad flight readiness firing.

  20. Conducting Research from Small University Observatories: Investigating Exoplanet Candidates

    NASA Astrophysics Data System (ADS)

    Moreland, Kimberly D.

    2018-01-01

    Kepler has to date discovered 4,496 exoplanet candidates, but only half are confirmed, and only a handful are thought to be Earth sized and in the habitable zone. Planet verification often involves extensive follow-up observations, which are both time and resource intensive. The data set collected by Kepler is massive and will be studied for decades. University/small observatories, such as the one at Texas State University, are in a good position to assist with the exoplanet candidate verification process. By preforming extended monitoring campaigns, which are otherwise cost ineffective for larger observatories, students gain valuable research experience and contribute valuable data and results to the scientific community.

  1. Biometrics, identification and surveillance.

    PubMed

    Lyon, David

    2008-11-01

    Governing by identity describes the emerging regime of a globalizing, mobile world. Governance depends on identification but identification increasingly depends on biometrics. This 'solution' to difficulties of verification is described and some technical weaknesses are discussed. The role of biometrics in classification systems is also considered and is shown to contain possible prejudice in relation to racialized criteria of identity. Lastly, the culture of biometric identification is shown to be limited to abstract data, artificially separated from the lived experience of the body including the orientation to others. It is proposed that creators of national ID systems in particular address these crucial deficiencies in their attempt to provide new modes of verification.

  2. Feasibility and systems definition study for microwave multi-application payload (MMAP)

    NASA Technical Reports Server (NTRS)

    Horton, J. B.; Allen, C. C.; Massaro, M. J.; Zemany, J. L.; Murrell, J. W.; Stanhouse, R. W.; Condon, G. P.; Stone, R. F.

    1977-01-01

    There were three Shuttle/Spacelab experiments: adaptive multibeam phased array antenna (AMPA) experiment, electromagnetic environment experiment (EEE), and millimeter wave communications experiment (MWCE). Work on the AMPA experiment was completed. Results included are definition of operating modes, sequence of operation, radii of operation about several ground stations, signal format, foot prints of typical orbits and preliminary definition of ground and user terminals. Definition of the MOD I EEE included conceptual hardware designs, spacelab interfaces, preliminary data handling methods, experiment tests and verification, and EMC studies. The MWCE was defined conceptually for a steerable high gain antenna.

  3. Determination of somatropin charged variants by capillary zone electrophoresis - optimisation, verification and implementation of the European pharmacopoeia method.

    PubMed

    Storms, S M; Feltus, A; Barker, A R; Joly, M-A; Girard, M

    2009-03-01

    Measurement of somatropin charged variants by isoelectric focusing was replaced with capillary zone electrophoresis in the January 2006 European Pharmacopoeia Supplement 5.3, based on results from an interlaboratory collaborative study. Due to incompatibilities and method-robustness issues encountered prior to verification, a number of method parameters required optimisation. As the use of a diode array detector at 195 nm or 200 nm led to a loss of resolution, a variable wavelength detector using a 200 nm filter was employed. Improved injection repeatability was obtained by increasing the injection time and pressure, and changing the sample diluent from water to running buffer. Finally, definition of capillary pre-treatment and rinse procedures resulted in more consistent separations over time. Method verification data are presented demonstrating linearity, specificity, repeatability, intermediate precision, limit of quantitation, sample stability, solution stability, and robustness. Based on these experiments, several modifications to the current method have been recommended and incorporated into the European Pharmacopoeia to help improve method performance across laboratories globally.

  4. Systems Approach to Arms Control Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, K; Neimeyer, I; Listner, C

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between twomore » model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.« less

  5. Safeguardability of the vitrification option for disposal of plutonium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pillay, K.K.S.

    1996-05-01

    Safeguardability of the vitrification option for plutonium disposition is rather complex and there is no experience base in either domestic or international safeguards for this approach. In the present treaty regime between the US and the states of the former Soviet Union, bilaterial verifications are considered more likely with potential for a third-party verification of safeguards. There are serious technological limitations to applying conventional bulk handling facility safeguards techniques to achieve independent verification of plutonium in borosilicate glass. If vitrification is the final disposition option chosen, maintaining continuity of knowledge of plutonium in glass matrices, especially those containing boron andmore » those spike with high-level wastes or {sup 137}Cs, is beyond the capability of present-day safeguards technologies and nondestructive assay techniques. The alternative to quantitative measurement of fissile content is to maintain continuity of knowledge through a combination of containment and surveillance, which is not the international norm for bulk handling facilities.« less

  6. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  7. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NAM > Home NAM Operational Products HIRESW Operational Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log Contacts

  8. Microwave scattering models and basic experiments

    NASA Technical Reports Server (NTRS)

    Fung, Adrian K.

    1989-01-01

    Progress is summarized which has been made in four areas of study: (1) scattering model development for sparsely populated media, such as a forested area; (2) scattering model development for dense media, such as a sea ice medium or a snow covered terrain; (3) model development for randomly rough surfaces; and (4) design and conduct of basic scattering and attenuation experiments suitable for the verification of theoretical models.

  9. Distance Metric Learning Using Privileged Information for Face Verification and Person Re-Identification.

    PubMed

    Xu, Xinxing; Li, Wen; Xu, Dong

    2015-12-01

    In this paper, we propose a new approach to improve face verification and person re-identification in the RGB images by leveraging a set of RGB-D data, in which we have additional depth images in the training data captured using depth cameras such as Kinect. In particular, we extract visual features and depth features from the RGB images and depth images, respectively. As the depth features are available only in the training data, we treat the depth features as privileged information, and we formulate this task as a distance metric learning with privileged information problem. Unlike the traditional face verification and person re-identification tasks that only use visual features, we further employ the extra depth features in the training data to improve the learning of distance metric in the training process. Based on the information-theoretic metric learning (ITML) method, we propose a new formulation called ITML with privileged information (ITML+) for this task. We also present an efficient algorithm based on the cyclic projection method for solving the proposed ITML+ formulation. Extensive experiments on the challenging faces data sets EUROCOM and CurtinFaces for face verification as well as the BIWI RGBD-ID data set for person re-identification demonstrate the effectiveness of our proposed approach.

  10. Safety Verification of the Small Aircraft Transportation System Concept of Operations

    NASA Technical Reports Server (NTRS)

    Carreno, Victor; Munoz, Cesar

    2005-01-01

    A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.

  11. Preventing illegal tobacco and alcohol sales to minors through electronic age-verification devices: a field effectiveness study.

    PubMed

    Krevor, Brad; Capitman, John A; Oblak, Leslie; Cannon, Joanna B; Ruwe, Mathilda

    2003-01-01

    Efforts to prohibit the sales of tobacco and alcohol products to minors are widespread. Electronic Age Verification (EAV) devices are one possible means to improve compliance with sales to minors laws. The purpose of this study was to evaluate the implementation and effectiveness of EAV devices in terms of the frequency and accuracy of age verification, as well as to examine the impact of EAV's on the retailer environment. Two study locations were selected: Tallahassee, Florida and Iowa City, Iowa. Retail stores were invited to participate in the study, producing a self-selected experimental group. Stores that did not elect to test the EAV's comprised the comparison group. The data sources included: 1) mystery shopper inspections: two pre- and five post-EAV installation mystery shopper inspections of tobacco and alcohol retailers; 2) retail clerk and manager interviews; and 3) customer interviews. The study found that installing EAV devices with minimal training and encouragement did not increase age verification and underage sales refusal. Surveyed clerks reported positive experiences using the electronic ID readers and customers reported almost no discomfort about being asked to swipe their IDs. Observations from this study support the need for a more comprehensive system for responsible retailing.

  12. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.

  13. Statistical Post-Processing of Wind Speed Forecasts to Estimate Relative Economic Value

    NASA Astrophysics Data System (ADS)

    Courtney, Jennifer; Lynch, Peter; Sweeney, Conor

    2013-04-01

    The objective of this research is to get the best possible wind speed forecasts for the wind energy industry by using an optimal combination of well-established forecasting and post-processing methods. We start with the ECMWF 51 member ensemble prediction system (EPS) which is underdispersive and hence uncalibrated. We aim to produce wind speed forecasts that are more accurate and calibrated than the EPS. The 51 members of the EPS are clustered to 8 weighted representative members (RMs), chosen to minimize the within-cluster spread, while maximizing the inter-cluster spread. The forecasts are then downscaled using two limited area models, WRF and COSMO, at two resolutions, 14km and 3km. This process creates four distinguishable ensembles which are used as input to statistical post-processes requiring multi-model forecasts. Two such processes are presented here. The first, Bayesian Model Averaging, has been proven to provide more calibrated and accurate wind speed forecasts than the ECMWF EPS using this multi-model input data. The second, heteroscedastic censored regression is indicating positive results also. We compare the two post-processing methods, applied to a year of hindcast wind speed data around Ireland, using an array of deterministic and probabilistic verification techniques, such as MAE, CRPS, probability transform integrals and verification rank histograms, to show which method provides the most accurate and calibrated forecasts. However, the value of a forecast to an end-user cannot be fully quantified by just the accuracy and calibration measurements mentioned, as the relationship between skill and value is complex. Capturing the full potential of the forecast benefits also requires detailed knowledge of the end-users' weather sensitive decision-making processes and most importantly the economic impact it will have on their income. Finally, we present the continuous relative economic value of both post-processing methods to identify which is more beneficial to the wind energy industry of Ireland.

  14. [Development of Markov models for economics evaluation of strategies on hepatitis B vaccination and population-based antiviral treatment in China].

    PubMed

    Yang, P C; Zhang, S X; Sun, P P; Cai, Y L; Lin, Y; Zou, Y H

    2017-07-10

    Objective: To construct the Markov models to reflect the reality of prevention and treatment interventions against hepatitis B virus (HBV) infection, simulate the natural history of HBV infection in different age groups and provide evidence for the economics evaluations of hepatitis B vaccination and population-based antiviral treatment in China. Methods: According to the theory and techniques of Markov chain, the Markov models of Chinese HBV epidemic were developed based on the national data and related literature both at home and abroad, including the settings of Markov model states, allowable transitions and initial and transition probabilities. The model construction, operation and verification were conducted by using software TreeAge Pro 2015. Results: Several types of Markov models were constructed to describe the disease progression of HBV infection in neonatal period, perinatal period or adulthood, the progression of chronic hepatitis B after antiviral therapy, hepatitis B prevention and control in adults, chronic hepatitis B antiviral treatment and the natural progression of chronic hepatitis B in general population. The model for the newborn was fundamental which included ten states, i.e . susceptiblity to HBV, HBsAg clearance, immune tolerance, immune clearance, low replication, HBeAg negative CHB, compensated cirrhosis, decompensated cirrhosis, hepatocellular carcinoma (HCC) and death. The susceptible state to HBV was excluded in the perinatal period model, and the immune tolerance state was excluded in the adulthood model. The model for general population only included two states, survive and death. Among the 5 types of models, there were 9 initial states assigned with initial probabilities, and 27 states for transition probabilities. The results of model verifications showed that the probability curves were basically consistent with the situation of HBV epidemic in China. Conclusion: The Markov models developed can be used in economics evaluation of hepatitis B vaccination and treatment for the elimination of HBV infection in China though the structures and parameters in the model have uncertainty with dynamic natures.

  15. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  16. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience.

    PubMed

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael

    2007-08-21

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm(3) ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 +/- 1.2% and 0.5 +/- 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 +/- 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach. The physical effects modelled in the dose calculation software MUV allow accurate dose calculations in individual verification points. Independent calculations may be used to replace experimental dose verification once the IMRT programme is mature.

  17. Two years experience with quality assurance protocol for patient related Rapid Arc treatment plan verification using a two dimensional ionization chamber array

    PubMed Central

    2011-01-01

    Purpose To verify the dose distribution and number of monitor units (MU) for dynamic treatment techniques like volumetric modulated single arc radiation therapy - Rapid Arc - each patient treatment plan has to be verified prior to the first treatment. The purpose of this study was to develop a patient related treatment plan verification protocol using a two dimensional ionization chamber array (MatriXX, IBA, Schwarzenbruck, Germany). Method Measurements were done to determine the dependence between response of 2D ionization chamber array, beam direction, and field size. Also the reproducibility of the measurements was checked. For the patient related verifications the original patient Rapid Arc treatment plan was projected on CT dataset of the MatriXX and the dose distribution was calculated. After irradiation of the Rapid Arc verification plans measured and calculated 2D dose distributions were compared using the gamma evaluation method implemented in the measuring software OmniPro (version 1.5, IBA, Schwarzenbruck, Germany). Results The dependence between response of 2D ionization chamber array, field size and beam direction has shown a passing rate of 99% for field sizes between 7 cm × 7 cm and 24 cm × 24 cm for measurements of single arc. For smaller and larger field sizes than 7 cm × 7 cm and 24 cm × 24 cm the passing rate was less than 99%. The reproducibility was within a passing rate of 99% and 100%. The accuracy of the whole process including the uncertainty of the measuring system, treatment planning system, linear accelerator and isocentric laser system in the treatment room was acceptable for treatment plan verification using gamma criteria of 3% and 3 mm, 2D global gamma index. Conclusion It was possible to verify the 2D dose distribution and MU of Rapid Arc treatment plans using the MatriXX. The use of the MatriXX for Rapid Arc treatment plan verification in clinical routine is reasonable. The passing rate should be 99% than the verification protocol is able to detect clinically significant errors. PMID:21342509

  18. Renewable Energy SCADA/Training Using NASA's Advanced Technology Communication Satellite

    NASA Technical Reports Server (NTRS)

    Kalu, A.; Emrich, C.; Ventre, G.; Wilson, W.; Acosta, Roberto (Technical Monitor)

    2000-01-01

    The lack of electrical energy in the rural communities of developing countries is well known, as is the economic unfeasibility of providing much needed energy to these regions via electric grids. Renewable energy (RE) can provide an economic advantage over conventional forms in meeting some of these energy needs. The use of a Supervisory Control and Data Acquisition (SCADA) arrangement via satellite could enable experts at remote locations to provide technical assistance to local trainees while they acquire a measure of proficiency with a newly installed RE system through hands-on training programs using the same communications link. Upon full mastery of the technologies, indigenous personnel could also employ similar SCADA arrangements to remotely monitor and control their constellation of RE systems. Two separate ACTS technology verification experiments (TVEs) have demonstrated that the portability of the Ultra Small Aperture Terminal (USAT) and the versatility of NASA's Advanced Communications Technology Satellite (ACTS), as well as the advantages of Ka band satellites, can be invaluable in providing energy training via distance education (DE), and for implementing renewable energy system SCADA. What has not been tested is the capabilities of these technologies for a simultaneous implementation of renewable energy DE and SCADA. Such concurrent implementations will be useful for preparing trainees in developing countries for their eventual SCADA operations. The project described in this correspondence is the first effort, to our knowledge, in this specific TVE. The setup for this experiment consists of a one-Watt USAT located at Florida Solar Energy Center (FSEC) connected to two satellite modems tuned to different frequencies to establish two duplex ACTS Ka-band communication channels. A short training program on operation and maintenance of the system will be delivered while simultaneously monitoring and controlling the hybrid using the same satellite communications link. The trainees will include faculty and students from Savannah State University, and staff from FSEC. An interactive internet link will be used to allow faculty from the University of West Indies to participate in the training session.

  19. USML-1 Glovebox experiments

    NASA Technical Reports Server (NTRS)

    Naumann, Robert J.

    1995-01-01

    This report covers the development of and results from three experiments that were flown in the Materials Science Glovebox on USML-1: Marangoni convection in Closed Containers (MCCC), Double Float Zone (DFZ), and Fiber Pulling in Microgravity (FPM). The Glovebox provided a convenient, low cost method for doing simple 'try and see' experiments that could test new concepts or elucidate microgravity phenomena. Since the Glovebox provided essentially one (or possibly two levels of confinement, many of the stringent verification and test requirements on the experiment apparatus could be relaxed and a streamlined test and verification plan for flight qualification could be implemented. Furthermore, the experiments were contained in their own carrying cases whose external configurations could be identified early in the integration sequence for stowage considerations while delivery of the actual experiment apparatus could be postponed until only a few months before flight. This minimized the time fluids must be contained and reduced the possibility of corrosive reactions that could ruin the experiment. In many respects, this exercise was as much about developing a simpler, cheaper way of doing crew-assisted science as it was about the actual scientific accomplishments of the individual experiments. The Marangoni Convection in Closed Containers experiment was designed to study the effects of a void space in a simulated Bridgman crystal growth configuration and to determine if surface tension driven convective flows that may result from thermal gradients along any free surfaces could affect the solidification process. The Fiber Pulling in Microgravity experiment sought to separate the role of gravity drainage from capillarity effects in the break-up of slender cylindrical liquid columns. The Stability of a Double Float Zone experiment explored the feasibility of a quasi-containerless process in which a solidifying material is suspended by two liquid bridges of its own melt.

  20. TOUGH Simulations of the Updegraff's Set of Fluid and Heat Flow Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moridis, G.J.; Pruess

    1992-11-01

    The TOUGH code [Pruess, 1987] for two-phase flow of water, air, and heat in penneable media has been exercised on a suite of test problems originally selected and simulated by C. D. Updegraff [1989]. These include five 'verification' problems for which analytical or numerical solutions are available, and three 'validation' problems that model laboratory fluid and heat flow experiments. All problems could be run without any code modifications (*). Good and efficient numerical performance, as well as accurate results were obtained throughout. Additional code verification and validation problems from the literature are briefly summarized, and suggestions are given for propermore » applications of TOUGH and related codes.« less

  1. Verification technology of remote sensing camera satellite imaging simulation based on ray tracing

    NASA Astrophysics Data System (ADS)

    Gu, Qiongqiong; Chen, Xiaomei; Yang, Deyun

    2017-08-01

    Remote sensing satellite camera imaging simulation technology is broadly used to evaluate the satellite imaging quality and to test the data application system. But the simulation precision is hard to examine. In this paper, we propose an experimental simulation verification method, which is based on the test parameter variation comparison. According to the simulation model based on ray-tracing, the experiment is to verify the model precision by changing the types of devices, which are corresponding the parameters of the model. The experimental results show that the similarity between the imaging model based on ray tracing and the experimental image is 91.4%, which can simulate the remote sensing satellite imaging system very well.

  2. Analytical torque calculation and experimental verification of synchronous permanent magnet couplings with Halbach arrays

    NASA Astrophysics Data System (ADS)

    Seo, Sung-Won; Kim, Young-Hyun; Lee, Jung-Ho; Choi, Jang-Young

    2018-05-01

    This paper presents analytical torque calculation and experimental verification of synchronous permanent magnet couplings (SPMCs) with Halbach arrays. A Halbach array is composed of various numbers of segments per pole; we calculate and compare the magnetic torques for 2, 3, and 4 segments. Firstly, based on the magnetic vector potential, and using a 2D polar coordinate system, we obtain analytical solutions for the magnetic field. Next, through a series of processes, we perform magnetic torque calculations using the derived solutions and a Maxwell stress tensor. Finally, the analytical results are verified by comparison with the results of 2D and 3D finite element analysis and the results of an experiment.

  3. Effects of Part-based Similarity on Visual Search: The Frankenbear Experiment

    PubMed Central

    Alexander, Robert G.; Zelinsky, Gregory J.

    2012-01-01

    Do the target-distractor and distractor-distractor similarity relationships known to exist for simple stimuli extend to real-world objects, and are these effects expressed in search guidance or target verification? Parts of photorealistic distractors were replaced with target parts to create four levels of target-distractor similarity under heterogenous and homogenous conditions. We found that increasing target-distractor similarity and decreasing distractor-distractor similarity impaired search guidance and target verification, but that target-distractor similarity and heterogeneity/homogeneity interacted only in measures of guidance; distractor homogeneity lessens effects of target-distractor similarity by causing gaze to fixate the target sooner, not by speeding target detection following its fixation. PMID:22227607

  4. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.

  5. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  6. Verification and Validation of Adaptive and Intelligent Systems with Flight Test Results

    NASA Technical Reports Server (NTRS)

    Burken, John J.; Larson, Richard R.

    2009-01-01

    F-15 IFCS project goals are: a) Demonstrate Control Approaches that can Efficiently Optimize Aircraft Performance in both Normal and Failure Conditions [A] & [B] failures. b) Advance Neural Network-Based Flight Control Technology for New Aerospace Systems Designs with a Pilot in the Loop. Gen II objectives include; a) Implement and Fly a Direct Adaptive Neural Network Based Flight Controller; b) Demonstrate the Ability of the System to Adapt to Simulated System Failures: 1) Suppress Transients Associated with Failure; 2) Re-Establish Sufficient Control and Handling of Vehicle for Safe Recovery. c) Provide Flight Experience for Development of Verification and Validation Processes for Flight Critical Neural Network Software.

  7. Combustion Fundamentals Research

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Increased emphasis is placed on fundamental and generic research at Lewis Research Center with less systems development efforts. This is especially true in combustion research, where the study of combustion fundamentals has grown significantly in order to better address the perceived long term technical needs of the aerospace industry. The main thrusts for this combustion fundamentals program area are as follows: analytical models of combustion processes, model verification experiments, fundamental combustion experiments, and advanced numeric techniques.

  8. Experiences using IAEA Code of practice for radiation sterilization of tissue allografts: Validation and routine control

    NASA Astrophysics Data System (ADS)

    Hilmy, N.; Febrida, A.; Basril, A.

    2007-11-01

    Problems of tissue allografts in using International Standard (ISO) 11137 for validation of radiation sterilization dose (RSD) are limited and low numbers of uniform samples per production batch, those are products obtained from one donor. Allograft is a graft transplanted between two different individuals of the same species. The minimum number of uniform samples needed for verification dose (VD) experiment at the selected sterility assurance level (SAL) per production batch according to the IAEA Code is 20, i.e., 10 for bio-burden determination and the remaining 10 for sterilization test. Three methods of the IAEA Code have been used for validation of RSD, i.e., method A1 that is a modification of method 1 of ISO 11137:1995, method B (ISO 13409:1996), and method C (AAMI TIR 27:2001). This paper describes VD experiments using uniform products obtained from one cadaver donor, i.e., cancellous bones, demineralized bone powders and amnion grafts from one life donor. Results of the verification dose experiments show that RSD is 15.4 kGy for cancellous and demineralized bone grafts and 19.2 kGy for amnion grafts according to method A1 and 25 kGy according to methods B and C.

  9. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich

    2003-01-01

    We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.

  10. High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David L. Chichester; James T. Johnson; Edward H. Seabury

    2012-07-01

    Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials.more » The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations and experiments, using fission-spectrum neutron sources to assess neutron transmission through composite low-Z attenuators.« less

  11. Environmental Technology Verification Report: Grouts for Wastewater Collection Systems, Avanti International AV-118 Acrylic Chemical Grout

    EPA Science Inventory

    Municipalities are discovering rapid degradation of infrastructures in wastewater collection and treatment facilities due to the infiltration of water from the surrounding environments. Wastewater facilities are not only wet, but also experience hydrostatic pressure conditions un...

  12. [Does action semantic knowledge influence mental simulation in sentence comprehension?].

    PubMed

    Mochizuki, Masaya; Naito, Katsuo

    2012-04-01

    This research investigated whether action semantic knowledge influences mental simulation during sentence comprehension. In Experiment 1, we confirmed that the words of face-related objects include the perceptual knowledge about the actions that bring the object to the face. In Experiment 2, we used an acceptability judgment task and a word-picture verification task to compare the perceptual information that is activated by the comprehension of sentences describing an action using face-related objects near the face (near-sentence) or far from the face (far-sentence). Results showed that participants took a longer time to judge the acceptability of the far-sentence than the near-sentence. Verification times were significantly faster when the actions in the pictures matched the action described in the sentences than when they were mismatched. These findings suggest that action semantic knowledge influences sentence processing, and that perceptual information corresponding to the content of the sentence is activated regardless of the action semantic knowledge at the end of the sentence processing.

  13. Towards a Compositional SPIN

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Giannakopoulou, Dimitra

    2006-01-01

    This paper discusses our initial experience with introducing automated assume-guarantee verification based on learning in the SPIN tool. We believe that compositional verification techniques such as assume-guarantee reasoning could complement the state-reduction techniques that SPIN already supports, thus increasing the size of systems that SPIN can handle. We present a "light-weight" approach to evaluating the benefits of learning-based assume-guarantee reasoning in the context of SPIN: we turn our previous implementation of learning for the LTSA tool into a main program that externally invokes SPIN to provide the model checking-related answers. Despite its performance overheads (which mandate a future implementation within SPIN itself), this approach provides accurate information about the savings in memory. We have experimented with several versions of learning-based assume guarantee reasoning, including a novel heuristic introduced here for generating component assumptions when their environment is unavailable. We illustrate the benefits of learning-based assume-guarantee reasoning in SPIN through the example of a resource arbiter for a spacecraft. Keywords: assume-guarantee reasoning, model checking, learning.

  14. What is new about covered interest parity condition in the European Union? Evidence from fractal cross-correlation regressions

    NASA Astrophysics Data System (ADS)

    Ferreira, Paulo; Kristoufek, Ladislav

    2017-11-01

    We analyse the covered interest parity (CIP) using two novel regression frameworks based on cross-correlation analysis (detrended cross-correlation analysis and detrending moving-average cross-correlation analysis), which allow for studying the relationships at different scales and work well under non-stationarity and heavy tails. CIP is a measure of capital mobility commonly used to analyse financial integration, which remains an interesting feature of study in the context of the European Union. The importance of this features is related to the fact that the adoption of a common currency is associated with some benefits for countries, but also involves some risks such as the loss of economic instruments to face possible asymmetric shocks. While studying the Eurozone members could explain some problems in the common currency, studying the non-Euro countries is important to analyse if they are fit to take the possible benefits. Our results point to the CIP verification mainly in the Central European countries while in the remaining countries, the verification of the parity is only residual.

  15. Remote Sensing Product Verification and Validation at the NASA Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas M.

    2005-01-01

    Remote sensing data product verification and validation (V&V) is critical to successful science research and applications development. People who use remote sensing products to make policy, economic, or scientific decisions require confidence in and an understanding of the products' characteristics to make informed decisions about the products' use. NASA data products of coarse to moderate spatial resolution are validated by NASA science teams. NASA's Stennis Space Center (SSC) serves as the science validation team lead for validating commercial data products of moderate to high spatial resolution. At SSC, the Applications Research Toolbox simulates sensors and targets, and the Instrument Validation Laboratory validates critical sensors. The SSC V&V Site consists of radiometric tarps, a network of ground control points, a water surface temperature sensor, an atmospheric measurement system, painted concrete radial target and edge targets, and other instrumentation. NASA's Applied Sciences Directorate participates in the Joint Agency Commercial Imagery Evaluation (JACIE) team formed by NASA, the U.S. Geological Survey, and the National Geospatial-Intelligence Agency to characterize commercial systems and imagery.

  16. An Investigation of Widespread Ozone Damage to the Soybean Crop in the Upper Midwest Determined From Ground-Based and Satellite Measurements

    NASA Technical Reports Server (NTRS)

    Fishman, Jack; Creilson, John K.; Parker, Peter A.; Ainsworth, Elizabeth A.; Vining, G. Geoffrey; Szarka, John; Booker, Fitzgerald L.; Xu, Xiaojing

    2010-01-01

    Elevated concentrations of ground-level ozone (O3) are frequently measured over farmland regions in many parts of the world. While numerous experimental studies show that O3 can significantly decrease crop productivity, independent verifications of yield losses at current ambient O3 concentrations in rural locations are sparse. In this study, soybean crop yield data during a 5-year period over the Midwest of the United States were combined with ground and satellite O3 measurements to provide evidence that yield losses on the order of 10% could be estimated through the use of a multiple linear regression model. Yield loss trends based on both conventional ground-based instrumentation and satellite-derived tropospheric O3 measurements were statistically significant and were consistent with results obtained from open-top chamber experiments and an open-air experimental facility (SoyFACE, Soybean Free Air Concentration Enrichment) in central Illinois. Our analysis suggests that such losses are a relatively new phenomenon due to the increase in background tropospheric O3 levels over recent decades. Extrapolation of these findings supports previous studies that estimate the global economic loss to the farming community of more than $10 billion annually.

  17. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...

  18. Experimental demonstration of an isotope-sensitive warhead verification technique using nuclear resonance fluorescence.

    PubMed

    Vavrek, Jayson R; Henderson, Brian S; Danagoulian, Areg

    2018-04-24

    Future nuclear arms reduction efforts will require technologies to verify that warheads slated for dismantlement are authentic without revealing any sensitive weapons design information to international inspectors. Despite several decades of research, no technology has met these requirements simultaneously. Recent work by Kemp et al. [Kemp RS, Danagoulian A, Macdonald RR, Vavrek JR (2016) Proc Natl Acad Sci USA 113:8618-8623] has produced a novel physical cryptographic verification protocol that approaches this treaty verification problem by exploiting the isotope-specific nature of nuclear resonance fluorescence (NRF) measurements to verify the authenticity of a warhead. To protect sensitive information, the NRF signal from the warhead is convolved with that of an encryption foil that contains key warhead isotopes in amounts unknown to the inspector. The convolved spectrum from a candidate warhead is statistically compared against that from an authenticated template warhead to determine whether the candidate itself is authentic. Here we report on recent proof-of-concept warhead verification experiments conducted at the Massachusetts Institute of Technology. Using high-purity germanium (HPGe) detectors, we measured NRF spectra from the interrogation of proxy "genuine" and "hoax" objects by a 2.52 MeV endpoint bremsstrahlung beam. The observed differences in NRF intensities near 2.2 MeV indicate that the physical cryptographic protocol can distinguish between proxy genuine and hoax objects with high confidence in realistic measurement times.

  19. 0-6674 : improving fracture resistance measurement in asphalt binder specification with verification on asphalt mixture cracking performance.

    DOT National Transportation Integrated Search

    2014-08-01

    The current performance grading (PG) specification for asphalt binders is based primarily on the study of unmodified asphalt binders. Over the years, experience has proven that the PG grading system, while good for ensuring overall quality, fails in ...

  20. Experimental quantum verification in the presence of temporally correlated noise

    NASA Astrophysics Data System (ADS)

    Mavadia, S.; Edmunds, C. L.; Hempel, C.; Ball, H.; Roy, F.; Stace, T. M.; Biercuk, M. J.

    2018-02-01

    Growth in the capabilities of quantum information hardware mandates access to techniques for performance verification that function under realistic laboratory conditions. Here we experimentally characterise the impact of common temporally correlated noise processes on both randomised benchmarking (RB) and gate-set tomography (GST). Our analysis highlights the role of sequence structure in enhancing or suppressing the sensitivity of quantum verification protocols to either slowly or rapidly varying noise, which we treat in the limiting cases of quasi-DC miscalibration and white noise power spectra. We perform experiments with a single trapped 171Yb+ ion-qubit and inject engineered noise (" separators="∝σ^ z ) to probe protocol performance. Experiments on RB validate predictions that measured fidelities over sequences are described by a gamma distribution varying between approximately Gaussian, and a broad, highly skewed distribution for rapidly and slowly varying noise, respectively. Similarly we find a strong gate set dependence of default experimental GST procedures in the presence of correlated errors, leading to significant deviations between estimated and calculated diamond distances in the presence of correlated σ^ z errors. Numerical simulations demonstrate that expansion of the gate set to include negative rotations can suppress these discrepancies and increase reported diamond distances by orders of magnitude for the same error processes. Similar effects do not occur for correlated σ^ x or σ^ y errors or depolarising noise processes, highlighting the impact of the critical interplay of selected gate set and the gauge optimisation process on the meaning of the reported diamond norm in correlated noise environments.

  1. The Effect of Job Performance Aids on Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fosshage, Erik

    Job performance aids (JPAs) have been studied for many decades in a variety of disciplines and for many different types of tasks, yet this is the first known research experiment using JPAs in a quality assurance (QA) context. The objective of this thesis was to assess whether a JPA has an effect on the performance of a QA observer performing the concurrent dual verification technique for a basic assembly task. The JPA used in this study was a simple checklist, and the design borrows heavily from prior research on task analysis and other human factors principles. The assembly task andmore » QA construct of concurrent dual verification are consistent with those of a high consequence manufacturing environment. Results showed that the JPA had only a limited effect on QA performance in the context of this experiment. However, there were three important and unexpected findings that may draw interest from a variety of practitioners. First, a novel testing methodology sensitive enough to measure the effects of a JPA on performance was created. Second, the discovery that there are different probabilities of detection for different types of error in a QA context may be the most far-reaching results. Third, these results highlight the limitations of concurrent dual verification as a control against defects. It is hoped that both the methodology and results of this study are an effective baseline from which to launch future research activities.« less

  2. AdaBoost-based on-line signature verifier

    NASA Astrophysics Data System (ADS)

    Hongo, Yasunori; Muramatsu, Daigo; Matsumoto, Takashi

    2005-03-01

    Authentication of individuals is rapidly becoming an important issue. The authors previously proposed a Pen-input online signature verification algorithm. The algorithm considers a writer"s signature as a trajectory of pen position, pen pressure, pen azimuth, and pen altitude that evolve over time, so that it is dynamic and biometric. Many algorithms have been proposed and reported to achieve accuracy for on-line signature verification, but setting the threshold value for these algorithms is a problem. In this paper, we introduce a user-generic model generated by AdaBoost, which resolves this problem. When user- specific models (one model for each user) are used for signature verification problems, we need to generate the models using only genuine signatures. Forged signatures are not available because imposters do not give forged signatures for training in advance. However, we can make use of another's forged signature in addition to the genuine signatures for learning by introducing a user generic model. And Adaboost is a well-known classification algorithm, making final decisions depending on the sign of the output value. Therefore, it is not necessary to set the threshold value. A preliminary experiment is performed on a database consisting of data from 50 individuals. This set consists of western-alphabet-based signatures provide by a European research group. In this experiment, our algorithm gives an FRR of 1.88% and an FAR of 1.60%. Since no fine-tuning was done, this preliminary result looks very promising.

  3. Preliminary Consideration of the ADS Research in China

    NASA Astrophysics Data System (ADS)

    Fang, Shouxian; Fu, Shinian

    2002-08-01

    Power supply is a key issue for China's further economic development. To meet the needs of our economic growth in the next century, the part of nuclear energy in the total newly increased power supply must become larger. However, the present nuclear power stations dominated by the PWR in the world are facing some troubles. Recently, a new concept, called ADS (Accelerator Driven Subcritical system), can avoid these troubles and it is recognized as a most prospective power system for fission energy. So during the early time of nuclear power development in our country, it is worthwhile to exploit this novel idea. In this paper, the ADS research program and a proposed verification facility are described. It consists of an 300MeV/3mA low energy accelerator, a swimming pool reactor and some basic research equipment. Beam physics, such as beam halo formation, in the intense-beam accelerator is also discussed.

  4. The simple economics of risk-sharing agreements between the NHS and the pharmaceutical industry.

    PubMed

    Barros, Pedro Pita

    2011-04-01

    The introduction of new (and expensive) pharmaceutical products is one of the major challenges for health systems. The search for new institutional arrangements is natural. The use of the so-called risk-sharing agreements is one example. Recent discussions have somewhat neglected the economic fundamentals underlying risk-sharing agreements. We argue here that risk-sharing agreements, although attractive due to the principle of paying by results, also entail risks. Too many patients may be put under treatment. Prices are likely to be adjusted upward, in anticipation of future risk-sharing agreements between the pharmaceutical company and the third-party payer. An available instrument is a verification cost per patient treated, which allows obtaining the first-best allocation of patients to the new treatment, under the agreement. Overall, the welfare effects of risk-sharing agreements are ambiguous, and caution is urged regarding their use. Copyright © 2010 John Wiley & Sons, Ltd.

  5. Space power system design and development from an economic point of view

    NASA Technical Reports Server (NTRS)

    Hazelrigg, G. A., Jr.

    1977-01-01

    The concept of a satellite solar power system offers a feasible, but unproven, long-range energy alternative. While the basic physics of these systems is understood, many developments are necessary in order to reduce the system cost to the point of being cost-competitive with alternative energy sources. Thus, a substantial technology advancement and verification program, plus test and demonstration satellite programs are necessary before a full-scale satellite can be designed and built. It is important to properly identify those elements of the technology that should be subject to development efforts, the goals of the corresponding development programs and the appropriate funding levels and schedules. Systems studies and designs play a major role in rationally formulating a development program. This paper uses an economic approach to place these studies into a framework for formulating a viable satellite solar power system development plan.

  6. [Thought Experiments of Economic Surplus: Science and Economy in Ernst Mach's Epistemology].

    PubMed

    Wulz, Monika

    2015-03-01

    Thought Experiments of Economic Surplus: Science and Economy in Ernst Mach's Epistemology. Thought experiments are an important element in Ernst Mach's epistemology: They facilitate amplifying our knowledge by experimenting with thoughts; they thus exceed the empirical experience and suspend the quest for immediate utility. In an economical perspective, Mach suggested that thought experiments depended on the production of an economic surplus based on the division of labor relieving the struggle for survival of the individual. Thus, as frequently emphasized, in Mach's epistemology, not only the 'economy of thought' is an important feature; instead, also the socioeconomic conditions of science play a decisive role. The paper discusses the mental and social economic aspects of experimental thinking in Mach's epistemology and examines those within the contemporary evolutionary, physiological, and economic contexts. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Testing of the line element of special relativity with rotating systems

    NASA Technical Reports Server (NTRS)

    Vargas, Jose G.; Torr, Douglas G.

    1989-01-01

    Experiments with rotating systems are examined from the point of view of a test theory of the Lorentz transformations (LTs), permitting, in principle, the verification of the simultaneity relation. The significance of the experiments involved in the testing of the LTs can be determined using Robertson's test theory (RTT). A revised RTT is discussed, and attention is given to the Ehrenfest paradox in connection with the testing of the LTs.

  8. Using Field Experiments to Change the Template of How We Teach Economics

    ERIC Educational Resources Information Center

    List, John A.

    2014-01-01

    In this article, the author explains why field experiments can improve what we teach and how we teach economics. Economists no longer operate as passive observers of economic phenomena. Instead, they participate actively in the research process by collecting data from field experiments to investigate the economics of everyday life. This change can…

  9. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  10. Control/structure interaction design methodology

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Layman, William E.

    1989-01-01

    The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.

  11. Attention and implicit memory in the category-verification and lexical decision tasks.

    PubMed

    Mulligan, Neil W; Peterson, Daniel

    2008-05-01

    Prior research on implicit memory appeared to support 3 generalizations: Conceptual tests are affected by divided attention, perceptual tasks are affected by certain divided-attention manipulations, and all types of priming are affected by selective attention. These generalizations are challenged in experiments using the implicit tests of category verification and lexical decision. First, both tasks were unaffected by divided-attention tasks known to impact other priming tasks. Second, both tasks were unaffected by a manipulation of selective attention in which colored words were either named or their colors identified. Thus, category verification, unlike other conceptual tasks, appears unaffected by divided attention, and some selective-attention tasks, and lexical decision, unlike other perceptual tasks, appears unaffected by a difficult divided-attention task and some selective-attention tasks. Finally, both tasks were affected by a selective-attention task in which attention was manipulated across objects (rather than within objects), indicating some susceptibility to selective attention. The results contradict an analysis on the basis of the conceptual-perceptual distinction and other more specific hypotheses but are consistent with the distinction between production and identification priming.

  12. Posttest calculation of the PBF LOC-11B and LOC-11C experiments using RELAP4/MOD6. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendrix, C.E.

    Comparisons between RELAP4/MOD6, Update 4 code-calculated and measured experimental data are presented for the PBF LOC-11C and LOC-11B experiments. Independent code verification techniques are now being developed and this study represents a preliminary effort applying structured criteria for developing computer models, selecting code input, and performing base-run analyses. Where deficiencies are indicated in the base-case representation of the experiment, methods of code and criteria improvement are developed and appropriate recommendations are made.

  13. Thermal physics in practice and its confrontation with school physics

    NASA Astrophysics Data System (ADS)

    Vochozka, Vladimír; Tesař, Jiří; Bednář, Vít

    2017-01-01

    Concepts of heat, specific heat capacity and other terms of thermal physics are very abstract. For their better understanding it is necessary in teaching to include newly conceived experiments focused on the everyday experience of students. The paper evaluates the thermal phenomena with the help of infrared camera, respectively surface temperature sensors for on-line measurement. The article focuses on the experimental verification of the law of conservation of energy in thermal physics, comparing specific heat capacity of various substances and their confrontation with established experience of pupils.

  14. Experiment S-191 visible and infrared spectrometer

    NASA Technical Reports Server (NTRS)

    Linnell, E. R.

    1974-01-01

    The design, development, fabrication test, and utilization of the visible and infrared spectrometer portion of the S-191 experiment, part of the Earth Resources Experiment Package, on board Skylab is discussed. The S-191 program is described, as well as conclusions and recommendations for improvement of this type of instrument for future applications. Design requirements, instrument design approaches, and the test verification program are presented along with test results, including flight hardware calibration data. A brief discussion of operation during the Skylab mission is included. Documentation associated with the program is listed.

  15. Missouri Soybean Association Biodiesel Demonstration Project: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludwig, Dale; Hamilton, Jill

    The Missouri Soybean Association (MSA) and the National Biodiesel Board (NBB) partnered together to implement the MSA Biodiesel Demonstration project under a United States Department of Energy (DOE) grant. The goal of this project was to provide decision makers and fleet managers with information that could lead to the increased use of domestically produced renewable fuels and could reduce the harmful impacts of school bus diesel exhaust on children. This project was initiated in September 2004 and completed in April 2011. The project carried out a broad range of activities organized under four areas: 1. Petroleum and related industry educationmore » program for fuel suppliers; 2. Fleet evaluation program using B20 with a Missouri school district; 3. Outreach and awareness campaign for school district fleet managers; and 4. Support of ongoing B20 Fleet Evaluation Team (FET) data collection efforts with existing school districts. Technical support to the biodiesel industry was also provided through NBB’s Troubleshooting Hotline. The hotline program was established in 2008 to troubleshoot fuel quality issues and help facilitate smooth implementation of the RFS and is described in greater detail under Milestone A.1 - Promote Instruction and Guidance on Best Practices. As a result of this project’s efforts, MSA and NBB were able to successfully reach out to and support a broad spectrum of biodiesel users in Missouri and New England. The MSA Biodiesel Demonstration was funded through a FY2004 Renewable Energy Resources Congressional earmark. The initial focus of this project was to test and evaluate biodiesel blends coupled with diesel oxidation catalysts as an emissions reduction technology for school bus fleets in the United States. The project was designed to verify emissions reductions using Environmental Protection Agency (EPA) protocols, then document – with school bus fleet experience – the viability of utilizing B20 blends. The fleet experience was expected to support ongoing industry efforts to collect existing data and to increase awareness and knowledge among school district fleet managers. However, three years into the project, the original intent of the engine verification was no longer deemed by equipment manufacturers to be of sufficient economic interest to enter into a partnership. In response, MSA requested a project extension and re-scope to eliminate the aftermarket equipment verification and replace it with a petroleum education program. The revised project maintained four task areas with the following modifications. The first component was directed at increasing national compliance with newly initiated state level fuel blend mandates through a distributor education program. Component two was modified to eliminate the verification element and, instead, document operational data from biodiesel use in a district school bus fleet. Components three and four were unchanged and maintained their purpose of expanding upon the existing knowledge base of biodiesel use in school bus fleets.« less

  16. An Audit of Ward Experience as a Tool for Teaching Diagnosis in Pulmonary Medicine

    ERIC Educational Resources Information Center

    Kale, Madhav K.; and others

    1969-01-01

    Analyzes and compares diagnoses made over a five-year period to determine whether differences between diagnoses of students and those of specialists justified further verification by specialized staff, and whether the proportion of such differences had changed with time because of more effective teaching. (WM)

  17. Investigating Navy Officer Retention Using Data Farming

    DTIC Science & Technology

    2015-09-01

    runs on Microsoft Access . Contractors from SAG Corporation translated the code into Visual Basic for Applications ( VBA ), bringing several benefits...18  b.  Accessions ............................................................. 18  c.  Promotions...Strategic Actions Group SEED Simulation Experiments & Efficient Design URL Unrestricted Line VBA Visual Basic for Applications VV&A Verification

  18. 25 CFR 38.5 - Qualifications for educators.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... verification by the ASE or the AEPA. Employees who falsify experience and employment history may be subject to... formal education and State certification requirements for tribal members who are hired to teach tribal... higher than the rate paid to qualified educators in teaching positions at that school. (c) Identification...

  19. Mormon Clients' Experiences of Conversion Therapy: The Need for a New Treatment Approach

    ERIC Educational Resources Information Center

    Beckstead, A. Lee; Morrow, Susan L.

    2004-01-01

    Perspectives were gathered of 50 Mormon individuals who had undergone counseling to change their sexual orientation. The data were analyzed using the constant comparative method and participant verification, thereby developing a grounded theory. A model emerged that depicted participants' intrapersonal and interpersonal motivations for seeking…

  20. Soft Drinks, Mind Reading, and Number Theory

    ERIC Educational Resources Information Center

    Schultz, Kyle T.

    2009-01-01

    Proof is a central component of mathematicians' work, used for verification, explanation, discovery, and communication. Unfortunately, high school students' experiences with proof are often limited to verifying mathematical statements or relationships that are already known to be true. As a result, students often fail to grasp the true nature of…

  1. Using Replication Projects in Teaching Research Methods

    ERIC Educational Resources Information Center

    Standing, Lionel G.; Grenier, Manuel; Lane, Erica A.; Roberts, Meigan S.; Sykes, Sarah J.

    2014-01-01

    It is suggested that replication projects may be valuable in teaching research methods, and also address the current need in psychology for more independent verification of published studies. Their use in an undergraduate methods course is described, involving student teams who performed direct replications of four well-known experiments, yielding…

  2. Putting the Laboratory at the Center of Teaching Chemistry

    ERIC Educational Resources Information Center

    Bopegedera, A. M. R. P.

    2011-01-01

    This article describes an effective approach to teaching chemistry by bringing the laboratory to the center of teaching, to bring the excitement of discovery to the learning process. The lectures and laboratories are closely integrated to provide a holistic learning experience. The laboratories progress from verification to open-inquiry and…

  3. Bistatic radar sea state monitoring system design

    NASA Technical Reports Server (NTRS)

    Ruck, G. T.; Krichbaum, C. K.; Everly, J. O.

    1975-01-01

    Remote measurement of the two-dimensional surface wave height spectrum of the ocean by the use of bistatic radar techniques was examined. Potential feasibility and experimental verification by field experiment are suggested. The required experimental hardware is defined along with the designing, assembling, and testing of several required experimental hardware components.

  4. Root-sum-square structural strength verification approach

    NASA Technical Reports Server (NTRS)

    Lee, Henry M.

    1994-01-01

    Utilizing a proposed fixture design or some variation thereof, this report presents a verification approach to strength test space flight payload components, electronics boxes, mechanisms, lines, fittings, etc., which traditionally do not lend themselves to classical static loading. The fixture, through use of ordered Euler rotation angles derived herein, can be mounted on existing vibration shakers and can provide an innovative method of applying single axis flight load vectors. The versatile fixture effectively loads protoflight or prototype components in all three axes simultaneously by use of a sinusoidal burst of desired magnitude at less than one-third the first resonant frequency. Cost savings along with improved hardware confidence are shown. The end product is an efficient way to verify experiment hardware for both random vibration and strength.

  5. Age and gender-invariant features of handwritten signatures for verification systems

    NASA Astrophysics Data System (ADS)

    AbdAli, Sura; Putz-Leszczynska, Joanna

    2014-11-01

    Handwritten signature is one of the most natural biometrics, the study of human physiological and behavioral patterns. Behavioral biometrics includes signatures that may be different due to its owner gender or age because of intrinsic or extrinsic factors. This paper presents the results of the author's research on age and gender influence on verification factors. The experiments in this research were conducted using a database that contains signatures and their associated metadata. The used algorithm is based on the universal forgery feature idea, where the global classifier is able to classify a signature as a genuine one or, as a forgery, without the actual knowledge of the signature template and its owner. Additionally, the reduction of the dimensionality with the MRMR method is discussed.

  6. NEUTRON MULTIPLICITY AND ACTIVE WELL NEUTRON COINCIDENCE VERIFICATION MEASUREMENTS PERFORMED FOR MARCH 2009 SEMI-ANNUAL DOE INVENTORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dewberry, R.; Ayers, J.; Tietze, F.

    The Analytical Development (AD) Section field nuclear measurement group performed six 'best available technique' verification measurements to satisfy a DOE requirement instituted for the March 2009 semi-annual inventory. The requirement of (1) yielded the need for SRNL Research Operations Department Material Control & Accountability (MC&A) group to measure the Pu content of five items and the highly enrich uranium (HEU) content of two. No 14Q-qualified measurement equipment was available to satisfy the requirement. The AD field nuclear group has routinely performed the required Confirmatory Measurements for the semi-annual inventories for fifteen years using sodium iodide and high purity germanium (HpGe)more » {gamma}-ray pulse height analysis nondestructive assay (NDA) instruments. With appropriate {gamma}-ray acquisition modeling, the HpGe spectrometers can be used to perform verification-type quantitative assay for Pu-isotopics and HEU content. The AD nuclear NDA group is widely experienced with this type of measurement and reports content for these species in requested process control, MC&A booking, and holdup measurements assays Site-wide. However none of the AD HpGe {gamma}-ray spectrometers have been 14Q-qualified, and the requirement of reference 1 specifically excluded a {gamma}-ray PHA measurement from those it would accept for the required verification measurements. The requirement of reference 1 was a new requirement for which the Savannah River National Laboratory (SRNL) Research Operations Department (ROD) MC&A group was unprepared. The criteria for exemption from verification were: (1) isotope content below 50 grams; (2) intrinsically tamper indicating or TID sealed items which contain a Category IV quantity of material; (3) assembled components; and (4) laboratory samples. Therefore all (SRNL) Material Balance Area (MBA) items with greater than 50 grams total Pu or greater than 50 grams HEU were subject to a verification measurement. The pass/fail criteria of reference 7 stated 'The facility will report measured values, book values, and statistical control limits for the selected items to DOE SR...', and 'The site/facility operator must develop, document, and maintain measurement methods for all nuclear material on inventory'. These new requirements exceeded SRNL's experience with prior semi-annual inventory expectations, but allowed the AD nuclear field measurement group to demonstrate its excellent adaptability and superior flexibility to respond to unpredicted expectations from the DOE customer. The requirements yielded five SRNL items subject to Pu verification and two SRNL items subject to HEU verification. These items are listed and described in Table 1.« less

  7. Dosimetric accuracy of Kodak EDR2 film for IMRT verifications.

    PubMed

    Childress, Nathan L; Salehpour, Mohammad; Dong, Lei; Bloch, Charles; White, R Allen; Rosen, Isaac I

    2005-02-01

    Patient-specific intensity-modulated radiotherapy (IMRT) verifications require an accurate two-dimensional dosimeter that is not labor-intensive. We assessed the precision and reproducibility of film calibrations over time, measured the elemental composition of the film, measured the intermittency effect, and measured the dosimetric accuracy and reproducibility of calibrated Kodak EDR2 film for single-beam verifications in a solid water phantom and for full-plan verifications in a Rexolite phantom. Repeated measurements of the film sensitometric curve in a single experiment yielded overall uncertainties in dose of 2.1% local and 0.8% relative to 300 cGy. 547 film calibrations over an 18-month period, exposed to a range of doses from 0 to a maximum of 240 MU or 360 MU and using 6 MV or 18 MV energies, had optical density (OD) standard deviations that were 7%-15% of their average values. This indicates that daily film calibrations are essential when EDR2 film is used to obtain absolute dose results. An elemental analysis of EDR2 film revealed that it contains 60% as much silver and 20% as much bromine as Kodak XV2 film. EDR2 film also has an unusual 1.69:1 silver:halide molar ratio, compared with the XV2 film's 1.02:1 ratio, which may affect its chemical reactions. To test EDR2's intermittency effect, the OD generated by a single 300 MU exposure was compared to the ODs generated by exposing the film 1 MU, 2 MU, and 4 MU at a time to a total of 300 MU. An ion chamber recorded the relative dose of all intermittency measurements to account for machine output variations. Using small MU bursts to expose the film resulted in delivery times of 4 to 14 minutes and lowered the film's OD by approximately 2% for both 6 and 18 MV beams. This effect may result in EDR2 film underestimating absolute doses for patient verifications that require long delivery times. After using a calibration to convert EDR2 film's OD to dose values, film measurements agreed within 2% relative difference and 2 mm criteria to ion chamber measurements for both sliding window and step-and-shoot fluence map verifications. Calibrated film results agreed with ion chamber measurements to within 5 % /2 mm criteria for transverse-plane full-plan verifications, but were consistently low. When properly calibrated, EDR2 film can be an adequate two-dimensional dosimeter for IMRT verifications, although it may underestimate doses in regions with long exposure times.

  8. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  9. IGARSS 89: Canadian Symposium on Remote Sensing (12th) (Symposium Canadien sur la Teledetection): Quantitative Remote Sensing: An Economic Tool for the Nineties Held in Vancouver, Canada on 10-14 July 1989. Volume 2. Tuesday, July 11

    DTIC Science & Technology

    1989-07-14

    blunder-free DEMs, the manual verification group of Ground Control Points (GCPs); (ii) and editing stage is still needed. the orbit and attitude of... control the influence (weight) of the sources and model the that each information class only had one subclass! To set non -Gaussian data. When the...clear line of infinitesimally close laminae sliding parallel to a shear line of shear; a quasi -brittle inner regime and a non -linear viscous outer

  10. E-st@r-I experience: Valuable knowledge for improving the e-st@r-II design

    NASA Astrophysics Data System (ADS)

    Corpino, S.; Obiols-Rabasa, G.; Mozzillo, R.; Nichele, F.

    2016-04-01

    Many universities all over the world have now established hands-on education programs based on CubeSats. These small and cheap platforms are becoming more and more attractive also for other-than-educational missions, such as technology demonstration, science applications, and Earth observation. This new paradigm requires the development of adequate technology to increase CubeSat performance and mission reliability, because educationally-driven missions have often failed. In 2013 the ESA Education Office launched the Fly Your Satellite! Programme which aims at increasing CubeSat mission reliability through several actions: to improve design implementation, to define best practices for conducting the verification process, and to make the CubeSat community aware of the importance of verification. Within this framework, the CubeSat team at Politecnico di Torino developed the e-st@r-II CubeSat as follow-on of the e-st@r-I satellite, launched in 2012 on the VEGA Maiden Flight. E-st@r-I and e-st@r-II are both 1U satellites with educational and technology demonstration objectives: to give hands-on experience to university students and to test an active attitude determination and control system based on inertial and magnetic measurements with magnetic actuation. This paper describes the know-how gained thanks to the e-st@r-I mission, and how this heritage has been translated into the improvement of the new CubeSat in several areas and lifecycle phases. The CubeSat design has been reviewed to reduce the complexity of the assembly procedure and to deal with possible failures of the on-board computer, for example re-coding the software in the communications subsystem. New procedures have been designed and assessed for the verification campaign accordingly to ECSS rules and with the support of ESA specialists. Different operative modes have been implemented to handle some anomalies observed during the operations of the first satellite. A new version of the on-board software is one of the main modifications. In particular, the activation sequence of the satellite has been modified to have a stepwise switch-on of the satellite. In conclusion, the e-st@r-I experience has provided valuable lessons during its development, verification and on-orbit operations. This know-how has become crucial for the development of the e-st@r-II CubeSat as illustrated in this article.

  11. A silicon strip detector array for energy verification and quality assurance in heavy ion therapy.

    PubMed

    Debrot, Emily; Newall, Matthew; Guatelli, Susanna; Petasecca, Marco; Matsufuji, Naruhiro; Rosenfeld, Anatoly B

    2018-02-01

    The measurement of depth dose profiles for range and energy verification of heavy ion beams is an important aspect of quality assurance procedures for heavy ion therapy facilities. The steep dose gradients in the Bragg peak region of these profiles require the use of detectors with high spatial resolution. The aim of this work is to characterize a one dimensional monolithic silicon detector array called the "serial Dose Magnifying Glass" (sDMG) as an independent ion beam energy and range verification system used for quality assurance conducted for ion beams used in heavy ion therapy. The sDMG detector consists of two linear arrays of 128 silicon sensitive volumes each with an effective size of 2mm × 50μm × 100μm fabricated on a p-type substrate at a pitch of 200 μm along a single axis of detection. The detector was characterized for beam energy and range verification by measuring the response of the detector when irradiated with a 290 MeV/u 12 C ion broad beam incident along the single axis of the detector embedded in a PMMA phantom. The energy of the 12 C ion beam incident on the detector and the residual energy of an ion beam incident on the phantom was determined from the measured Bragg peak position in the sDMG. Ad hoc Monte Carlo simulations of the experimental setup were also performed to give further insight into the detector response. The relative response profiles along the single axis measured with the sDMG detector were found to have good agreement between experiment and simulation with the position of the Bragg peak determined to fall within 0.2 mm or 1.1% of the range in the detector for the two cases. The energy of the beam incident on the detector was found to vary less than 1% between experiment and simulation. The beam energy incident on the phantom was determined to be (280.9 ± 0.8) MeV/u from the experimental and (280.9 ± 0.2) MeV/u from the simulated profiles. These values coincide with the expected energy of 281 MeV/u. The sDMG detector response was studied experimentally and characterized using a Monte Carlo simulation. The sDMG detector was found to accurately determine the 12 C beam energy and is suited for fast energy and range verification quality assurance. It is proposed that the sDMG is also applicable for verification of treatment planning systems that rely on particle range. © 2017 American Association of Physicists in Medicine.

  12. Experimental Results from the Thermal Energy Storage-1 (TES-1) Flight Experiment

    NASA Technical Reports Server (NTRS)

    Wald, Lawrence W.; Tolbert, Carol; Jacqmin, David

    1995-01-01

    The Thermal Energy Storage-1 (TES-1) is a flight experiment that flew on the Space Shuttle Columbia (STS-62), in March 1994, as part of the OAST-2 mission. TES-1 is the first experiment in a four experiment suite designed to provide data for understanding the long duration microgravity behavior of thermal energy storage fluoride salts that undergo repeated melting and freezing. Such data have never been obtained before and have direct application for the development of space-based solar dynamic (SD) power systems. These power systems will store solar energy in a thermal energy salt such as lithium fluoride or calcium fluoride. The stored energy is extracted during the shade portion of the orbit. This enables the solar dynamic power system to provide constant electrical power over the entire orbit. Analytical computer codes have been developed for predicting performance of a spaced-based solar dynamic power system. Experimental verification of the analytical predictions is needed prior to using the analytical results for future space power design applications. The four TES flight experiments will be used to obtain the needed experimental data. This paper will focus on the flight results from the first experiment, TES-1, in comparison to the predicted results from the Thermal Energy Storage Simulation (TESSIM) analytical computer code. The TES-1 conceptual development, hardware design, final development, and system verification testing were accomplished at the NASA lewis Research Center (LeRC). TES-1 was developed under the In-Space Technology Experiment Program (IN-STEP), which sponsors NASA, industry, and university flight experiments designed to enable and enhance space flight technology. The IN-STEP Program is sponsored by the Office of Space Access and Technology (OSAT).

  13. The mental representation of living and nonliving things: differential weighting and interactivity of sensorial and non-sensorial features.

    PubMed

    Ventura, Paulo; Morais, José; Brito-Mendes, Carlos; Kolinsky, Régine

    2005-02-01

    Warrington and colleagues (Warrington & McCarthy, 1983, 1987; Warrington & Shallice, 1984) claimed that sensorial and functional-associative (FA) features are differentially important in determining the meaning of living things (LT) and nonliving things (NLT). The first aim of the present study was to evaluate this hypothesis through two different access tasks: feature generation (Experiment 1) and cued recall (Experiment 2). The results of both experiments provided consistent empirical support for Warrington and colleagues' assumption. The second aim of the present study was to test a new differential interactivity hypothesis that combines Warrington and colleagueS' assumption with the notion of a higher number of intercorrelations and hence of a stronger connectivity between sensorial and non-sensorial features for LTs than for NLTs. This hypothesis was motivated by previoUs reports of an uncrossed interaction between domain (LTs vs NLTs) and attribute type (sensorial vs FA) in, for example, a feature verification task (Laws, Humber, Ramsey, & McCarthy, 1995): while FA attributes are verified faster than sensorial attributes for NLTs, no difference is observed for LTs. We replicated and generalised this finding using several feature verification tasks on both written words and pictures (Experiment 3), including in conditions aimed at minimising the intervention of priming biases and strategic or mnemonic processes (Experiment 4). The whole set of results suggests that both privileged relations between features and categories, and the differential importance of intercorrelations between features as a function of category, modulate access to semantic features.

  14. Expose : procedure and results of the joint experiment verification tests

    NASA Astrophysics Data System (ADS)

    Panitz, C.; Rettberg, P.; Horneck, G.; Rabbow, E.; Baglioni, P.

    The International Space Station will carry the EXPOSE facility accommodated at the universal workplace URM-D located outside the Russian Service Module. The launch will be affected in 2005 and it is planned to stay in space for 1.5 years. The tray like structure will accomodate 2 chemical and 6 biological PI-experiments or experiment systems of the ROSE (Response of Organisms to Space Environment) consortium. EXPOSE will support long-term in situ studies of microbes in artificial meteorites, as well as of microbial communities from special ecological niches, such as endolithic and evaporitic ecosystems. The either vented or sealed experiment pockets will be covered by an optical filter system to control intensity and spectral range of solar UV irradiation. Control of sun exposure will be achieved by the use of individual shutters. To test the compatibility of the different biological systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed. The procedure and first results of this joint Experiment Verification Tests (EVT) will be presented. The results will be essential for the success of the EXPOSE mission and have been done in parallel with the development and construction of the final hardware design of the facility. The results of the mission will contribute to the understanding of the organic chemistry processes in space, the biological adaptation strategies to extreme conditions, e.g. on early Earth and Mars, and the distribution of life beyond its planet of origin.

  15. Lockheed L-1101 avionic flight control redundant systems

    NASA Technical Reports Server (NTRS)

    Throndsen, E. O.

    1976-01-01

    The Lockheed L-1011 automatic flight control systems - yaw stability augmentation and automatic landing - are described in terms of their redundancies. The reliability objectives for these systems are discussed and related to in-service experience. In general, the availability of the stability augmentation system is higher than the original design requirement, but is commensurate with early estimates. The in-service experience with automatic landing is not sufficient to provide verification of Category 3 automatic landing system estimated availability.

  16. Space Construction Automated Fabrication Experiment Definition Study (SCAFEDS), part 3. Volume 3: Requirements

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The performance, design and verification requirements for the space Construction Automated Fabrication Experiment (SCAFE) are defined. The SCAFE program defines, develops, and demonstrates the techniques, processes, and equipment required for the automatic fabrication of structural elements in space and for the assembly of such elements into a large, lightweight structure. The program defines a large structural platform to be constructed in orbit using the space shuttle as a launch vehicle and construction base.

  17. Hyperplex-MRM: a hybrid multiple reaction monitoring method using mTRAQ/iTRAQ labeling for multiplex absolute quantification of human colorectal cancer biomarker.

    PubMed

    Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie

    2013-09-06

    Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.

  18. Experimental demonstration of an isotope-sensitive warhead verification technique using nuclear resonance fluorescence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vavrek, Jayson R.; Henderson, Brian S.; Danagoulian, Areg

    Future nuclear arms reduction efforts will require technologies to verify that warheads slated for dismantlement are authentic without revealing any sensitive weapons design information to international inspectors. Despite several decades of research, no technology has met these requirements simultaneously. Recent work by Kemp et al. [Kemp RS, Danagoulian A, Macdonald RR, Vavrek JR (2016) Proc Natl Acad Sci USA 113:8618–8623] has produced a novel physical cryptographic verification protocol that approaches this treaty verification problem by exploiting the isotope-specific nature of nuclear resonance fluorescence (NRF) measurements to verify the authenticity of a warhead. To protect sensitive information, the NRF signal frommore » the warhead is convolved with that of an encryption foil that contains key warhead isotopes in amounts unknown to the inspector. The convolved spectrum from a candidate warhead is statistically compared against that from an authenticated template warhead to determine whether the candidate itself is authentic. Here in this paper we report on recent proof-of-concept warhead verification experiments conducted at the Massachusetts Institute of Technology. Using high-purity germanium (HPGe) detectors, we measured NRF spectra from the interrogation of proxy “genuine” and “hoax” objects by a 2.52 MeV endpoint bremsstrahlung beam. The observed differences in NRF intensities near 2.2 MeV indicate that the physical cryptographic protocol can distinguish between proxy genuine and hoax objects with high confidence in realistic measurement times.« less

  19. Forest Carbon Monitoring and Reporting for REDD+: What Future for Africa?

    PubMed

    Gizachew, Belachew; Duguma, Lalisa A

    2016-11-01

    A climate change mitigation mechanism for emissions reduction from reduced deforestation and forest degradation, plus forest conservation, sustainable management of forest, and enhancement of carbon stocks (REDD+), has received an international political support in the climate change negotiations. The mechanism will require, among others, an unprecedented technical capacity for monitoring, reporting and verification of carbon emissions from the forest sector. A functional monitoring, reporting and verification requires inventories of forest area, carbon stock and changes, both for the construction of forest reference emissions level and compiling the report on the actual emissions, which are essentially lacking in developing countries, particularly in Africa. The purpose of this essay is to contribute to a better understanding of the state and prospects of forest monitoring and reporting in the context of REDD+ in Africa. We argue that monitoring and reporting capacities in Africa fall short of the stringent requirements of the methodological guidance for monitoring, reporting and verification for REDD+, and this may weaken the prospects for successfully implementing REDD+ in the continent. We presented the challenges and prospects in the national forest inventory, remote sensing and reporting infrastructures. A North-South, South-South collaboration as well as governments own investments in monitoring, reporting and verification system could help Africa leapfrog in monitoring and reporting. These could be delivered through negotiations for the transfer of technology, technical capacities, and experiences that exist among developed countries that traditionally compile forest carbon reports in the context of the Kyoto protocol.

  20. Experimental demonstration of an isotope-sensitive warhead verification technique using nuclear resonance fluorescence

    DOE PAGES

    Vavrek, Jayson R.; Henderson, Brian S.; Danagoulian, Areg

    2018-04-10

    Future nuclear arms reduction efforts will require technologies to verify that warheads slated for dismantlement are authentic without revealing any sensitive weapons design information to international inspectors. Despite several decades of research, no technology has met these requirements simultaneously. Recent work by Kemp et al. [Kemp RS, Danagoulian A, Macdonald RR, Vavrek JR (2016) Proc Natl Acad Sci USA 113:8618–8623] has produced a novel physical cryptographic verification protocol that approaches this treaty verification problem by exploiting the isotope-specific nature of nuclear resonance fluorescence (NRF) measurements to verify the authenticity of a warhead. To protect sensitive information, the NRF signal frommore » the warhead is convolved with that of an encryption foil that contains key warhead isotopes in amounts unknown to the inspector. The convolved spectrum from a candidate warhead is statistically compared against that from an authenticated template warhead to determine whether the candidate itself is authentic. Here in this paper we report on recent proof-of-concept warhead verification experiments conducted at the Massachusetts Institute of Technology. Using high-purity germanium (HPGe) detectors, we measured NRF spectra from the interrogation of proxy “genuine” and “hoax” objects by a 2.52 MeV endpoint bremsstrahlung beam. The observed differences in NRF intensities near 2.2 MeV indicate that the physical cryptographic protocol can distinguish between proxy genuine and hoax objects with high confidence in realistic measurement times.« less

  1. Probabilistic Sizing and Verification of Space Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  2. Methodology for Software Reliability Prediction. Volume 2.

    DTIC Science & Technology

    1987-11-01

    The overall acquisition ,z program shall include the resources, schedule, management, structure , and controls necessary to ensure that specified AD...Independent Verification/Validation - Programming Team Structure - Educational Level of Team Members - Experience Level of Team Members * Methods Used...Prediction or Estimation Parameter Supported: Software - Characteristics 3. Objectives: Structured programming studies and Government Ur.’.. procurement

  3. Case-Study of the High School Student's Family Values Formation

    ERIC Educational Resources Information Center

    Valeeva, Roza A.; Korolyeva, Natalya E.; Sakhapova, Farida Kh.

    2016-01-01

    The aim of the research is the theoretical justification and experimental verification of content, complex forms and methods to ensure effective development of the high school students' family values formation. 93 lyceum students from Kazan took part in the experiment. To study students' family values we have applied method of studying personality…

  4. Families of Functions and Functions of Proof

    ERIC Educational Resources Information Center

    Landman, Greisy Winicki

    2002-01-01

    This article describes an activity for secondary school students that may constitute an appropriate opportunity to discuss with them the idea of proof, particularly in an algebraic context. During the activity the students may experience and understand some of the roles played by proof in mathematics in addition to verification of truth:…

  5. Verification of Social Network Site Use Behavior of the University Physical Education Students

    ERIC Educational Resources Information Center

    Liu, Li-Wei; Chang, Chia-Ming; Huang, Hsiu-Chin; Chang, Yu-Liang

    2016-01-01

    This study aims to explore the relationships among performance expectancy, effort expectancy, social influence, facilitating condition, behavioral intention and use behavior of university physical education students in Taiwan. Moreover, it also intends to examine the moderating effects of gender, age, and experience on the UTAUT model. The targets…

  6. Attention and Implicit Memory in the Category-Verification and Lexical Decision Tasks

    ERIC Educational Resources Information Center

    Mulligan, Neil W.; Peterson, Daniel

    2008-01-01

    Prior research on implicit memory appeared to support 3 generalizations: Conceptual tests are affected by divided attention, perceptual tasks are affected by certain divided-attention manipulations, and all types of priming are affected by selective attention. These generalizations are challenged in experiments using the implicit tests of category…

  7. Bullying in School: Case Study of Prevention and Psycho-Pedagogical Correction

    ERIC Educational Resources Information Center

    Ribakova, Laysan A.; Valeeva, Roza A.; Merker, Natalia

    2016-01-01

    The purpose of the study was the theoretical justification and experimental verification of content, complex forms and methods to ensure effective prevention and psycho-pedagogical correction of bullying in school. 53 teenage students from Kazan took part in the experiment. A complex of diagnostic techniques for the detection of violence and…

  8. VERIFICATION AND USES OF THE ENVIRONMENTAL PROTECTION AGENCY (EPA) INDOOR AIR QUALITY MODEL

    EPA Science Inventory

    The paper describes a set of experiments used to verify an indoor air quality (IAQ) model for estimating the impact of various pollution sources on IAQ in a multiroom building. he model treats each room as a well-mixed chamber that contains pollution sources and sinks. he model a...

  9. Modeling the Water Balloon Slingshot

    ERIC Educational Resources Information Center

    Bousquet, Benjamin D.; Figura, Charles C.

    2013-01-01

    In the introductory physics courses at Wartburg College, we have been working to create a lab experience focused on the scientific process itself rather than verification of physical laws presented in the classroom or textbook. To this end, we have developed a number of open-ended modeling exercises suitable for a variety of learning environments,…

  10. Dosimetry investigation of MOSFET for clinical IMRT dose verification.

    PubMed

    Deshpande, Sudesh; Kumar, Rajesh; Ghadi, Yogesh; Neharu, R M; Kannan, V

    2013-06-01

    In IMRT, patient-specific dose verification is followed regularly at each centre. Simple and efficient dosimetry techniques play a very important role in routine clinical dosimetry QA. The MOSFET dosimeter offers several advantages over the conventional dosimeters such as its small detector size, immediate readout, immediate reuse, multiple point dose measurements. To use the MOSFET as routine clinical dosimetry system for pre-treatment dose verification in IMRT, a comprehensive set of experiments has been conducted, to investigate its linearity, reproducibility, dose rate effect and angular dependence for 6 MV x-ray beam. The MOSFETs shows a linear response with linearity coefficient of 0.992 for a dose range of 35 cGy to 427 cGy. The reproducibility of the MOSFET was measured by irradiating the MOSFET for ten consecutive irradiations in the dose range of 35 cGy to 427 cGy. The measured reproducibility of MOSFET was found to be within 4% up to 70 cGy and within 1.4% above 70 cGy. The dose rate effect on the MOSFET was investigated in the dose rate range 100 MU/min to 600 MU/min. The response of the MOSFET varies from -1.7% to 2.1%. The angular responses of the MOSFETs were measured at 10 degrees intervals from 90 to 270 degrees in an anticlockwise direction and normalized at gantry angle zero and it was found to be in the range of 0.98 ± 0.014 to 1.01 ± 0.014. The MOSFETs were calibrated in a phantom which was later used for IMRT verification. The measured calibration coefficients were found to be 1 mV/cGy and 2.995 mV/cGy in standard and high sensitivity mode respectively. The MOSFETs were used for pre-treatment dose verification in IMRT. Nine dosimeters were used for each patient to measure the dose in different plane. The average variation between calculated and measured dose at any location was within 3%. Dose verification using MOSFET and IMRT phantom was found to quick and efficient and well suited for a busy radiotherapy department.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    English, Shawn Allen; Nelson, Stacy Michelle; Briggs, Timothy

    Presented is a model verification and validation effort using low - velocity impact (LVI) of carbon fiber reinforced polymer laminate experiments. A flat cylindrical indenter impacts the laminate with enough energy to produce delamination, matrix cracks and fiber breaks. Included in the experimental efforts are ultrasonic scans of the damage for qualitative validation of the models. However, the primary quantitative metrics of validation are the force time history measured through the instrumented indenter and initial and final velocities. The simulations, whi ch are run on Sandia's Sierra finite element codes , consist of all physics and material parameters of importancemore » as determined by a sensitivity analysis conducted on the LVI simulation. A novel orthotropic damage and failure constitutive model that is cap able of predicting progressive composite damage and failure is described in detail and material properties are measured, estimated from micromechanics or optimized through calibration. A thorough verification and calibration to the accompanying experiment s are presented. Specia l emphasis is given to the four - point bend experiment. For all simulations of interest, the mesh and material behavior is verified through extensive convergence studies. An ensemble of simulations incorporating model parameter unc ertainties is used to predict a response distribution which is then compared to experimental output. The result is a quantifiable confidence in material characterization and model physics when simulating this phenomenon in structures of interest.« less

  12. [Optimization of vacuum belt drying process of Gardeniae Fructus in Reduning injection by Box-Behnken design-response surface methodology].

    PubMed

    Huang, Dao-sheng; Shi, Wei; Han, Lei; Sun, Ke; Chen, Guang-bo; Wu Jian-xiong; Xu, Gui-hong; Bi, Yu-an; Wang, Zhen-zhong; Xiao, Wei

    2015-06-01

    To optimize the belt drying process conditions optimization of Gardeniae Fructus extract from Reduning injection by Box-Behnken design-response surface methodology, on the basis of single factor experiment, a three-factor and three-level Box-Behnken experimental design was employed to optimize the drying technology of Gardeniae Fructus extract from Reduning injection. With drying temperature, drying time, feeding speed as independent variables and the content of geniposide as dependent variable, the experimental data were fitted to a second order polynomial equation, establishing the mathematical relationship between the content of geniposide and respective variables. With the experimental data analyzed by Design-Expert 8. 0. 6, the optimal drying parameter was as follows: the drying temperature was 98.5 degrees C , the drying time was 89 min, the feeding speed was 99.8 r x min(-1). Three verification experiments were taked under this technology and the measured average content of geniposide was 564. 108 mg x g(-1), which was close to the model prediction: 563. 307 mg x g(-1). According to the verification test, the Gardeniae Fructus belt drying process is steady and feasible. So single factor experiments combined with response surface method (RSM) could be used to optimize the drying technology of Reduning injection Gardenia extract.

  13. Inattentive listening undermines self-verification in personal storytelling.

    PubMed

    Pasupathi, Monisha; Rich, Ben

    2005-08-01

    Two studies explore the narrative construction of self-perceptions in conversational storytelling among pairs of same-sex friends. Specifically, the studies examined how listener behavior can support or undermine attempts to self-verify in personal storytelling. In two studies (n=100 dyads), speakers told attentive, distracted, or disagreeable (Study 1 only) friends about a recent experience. Distracted, but not disagreeable, friends tended to undermine participants' attempts to verify their self-perception of being interested in an activity (Study 1) or their self-perception that an event was typical for them (Study 2). These results support the notion that friends can be an important source of influence on self-perceptions and, perhaps surprisingly, suggest that responsiveness from friends, rather than agreement per se, may be crucial for supporting self-verification processes.

  14. Verification of a three-dimensional FEM model for FBGs in PANDA fibers by transversal load experiments

    NASA Astrophysics Data System (ADS)

    Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes

    2017-04-01

    A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.

  15. Shuttle-tethered satellite system definition study extension

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A system requirements definition and configuration study (Phase B) of the Tethered Satellite System (TSS) was conducted during the period 14 November 1977 to 27 February 1979. Subsequently a study extension was conducted during the period 13 June 1979 to 30 June 1980, for the purpose of refining the requirements identified during the main phase of the study, and studying in some detail the implications of accommodating various types of scientific experiments on the initial verification flight mission. An executive overview is given of the Tethered Satellite System definition developed during the study. The results of specific study tasks undertaken in the extension phase of the study are reported. Feasibility of the Tethered Satellite System has been established with reasonable confidence and the groundwork laid for proceeding with hardware design for the verification mission.

  16. Verification of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.

  17. Skylab materials processing facility experiment developer's report

    NASA Technical Reports Server (NTRS)

    Parks, P. G.

    1975-01-01

    The development of the Skylab M512 Materials Processing Facility is traced from the design of a portable, self-contained electron beam welding system for terrestrial applications to the highly complex experiment system ultimately developed for three Skylab missions. The M512 experiment facility was designed to support six in-space experiments intended to explore the advantages of manufacturing materials in the near-zero-gravity environment of Earth orbit. Detailed descriptions of the M512 facility and related experiment hardware are provided, with discussions of hardware verification and man-machine interfaces included. An analysis of the operation of the facility and experiments during the three Skylab missions is presented, including discussions of the hardware performance, anomalies, and data returned to earth.

  18. Additional confirmation of the validity of laboratory simulation of cloud radiances

    NASA Technical Reports Server (NTRS)

    Davis, J. M.; Cox, S. K.

    1986-01-01

    The results of a laboratory experiment are presented that provide additional verification of the methodology adopted for simulation of the radiances reflected from fields of optically thick clouds using the Cloud Field Optical Simulator (CFOS) at Colorado State University. The comparison of these data with their theoretically derived counterparts indicates that the crucial mechanism of cloud-to-cloud radiance field interaction is accurately simulated in the CFOS experiments and adds confidence to the manner in which the optical depth is scaled.

  19. Lay out, test verification and in orbit performance of HELIOS a temperature control system

    NASA Technical Reports Server (NTRS)

    Brungs, W.

    1975-01-01

    HELIOS temperature control system is described. The main design features and the impact of interactions between experiment, spacecraft system, and temperature control system requirements on the design are discussed. The major limitations of the thermal design regarding a closer sun approach are given and related to test experience and performance data obtained in orbit. Finally the validity of the test results achieved with prototype and flight spacecraft is evaluated by comparison between test data, orbit temperature predictions and flight data.

  20. Development Approaches Coupled with Verification and Validation Methodologies for Agent-Based Mission-Level Analytical Combat Simulations

    DTIC Science & Technology

    2004-03-01

    When applying experience to new situations, the process is very similar. Faced with a new situation, a human generally looks for ways in which...find the best course of action, the human would compare current goals to those it faced in the previous experiences and choose the path that...154. Saperstein, Alvin (1995) “War and Chaos”. American Scientist, vol. 84. November-December 1995. pp. 548-557. 155. Sargent, Robert G . (1991

  1. SecPop Version 4: Sector Population Land Fraction and Economic Estimation Program: Users? Guide Model Manual and Verification Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, Scott; Bixler, Nathan E.; McFadden, Katherine Letizia

    In 1973 the U.S. Environmental Protection Agency (EPA) developed SecPop to calculate population estimates to support a study on air quality. The Nuclear Regulatory Commission (NRC) adopted this program to support siting reviews for nuclear power plant construction and license applications. Currently SecPop is used to prepare site data input files for offsite consequence calculations with the MELCOR Accident Consequence Code System (MACCS). SecPop enables the use of site-specific population, land use, and economic data for a polar grid defined by the user. Updated versions of SecPop have been released to use U.S. decennial census population data. SECPOP90 was releasedmore » in 1997 to use 1990 population and economic data. SECPOP2000 was released in 2003 to use 2000 population data and 1997 economic data. This report describes the current code version, SecPop version 4.3, which uses 2010 population data and both 2007 and 2012 economic data. It is also compatible with 2000 census and 2002 economic data. At the time of this writing, the current version of SecPop is 4.3.0, and that version is described herein. This report contains guidance for the installation and use of the code as well as a description of the theory, models, and algorithms involved. This report contains appendices which describe the development of the 2010 census file, 2007 county file, and 2012 county file. Finally, an appendix is included that describes the validation assessments performed.« less

  2. Microbial ureolysis in the seawater-catalysed urine phosphorus recovery system: Kinetic study and reactor verification.

    PubMed

    Tang, Wen-Tao; Dai, Ji; Liu, Rulong; Chen, Guang-Hao

    2015-12-15

    Our previous study has confirmed the feasibility of using seawater as an economical precipitant for urine phosphorus (P) precipitation. However, we still understand very little about the ureolysis in the Seawater-based Urine Phosphorus Recovery (SUPR) system despite its being a crucial step for urine P recovery. In this study, batch experiments were conducted to investigate the kinetics of microbial ureolysis in the seawater-urine system. Indigenous bacteria from urine and seawater exhibited relatively low ureolytic activity, but they adapted quickly to the urine-seawater mixture during batch cultivation. During cultivation, both the abundance and specific ureolysis rate of the indigenous bacteria were greatly enhanced as confirmed by a biomass-dependent Michaelis-Menten model. The period for fully ureolysis was decreased from 180 h to 2.5 h after four cycles of cultivation. Based on the successful cultivation, a lab-scale SUPR reactor was set up to verify the fast ureolysis and efficient P recovery in the SUPR system. Nearly complete urine P removal was achieved in the reactor in 6 h without adding any chemicals. Terminal Restriction Fragment Length Polymorphism (TRFLP) analysis revealed that the predominant groups of bacteria in the SUPR reactor likely originated from seawater rather than urine. Moreover, batch tests confirmed the high ureolysis rates and high phosphorus removal efficiency induced by cultivated bacteria in the SUPR reactor under seawater-to-urine mixing ratios ranging from 1:1 to 9:1. This study has proved that the enrichment of indigenous bacteria in the SUPR system can lead to sufficient ureolytic activity for phosphate precipitation, thus providing an efficient and economical method for urine P recovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Verification of watershed vegetation restoration policies, arid China

    PubMed Central

    Zhang, Chengqi; Li, Yu

    2016-01-01

    Verification of restoration policies that have been implemented is of significance to simultaneously reduce global environmental risks while also meeting economic development goals. This paper proposed a novel method according to the idea of multiple time scales to verify ecological restoration policies in the Shiyang River drainage basin, arid China. We integrated modern pollen transport characteristics of the entire basin and pollen records from 8 Holocene sedimentary sections, and quantitatively reconstructed the millennial-scale changes of watershed vegetation zones by defining a new pollen-precipitation index. Meanwhile, Empirical Orthogonal Function method was used to quantitatively analyze spatial and temporal variations of Normalized Difference Vegetation Index in summer (June to August) of 2000–2014. By contrasting the vegetation changes that mainly controlled by millennial-scale natural ecological evolution with that under conditions of modern ecological restoration measures, we found that vegetation changes of the entire Shiyang River drainage basin are synchronous in both two time scales, and the current ecological restoration policies met the requirements of long-term restoration objectives and showed promising early results on ecological environmental restoration. Our findings present an innovative method to verify river ecological restoration policies, and also provide the scientific basis to propose future emphasizes of ecological restoration strategies. PMID:27470948

  4. Verification of watershed vegetation restoration policies, arid China

    NASA Astrophysics Data System (ADS)

    Zhang, Chengqi; Li, Yu

    2016-07-01

    Verification of restoration policies that have been implemented is of significance to simultaneously reduce global environmental risks while also meeting economic development goals. This paper proposed a novel method according to the idea of multiple time scales to verify ecological restoration policies in the Shiyang River drainage basin, arid China. We integrated modern pollen transport characteristics of the entire basin and pollen records from 8 Holocene sedimentary sections, and quantitatively reconstructed the millennial-scale changes of watershed vegetation zones by defining a new pollen-precipitation index. Meanwhile, Empirical Orthogonal Function method was used to quantitatively analyze spatial and temporal variations of Normalized Difference Vegetation Index in summer (June to August) of 2000-2014. By contrasting the vegetation changes that mainly controlled by millennial-scale natural ecological evolution with that under conditions of modern ecological restoration measures, we found that vegetation changes of the entire Shiyang River drainage basin are synchronous in both two time scales, and the current ecological restoration policies met the requirements of long-term restoration objectives and showed promising early results on ecological environmental restoration. Our findings present an innovative method to verify river ecological restoration policies, and also provide the scientific basis to propose future emphasizes of ecological restoration strategies.

  5. Negotiating Parenthood: Experiences of Economic Hardship among Parents with Cognitive Difficulties

    ERIC Educational Resources Information Center

    Fernqvist, Stina

    2015-01-01

    People with cognitive difficulties often have scarce economic resources, and parents with cognitive difficulties are no exception. In this article, parents' experiences are put forth and discussed, for example, how does economic hardship affect family life? How do the parents experience support, what kind of strain does the scarce economy put on…

  6. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  7. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  8. Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

    DOE PAGES

    Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...

    2014-01-01

    This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less

  9. Verification on spray simulation of a pintle injector for liquid rocket engine

    NASA Astrophysics Data System (ADS)

    Son, Min; Yu, Kijeong; Radhakrishnan, Kanmaniraja; Shin, Bongchul; Koo, Jaye

    2016-02-01

    The pintle injector used for a liquid rocket engine is a newly re-attracted injection system famous for its wide throttle ability with high efficiency. The pintle injector has many variations with complex inner structures due to its moving parts. In order to study the rotating flow near the injector tip, which was observed from the cold flow experiment using water and air, a numerical simulation was adopted and a verification of the numerical model was later conducted. For the verification process, three types of experimental data including velocity distributions of gas flows, spray angles and liquid distribution were all compared using simulated results. The numerical simulation was performed using a commercial simulation program with the Eulerian multiphase model and axisymmetric two dimensional grids. The maximum and minimum velocities of gas were within the acceptable range of agreement, however, the spray angles experienced up to 25% error when the momentum ratios were increased. The spray density distributions were quantitatively measured and had good agreement. As a result of this study, it was concluded that the simulation method was properly constructed to study specific flow characteristics of the pintle injector despite having the limitations of two dimensional and coarse grids.

  10. New generation of universal modeling for centrifugal compressors calculation

    NASA Astrophysics Data System (ADS)

    Galerkin, Y.; Drozdov, A.

    2015-08-01

    The Universal Modeling method is in constant use from mid - 1990th. Below is presented the newest 6th version of the Method. The flow path configuration of 3D impellers is presented in details. It is possible to optimize meridian configuration including hub/shroud curvatures, axial length, leading edge position, etc. The new model of vaned diffuser includes flow non-uniformity coefficient based on CFD calculations. The loss model was built from the results of 37 experiments with compressors stages of different flow rates and loading factors. One common set of empirical coefficients in the loss model guarantees the efficiency definition within an accuracy of 0.86% at the design point and 1.22% along the performance curve. The model verification was made. Four multistage compressors performances with vane and vaneless diffusers were calculated. As the model verification was made, four multistage compressors performances with vane and vaneless diffusers were calculated. Two of these compressors have quite unusual flow paths. The modeling results were quite satisfactory in spite of these peculiarities. One sample of the verification calculations is presented in the text. This 6th version of the developed computer program is being already applied successfully in the design practice.

  11. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  12. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  13. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  14. Collinear cluster tri-partition: Kinematics constraints and stability of collinearity

    NASA Astrophysics Data System (ADS)

    Holmvall, P.; Köster, U.; Heinz, A.; Nilsson, T.

    2017-01-01

    Background: A new mode of nuclear fission has been proposed by the FOBOS Collaboration, called collinear cluster tri-partition (CCT), and suggests that three heavy fission fragments can be emitted perfectly collinearly in low-energy fission. This claim is based on indirect observations via missing-energy events using the 2 v 2 E method. This proposed CCT seems to be an extraordinary new aspect of nuclear fission. It is surprising that CCT escaped observation for so long given the relatively high reported yield of roughly 0.5 % relative to binary fission. These claims call for an independent verification with a different experimental technique. Purpose: Verification experiments based on direct observation of CCT fragments with fission-fragment spectrometers require guidance with respect to the allowed kinetic-energy range, which we present in this paper. Furthermore, we discuss corresponding model calculations which, if CCT is found in such verification experiments, could indicate how the breakups proceed. Since CCT refers to collinear emission, we also study the intrinsic stability of collinearity. Methods: Three different decay models are used that together span the timescales of three-body fission. These models are used to calculate the possible kinetic-energy ranges of CCT fragments by varying fragment mass splits, excitation energies, neutron multiplicities, and scission-point configurations. Calculations are presented for the systems 235U(nth,f ) and 252Cf(s f ) , and the fission fragments previously reported for CCT; namely, isotopes of the elements Ni, Si, Ca, and Sn. In addition, we use semiclassical trajectory calculations with a Monte Carlo method to study the intrinsic stability of collinearity. Results: CCT has a high net Q value but, in a sequential decay, the intermediate steps are energetically and geometrically unfavorable or even forbidden. Moreover, perfect collinearity is extremely unstable, and broken by the slightest perturbation. Conclusions: According to our results, the central fragment would be very difficult to detect due to its low kinetic energy, raising the question of why other 2 v 2 E experiments could not detect a missing-mass signature corresponding to CCT. Considering the high kinetic energies of the outer fragments reported in our study, direct-observation experiments should be able to observe CCT. Furthermore, we find that a realization of CCT would require an unphysical fine tuning of the initial conditions. Finally, our stability calculations indicate that, due to the pronounced instability of the collinear configuration, a prolate scission configuration does not necessarily lead to collinear emission, nor does equatorial emission necessarily imply an oblate scission configuration. In conclusion, our results enable independent experimental verification and encourage further critical theoretical studies of CCT.

  15. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  16. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  17. The politics of verification and the control of nuclear tests, 1945-1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, N.W.

    1990-01-01

    This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less

  18. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  19. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...

  20. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  1. Methods of increasing efficiency and maintainability of pipeline systems

    NASA Astrophysics Data System (ADS)

    Ivanov, V. A.; Sokolov, S. M.; Ogudova, E. V.

    2018-05-01

    This study is dedicated to the issue of pipeline transportation system maintenance. The article identifies two classes of technical-and-economic indices, which are used to select an optimal pipeline transportation system structure. Further, the article determines various system maintenance strategies and strategy selection criteria. Meanwhile, the maintenance strategies turn out to be not sufficiently effective due to non-optimal values of maintenance intervals. This problem could be solved by running the adaptive maintenance system, which includes a pipeline transportation system reliability improvement algorithm, especially an equipment degradation computer model. In conclusion, three model building approaches for determining optimal technical systems verification inspections duration were considered.

  2. NASA/DOD Aerospace Knowledge Diffusion Research Project. XXIV - A general approach to measuring the value of aerospace information products and services

    NASA Technical Reports Server (NTRS)

    Brinberg, Herbert R.; Pinelli, Thomas E.

    1993-01-01

    This paper discusses the various approaches to measuring the value of information, first defining the meanings of information, economics of information, and value. It concludes that no general model of measuring the value of information is possible and that the usual approaches, such as cost/benefit equations, have very limited applications. It also concludes that in specific contexts with given goals for newly developed products and services or newly acquired information, there is a basis for its objective valuation. The axioms and inputs for such a model are described and directions for further verification and analysis are proposed.

  3. Resource Letter: GW-1: Global warming

    NASA Astrophysics Data System (ADS)

    Firor, John W.

    1994-06-01

    This Resource Letter provides a guide to the literature on the possibility of a human-induced climate change—a global warming. Journal articles and books are cited for the following topics: the Greenhouse Effect, sources of infrared-trapping gases, climate models and their uncertainties, verification of climate models, past climate changes, and economics, ethics, and politics of policy responses to climate change. [The letter E after an item indicates elementary level or material of general interest to persons becoming informed in the field. The letter I, for intermediate level, indicates material of somewhat more specialized nature, and the letter A indicates rather specialized or advanced material.

  4. Advanced composite vertical fin for L-1011 aircraft

    NASA Technical Reports Server (NTRS)

    Jackson, A. C.

    1984-01-01

    The structural box of the L-1011 vertical fin was redesigned using advanced composite materials. The box was fabricated and ground tested to verify the structural integrity. This report summarizes the complete program starting with the design and analysis and proceeds through the process development ancillary test program production readiness verification testing, fabrication of the full-scale fin boxes and the full-scale ground testing. The program showed that advanced composites can economically and effectively be used in the design and fabrication of medium primary structures for commercial aircraft. Static-strength variability was demonstrated to be comparable to metal structures and the long term durability of advanced composite components was demonstrated.

  5. Does Economics Education Make Bad Citizens? The Effect of Economics Education in Japan

    ERIC Educational Resources Information Center

    Iida, Yoshio; Oda, Sobei H.

    2011-01-01

    Does studying economics discourage students' cooperative mind? Several surveys conducted in the United States have concluded that the answer is yes. The authors conducted a series of economic experiments and questionnaires to consider the question in Japan. The results of the prisoner's dilemma experiment and public goods questionnaires showed no…

  6. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...

  7. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...

  8. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  9. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  10. Decolorization of Acid Orange 7 by an electric field-assisted modified orifice plate hydrodynamic cavitation system: Optimization of operational parameters.

    PubMed

    Jung, Kyung-Won; Park, Dae-Seon; Hwang, Min-Jin; Ahn, Kyu-Hong

    2015-09-01

    In this study, the decolorization of Acid Orange 7 (AO-7) with intensified performance was obtained using hydrodynamic cavitation (HC) combined with an electric field (graphite electrodes). As a preliminary step, various HC systems were compared in terms of decolorization, and, among them, the electric field-assisted modified orifice plate HC (EFM-HC) system exhibited perfect decolorization performance within 40 min of reaction time. Interestingly, when H2O2 was injected into the EFM-HC system as an additional oxidant, the reactor performance gradually decreased as the dosing ratio increased; thus, the remaining experiments were performed without H2O2. Subsequently, an optimization process was conducted using response surface methodology with a Box-Behnken design. The inlet pressure, initial pH, applied voltage, and reaction time were chosen as operational key factors, while decolorization was selected as the response variable. The overall performance revealed that the selected parameters were either slightly interdependent, or had significant interactive effects on the decolorization. In the verification test, complete decolorization was observed under statistically optimized conditions. This study suggests that EFM-HC is a useful method for pretreatment of dye wastewater with positive economic and commercial benefits. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Using Chem-Wiki to Increase Student Collaboration through Online Lab Reporting

    ERIC Educational Resources Information Center

    Elliott, Edward W., III; Fraiman, Ana

    2010-01-01

    The nature of laboratory work has changed in the past decade. One example is a shift from working individually or in pairs on single traditional verification experiments to working collaboratively in larger groups in inquiry and research-based laboratories, over extended periods in and outside of the lab. In this increased era of collaboration, we…

  12. Resistivity Correction Factor for the Four-Probe Method: Experiment I

    NASA Astrophysics Data System (ADS)

    Yamashita, Masato; Yamaguchi, Shoji; Enjoji, Hideo

    1988-05-01

    Experimental verification of the theoretically derived resistivity correction factor (RCF) is presented. Resistivity and sheet resistance measurements by the four-probe method are made on three samples: isotropic graphite, ITO film and Au film. It is indicated that the RCF can correct the apparent variations of experimental data to yield reasonable resistivities and sheet resistances.

  13. Design, Development and Delivery of Active Learning Tools in Software Verification & Validation Education

    ERIC Educational Resources Information Center

    Acharya, Sushil; Manohar, Priyadarshan Anant; Wu, Peter; Maxim, Bruce; Hansen, Mary

    2018-01-01

    Active learning tools are critical in imparting real world experiences to the students within a classroom environment. This is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains with little to no training. However, there is a well-recognized need for the…

  14. Verification of Cold Working and Interference Levels at Fastener Holes

    DTIC Science & Technology

    2009-02-01

    of the Residual Stress Field on the Fatigue Coupons ........................................ 32 3.3.3 Fractography of Fatigue Test Coupons...predictions to fatigue experiment results (none of the literature we reviewed described fractography of cracks propagating through residual stress...ensures continued safety, readiness, and controlled maintenance costs. These methods augment and enhance traditional safe-life and damage tolerance

  15. Symbology Verification Study.

    DTIC Science & Technology

    1983-01-01

    30 color symbols. This reduction in seconds required to intepret and respond to a real-time situation display is critical to the pilot in the combat...D66834, December, 1978. 12. W. D. Hitt, "An Evaluation of Five Different Abstract Coding Methods, Experiment IV." Human Factors, 1961, 3 (2), 120-130. 107 941 FIME D--.. 95 FILMED 9-83 OTIC

  16. Online pretreatment verification of high-dose rate brachytherapy using an imaging panel

    NASA Astrophysics Data System (ADS)

    Fonseca, Gabriel P.; Podesta, Mark; Bellezzo, Murillo; Van den Bosch, Michiel R.; Lutgens, Ludy; Vanneste, Ben G. L.; Voncken, Robert; Van Limbergen, Evert J.; Reniers, Brigitte; Verhaegen, Frank

    2017-07-01

    Brachytherapy is employed to treat a wide variety of cancers. However, an accurate treatment verification method is currently not available. This study describes a pre-treatment verification system that uses an imaging panel (IP) to verify important aspects of the treatment plan. A detailed modelling of the IP was only possible with an extensive calibration performed using a robotic arm. Irradiations were performed with a high dose rate (HDR) 192Ir source within a water phantom. An empirical fit was applied to measure the distance between the source and the detector so 3D Cartesian coordinates of the dwell positions can be obtained using a single panel. The IP acquires 7.14 fps to verify the dwell times, dwell positions and air kerma strength (Sk). A gynecological applicator was used to create a treatment plan that was registered with a CT image of the water phantom used during the experiments for verification purposes. Errors (shifts, exchanged connections and wrong dwell times) were simulated to verify the proposed verification system. Cartesian source positions (panel measurement plane) have a standard deviation of about 0.02 cm. The measured distance between the source and the panel (z-coordinate) have a standard deviation up to 0.16 cm and maximum absolute error of  ≈0.6 cm if the signal is close to sensitive limit of the panel. The average response of the panel is very linear with Sk. Therefore, Sk measurements can be performed with relatively small errors. The measured dwell times show a maximum error of 0.2 s which is consistent with the acquisition rate of the panel. All simulated errors were clearly identified by the proposed system. The use of IPs is not common in brachytherapy, however, it provides considerable advantages. It was demonstrated that the IP can accurately measure Sk, dwell times and dwell positions.

  17. Tolerance limits and methodologies for IMRT measurement-based verification QA: Recommendations of AAPM Task Group No. 218.

    PubMed

    Miften, Moyed; Olch, Arthur; Mihailidis, Dimitris; Moran, Jean; Pawlicki, Todd; Molineu, Andrea; Li, Harold; Wijesooriya, Krishni; Shi, Jie; Xia, Ping; Papanikolaou, Nikos; Low, Daniel A

    2018-04-01

    Patient-specific IMRT QA measurements are important components of processes designed to identify discrepancies between calculated and delivered radiation doses. Discrepancy tolerance limits are neither well defined nor consistently applied across centers. The AAPM TG-218 report provides a comprehensive review aimed at improving the understanding and consistency of these processes as well as recommendations for methodologies and tolerance limits in patient-specific IMRT QA. The performance of the dose difference/distance-to-agreement (DTA) and γ dose distribution comparison metrics are investigated. Measurement methods are reviewed and followed by a discussion of the pros and cons of each. Methodologies for absolute dose verification are discussed and new IMRT QA verification tools are presented. Literature on the expected or achievable agreement between measurements and calculations for different types of planning and delivery systems are reviewed and analyzed. Tests of vendor implementations of the γ verification algorithm employing benchmark cases are presented. Operational shortcomings that can reduce the γ tool accuracy and subsequent effectiveness for IMRT QA are described. Practical considerations including spatial resolution, normalization, dose threshold, and data interpretation are discussed. Published data on IMRT QA and the clinical experience of the group members are used to develop guidelines and recommendations on tolerance and action limits for IMRT QA. Steps to check failed IMRT QA plans are outlined. Recommendations on delivery methods, data interpretation, dose normalization, the use of γ analysis routines and choice of tolerance limits for IMRT QA are made with focus on detecting differences between calculated and measured doses via the use of robust analysis methods and an in-depth understanding of IMRT verification metrics. The recommendations are intended to improve the IMRT QA process and establish consistent, and comparable IMRT QA criteria among institutions. © 2018 American Association of Physicists in Medicine.

  18. Runtime Verification of Pacemaker Functionality Using Hierarchical Fuzzy Colored Petri-nets.

    PubMed

    Majma, Negar; Babamir, Seyed Morteza; Monadjemi, Amirhassan

    2017-02-01

    Today, implanted medical devices are increasingly used for many patients and in case of diverse health problems. However, several runtime problems and errors are reported by the relevant organizations, even resulting in patient death. One of those devices is the pacemaker. The pacemaker is a device helping the patient to regulate the heartbeat by connecting to the cardiac vessels. This device is directed by its software, so any failure in this software causes a serious malfunction. Therefore, this study aims to a better way to monitor the device's software behavior to decrease the failure risk. Accordingly, we supervise the runtime function and status of the software. The software verification means examining limitations and needs of the system users by the system running software. In this paper, a method to verify the pacemaker software, based on the fuzzy function of the device, is presented. So, the function limitations of the device are identified and presented as fuzzy rules and then the device is verified based on the hierarchical Fuzzy Colored Petri-net (FCPN), which is formed considering the software limits. Regarding the experiences of using: 1) Fuzzy Petri-nets (FPN) to verify insulin pumps, 2) Colored Petri-nets (CPN) to verify the pacemaker and 3) To verify the pacemaker by a software agent with Petri-network based knowledge, which we gained during the previous studies, the runtime behavior of the pacemaker software is examined by HFCPN, in this paper. This is considered a developing step compared to the earlier work. HFCPN in this paper, compared to the FPN and CPN used in our previous studies reduces the complexity. By presenting the Petri-net (PN) in a hierarchical form, the verification runtime, decreased as 90.61% compared to the verification runtime in the earlier work. Since we need an inference engine in the runtime verification, we used the HFCPN to enhance the performance of the inference engine.

  19. Online pretreatment verification of high-dose rate brachytherapy using an imaging panel.

    PubMed

    Fonseca, Gabriel P; Podesta, Mark; Bellezzo, Murillo; Van den Bosch, Michiel R; Lutgens, Ludy; Vanneste, Ben G L; Voncken, Robert; Van Limbergen, Evert J; Reniers, Brigitte; Verhaegen, Frank

    2017-07-07

    Brachytherapy is employed to treat a wide variety of cancers. However, an accurate treatment verification method is currently not available. This study describes a pre-treatment verification system that uses an imaging panel (IP) to verify important aspects of the treatment plan. A detailed modelling of the IP was only possible with an extensive calibration performed using a robotic arm. Irradiations were performed with a high dose rate (HDR) 192 Ir source within a water phantom. An empirical fit was applied to measure the distance between the source and the detector so 3D Cartesian coordinates of the dwell positions can be obtained using a single panel. The IP acquires 7.14 fps to verify the dwell times, dwell positions and air kerma strength (Sk). A gynecological applicator was used to create a treatment plan that was registered with a CT image of the water phantom used during the experiments for verification purposes. Errors (shifts, exchanged connections and wrong dwell times) were simulated to verify the proposed verification system. Cartesian source positions (panel measurement plane) have a standard deviation of about 0.02 cm. The measured distance between the source and the panel (z-coordinate) have a standard deviation up to 0.16 cm and maximum absolute error of  ≈0.6 cm if the signal is close to sensitive limit of the panel. The average response of the panel is very linear with Sk. Therefore, Sk measurements can be performed with relatively small errors. The measured dwell times show a maximum error of 0.2 s which is consistent with the acquisition rate of the panel. All simulated errors were clearly identified by the proposed system. The use of IPs is not common in brachytherapy, however, it provides considerable advantages. It was demonstrated that the IP can accurately measure Sk, dwell times and dwell positions.

  20. PET/CT imaging for treatment verification after proton therapy: A study with plastic phantoms and metallic implants

    PubMed Central

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas

    2008-01-01

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of the detector itself. In addition to PET alone, PET/CT imaging provides accurate information on the position of the imaged object and may assess possible anatomical changes during fractionated radiotherapy in clinical applications. PMID:17388158

  1. A global wind resource atlas including high-resolution terrain effects

    NASA Astrophysics Data System (ADS)

    Hahmann, Andrea; Badger, Jake; Olsen, Bjarke; Davis, Neil; Larsen, Xiaoli; Badger, Merete

    2015-04-01

    Currently no accurate global wind resource dataset is available to fill the needs of policy makers and strategic energy planners. Evaluating wind resources directly from coarse resolution reanalysis datasets underestimate the true wind energy resource, as the small-scale spatial variability of winds is missing. This missing variability can account for a large part of the local wind resource. Crucially, it is the windiest sites that suffer the largest wind resource errors: in simple terrain the windiest sites may be underestimated by 25%, in complex terrain the underestimate can be as large as 100%. The small-scale spatial variability of winds can be modelled using novel statistical methods and by application of established microscale models within WAsP developed at DTU Wind Energy. We present the framework for a single global methodology, which is relative fast and economical to complete. The method employs reanalysis datasets, which are downscaled to high-resolution wind resource datasets via a so-called generalization step, and microscale modelling using WAsP. This method will create the first global wind atlas (GWA) that covers all land areas (except Antarctica) and 30 km coastal zone over water. Verification of the GWA estimates will be done at carefully selected test regions, against verified estimates from mesoscale modelling and satellite synthetic aperture radar (SAR). This verification exercise will also help in the estimation of the uncertainty of the new wind climate dataset. Uncertainty will be assessed as a function of spatial aggregation. It is expected that the uncertainty at verification sites will be larger than that of dedicated assessments, but the uncertainty will be reduced at levels of aggregation appropriate for energy planning, and importantly much improved relative to what is used today. In this presentation we discuss the methodology used, which includes the generalization of wind climatologies, and the differences in local and spatially aggregated wind resources that result from using different reanalyses in the various verification regions. A prototype web interface for the public access to the data will also be showcased.

  2. Numerical verification of three point bending experiment of magnetorheological elastomer (MRE) in magnetic field

    NASA Astrophysics Data System (ADS)

    Miedzinska, Danuta; Boczkowska, Anna; Zubko, Konrad

    2010-07-01

    In the article a method of numerical verification of experimental results for magnetorheological elastomer samples (MRE) is presented. The samples were shaped into cylinders with diameter of 8 mm and height of 20 mm with various carbonyl iron volume shares (1,5%, 11,5% and 33%). The diameter of soft ferromagnetic substance particles ranged from 6 to 9 μm. During the experiment, initially bended samples were exposed to the magnetic field with intensity levels at 0,1T, 0,3T, 0,5T, 0,7 and 1T. The reaction of the sample to the field action was measured as a displacement of a specimen. Numerical calculation was carried out with the MSC Patran/Marc computer code. For the purpose of numerical analysis the orthotropic material model with the material properties of magnetorheological elastomer along the iron chains, and of the pure elastomer along other directions, was applied. The material properties were obtained from the experimental tests. During the numerical analysis, the initial mechanical load resulting from cylinder deflection was set. Then, the equivalent external force, that was set on the basis of analytical calculations of intermolecular reaction within iron chains in the specific magnetic field, was put on the bended sample. Correspondence of such numerical model with results of the experiment was verified. Similar results of the experiments and both theoretical and FEM analysis indicates that macroscopic modeling of magnetorheological elastomer mechanical properties as orthotropic material delivers accurate enough description of the material's behavior.

  3. Development and verification of hardware for life science experiments in the Japanese Experiment Module "Kibo" on the International Space Station.

    PubMed

    Ishioka, Noriaki; Suzuki, Hiromi; Asashima, Makoto; Kamisaka, Seiichiro; Mogami, Yoshihiro; Ochiai, Toshimasa; Aizawa-Yano, Sachiko; Higashibata, Akira; Ando, Noboru; Nagase, Mutsumu; Ogawa, Shigeyuki; Shimazu, Toru; Fukui, Keiji; Fujimoto, Nobuyoshi

    2004-03-01

    Japan Aerospace Exploration Agency (JAXA) has developed a cell biology experiment facility (CBEF) and a clean bench (CB) as a common hardware in which life science experiments in the Japanese Experiment Module (JEM known as "Kibo") of the International Space Station (ISS) can be performed. The CBEF, a CO2 incubator with a turntable that provides variable gravity levels, is the basic hardware required to carry out the biological experiments using microorganisms, cells, tissues, small animals, plants, etc. The CB provides a closed aseptic operation area for life science and biotechnology experiments in Kibo. A phase contrast and fluorescence microscope is installed inside CB. The biological experiment units (BEU) are designed to run individual experiments using the CBEF and the CB. A plant experiment unit (PEU) and two cell experiment units (CEU type1 and type2) for the BEU have been developed.

  4. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...

  5. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  6. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  7. TET-1- A German Microsatellite for Technology On -Orbit Verification

    NASA Astrophysics Data System (ADS)

    Föckersperger, S.; Lattner, K.; Kaiser, C.; Eckert, S.; Bärwald, W.; Ritzmann, S.; Mühlbauer, P.; Turk, M.; Willemsen, P.

    2008-08-01

    Due to the high safety standards in the space industry every new product must go through a verification process before qualifying for operation in a space system. Within the verification process the payload undergoes a series of tests which prove that it is in accordance with mission requirements in terms of function, reliability and safety. Important verification components are the qualification for use on the ground as well as the On-Orbit Verification (OOV), i.e. proof that the product is suitable for use under virtual space conditions (on-orbit). Here it is demonstrated that the product functions under conditions which cannot or can only be partially simulated on the ground. The OOV-Program of the DLR serves to bridge the gap between the product tested and qualified on the ground and the utilization of the product in space. Due to regular and short-term availability of flight opportunities industry and research facilities can verify their latest products under space conditions and demonstrate their reliability and marketability. The Technologie-Erprobungs-Tr&äger TET (Technology Experiments Carrier) comprises the core elements of the OOV Program. A programmatic requirement of the OOV Program is that a satellite bus already verified in orbit be used in the first segment of the program. An analysis of suitable satellite buses showed that a realization of the TET satellite bus based on the BIRD satellite bus fulfilled the programmatic requirements best. Kayser-Threde was selected by DLR as Prime Contractor to perform the project together with its major subcontractors Astro- und Feinwerktechnik, Berlin for the platform development and DLR-GSOC for the ground segment development. TET is now designed to be a modular and flexible micro-satellite for any orbit between 450 and 850 km altitude and inclination between 53° and SSO. With an overall mass of 120 kg TET is able to accommodate experiments of up to 50 kg. A multipurpose payload supply systemThere is significant confusion in the space industry today over the terms used to describe satellite bus architectures. Terms such as "standard bus" (or "common bus"), "modular bus" and "plug-and-play bus" are often used with little understanding of what the terms actually mean, and even less understanding of what the differences in these space architectures mean. It may seem that these terms are subtle differentiators, but in reality these terms describe radically different ways to design, build, test, and operate satellites. Furthermore, these terms imply very different business models for the acquisition, operation, and sustainment of space systems. This paper will define and describe the difference between "standard buses", "modular buses" and "plug-and-play buses"; giving examples of each kind with a cost/benefit discussion of each type. under Kayser-Threde responsibility provides the necessary interfaces to the experiments. The first TET mission is scheduled for mid of 2010. TET will be launched as piggy-back payload on any available launcher worldwide to reduce launch cost and provide maximum flexibility. Finally, TET will provide all services required by the experimenters for a one year mission operation to perform a successful OOV-mission with its technology experiments leading to an efficient access to space for German industry and institutions.

  8. [Pursuit of economic efficiency in the hospital laboratory--full automatic re-test system of clinical chemistry, hematology, immunology and cost control system].

    PubMed

    Chiba, M

    2000-10-01

    Further business improvement is requested due to finance-based fluctuation and the influence of the revision in the medical treatment law. Therefore, new laboratories are needed. To achieving this in our hospital, economic efficiency is being pursued. The first issue is the use of space, the second issue is labor-saving. The third issue is the simplification of business procedures. There is individual quality control by the zonal verification method that we developed, as well as the quality control of the batch method using controlled substances. The four issue is cost control. By controlling the delivery and use of reagents and materials including the term of validity control, we made an effort to abolition defective stock. The fifth issue is correspondence to circulation style society. The disposal of laboratory garbage is a major issue. We controlled garbage that occurs unnecessarily. Furthermore, we are improving the demand for reagents that exceeds the specification and use reagent containers.

  9. Same Landscape, Different Lens: Variations in Young People's Socio-Economic Experiences and Perceptions in Their Disadvantaged Working-Class Community

    ERIC Educational Resources Information Center

    Brann-Barrett, Mary Tanya

    2011-01-01

    In this paper, I compare socio-economic experiences and community perceptions expressed by socially and economically disadvantaged young people with those of university students living in the same post-industrial community. I consider markers of distinction among these young people in relation to their family and educational experiences. I also…

  10. Crewed Space Vehicle Battery Safety Requirements

    NASA Technical Reports Server (NTRS)

    Jeevarajan, Judith A.; Darcy, Eric C.

    2014-01-01

    This requirements document is applicable to all batteries on crewed spacecraft, including vehicle, payload, and crew equipment batteries. It defines the specific provisions required to design a battery that is safe for ground personnel and crew members to handle and/or operate during all applicable phases of crewed missions, safe for use in the enclosed environment of a crewed space vehicle, and safe for use in launch vehicles, as well as in unpressurized spaces adjacent to the habitable portion of a space vehicle. The required provisions encompass hazard controls, design evaluation, and verification. The extent of the hazard controls and verification required depends on the applicability and credibility of the hazard to the specific battery design and applicable missions under review. Evaluation of the design and verification program results shall be completed prior to certification for flight and ground operations. This requirements document is geared toward the designers of battery systems to be used in crewed vehicles, crew equipment, crew suits, or batteries to be used in crewed vehicle systems and payloads (or experiments). This requirements document also applies to ground handling and testing of flight batteries. Specific design and verification requirements for a battery are dependent upon the battery chemistry, capacity, complexity, charging, environment, and application. The variety of battery chemistries available, combined with the variety of battery-powered applications, results in each battery application having specific, unique requirements pertinent to the specific battery application. However, there are basic requirements for all battery designs and applications, which are listed in section 4. Section 5 includes a description of hazards and controls and also includes requirements.

  11. Analytical and Experimental Verification of a Flight Article for a Mach-8 Boundary-Layer Experiment

    NASA Technical Reports Server (NTRS)

    Richards, W. Lance; Monaghan, Richard C.

    1996-01-01

    Preparations for a boundary-layer transition experiment to be conducted on a future flight mission of the air-launched Pegasus(TM) rocket are underway. The experiment requires a flight-test article called a glove to be attached to the wing of the Mach-8 first-stage booster. A three-dimensional, nonlinear finite-element analysis has been performed and significant small-scale laboratory testing has been accomplished to ensure the glove design integrity and quality of the experiment. Reliance on both the analysis and experiment activities has been instrumental in the success of the flight-article design. Results obtained from the structural analysis and laboratory testing show that all glove components are well within the allowable thermal stress and deformation requirements to satisfy the experiment objectives.

  12. A physical zero-knowledge object-comparison system for nuclear warhead verification

    PubMed Central

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d'Errico, Francesco

    2016-01-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477

  13. A physical zero-knowledge object-comparison system for nuclear warhead verification.

    PubMed

    Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  14. A physical zero-knowledge object-comparison system for nuclear warhead verification

    NASA Astrophysics Data System (ADS)

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; D'Errico, Francesco

    2016-09-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  15. A physical zero-knowledge object-comparison system for nuclear warhead verification

    DOE PAGES

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; ...

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less

  16. Velocity-image model for online signature verification.

    PubMed

    Khan, Mohammad A U; Niazi, Muhammad Khalid Khan; Khan, Muhammad Aurangzeb

    2006-11-01

    In general, online signature capturing devices provide outputs in the form of shape and velocity signals. In the past, strokes have been extracted while tracking velocity signal minimas. However, the resulting strokes are larger and complicated in shape and thus make the subsequent job of generating a discriminative template difficult. We propose a new stroke-based algorithm that splits velocity signal into various bands. Based on these bands, strokes are extracted which are smaller and more simpler in nature. Training of our proposed system revealed that low- and high-velocity bands of the signal are unstable, whereas the medium-velocity band can be used for discrimination purposes. Euclidean distances of strokes extracted on the basis of medium velocity band are used for verification purpose. The experiments conducted show improvement in discriminative capability of the proposed stroke-based system.

  17. Sensorimotor simulations underlie conceptual representations: modality-specific effects of prior activation.

    PubMed

    Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W

    2004-02-01

    According to the perceptual symbols theory (Barsalou, 1999), sensorimotor simulations underlie the representation of concepts. Simulations are componential in the sense that they vary with the context in which the concept is presented. In the present study, we investigated whether representations are affected by recent experiences with a concept. Concept names (e.g., APPLE) were presented twice in a property verification task with a different property on each occasion. The two properties were either from the same perceptual modality (e.g., green, shiny) or from different modalities (e.g., tart, shiny). All stimuli were words. There was a lag of several intervening trials between the first and second presentation. Verification times and error rates for the second presentation of the concept were higher if the properties were from different modalities than if they were from the same modality.

  18. A physical zero-knowledge object-comparison system for nuclear warhead verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less

  19. Collapse of Experimental Colloidal Aging using Record Dynamics

    NASA Astrophysics Data System (ADS)

    Robe, Dominic; Boettcher, Stefan; Sibani, Paolo; Yunker, Peter

    The theoretical framework of record dynamics (RD) posits that aging behavior in jammed systems is controlled by short, rare events involving activation of only a few degrees of freedom. RD predicts dynamics in an aging system to progress with the logarithm of t /tw . This prediction has been verified through new analysis of experimental data on an aging 2D colloidal system. MSD and persistence curves spanning three orders of magnitude in waiting time are collapsed. These predictions have also been found consistent with a number of experiments and simulations, but verification of the specific assumptions that RD makes about the underlying statistics of these rare events has been elusive. Here the observation of individual particles allows for the first time the direct verification of the assumptions about event rates and sizes. This work is suppoted by NSF Grant DMR-1207431.

  20. Cassini's Test Methodology for Flight Software Verification and Operations

    NASA Technical Reports Server (NTRS)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  1. Hypersonic CFD applications for the National Aero-Space Plane

    NASA Technical Reports Server (NTRS)

    Richardson, Pamela F.; Mcclinton, Charles R.; Bittner, Robert D.; Dilley, A. Douglas; Edwards, Kelvin W.

    1989-01-01

    Design and analysis of the NASP depends heavily upon developing the critical technology areas that cover the entire engineering design of the vehicle. These areas include materials, structures, propulsion systems, propellants, integration of airframe and propulsion systems, controls, subsystems, and aerodynamics areas. Currently, verification of many of the classical engineering tools relies heavily on computational fluid dynamics. Advances are being made in the development of CFD codes to accomplish nose-to-tail analyses for hypersonic aircraft. Additional details involving the partial development, analysis, verification, and application of the CFL3D code and the SPARK combustor code are discussed. A nonequilibrium version of CFL3D that is presently being developed and tested is also described. Examples are given of portion calculations for research hypersonic aircraft geometries and comparisons with experiment data show good agreement.

  2. Formal semantics for a subset of VHDL and its use in analysis of the FTPP scoreboard circuit

    NASA Technical Reports Server (NTRS)

    Bickford, Mark

    1994-01-01

    In the first part of the report, we give a detailed description of an operational semantics for a large subset of VHDL, the VHSIC Hardware Description Language. The semantics is written in the functional language Caliban, similar to Haskell, used by the theorem prover Clio. We also describe a translator from VHDL into Caliban semantics and give some examples of its use. In the second part of the report, we describe our experience in using the VHDL semantics to try to verify a large VHDL design. We were not able to complete the verification due to certain complexities of VHDL which we discuss. We propose a VHDL verification method that addresses the problems we encountered but which builds on the operational semantics described in the first part of the report.

  3. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  5. Economic abuse in Lebanon: experiences and perceptions.

    PubMed

    Usta, Jinan; Makarem, Nisrine N; Habib, Rima R

    2013-03-01

    This article explores the experiences and perceptions of Lebanese women and men with economic abuse. Data were drawn from focus group discussions and face-to-face interviews with men, women and social workers. The findings reveal that Lebanese women experience many forms of economic abuse, including the withholding of earnings, restricted involvement in the labor force, and limited purchasing decisions. Inheritance laws and practices still favor men over women. Women tolerate economic abuse to avoid more serious forms of abuse and ensure family stability. Practical implications of the findings are presented.

  6. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  7. ON AN ALLEGED TRUTH/FALSITY ASYMMETRY IN CONTEXT SHIFTING EXPERIMENTS

    PubMed Central

    Hansen, Nat

    2012-01-01

    Keith DeRose has argued that context shifting experiments should be designed in a specific way in order to accommodate what he calls a ‘truth/falsity asymmetry’. I explain and critique DeRose's reasons for proposing this modification to contextualist methodology, drawing on recent experimental studies of DeRose's bank cases as well as experimental findings about the verification of affirmative and negative statements. While DeRose's arguments for his particular modification to contextualist methodology fail, the lesson of his proposal is that there is good reason to pay close attention to several subtle aspects of the design of context shifting experiments. PMID:25821248

  8. The Economic Domino Effect: A Phenomenological Study Exploring Community College Faculty's Lived Experiences during Financial Hard Times in Higher Education

    ERIC Educational Resources Information Center

    Taylor, Tridai A.

    2014-01-01

    This qualitative study explored the lived experiences of eight full-time community college faculty members who taught during the economic crisis of 2008. The study was guided by the central research question, "How do community college faculty members describe their lived experiences regarding the recent economic crisis of 2008 and its impact…

  9. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  10. A rotor-aerodynamics-based wind estimation method using a quadrotor

    NASA Astrophysics Data System (ADS)

    Song, Yao; Luo, Bing; Meng, Qing-Hao

    2018-02-01

    Attempts to estimate horizontal wind by the quadrotor are reviewed. Wind estimations are realized by utilizing the quadrotor’s thrust change, which is caused by the wind’s effect on the rotors. The basis of the wind estimation method is the aerodynamic formula for the rotor’s thrust, which is verified and calibrated by experiments. A hardware-in-the-loop simulation (HILS) system was built as a testbed; its dynamic model and control structure are demonstrated. Verification experiments on the HILS system proved that the wind estimation method was effective.

  11. Remote collection and analysis of witness reports on flash floods

    NASA Astrophysics Data System (ADS)

    Gourley, Jonathan; Erlingis, Jessica; Smith, Travis; Ortega, Kiel; Hong, Yang

    2010-05-01

    Typically, flash floods are studied ex post facto in response to a major impact event. A complement to field investigations is developing a detailed database of flash flood events, including minor events and null reports (i.e., where heavy rain occurred but there was no flash flooding), based on public survey questions conducted in near-real time. The Severe Hazards Analysis and Verification Experiment (SHAVE) has been in operation at the National Severe Storms Laboratory (NSSL) in Norman, OK, USA during the summers since 2006. The experiment employs undergraduate students to analyse real-time products from weather radars, target specific regions within the conterminous US, and poll public residences and businesses regarding the occurrence and severity of hail, wind, tornadoes, and now flash floods. In addition to providing a rich learning experience for students, SHAVE has been successful in creating high-resolution datasets of severe hazards used for algorithm and model verification. This talk describes the criteria used to initiate the flash flood survey, the specific questions asked and information entered to the database, and then provides an analysis of results for flash flood data collected during the summer of 2008. It is envisioned that specific details provided by the SHAVE flash flood observation database will complement databases collected by operational agencies and thus lead to better tools to predict the likelihood of flash floods and ultimately reduce their impacts on society.

  12. Self-verification motives at the collective level of self-definition.

    PubMed

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  13. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  14. Resistivity Correction Factor for the Four-Probe Method: Experiment III

    NASA Astrophysics Data System (ADS)

    Yamashita, Masato; Nishii, Toshifumi; Kurihara, Hiroshi; Enjoji, Hideo; Iwata, Atsushi

    1990-04-01

    Experimental verification of the theoretically derived resistivity correction factor F is presented. Factor F is applied to a system consisting of a rectangular parallelepiped sample and a square four-probe array. Resistivity and sheet resistance measurements are made on isotropic graphites and crystalline ITO films. Factor F corrects experimental data and leads to reasonable resistivity and sheet resistance.

  15. An Investigation of Judges' Behaviors within a Procedure for Setting Cut Scores for NOCTI Occupational Competency Examinations

    ERIC Educational Resources Information Center

    Walter, Richard A.

    2004-01-01

    Pennsylvania has maintained a nontraditional pathway for the certification of secondary-level vocational teachers since the 1920s. The key that opens the door to that pathway is the verification of subject mastery via: (1) documentation of a learning period in the occupation; (2) documentation of related paid work experience beyond the learning…

  16. Markov Chains For Testing Redundant Software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1990-01-01

    Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.

  17. The Foundations of Einstein's Theory of Gravitation

    NASA Astrophysics Data System (ADS)

    Freundlich, Erwin; Brose, Translated by Henry L.; Einstein, Preface by Albert; Turner, Introduction by H. H.

    2011-06-01

    Introduction; 1. The special theory of relativity as a stepping-stone to the general theory of relativity; 2. Two fundamental postulates in the mathematical formulation of physical laws; 3. Concerning the fulfilment of the two postulates; 4. The difficulties in the principles of classical mechanics; 5. Einstein's theory of gravitation; 6. The verification of the new theory by actual experience; Appendix; Index.

  18. Using Meteorological Analogues for Reordering Postprocessed Precipitation Ensembles in Hydrological Forecasting

    NASA Astrophysics Data System (ADS)

    Bellier, Joseph; Bontron, Guillaume; Zin, Isabella

    2017-12-01

    Meteorological ensemble forecasts are nowadays widely used as input of hydrological models for probabilistic streamflow forecasting. These forcings are frequently biased and have to be statistically postprocessed, using most of the time univariate techniques that apply independently to individual locations, lead times and weather variables. Postprocessed ensemble forecasts therefore need to be reordered so as to reconstruct suitable multivariate dependence structures. The Schaake shuffle and ensemble copula coupling are the two most popular methods for this purpose. This paper proposes two adaptations of them that make use of meteorological analogues for reconstructing spatiotemporal dependence structures of precipitation forecasts. Performances of the original and adapted techniques are compared through a multistep verification experiment using real forecasts from the European Centre for Medium-Range Weather Forecasts. This experiment evaluates not only multivariate precipitation forecasts but also the corresponding streamflow forecasts that derive from hydrological modeling. Results show that the relative performances of the different reordering methods vary depending on the verification step. In particular, the standard Schaake shuffle is found to perform poorly when evaluated on streamflow. This emphasizes the crucial role of the precipitation spatiotemporal dependence structure in hydrological ensemble forecasting.

  19. Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Miller, Karen A.; Garner, James R.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument uncertainties, provides an assessment of the partial-defect sensitivity of HEVA and PNEM for both one-time assay and (repeated) NDA Fingerprint verification scenarios. The findings presented in this report represent a significant step forward in the community’s understanding of the strengths and limitations of the PNEM and HEVA NDA methods, and the viability of the UCVS concept in front-end fuel cycle facilities. This experience will inform Phase II of the UCVS viability study, should the IAEA pursue it.« less

  20. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  1. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  2. National Centers for Environmental Prediction

    Science.gov Websites

    Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS

  3. National Centers for Environmental Prediction

    Science.gov Websites

    Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration /EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION / DIAGNOSTICS Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS

  4. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  5. Laboratory Experiments for Undergraduate Instruction in Economics.

    ERIC Educational Resources Information Center

    Wells, Donald A.

    1991-01-01

    Describes the generation and use of experimental data in teaching economics. Includes a double oral auction experiment and a monopoly pricing experiment. Concludes that such experiments allow the instructor to see what the students have learned, how they reason, and what parts of the material have proved difficult. (DK)

  6. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Eligibility verification. 457.380 Section 457.380... Requirements: Eligibility, Screening, Applications, and Enrollment § 457.380 Eligibility verification. (a) The... State may establish reasonable eligibility verification mechanisms to promote enrollment of eligible...

  7. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  8. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem

    PubMed Central

    Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-01-01

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem. PMID:29597286

  9. Status of BOUT fluid turbulence code: improvements and verification

    NASA Astrophysics Data System (ADS)

    Umansky, M. V.; Lodestro, L. L.; Xu, X. Q.

    2006-10-01

    BOUT is an electromagnetic fluid turbulence code for tokamak edge plasma [1]. BOUT performs time integration of reduced Braginskii plasma fluid equations, using spatial discretization in realistic geometry and employing a standard ODE integration package PVODE. BOUT has been applied to several tokamak experiments and in some cases calculated spectra of turbulent fluctuations compared favorably to experimental data. On the other hand, the desire to understand better the code results and to gain more confidence in it motivated investing effort in rigorous verification of BOUT. Parallel to the testing the code underwent substantial modification, mainly to improve its readability and tractability of physical terms, with some algorithmic improvements as well. In the verification process, a series of linear and nonlinear test problems was applied to BOUT, targeting different subgroups of physical terms. The tests include reproducing basic electrostatic and electromagnetic plasma modes in simplified geometry, axisymmetric benchmarks against the 2D edge code UEDGE in real divertor geometry, and neutral fluid benchmarks against the hydrodynamic code LCPFCT. After completion of the testing, the new version of the code is being applied to actual tokamak edge turbulence problems, and the results will be presented. [1] X. Q. Xu et al., Contr. Plas. Phys., 36,158 (1998). *Work performed for USDOE by Univ. Calif. LLNL under contract W-7405-ENG-48.

  10. FIR signature verification system characterizing dynamics of handwriting features

    NASA Astrophysics Data System (ADS)

    Thumwarin, Pitak; Pernwong, Jitawat; Matsuura, Takenobu

    2013-12-01

    This paper proposes an online signature verification method based on the finite impulse response (FIR) system characterizing time-frequency characteristics of dynamic handwriting features. First, the barycenter determined from both the center point of signature and two adjacent pen-point positions in the signing process, instead of one pen-point position, is used to reduce the fluctuation of handwriting motion. In this paper, among the available dynamic handwriting features, motion pressure and area pressure are employed to investigate handwriting behavior. Thus, the stable dynamic handwriting features can be described by the relation of the time-frequency characteristics of the dynamic handwriting features. In this study, the aforesaid relation can be represented by the FIR system with the wavelet coefficients of the dynamic handwriting features as both input and output of the system. The impulse response of the FIR system is used as the individual feature for a particular signature. In short, the signature can be verified by evaluating the difference between the impulse responses of the FIR systems for a reference signature and the signature to be verified. The signature verification experiments in this paper were conducted using the SUBCORPUS MCYT-100 signature database consisting of 5,000 signatures from 100 signers. The proposed method yielded equal error rate (EER) of 3.21% on skilled forgeries.

  11. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem.

    PubMed

    AlShehri, Helala; Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-03-28

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.

  12. Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.

    PubMed

    Li, Haoxiang; Hua, Gang

    2018-04-01

    Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.

  13. Technical review of SRT-CMA-930058 revalidation studies of Mark 16 experiments: J70

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, R.L.

    1993-10-25

    This study is a reperformance of a set of MGBS-TGAL criticality safety code validation calculations previously reported by Clark. The reperformance was needed because the records of the previous calculations could not be located in current APG files and records. As noted by the author, preliminary attempts to reproduce the Clark results by direct modeling in MGBS and TGAL were unsuccessful. Consultation with Clark indicated that the MGBS-TGAL (EXPT) option within the KOKO system should be used to set up the MGBS and TGAL input data records. The results of the study indicate that the technique used by Clark hasmore » been established and that the technique is now documented for future use. File records of the calculations have also been established in APG files. The review was performed per QAP 11--14 of 1Q34. Since the reviewer was involved in developing the procedural technique used for this study, this review can not be considered a fully independent review, but should be considered a verification that the document contains adequate information to allow a new user to perform similar calculations, a verification of the procedure by performing several calculations independently with identical results to the reported results, and a verification of the readability of the report.« less

  14. SCORPI and SCORPI-T: Neurophysiological experiments on animals in space

    NASA Astrophysics Data System (ADS)

    Serafini, L.; Ramacciotti, T.; Vigano, W.; Donati, A.; Porciani, M.; Zolesi, V.; Schulze-Varnholt, D.; Manieri, P.; El-Din Sallam, A.; Schmah, M.; Horn, E. R.

    2005-08-01

    The study of physiological adaptation to long-term space flights with special consideration of the internal clock systems of scorpions is the goal of the SCORPI and SCORPI-T experiments. SCORPI was selected for flight on the International Space Station (ISS) and will be mounted in the European facility BIOLAB, the ESA laboratory designed to support biological experiments on micro-organisms, cells, tissue, cultures, small plants and small invertebrates. SCORPI-T experiment, performed on the Russian FOTON-M2 satellite in May-June 2005, represents an important precursor for the success of the experiment SCORPI on BIOLAB. This paper outlines the main features of the hardware designed and developed in order to allow the analysis of critical aspects of experiment execution and the verification of experiment objectives. The capabilities of the hardware developed for SCORPI and SCORPI-T show its potential use for any future similar type of experiments in space.

  15. Research on Equivalent Tests of Dynamics of On-orbit Soft Contact Technology Based on On-Orbit Experiment Data

    NASA Astrophysics Data System (ADS)

    Yang, F.; Dong, Z. H.; Ye, X.

    2018-05-01

    Currently, space robots have been become a very important means of space on-orbit maintenance and support. Many countries are taking deep research and experiment on this. Because space operation attitude is very complicated, it is difficult to model them in research lab. This paper builds up a complete equivalent experiment framework according to the requirement of proposed space soft-contact technology. Also, this paper carries out flexible multi-body dynamics parameters verification for on-orbit soft-contact mechanism, which combines on-orbit experiment data, the built soft-contact mechanism equivalent model and flexible multi-body dynamics equivalent model that is based on KANE equation. The experiment results approve the correctness of the built on-orbit soft-contact flexible multi-body dynamics.

  16. Unmanned Vehicle Material Flammability Test

    NASA Technical Reports Server (NTRS)

    Urban, David L.; Ruff, Gary A.; Minster, Olivier; Toth, Balazs; Fernandez-Pello, A. Carlos; Tien, James S.; Torero, Jose L.; Cowlard, Adam J.; Legros, Guillaume; Eigenbrod, Christian; hide

    2012-01-01

    Microgravity fire behaviour remains poorly understood and a significant risk for spaceflight An experiment is under development that will provide the first real opportunity to examine this issue focussing on two objectives: a) Flame Spread. b) Material Flammability. This experiment has been shown to be feasible on both ESA's ATV and Orbital Science's Cygnus vehicles with the Cygnus as the current base-line carrier. An international topical team has been formed to develop concepts for that experiment and support its implementation: a) Pressure Rise prediction. b) Sample Material Selection. This experiment would be a landmark for spacecraft fire safety with the data and subsequent analysis providing much needed verification of spacecraft fire safety protocols for the crews of future exploration vehicles and habitats.

  17. Environmental Technology Verification Report -- Baghouse filtration products, GE Energy QG061 filtration media ( tested May 2007)

    EPA Science Inventory

    EPA has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The Air Pollution Control Technology Verification Center, a cente...

  18. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270 and... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066...

  19. Toward a More Effective Economic Principles Class: The Florida State University Experience.

    ERIC Educational Resources Information Center

    Tuckman, Barbara; Tuckman, Howard

    1975-01-01

    This special issue explores alternative approaches to teaching the college introductory economics course. Using insights gained from learning theory, suggestions from the Joint Council on Economic Education, and trial and error, several faculty members at the Florida State University experimented with various techniques and approaches designed to…

  20. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 24: A general approach to measuring the value of aerospace information products and services

    NASA Technical Reports Server (NTRS)

    Brinberg, Herbert R.; Pinelli, Thomas E.

    1993-01-01

    This paper discusses the various approaches to measuring the value of information, first defining the meanings of information, economics of information, and value. It concludes that no general model of measuring the value of information is possible and that the usual approaches, such as cost/benefit equations, have very limited applications. It also concludes that in specific contexts with given goals for newly developed products and services or newly acquired information there is a basis for its objective valuation. The axioms and inputs for such a model are described and directions for further verification and analysis are proposed.

  1. The Role of the DOE Weapons Laboratories in a Changing National Security Environment: CNSS Papers No. 8, April 1988

    DOE R&D Accomplishments Database

    Hecker, S. S.

    1988-04-01

    The contributions of the Department of Energy (DOE) nuclear weapons laboratories to the nation's security are reviewed in testimony before the Subcommittee on Procurement and Military Nuclear Systems of the House Armed Services Committee. Also presented are contributions that technology will make in maintaining the strategic balance through deterrence, treaty verification, and a sound nuclear weapons complex as the nation prepares for significant arms control initiatives. The DOE nuclear weapons laboratories can contribute to the broader context of national security, one that recognizes that military strength can be maintained over the long term only if it is built upon the foundations of economic strength and energy security.

  2. Personal Identification by Keystroke Dynamics in Japanese Free Text Typing

    NASA Astrophysics Data System (ADS)

    Samura, Toshiharu; Nishimura, Haruhiko

    Biometrics is classified into verification and identification. Many researchers on the keystroke dynamics have treated the verification of a fixed short password which is used for the user login. In this research, we pay attention to the identification and investigate several characteristics of the keystroke dynamics in Japanese free text typing. We developed Web-based typing software in order to collect the keystroke data on the Local Area Network and performed experiments on a total of 112 subjects, from which three groups of typing level, the beginner's level and above, the normal level and above and the middle level and above were constructed. Based on the identification methods by the weighted Euclid distance and the neural network for the extracted feature indexes in Japanese texts, we evaluated identification performances for the three groups. As a result, high accuracy of personal identification was confirmed in both methods, in proportion to the typing level of the group.

  3. Signal existence verification (SEV) for GPS low received power signal detection using the time-frequency approach.

    PubMed

    Jan, Shau-Shiun; Sun, Chih-Cheng

    2010-01-01

    The detection of low received power of global positioning system (GPS) signals in the signal acquisition process is an important issue for GPS applications. Improving the miss-detection problem of low received power signal is crucial, especially for urban or indoor environments. This paper proposes a signal existence verification (SEV) process to detect and subsequently verify low received power GPS signals. The SEV process is based on the time-frequency representation of GPS signal, and it can capture the characteristic of GPS signal in the time-frequency plane to enhance the GPS signal acquisition performance. Several simulations and experiments are conducted to show the effectiveness of the proposed method for low received power signal detection. The contribution of this work is that the SEV process is an additional scheme to assist the GPS signal acquisition process in low received power signal detection, without changing the original signal acquisition or tracking algorithms.

  4. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  5. Competitive region orientation code for palmprint verification and identification

    NASA Astrophysics Data System (ADS)

    Tang, Wenliang

    2015-11-01

    Orientation features of the palmprint have been widely investigated in coding-based palmprint-recognition methods. Conventional orientation-based coding methods usually used discrete filters to extract the orientation feature of palmprint. However, in real operations, the orientations of the filter usually are not consistent with the lines of the palmprint. We thus propose a competitive region orientation-based coding method. Furthermore, an effective weighted balance scheme is proposed to improve the accuracy of the extracted region orientation. Compared with conventional methods, the region orientation of the palmprint extracted using the proposed method can precisely and robustly describe the orientation feature of the palmprint. Extensive experiments on the baseline PolyU and multispectral palmprint databases are performed and the results show that the proposed method achieves a promising performance in comparison to conventional state-of-the-art orientation-based coding methods in both palmprint verification and identification.

  6. Plasma Model V&V of Collisionless Electrostatic Shock

    NASA Astrophysics Data System (ADS)

    Martin, Robert; Le, Hai; Bilyeu, David; Gildea, Stephen

    2014-10-01

    A simple 1D electrostatic collisionless shock was selected as an initial validation and verification test case for a new plasma modeling framework under development at the Air Force Research Laboratory's In-Space Propulsion branch (AFRL/RQRS). Cross verification between PIC, Vlasov, and Fluid plasma models within the framework along with expected theoretical results will be shown. The non-equilibrium velocity distributions (VDF) captured by PIC and Vlasov will be compared to each other and the assumed VDF of the fluid model at selected points. Validation against experimental data from the University of California, Los Angeles double-plasma device will also be presented along with current work in progress at AFRL/RQRS towards reproducing the experimental results using higher fidelity diagnostics to help elucidate differences between model results and between the models and original experiment. DISTRIBUTION A: Approved for public release; unlimited distribution; PA (Public Affairs) Clearance Number 14332.

  7. Development and tests on OREX vehicle thermal structure system

    NASA Astrophysics Data System (ADS)

    Yoshinaka, Toshinari; Morino, Yoshiki

    1992-08-01

    An overview of the thermal system structure development and their tests for Orbital Re-entry Experiment (OREX) vehicle, being developed as a part of H-2 Orbiting Plane (HOPE) development, is presented. The results of study on the OREX vehicle thermal structure system and concept of the system study are shown. The results of HOPE thermal structure system research were reflected to OREX in employing polyacrylonitrile tissues with conversion coating for the nose cap, Carbon-Thermal Protection System (TPS), and ceramic tile TPS for the structure. Test plans were established for material characteristics and design verifications, and flight validation for C/C (Carbon/Carbon Composite) nose cap and TPS, and gap filler, arc wind tunnel, heat insulation, and adhesion quality verification tests. Environment resistance of the C/C nose cone, C/C TPS, and ceramic tile TPS were verified and prospects of their manufacturing were obtained.

  8. Verification of Geosat sea surface topography in the Gulf Stream extension with surface drifting buoys and hydrographic measurements

    NASA Astrophysics Data System (ADS)

    Willebrand, J.; KäSe, R. H.; Stammer, D.; Hinrichsen, H.-H.; Krauss, W.

    1990-03-01

    Altimeter data from Geosat have been analyzed in the Gulf Stream extension area. Horizontal maps of the sea surface height anomaly relative to an annual mean for various 17-day intervals were constructed using an objective mapping procedure. The mean sea level was approximated by the dynamic topography from climatological hydrographic data. Geostrophic surface velocities derived from the composite maps (mean plus anomaly) are significantly correlated with surface drifter velocities observed during an oceanographie experiment in the spring of 1987. The drifter velocities contain much energy on scales less than 100 km which are not resolved in the altimetric maps. It is shown that the composite sea surface height also agrees well with ground verification from hydrographic data along sections in a triangle between the Azores, Newfoundland, and Bermuda, except in regions of high mean gradients.

  9. In-orbit verification of small optical transponder (SOTA): evaluation of satellite-to-ground laser communication links

    NASA Astrophysics Data System (ADS)

    Takenaka, Hideki; Koyama, Yoshisada; Akioka, Maki; Kolev, Dimitar; Iwakiri, Naohiko; Kunimori, Hiroo; Carrasco-Casado, Alberto; Munemasa, Yasushi; Okamoto, Eiji; Toyoshima, Morio

    2016-03-01

    Research and development of space optical communications is conducted in the National Institute of Information and Communications Technology (NICT). The NICT developed the Small Optical TrAnsponder (SOTA), which was embarked on a 50kg-class satellite and launched into a low earth orbit (LEO). The space-to-ground laser communication experiments have been conducted with the SOTA. Atmospheric turbulence causes signal fadings and becomes an issue to be solved in satellite-to-ground laser communication links. Therefore, as error-correcting functions, a Reed-Solomon (RS) code and a Low-Density Generator Matrix (LDGM) code are implemented in the communication system onboard the SOTA. In this paper, we present the in-orbit verification results of SOTA including the characteristic of the functions, the communication performance with the LDGM code via satellite-to-ground atmospheric paths, and the link budget analysis and the comparison between theoretical and experimental results.

  10. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less

  11. Three years of operational experience from Schauinsland CTBT monitoring station.

    PubMed

    Zähringer, M; Bieringer, J; Schlosser, C

    2008-04-01

    Data from three years of operation of a low-level aerosol sampler and analyzer (RASA) at Schauinsland monitoring station are reported. The system is part of the International Monitoring System (IMS) for verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The fully automatic system is capable to measure aerosol borne gamma emitters with high sensitivity and routinely quantifies 7Be and 212Pb. The system had a high level of data availability of 90% within the reporting period. A daily screening process rendered 66 tentative identifications of verification relevant radionuclides since the system entered IMS operation in February 2004. Two of these were real events and associated to a plausible source. The remaining 64 cases can consistently be explained by detector background and statistical phenomena. Inter-comparison with data from a weekly sampler operated at the same station shows instabilities of the calibration during the test phase and a good agreement since certification of the system.

  12. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  13. Gender verification: a term whose time has come and gone.

    PubMed

    Hercher, Laura

    2010-12-01

    The process of testing to determine gender in putatively female athletes was developed in order to prevent cheating, but has devolved instead into a clumsy mechanism for detecting disorders of sexual development (DSD's). In over thirty years of compulsory testing, individuals with DSD's have been stigmatized and some have been denied the right to compete, although frequently their condition provided no competitive advantage. More recent guidelines require testing only on a case-by-case basis; the South African runner Caster Semenya was the first major test of this policy, and her experience points to the need for a more sensitive and confidential process. In addition, her case dramatizes the inadequacy of the term "gender verification." Gender identity is a complex entity and resists simple classification. Sports authorities may set guidelines for who can compete, but they should refrain from taking on themselves the authority to decide who is and who is not a female.

  14. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  15. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  16. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  17. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  18. 48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...

  19. Joint ETV/NOWATECH verification protocol for the Sorbisense GSW40 passive sampler

    EPA Science Inventory

    Environmental technology verification (ETV) is an independent (third party) assessment of the performance of a technology or a product for a specified application, under defined conditions and quality assurance. This verification is a joint verification with the US EPA ETV schem...

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PERFORMANCE VERIFICATION OF THE W.L. GORE & ASSOCIATES GORE-SORBER SCREENING SURVEY

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  1. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  2. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Venet, Arnaud; Visser, Willem

    2003-01-01

    To achieve its science objectives in deep space exploration, NASA has a need for science platform vehicles to autonomously make control decisions in a time frame that excludes intervention from Earth-based controllers. Round-trip light-time is one significant factor motivating autonomy capability, another factor is the need to reduce ground support operations cost. An unsolved problem potentially impeding the adoption of autonomy capability is the verification and validation of such software systems, which exhibit far more behaviors (and hence distinct execution paths in the software) than is typical in current deepspace platforms. Hence the need for a study to benchmark advanced Verification and Validation (V&V) tools on representative autonomy software. The objective of the study was to access the maturity of different technologies, to provide data indicative of potential synergies between them, and to identify gaps in the technologies with respect to the challenge of autonomy V&V. The study consisted of two parts: first, a set of relatively independent case studies of different tools on the same autonomy code, second a carefully controlled experiment with human participants on a subset of these technologies. This paper describes the second part of the study. Overall, nearly four hundred hours of data on human use of three different advanced V&V tools were accumulated, with a control group that used conventional testing methods. The experiment simulated four independent V&V teams debugging three successive versions of an executive controller for a Martian Rover. Defects were carefully seeded into the three versions based on a profile of defects from CVS logs that occurred in the actual development of the executive controller. The rest of the document is structured a s follows. In section 2 and 3, we respectively describe the tools used in the study and the rover software that was analyzed. In section 4 the methodology for the experiment is described; this includes the code preparation, seeding of defects, participant training and experimental setup. Next we give a qualitative overview of how the experiment went from the point of view of each technology; model checking (section 5), static analysis (section 6), runtime analysis (section 7) and testing (section 8). The find section gives some preliminary quantitative results on how the tools compared.

  3. Benchmarking desktop and mobile handwriting across COTS devices: The e-BioSign biometric database

    PubMed Central

    Tolosana, Ruben; Vera-Rodriguez, Ruben; Fierrez, Julian; Morales, Aythami; Ortega-Garcia, Javier

    2017-01-01

    This paper describes the design, acquisition process and baseline evaluation of the new e-BioSign database, which includes dynamic signature and handwriting information. Data is acquired from 5 different COTS devices: three Wacom devices (STU-500, STU-530 and DTU-1031) specifically designed to capture dynamic signatures and handwriting, and two general purpose tablets (Samsung Galaxy Note 10.1 and Samsung ATIV 7). For the two Samsung tablets, data is collected using both pen stylus and also the finger in order to study the performance of signature verification in a mobile scenario. Data was collected in two sessions for 65 subjects, and includes dynamic information of the signature, the full name and alpha numeric sequences. Skilled forgeries were also performed for signatures and full names. We also report a benchmark evaluation based on e-BioSign for person verification under three different real scenarios: 1) intra-device, 2) inter-device, and 3) mixed writing-tool. We have experimented the proposed benchmark using the main existing approaches for signature verification: feature- and time functions-based. As a result, new insights into the problem of signature biometrics in sensor-interoperable scenarios have been obtained, namely: the importance of specific methods for dealing with device interoperability, and the necessity of a deeper analysis on signatures acquired using the finger as the writing tool. This e-BioSign public database allows the research community to: 1) further analyse and develop signature verification systems in realistic scenarios, and 2) investigate towards a better understanding of the nature of the human handwriting when captured using electronic COTS devices in realistic conditions. PMID:28475590

  4. Benchmarking desktop and mobile handwriting across COTS devices: The e-BioSign biometric database.

    PubMed

    Tolosana, Ruben; Vera-Rodriguez, Ruben; Fierrez, Julian; Morales, Aythami; Ortega-Garcia, Javier

    2017-01-01

    This paper describes the design, acquisition process and baseline evaluation of the new e-BioSign database, which includes dynamic signature and handwriting information. Data is acquired from 5 different COTS devices: three Wacom devices (STU-500, STU-530 and DTU-1031) specifically designed to capture dynamic signatures and handwriting, and two general purpose tablets (Samsung Galaxy Note 10.1 and Samsung ATIV 7). For the two Samsung tablets, data is collected using both pen stylus and also the finger in order to study the performance of signature verification in a mobile scenario. Data was collected in two sessions for 65 subjects, and includes dynamic information of the signature, the full name and alpha numeric sequences. Skilled forgeries were also performed for signatures and full names. We also report a benchmark evaluation based on e-BioSign for person verification under three different real scenarios: 1) intra-device, 2) inter-device, and 3) mixed writing-tool. We have experimented the proposed benchmark using the main existing approaches for signature verification: feature- and time functions-based. As a result, new insights into the problem of signature biometrics in sensor-interoperable scenarios have been obtained, namely: the importance of specific methods for dealing with device interoperability, and the necessity of a deeper analysis on signatures acquired using the finger as the writing tool. This e-BioSign public database allows the research community to: 1) further analyse and develop signature verification systems in realistic scenarios, and 2) investigate towards a better understanding of the nature of the human handwriting when captured using electronic COTS devices in realistic conditions.

  5. Energy- and time-resolved detection of prompt gamma-rays for proton range verification.

    PubMed

    Verburg, Joost M; Riley, Kent; Bortfeld, Thomas; Seco, Joao

    2013-10-21

    In this work, we present experimental results of a novel prompt gamma-ray detector for proton beam range verification. The detection system features an actively shielded cerium-doped lanthanum(III) bromide scintillator, coupled to a digital data acquisition system. The acquisition was synchronized to the cyclotron radio frequency to separate the prompt gamma-ray signals from the later-arriving neutron-induced background. We designed the detector to provide a high energy resolution and an effective reduction of background events, enabling discrete proton-induced prompt gamma lines to be resolved. Measuring discrete prompt gamma lines has several benefits for range verification. As the discrete energies correspond to specific nuclear transitions, the magnitudes of the different gamma lines have unique correlations with the proton energy and can be directly related to nuclear reaction cross sections. The quantification of discrete gamma lines also enables elemental analysis of tissue in the beam path, providing a better prediction of prompt gamma-ray yields. We present the results of experiments in which a water phantom was irradiated with proton pencil-beams in a clinical proton therapy gantry. A slit collimator was used to collimate the prompt gamma-rays, and measurements were performed at 27 positions along the path of proton beams with ranges of 9, 16 and 23 g cm(-2) in water. The magnitudes of discrete gamma lines at 4.44, 5.2 and 6.13 MeV were quantified. The prompt gamma lines were found to be clearly resolved in dimensions of energy and time, and had a reproducible correlation with the proton depth-dose curve. We conclude that the measurement of discrete prompt gamma-rays for in vivo range verification of clinical proton beams is feasible, and plan to further study methods and detector designs for clinical use.

  6. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  7. Verification of S&D Solutions for Network Communications and Devices

    NASA Astrophysics Data System (ADS)

    Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen

    This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.

  8. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferguson, D.W.; Tompkins, T.A.; Pratapas, J.M.

    The Coal Quality Impact Model (CQIM{trademark}) was used to evaluate the economic and performance impacts of gas co-firing at Mississippi Power Company`s Plant Watson. One of the most important benefits of gas co-firing considered was the ability to burn lower quality, less expensive fuels. Four coals and petroleum coke were evaluated at 0, 5, 10, 20, and 30 percent gas co-firing. These fuels vary widely in their geographic source, heating value, moisture, volatile matter, and sulfur contents. Performance and economic evaluations were conducted at individual load points of 100, 75, 50, 40, 30, and 20 percent of full load. Additionalmore » analyses were made for seasonal load-demand curves and for an average annual load-demand curve. Operating cost in $/MWh, net plant heat rate in Btu/kWh, and break-even gas price in $/MBtu are presented as a function of load and percent gas co-firing. Results illustrate that with the Illinois Basin Coal currently burned at Plant Watson, gas co-firing can be economically justified over a range of gas market prices on either an annual or seasonal basis. Other findings indicate that petroleum coke and South American coal co-fired with natural gas offer significant fuel cost savings and are attractive candidate fuels for combustion verification testing.« less

  10. Natural-technological risk assessment and management

    NASA Astrophysics Data System (ADS)

    Burova, Valentina; Frolova, Nina

    2016-04-01

    EM-DAT statistical data on human impact and economic damages in the 1st semester 2015 are the highest since 2011: 41% of disasters were floods, responsible for 39% of economic damage and 7% of events were earthquakes responsible for 59% of total death toll. This suggests that disaster risk assessment and management still need to be improved and stay the principle issue in national and international related programs. The paper investigates the risk assessment and management practice in the Russian Federation at different levels. The method is proposed to identify the territories characterized by integrated natural-technological hazard. The maps of the Russian Federation zoning according to the integrated natural-technological hazard level are presented, as well as the procedure of updating the integrated hazard level taking into account the activity of separate processes. Special attention is paid to data bases on past natural and technological processes consequences, which are used for verification of current hazard estimation. The examples of natural-technological risk zoning for the country and some regions territory are presented. Different output risk indexes: both social and economic, are estimated taking into account requirements of end-users. In order to increase the safety of population of the Russian Federation the trans-boundaries hazards are also taken into account.

  11. 78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-13

    ... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...

  12. 78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-28

    .... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process AGENCY: Pipeline and... announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity Verification Process shares similar characteristics with fitness for service processes. At this workshop, the...

  13. Interim Letter Report - Verification Survey of Partial Grid E9, David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P.C. Weaver

    2008-06-12

    Conduct verification surveys of available grids at the DWI 1630 in Knoxville, Tennessee. A representative with the Independent Environmental Assessment and Verification (IEAV) team from ORISE conducted a verification survey of a partial area within Grid E9.

  14. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  15. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...

  16. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Import certificate/delivery verification...

  17. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Import certificate/delivery verification...

  18. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  19. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  20. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  1. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  2. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  3. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  4. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  5. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  6. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  7. Towards the formal verification of the requirements and design of a processor interface unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.

  8. A Nonvolume Preserving Plasticity Theory with Applications to Powder Metallurgy

    NASA Technical Reports Server (NTRS)

    Cassenti, B. N.

    1983-01-01

    A plasticity theory has been developed to predict the mechanical response of powder metals during hot isostatic pressing. The theory parameters were obtained through an experimental program consisting of hydrostatic pressure tests, uniaxial compression and uniaxial tension tests. A nonlinear finite element code was modified to include the theory and the results of themodified code compared favorably to the results from a verification experiment.

  9. Source Physics Experiment: Research in Support of Verification and Nonproliferation

    DTIC Science & Technology

    2011-09-01

    designed to provide a carefully controlled seismic and strong motion data set from buried explosions at the Nevada National Security Site (NNSS). The...deposition partitioned into internal (heat and plastic strain) and kinetic (e.g., radiated seismic ) energy, giving more confidence in predicted free...ample information to study dry and water-saturated fractures, local lithology and topography on the radiated seismic wavefield. Spallation on

  10. Verification of Algebra Step Problems: A Chronometric Study of Human Problem Solving. Technical Report No. 253. Psychology and Education Series.

    ERIC Educational Resources Information Center

    Matthews, Paul G.; Atkinson, Richard C.

    This paper reports an experiment designed to test theoretical relations among fast problem solving, more complex and slower problem solving, and research concerning fundamental memory processes. Using a cathode ray tube, subjects were presented with propositions of the form "Y is in list X" which they memorized. In later testing they were asked to…

  11. Modeling the Water Balloon Slingshot

    NASA Astrophysics Data System (ADS)

    Bousquet, Benjamin D.; Figura, Charles C.

    2013-01-01

    In the introductory physics courses at Wartburg College, we have been working to create a lab experience focused on the scientific process itself rather than verification of physical laws presented in the classroom or textbook. To this end, we have developed a number of open-ended modeling exercises suitable for a variety of learning environments, from non-science major classes to algebra-based and calculus-based introductory physics classes.

  12. 7 CFR 272.8 - State income and eligibility verification system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false State income and eligibility verification system. 272... PARTICIPATING STATE AGENCIES § 272.8 State income and eligibility verification system. (a) General. (1) State agencies may maintain and use an income and eligibility verification system (IEVS), as specified in this...

  13. 24 CFR 985.3 - Indicators, HUD verification methods and ratings.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicators, HUD verification..., HUD verification methods and ratings. This section states the performance indicators that are used to assess PHA Section 8 management. HUD will use the verification method identified for each indicator in...

  14. 78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    .... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension... public workshop on ``Integrity Verification Process'' which took place on August 7, 2013. The notice also sought comments on the proposed ``Integrity Verification Process.'' In response to the comments received...

  15. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  16. 76 FR 50164 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-12

    ...-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing... correct certain portions of the Protocol Gas Verification Program and Minimum Competency Requirements for... final rule that amends the Agency's Protocol Gas Verification Program (PGVP) and the minimum competency...

  17. 30 CFR 227.601 - What are a State's responsibilities if it performs automated verification?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... performs automated verification? 227.601 Section 227.601 Mineral Resources MINERALS MANAGEMENT SERVICE... Perform Delegated Functions § 227.601 What are a State's responsibilities if it performs automated verification? To perform automated verification of production reports or royalty reports, you must: (a) Verify...

  18. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  19. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  20. 10 CFR 9.54 - Verification of identity of individuals making requests.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...

  1. Verification test report on a solar heating and hot water system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  2. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  3. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  4. Parental Health and Children's Economic Well-Being

    ERIC Educational Resources Information Center

    Wagmiller, Robert L., Jr.; Lennon, Mary Clare; Kuang, Li

    2008-01-01

    The life course perspective emphasizes that past economic experiences and stage in the life course influence a family's ability to cope with negative life events such as poor health. However, traditional analytic approaches are not well-suited to examine how the impact of negative life events differs based on a family's past economic experiences,…

  5. An experiment to evaluate liquid hydrogen storage in space

    NASA Technical Reports Server (NTRS)

    Eberhardt, R. N.; Fester, D. A.; Johns, W. A.; Marino, J. S.

    1981-01-01

    The design and verification of a Cryogenic Fluid Management Experiment for orbital operation on the Shuttle is described. The experiment will furnish engineering data to establish design criteria for storage and supply of cryogenic fluids, mainly hydrogen, for use in low gravity environments. The apparatus comprises an LAD (liquid acquisition device) and a TVS (thermodynamic vent system). The hydrogen will be either vented or forced out by injected helium and the flow rates will be monitored. The data will be compared with ground-based simulations to determine optimal flow rates for the pressurizing gas and the release of the cryogenic fluid. It is noted that tests on a one-g, one-third size LAD system are under way.

  6. Analyzing the security of an existing computer system

    NASA Technical Reports Server (NTRS)

    Bishop, M.

    1986-01-01

    Most work concerning secure computer systems has dealt with the design, verification, and implementation of provably secure computer systems, or has explored ways of making existing computer systems more secure. The problem of locating security holes in existing systems has received considerably less attention; methods generally rely on thought experiments as a critical step in the procedure. The difficulty is that such experiments require that a large amount of information be available in a format that makes correlating the details of various programs straightforward. This paper describes a method of providing such a basis for the thought experiment by writing a special manual for parts of the operating system, system programs, and library subroutines.

  7. A tool for hearing aid and cochlear implant users to judge the usability of cellular telephones in field conditions

    NASA Astrophysics Data System (ADS)

    Deer, Maria Soledad

    The auditory experience of using a hearing aid or a cochlear implant simultaneously with a cell phone is driven by a number of factors. These factors are: radiofrequency and baseband interference, speech intelligibility, sound quality, handset design, volume control and signal strength. The purpose of this study was to develop a tool to be used by hearing aid and cochlear implant users in retail stores as they try cell phones before buying them. This tool is meant to be an efficient, practical and systematic consumer selection tool that will capture and document information on all the domains that play a role in the auditory experience of using a cell phone with a hearing aid or cochlear implant. The development of this consumer tool involved three steps as follows: preparation, verification and measurement of success according to a predefined criterion. First, the consumer tool, consisting of a comparison chart and speech material, was prepared. Second, the consumer tool was evaluated by groups of subjects in a two-step verification process. Phase I was conducted in a controlled setting and it was followed by Phase II which took place in real world (field) conditions. In order to perform a systematic evaluation of the consumer tool two questionnaires were developed: one questionnaire for each phase. Both questionnaires involved five quantitative variables scored with the use of ratings scales. These ratings were averaged yielding an Overall Consumer Performance Score. A qualitative performance category corresponding to the Mean Opinion Score (MOS) was allocated to each final score within a scale ranging from 1 to 5 (where 5 = excellent and 1 = bad). Finally, the consumer tool development was determined to be successful if at least 80% of the participants in verification Phase II rated the comparison chart as excellent or good according to the qualitative MOS score. The results for verification Phase II (field conditions) indicated that the Overall Consumer Performance score for 92% of the subjects (11/12) was 3.7 and above corresponding to Good and Excellent MOS qualitative categories. It was concluded that this is a practical and efficient tool for hearing aid/cochlear implant users as they approach a cell phone selection process.

  8. Measuring the Economic Value of Pre-MBA Work Experience

    ERIC Educational Resources Information Center

    Yeaple, Ronald N.; Johnston, Mark W.; Whittingham, Keith L.

    2010-01-01

    Pre-MBA work experience is required for admission to many graduate schools of business. In the present study, MBA graduates with a wide range of pre-MBA work experience were surveyed to assess the economic value of such work experience. No evidence was found of a systematic financial advantage to students from working for several years before…

  9. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  10. 37 CFR 201.29 - Access to, and confidentiality of, Statements of Account, Verification Auditor's Reports, and...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... confidentiality of, Statements of Account, Verification Auditor's Reports, and other verification information... GENERAL PROVISIONS § 201.29 Access to, and confidentiality of, Statements of Account, Verification Auditor... Account, including the Primary Auditor's Reports, filed under 17 U.S.C. 1003(c) and access to a Verifying...

  11. Formal Multilevel Hierarchical Verification of Synchronous MOS VLSI Circuits.

    DTIC Science & Technology

    1987-06-01

    166 12.4 Capacitance Coupling............................. 166 12.5 Multiple Abstraction Fuctions ....................... 168...depend on whether it is performing flat verification or hierarchical verification. The primary operations of Silica Pithecus when performing flat...signals never arise. The primary operation of Silica Pithecus when performing hierarchical verification is processing constraints to show they hold

  12. 49 CFR 802.7 - Requests: How, where, and when presented; verification of identity of individuals making requests...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...; verification of identity of individuals making requests; accompanying persons; and procedures for... Procedures and Requirements § 802.7 Requests: How, where, and when presented; verification of identity of... which the record is contained. (d) Verification of identity of requester. (1) For written requests, the...

  13. 49 CFR 802.7 - Requests: How, where, and when presented; verification of identity of individuals making requests...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...; verification of identity of individuals making requests; accompanying persons; and procedures for... Procedures and Requirements § 802.7 Requests: How, where, and when presented; verification of identity of... which the record is contained. (d) Verification of identity of requester. (1) For written requests, the...

  14. 76 FR 54810 - Submission for Review: 3206-0215, Verification of Full-Time School Attendance, RI 25-49

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: 3206-0215, Verification of Full-Time School...) 3206-0215, Verification of Full-Time School Attendance. As required by the Paperwork Reduction Act of... or faxed to (202) 395-6974. SUPPLEMENTARY INFORMATION: RI 25-49, Verification of Full-Time School...

  15. 25 CFR 61.8 - Verification forms.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... using the last address of record. The verification form will be used to ascertain the previous enrollee... death. Name and/or address changes will only be made if the verification form is signed by an adult... 25 Indians 1 2010-04-01 2010-04-01 false Verification forms. 61.8 Section 61.8 Indians BUREAU OF...

  16. Development and verification of an agent-based model of opinion leadership.

    PubMed

    Anderson, Christine A; Titler, Marita G

    2014-09-27

    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The development and testing of agent-based models is an iterative process. The opinion leader model presented here provides a basic structure for continued model development, ongoing verification, and the establishment of validation procedures, including empirical data collection.

  17. Current status of verification practices in clinical biochemistry in Spain.

    PubMed

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  18. A rotorcraft flight database for validation of vision-based ranging algorithms

    NASA Technical Reports Server (NTRS)

    Smith, Phillip N.

    1992-01-01

    A helicopter flight test experiment was conducted at the NASA Ames Research Center to obtain a database consisting of video imagery and accurate measurements of camera motion, camera calibration parameters, and true range information. The database was developed to allow verification of monocular passive range estimation algorithms for use in the autonomous navigation of rotorcraft during low altitude flight. The helicopter flight experiment is briefly described. Four data sets representative of the different helicopter maneuvers and the visual scenery encountered during the flight test are presented. These data sets will be made available to researchers in the computer vision community.

  19. COBE ground segment gyro calibration

    NASA Technical Reports Server (NTRS)

    Freedman, I.; Kumar, V. K.; Rae, A.; Venkataraman, R.; Patt, F. S.; Wright, E. L.

    1991-01-01

    Discussed here is the calibration of the scale factors and rate biases for the Cosmic Background Explorer (COBE) spacecraft gyroscopes, with the emphasis on the adaptation for COBE of an algorithm previously developed for the Solar Maximum Mission. Detailed choice of parameters, convergence, verification, and use of the algorithm in an environment where the reference attitudes are determined form the Sun, Earth, and star observations (via the Diffuse Infrared Background Experiment (DIRBE) are considered. Results of some recent experiments are given. These include tests where the gyro rate data are corrected for the effect of the gyro baseplate temperature on the spacecraft electronics.

  20. Using formal specification in the Guidance and Control Software (GCS) experiment. Formal design and verification technology for life critical systems

    NASA Technical Reports Server (NTRS)

    Weber, Doug; Jamsek, Damir

    1994-01-01

    The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.

  1. The Economics of Publishing and the Publishing of Economics.

    ERIC Educational Resources Information Center

    La Manna, Manfredi

    2003-01-01

    Explores the relationship between economics and scientific journal publishing. Topics include journal pricing in economics; market power exerted by the dominant commercial publisher in economics journal publishing; academic experiments to improve scholarly communication in economics; policies of the United Kingdom Competition Commission; and…

  2. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of themore » US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.« less

  3. Verification of the modified model of the drying process of a polymer liquid film on a flat substrate by experiment (2): through more accurate experiment

    NASA Astrophysics Data System (ADS)

    Kagami, Hiroyuki

    2006-05-01

    We have proposed and modified a model of drying process of polymer solution coated on a flat substrate for flat polymer film fabrication and have presented the fruits through Photomask Japan 2002, 2003, 2004 and so on. And for example numerical simulation of the model qualitatively reappears a typical thickness profile of the polymer film formed after drying, that is, the profile that the edge of the film is thicker and just the region next to the edge's bump is thinner. Then we have clarified dependence of distribution of polymer molecules on a flat substrate on a various parameters based on analysis of many numerical simulations. Then we done a few kinds of experiments so as to verify the modified model and reported the initial result of them through Photomask Japan 2005. Through the initial result we could observe some results supporting the modified model. But we could not observe a characteristic region of a valley next to the edge's bump of a polymer film after drying because a shape of a solution's film coated on a substrate in the experiment was different from one in resists' coating and drying process or imagined in the modified model. In this study, we improved above difference between experiment and the model and did experiments for verification again with a shape of a solution's film coated on a substrate coincident with one imagined in the modified model and using molar concentration. As a result, some were verified more strongly and some need to be examined again. That is, we could confirm like results of last experiment that the smaller average molecular weight of Metoloses was, the larger the gradient of thickness profile of a polymer thin film was. But we could not observe a depression just inside the edge of the thin film also in this improved experiment. We may be able to enumerate the fact that not an organic solution but an aqueous solution was used in the experiment as the cause of non-formation of the depression.

  4. Remote collection and analysis of witness reports on flash floods

    NASA Astrophysics Data System (ADS)

    Gourley, J. J.; Erlingis, J. M.; Smith, T. M.; Ortega, K. L.; Hong, Y.

    2010-11-01

    SummaryTypically, flash floods are studied ex post facto in response to a major impact event. A complement to field investigations is developing a detailed database of flash flood events, including minor events and null reports (i.e., where heavy rain occurred but there was no flash flooding), based on public survey questions conducted in near-real time. The Severe hazards analysis and verification experiment (SHAVE) has been in operation at the National Severe Storms Laboratory (NSSL) in Norman, OK, USA during the summers since 2006. The experiment employs undergraduate students to analyse real-time products from weather radars, target specific regions within the conterminous US, and poll public residences and businesses regarding the occurrence and severity of hail, wind, tornadoes, and now flash floods. In addition to providing a rich learning experience for students, SHAVE has also been successful in creating high-resolution datasets of severe hazards used for algorithm and model verification. This paper describes the criteria used to initiate the flash flood survey, the specific questions asked and information entered to the database, and then provides an analysis of results for flash flood data collected during the summer of 2008. It is envisioned that specific details provided by the SHAVE flash flood observation database will complement databases collected by operational agencies (i.e., US National Weather Service Storm Data reports) and thus lead to better tools to predict the likelihood of flash floods and ultimately reduce their impacts on society.

  5. Environmental Technology Verification: Supplement to Test/QA Plan for Biological and Aerosol Testing of General Ventilation Air Cleaners; Bioaerosol Inactivation Efficiency by HVAC In-Duct Ultraviolet Light Air Cleaners

    EPA Science Inventory

    The Air Pollution Control Technology Verification Center has selected general ventilation air cleaners as a technology area. The Generic Verification Protocol for Biological and Aerosol Testing of General Ventilation Air Cleaners is on the Environmental Technology Verification we...

  6. 49 CFR 40.135 - What does the MRO tell the employee at the beginning of the verification interview?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... beginning of the verification interview? 40.135 Section 40.135 Transportation Office of the Secretary of... verification interview? (a) As the MRO, you must tell the employee that the laboratory has determined that the... finding of adulteration or substitution. (b) You must explain the verification interview process to the...

  7. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations... concentration subcomponents (e.g., THC and CH4 for NMHC) separately. For example, for NMHC measurements, perform drift verification on NMHC; do not verify THC and CH4 separately. (2) Drift verification requires two...

  8. 76 FR 44051 - Submission for Review: Verification of Who Is Getting Payments, RI 38-107 and RI 38-147

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-22

    .... SUPPLEMENTARY INFORMATION: RI 38-107, Verification of Who is Getting Payments, is designed for use by the... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Verification of Who Is Getting Payments, RI... currently approved information collection request (ICR) 3206-0197, Verification of Who is Getting Payments...

  9. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  10. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  11. Accessing defect dynamics using intense, nanosecond pulsed ion beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Persaud, A.; Barnard, J. J.; Guo, H.

    2015-06-18

    Gaining in-situ access to relaxation dynamics of radiation induced defects will lead to a better understanding of materials and is important for the verification of theoretical models and simulations. We show preliminary results from experiments at the new Neutralized Drift Compression Experiment (NDCX-II) at Lawrence Berkeley National Laboratory that will enable in-situ access to defect dynamics through pump-probe experiments. Here, the unique capabilities of the NDCX-II accelerator to generate intense, nanosecond pulsed ion beams are utilized. Preliminary data of channeling experiments using lithium and potassium ions and silicon membranes are shown. We compare these data to simulation results using Crystalmore » Trim. Furthermore, we discuss the improvements to the accelerator to higher performance levels and the new diagnostics tools that are being incorporated.« less

  12. Space Shuttle Program (SSP) Shock Test and Specification Experience for Reusable Flight Hardware Equipment

    NASA Technical Reports Server (NTRS)

    Larsen, Curtis E.

    2012-01-01

    As commercial companies are nearing a preliminary design review level of design maturity, several companies are identifying the process for qualifying their multi-use electrical and mechanical components for various shock environments, including pyrotechnic, mortar firing, and water impact. The experience in quantifying the environments consists primarily of recommendations from Military Standard-1540, Product Verification Requirement for Launch, Upper Stage, and Space Vehicles. Therefore, the NASA Engineering and Safety Center (NESC) formed a team of NASA shock experts to share the NASA experience with qualifying hardware for the Space Shuttle Program (SSP) and other applicable programs and projects. Several team teleconferences were held to discuss past experience and to share ideas of possible methods for qualifying components for multiple missions. This document contains the information compiled from the discussions

  13. Experiences of giving and receiving care in traumatic brain injury: An integrative review.

    PubMed

    Kivunja, Stephen; River, Jo; Gullick, Janice

    2018-04-01

    To synthesise the literature on the experiences of giving or receiving care for traumatic brain injury for people with traumatic brain injury, their family members and nurses in hospital and rehabilitation settings. Traumatic brain injury represents a major source of physical, social and economic burden. In the hospital setting, people with traumatic brain injury feel excluded from decision-making processes and perceive impatient care. Families describe inadequate information and support for psychological distress. Nurses find the care of people with traumatic brain injury challenging particularly when experiencing heavy workloads. To date, a contemporary synthesis of the literature on people with traumatic brain injury, family and nurse experiences of traumatic brain injury care has not been conducted. Integrative literature review. A systematic search strategy guided by the PRISMA statement was conducted in CINAHL, PubMed, Proquest, EMBASE and Google Scholar. Whittemore and Knafl's (Journal of Advanced Nursing, 52, 2005, 546) integrative review framework guided data reduction, data display, data comparison and conclusion verification. Across the three participant categories (people with traumatic brain injury/family members/nurses) and sixteen subcategories, six cross-cutting themes emerged: seeking personhood, navigating challenging behaviour, valuing skills and competence, struggling with changed family responsibilities, maintaining productive partnerships and reflecting on workplace culture. Traumatic brain injury creates changes in physical, cognitive and emotional function that challenge known ways of being in the world for people. This alters relationship dynamics within families and requires a specific skill set among nurses. Recommendations include the following: (i) formal inclusion of people with traumatic brain injury and families in care planning, (ii) routine risk screening for falls and challenging behaviour to ensure that controls are based on accurate assessment, (iii) formal orientation and training for novice nurses in the management of challenging behaviour, (iv) professional case management to guide access to services and funding and (v) personal skill development to optimise family functioning. © 2018 John Wiley & Sons Ltd.

  14. Verification of the Microgravity Active Vibration Isolation System based on Parabolic Flight

    NASA Astrophysics Data System (ADS)

    Zhang, Yong-kang; Dong, Wen-bo; Liu, Wei; Li, Zong-feng; Lv, Shi-meng; Sang, Xiao-ru; Yang, Yang

    2017-12-01

    The Microgravity active vibration isolation system (MAIS) is a device to reduce on-orbit vibration and to provide a lower gravity level for certain scientific experiments. MAIS system is made up of a stator and a floater, the stator is fixed on the spacecraft, and the floater is suspended by electromagnetic force so as to reduce the vibration from the stator. The system has 3 position sensors, 3 accelerometers, 8 Lorentz actuators, signal processing circuits and a central controller embedded in the operating software and control algorithms. For the experiments on parabolic flights, a laptop is added to MAIS for monitoring and operation, and a power module is for electric power converting. The principle of MAIS is as follows: the system samples the vibration acceleration of the floater from accelerometers, measures the displacement between stator and floater from position sensitive detectors, and computes Lorentz force current for each actuator so as to eliminate the vibration of the scientific payload, and meanwhile to avoid crashing between the stator and the floater. This is a motion control technic in 6 degrees of freedom (6-DOF) and its function could only be verified in a microgravity environment. Thanks for DLR and Novespace, we get a chance to take the DLR 27th parabolic flight campaign to make experiments to verify the 6-DOF control technic. The experiment results validate that the 6-DOF motion control technique is effective, and vibration isolation performance perfectly matches what we expected based on theoretical analysis and simulation. The MAIS has been planned on Chinese manned spacecraft for many microgravity scientific experiments, and the verification on parabolic flights is very important for its following mission. Additionally, we also test some additional function by microgravity electromagnetic suspension, such as automatic catching and locking and working in fault mode. The parabolic flight produces much useful data for these experiments.

  15. The plan for the economic evaluation of the Public Service Communication Satellite system

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A total plan for the economic evaluation of the PSCS public service communication satellite program within domestic markets is presented. It extends from the present through the planning, performance and evaluation of economic experiments following the launch of the PSCS, and includes the consideration of how the results of these experiments impact the transfer from demonstration to operations. The implementation of this plan will provide NASA with information needed to understand and manage the economic and social impacts of the PSCS program.

  16. The plan for the economic evaluation of the public service communication satellite system

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A plan for the economic evaluation of the Public Service Communications Satellite (PSCS) within domestic markets is presented. It extends through the planning, performance and evaluation of economic experiments following the launch of the PSCS in 1982, and includes the consideration of how the results of these experiments impact the transfer from demonstration to operations. The implementation of this plan will provide information needed to understand and manage the economic and social impacts of the PSCS program.

  17. Loads and low frequency dynamics data base: Version 1.1 November 8, 1985. [Space Shuttles

    NASA Technical Reports Server (NTRS)

    Garba, J. A. (Editor)

    1985-01-01

    Structural design data for the Shuttle are presented in the form of a data base. The data can be used by designers of Shuttle experiments to assure compliance with Shuttle safety and structural verification requirements. A glossary of Shuttle design terminology is given, and the principal safety requirements of Shuttle are summarized. The Shuttle design data are given in the form of load factors.

  18. Teichmuller Space Resolution of the EPR Paradox

    NASA Astrophysics Data System (ADS)

    Winterberg, Friedwardt

    2013-04-01

    The mystery of Newton's action-at-a-distance law of gravity was resolved by Einstein with Riemann's non-Euclidean geometry, which permitted the explanation of the departure from Newton's law for the motion of Mercury. It is here proposed that the similarly mysterious non-local EPR-type quantum correlations may be explained by a Teichmuller space geometry below the Planck length, for which an experiment for its verification is proposed.

  19. Interfering with the neutron spin

    NASA Astrophysics Data System (ADS)

    Wagh, Apoorva G.; Rakhecha, Veer Chand

    2004-07-01

    Charge neutrality, a spin frac{1}{2} and an associated magnetic moment of the neu- tron make it an ideal probe of quantal spinor evolutions. Polarized neutron interferometry in magnetic field Hamiltonians has thus scored several firsts such as direct verification of Pauli anticommutation, experimental separation of geometric and dynamical phases and observation of non-cyclic amplitudes and phases. This paper provides a flavour of the physics learnt from such experiments.

  20. Formal Validation of Aerospace Software

    NASA Astrophysics Data System (ADS)

    Lesens, David; Moy, Yannick; Kanig, Johannes

    2013-08-01

    Any single error in critical software can have catastrophic consequences. Even though failures are usually not advertised, some software bugs have become famous, such as the error in the MIM-104 Patriot. For space systems, experience shows that software errors are a serious concern: more than half of all satellite failures from 2000 to 2003 involved software. To address this concern, this paper addresses the use of formal verification of software developed in Ada.

Top