Sample records for rewrite based verification

  1. Specification and Verification of Web Applications in Rewriting Logic

    NASA Astrophysics Data System (ADS)

    Alpuente, María; Ballis, Demis; Romero, Daniel

    This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.

  2. Simulation and Verification of Synchronous Set Relations in Rewriting Logic

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Munoz, Cesar A.

    2011-01-01

    This paper presents a mathematical foundation and a rewriting logic infrastructure for the execution and property veri cation of synchronous set relations. The mathematical foundation is given in the language of abstract set relations. The infrastructure consists of an ordersorted rewrite theory in Maude, a rewriting logic system, that enables the synchronous execution of a set relation provided by the user. By using the infrastructure, existing algorithm veri cation techniques already available in Maude for traditional asynchronous rewriting, such as reachability analysis and model checking, are automatically available to synchronous set rewriting. The use of the infrastructure is illustrated with an executable operational semantics of a simple synchronous language and the veri cation of temporal properties of a synchronous system.

  3. 77 FR 59581 - Personal Identity Verification, Release and Handling of Restricted Information, Protection of the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-28

    ... of a major NASA FAR Supplement (NFS) rewrite, any changes from the withdrawn rules that continue to... (NFS). Public comments were received on all three rules. However, circumstances at the time prevented... cancelled without further action. At this time, NASA is in process of a major NFS rewrite, and any changes...

  4. A Rewriting-Based Approach to Trace Analysis

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We present a rewriting-based algorithm for efficiently evaluating future time Linear Temporal Logic (LTL) formulae on finite execution traces online. While the standard models of LTL are infinite traces, finite traces appear naturally when testing and/or monitoring red applications that only run for limited time periods. The presented algorithm is implemented in the Maude executable specification language and essentially consists of a set of equations establishing an executable semantics of LTL using a simple formula transforming approach. The algorithm is further improved to build automata on-the-fly from formulae, using memoization. The result is a very efficient and small Maude program that can be used to monitor program executions. We furthermore present an alternative algorithm for synthesizing probably minimal observer finite state machines (or automata) from LTL formulae, which can be used to analyze execution traces without the need for a rewriting system, and can hence be used by observers written in conventional programming languages. The presented work is part of an ambitious runtime verification and monitoring project at NASA Ames, called PATHEXPLORER, and demonstrates that rewriting can be a tractable and attractive means for experimenting and implementing program monitoring logics.

  5. META 2f: Probabilistic, Compositional, Multi-dimension Model-Based Verification (PROMISE)

    DTIC Science & Technology

    2011-10-01

    Equational Logic, Rewriting Logic, and Maude ................................................ 52  5.3  Results and Discussion...and its discrete transitions are left unchanged. However, the differential equations describing the continuous dynamics (in each mode) are replaced by...by replacing hard-to-analyze differential equations by discrete transitions. In principle, predicate and qualitative abstraction can be used on a

  6. An Overview of the Runtime Verification Tool Java PathExplorer

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We present an overview of the Java PathExplorer runtime verification tool, in short referred to as JPAX. JPAX can monitor the execution of a Java program and check that it conforms with a set of user provided properties formulated in temporal logic. JPAX can in addition analyze the program for concurrency errors such as deadlocks and data races. The concurrency analysis requires no user provided specification. The tool facilitates automated instrumentation of a program's bytecode, which when executed will emit an event stream, the execution trace, to an observer. The observer dispatches the incoming event stream to a set of observer processes, each performing a specialized analysis, such as the temporal logic verification, the deadlock analysis and the data race analysis. Temporal logic specifications can be formulated by the user in the Maude rewriting logic, where Maude is a high-speed rewriting system for equational logic, but here extended with executable temporal logic. The Maude rewriting engine is then activated as an event driven monitoring process. Alternatively, temporal specifications can be translated into efficient automata, which check the event stream. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems.

  7. Rewriting Logic Semantics of a Plan Execution Language

    NASA Technical Reports Server (NTRS)

    Dowek, Gilles; Munoz, Cesar A.; Rocha, Camilo

    2009-01-01

    The Plan Execution Interchange Language (PLEXIL) is a synchronous language developed by NASA to support autonomous spacecraft operations. In this paper, we propose a rewriting logic semantics of PLEXIL in Maude, a high-performance logical engine. The rewriting logic semantics is by itself a formal interpreter of the language and can be used as a semantic benchmark for the implementation of PLEXIL executives. The implementation in Maude has the additional benefit of making available to PLEXIL designers and developers all the formal analysis and verification tools provided by Maude. The formalization of the PLEXIL semantics in rewriting logic poses an interesting challenge due to the synchronous nature of the language and the prioritized rules defining its semantics. To overcome this difficulty, we propose a general procedure for simulating synchronous set relations in rewriting logic that is sound and, for deterministic relations, complete. We also report on the finding of two issues at the design level of the original PLEXIL semantics that were identified with the help of the executable specification in Maude.

  8. Endowing Hydrochromism to Fluorans via Bioinspired Alteration of Molecular Structures and Microenvironments and Expanding Their Potential for Rewritable Paper.

    PubMed

    Xi, Guan; Sheng, Lan; Zhang, Ivan; Du, Jiahui; Zhang, Ting; Chen, Qiaonan; Li, Guiying; Zhang, Ying; Song, Yue; Li, Jianhua; Zhang, Yu-Mo; Zhang, Sean Xiao-An

    2017-11-01

    Interest and effort toward new materials for rewritable paper have increased dramatically because of the exciting advantages for sustainable development and better nature life cycle. Inspired by how nature works within living systems, herein, we have used fluorans, as a concept verification, to endow original acidochromic, basochromic or photochromic molecules with broader properties, such as switchable with solvent, water, heat, electricity, stress, other force, etc., via simplified methods (i.e., via variation of submolecular structure or microenvironments). The hydrochromic visual change and reversible behavior of selected molecules have been explored, and the primary mechanism at the atomic or subatomic level has been hypothesized. In addition, several newly demonstrated hydrochromic fluorans have been utilized for water-jet rewritable paper (WJRP), which exhibit great photostability, high hydrochromic contrast, and fast responsive rate and which can be reused at least 30 times without significant variation. The water-jet prints have good resolution and various colors and can keep legibility after a few months or years. This improved performance is a major step toward practical applications of WJRP.

  9. Rewriting Modulo SMT and Open System Analysis

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar

    2014-01-01

    This paper proposes rewriting modulo SMT, a new technique that combines the power of SMT solving, rewriting modulo theories, and model checking. Rewriting modulo SMT is ideally suited to model and analyze infinite-state open systems, i.e., systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism, which is proper to the system, and external non-determinism, which is due to the environment. In a reflective formalism, such as rewriting logic, rewriting modulo SMT can be reduced to standard rewriting. Hence, rewriting modulo SMT naturally extends rewriting-based reachability analysis techniques, which are available for closed systems, to open systems. The proposed technique is illustrated with the formal analysis of: (i) a real-time system that is beyond the scope of timed-automata methods and (ii) automatic detection of reachability violations in a synchronous language developed to support autonomous spacecraft operations.

  10. Hydrochromic molecular switches for water-jet rewritable paper

    NASA Astrophysics Data System (ADS)

    Sheng, Lan; Li, Minjie; Zhu, Shaoyin; Li, Hao; Xi, Guan; Li, Yong-Gang; Wang, Yi; Li, Quanshun; Liang, Shaojun; Zhong, Ke; Zhang, Sean Xiao-An

    2014-01-01

    The days of rewritable paper are coming, printers of the future will use water-jet paper. Although several kinds of rewritable paper have been reported, practical usage of them is rare. Herein, a new rewritable paper for ink-free printing is proposed and demonstrated successfully by using water as the sole trigger to switch hydrochromic dyes on solid media. Water-jet prints with various colours are achieved with a commercial desktop printer based on these hydrochromic rewritable papers. The prints can be erased and rewritten dozens of times with no significant loss in colour quality. This rewritable paper is promising in that it can serve an eco-friendly information display to meet the increasing global needs for environmental protection.

  11. Hydrochromic molecular switches for water-jet rewritable paper.

    PubMed

    Sheng, Lan; Li, Minjie; Zhu, Shaoyin; Li, Hao; Xi, Guan; Li, Yong-Gang; Wang, Yi; Li, Quanshun; Liang, Shaojun; Zhong, Ke; Zhang, Sean Xiao-An

    2014-01-01

    The days of rewritable paper are coming, printers of the future will use water-jet paper. Although several kinds of rewritable paper have been reported, practical usage of them is rare. Herein, a new rewritable paper for ink-free printing is proposed and demonstrated successfully by using water as the sole trigger to switch hydrochromic dyes on solid media. Water-jet prints with various colours are achieved with a commercial desktop printer based on these hydrochromic rewritable papers. The prints can be erased and rewritten dozens of times with no significant loss in colour quality. This rewritable paper is promising in that it can serve an eco-friendly information display to meet the increasing global needs for environmental protection.

  12. Let the Rewriter Beware.

    ERIC Educational Resources Information Center

    Charrow, Veda R.

    Translating legal and bureaucratic language into plain, comprehensible English is not amenable to simple rules and procedures. Rewriting comprehensibly requires specialized knowledge about language and an awareness of a number of misconceptions and pitfalls. This paper discusses what not to do in rewriting, based upon rewritten documents presently…

  13. Rewriting Modulo SMT

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar A.

    2013-01-01

    Combining symbolic techniques such as: (i) SMT solving, (ii) rewriting modulo theories, and (iii) model checking can enable the analysis of infinite-state systems outside the scope of each such technique. This paper proposes rewriting modulo SMT as a new technique combining the powers of (i)-(iii) and ideally suited to model and analyze infinite-state open systems; that is, systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism due to the system, and external non-determinism due to the environment. They are not amenable to finite-state model checking analysis because they typically are infinite-state. By being reducible to standard rewriting using reflective techniques, rewriting modulo SMT can both naturally model and analyze open systems without requiring any changes to rewriting-based reachability analysis techniques for closed systems. This is illustrated by the analysis of a real-time system beyond the scope of timed automata methods.

  14. Deciding Termination for Ancestor Match- Bounded String Rewriting Systems

    NASA Technical Reports Server (NTRS)

    Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2005-01-01

    Termination of a string rewriting system can be characterized by termination on suitable recursively defined languages. This kind of termination criteria has been criticized for its lack of automation. In an earlier paper we have shown how to construct an automated termination criterion if the recursion is aligned with the rewrite relation. We have demonstrated the technique with Dershowitz's forward closure criterion. In this paper we show that a different approach is suitable when the recursion is aligned with the inverse of the rewrite relation. We apply this idea to Kurth's ancestor graphs and obtain ancestor match-bounded string rewriting systems. Termination is shown to be decidable for this class. The resulting method improves upon those based on match-boundedness or inverse match-boundedness.

  15. An Empirical Evaluation of Automated Theorem Provers in Software Certification

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann

    2004-01-01

    We describe a system for the automated certification of safety properties of NASA software. The system uses Hoare-style program verification technology to generate proof obligations which are then processed by an automated first-order theorem prover (ATP). We discuss the unique requirements this application places on the ATPs, focusing on automation, proof checking, and usability. For full automation, however, the obligations must be aggressively preprocessed and simplified, and we demonstrate how the individual simplification stages, which are implemented by rewriting, influence the ability of the ATPs to solve the proof tasks. Our results are based on 13 certification experiments that lead to more than 25,000 proof tasks which have each been attempted by Vampire, Spass, e-setheo, and Otter. The proofs found by Otter have been proof-checked by IVY.

  16. Using Automated Theorem Provers to Certify Auto-Generated Aerospace Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann

    2004-01-01

    We describe a system for the automated certification of safety properties of NASA software. The system uses Hoare-style program verification technology to generate proof obligations which are then processed by an automated first-order theorem prover (ATP). For full automation, however, the obligations must be aggressively preprocessed and simplified We describe the unique requirements this places on the ATP and demonstrate how the individual simplification stages, which are implemented by rewriting, influence the ability of the ATP to solve the proof tasks. Experiments on more than 25,000 tasks were carried out using Vampire, Spass, and e-setheo.

  17. Formal Compiler Implementation in a Logical Framework

    DTIC Science & Technology

    2003-04-29

    variable set [], we omit the brackets and use the simpler notation v. MetaPRL is a tactic-based prover that uses OCaml [20] as its meta-language. When a...rewrite is defined in MetaPRL, the framework creates an OCaml expression that can be used to apply the rewrite. Code to guide the application of...rewrites is written in OCaml , using a rich set of primitives provided by MetaPRL. MetaPRL automates the construction of most guidance code; we describe

  18. Scaffolded Writing and Rewriting in the Discipline: A Web-Based Reciprocal Peer Review System

    ERIC Educational Resources Information Center

    Cho, Kwangsu; Schunn, Christian D.

    2007-01-01

    This paper describes how SWoRD (scaffolded writing and rewriting in the discipline), a web-based reciprocal peer review system, supports writing practice, particularly for large content courses in which writing is considered critical but not feasibly included. To help students gain content knowledge as well as writing and reviewing skills, SWoRD…

  19. Rewritable and pH-Sensitive Micropatterns Based on Nanoparticle "Inks"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, D. W.; Lagzi, Istvan; Wesson, Paul J.

    2010-08-16

    Rewritable micropatterns based on nanoparticle “inks” are created in gel substrates by wet stamping. The colors of the patterns depend on pH, reflect the degree of nanoparticle aggregation, and can be written using acids and erased using bases. Micropatterns imprinted with salts are “permanent” but can change color upon pH changes; these patterns act as multiple-use pH sensors.

  20. On-Chip Fluorescence Switching System for Constructing a Rewritable Random Access Data Storage Device.

    PubMed

    Nguyen, Hoang Hiep; Park, Jeho; Hwang, Seungwoo; Kwon, Oh Seok; Lee, Chang-Soo; Shin, Yong-Beom; Ha, Tai Hwan; Kim, Moonil

    2018-01-10

    We report the development of on-chip fluorescence switching system based on DNA strand displacement and DNA hybridization for the construction of a rewritable and randomly accessible data storage device. In this study, the feasibility and potential effectiveness of our proposed system was evaluated with a series of wet experiments involving 40 bits (5 bytes) of data encoding a 5-charactered text (KRIBB). Also, a flexible data rewriting function was achieved by converting fluorescence signals between "ON" and "OFF" through DNA strand displacement and hybridization events. In addition, the proposed system was successfully validated on a microfluidic chip which could further facilitate the encoding and decoding process of data. To the best of our knowledge, this is the first report on the use of DNA hybridization and DNA strand displacement in the field of data storage devices. Taken together, our results demonstrated that DNA-based fluorescence switching could be applicable to construct a rewritable and randomly accessible data storage device through controllable DNA manipulations.

  1. A Rewritable, Reprogrammable, Dual Light-Responsive Polymer Actuator.

    PubMed

    Gelebart, Anne Helene; Mulder, Dirk J; Vantomme, Ghislaine; Schenning, Albertus P H J; Broer, Dirk J

    2017-10-16

    We report on the fabrication of a rewritable and reprogrammable dual-photoresponsive liquid crystalline-based actuator containing an azomerocyanine dye that can be locally converted into the hydroxyazopyridinium form by acid treatment. Each dye absorbs at a different wavelength giving access to programmable actuators, the folding of which can be controlled by using different colors of light. The acidic patterning is reversible and allows the erasing and rewriting of patterns in the polymer film, giving access to reusable, adjustable soft actuators. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  2. A Rewritable, Random-Access DNA-Based Storage System.

    PubMed

    Yazdi, S M Hossein Tabatabaei; Yuan, Yongbo; Ma, Jian; Zhao, Huimin; Milenkovic, Olgica

    2015-09-18

    We describe the first DNA-based storage architecture that enables random access to data blocks and rewriting of information stored at arbitrary locations within the blocks. The newly developed architecture overcomes drawbacks of existing read-only methods that require decoding the whole file in order to read one data fragment. Our system is based on new constrained coding techniques and accompanying DNA editing methods that ensure data reliability, specificity and sensitivity of access, and at the same time provide exceptionally high data storage capacity. As a proof of concept, we encoded parts of the Wikipedia pages of six universities in the USA, and selected and edited parts of the text written in DNA corresponding to three of these schools. The results suggest that DNA is a versatile media suitable for both ultrahigh density archival and rewritable storage applications.

  3. A Rewritable, Random-Access DNA-Based Storage System

    NASA Astrophysics Data System (ADS)

    Tabatabaei Yazdi, S. M. Hossein; Yuan, Yongbo; Ma, Jian; Zhao, Huimin; Milenkovic, Olgica

    2015-09-01

    We describe the first DNA-based storage architecture that enables random access to data blocks and rewriting of information stored at arbitrary locations within the blocks. The newly developed architecture overcomes drawbacks of existing read-only methods that require decoding the whole file in order to read one data fragment. Our system is based on new constrained coding techniques and accompanying DNA editing methods that ensure data reliability, specificity and sensitivity of access, and at the same time provide exceptionally high data storage capacity. As a proof of concept, we encoded parts of the Wikipedia pages of six universities in the USA, and selected and edited parts of the text written in DNA corresponding to three of these schools. The results suggest that DNA is a versatile media suitable for both ultrahigh density archival and rewritable storage applications.

  4. (Re)Writing Civics in the Digital Age: The Role of Social Media in Student (Dis)Engagement

    ERIC Educational Resources Information Center

    Portman Daley, Joannah

    2012-01-01

    (Re)Writing Civics in the Digital Age: The Role of Social Media in Student (Dis)Engagement addresses an important gap in the knowledge of civic rhetoric available in Rhetoric and Composition by using qualitative methods to explore the parameters of civic engagement through social media-based digital writing. With funding from URI's Office of…

  5. Monitoring Programs Using Rewriting

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Lan, Sonie (Technical Monitor)

    2001-01-01

    We present a rewriting algorithm for efficiently testing future time Linear Temporal Logic (LTL) formulae on finite execution traces, The standard models of LTL are infinite traces, reflecting the behavior of reactive and concurrent systems which conceptually may be continuously alive in most past applications of LTL, theorem provers and model checkers have been used to formally prove that down-scaled models satisfy such LTL specifications. Our goal is instead to use LTL for up-scaled testing of real software applications, corresponding to analyzing the conformance of finite traces against LTL formulae. We first describe what it means for a finite trace to satisfy an LTL property end then suggest an optimized algorithm based on transforming LTL formulae. We use the Maude rewriting logic, which turns out to be a good notation and being supported by an efficient rewriting engine for performing these experiments. The work constitutes part of the Java PathExplorer (JPAX) project, the purpose of which is to develop a flexible tool for monitoring Java program executions.

  6. Values Education through Aggadic Stories: The Didactic Rewriter as Interpreter

    ERIC Educational Resources Information Center

    Weinstein, Sara

    2016-01-01

    Didactic rewrites of aggadic stories are an important resource in values education. This study, geared primarily toward teachers involved in choosing curricular materials, investigates how the didactic rewriter actually becomes an interpreter, rather than a mere transmitter, of the original text. The personal values of the rewriters can influence…

  7. Rewriting in Advanced Composition.

    ERIC Educational Resources Information Center

    Stone, William B.

    A college English instructor made an informal comparison of rewriting habits of students in a freshman composition course and two advanced composition courses. Notes kept on student rewriting focused on this central question: given peer and instructor response to their papers and a choice as to what and how to rewrite, what will students decide to…

  8. Rewrite Systems, Pattern Matching, and Code Generation

    DTIC Science & Technology

    1988-06-09

    Transformations Quicn a bucn arbol se anima, buena sombra le cobija1 [Old Spanish Saying] 1 Trees arc hierarchical mathematical objects. Their...subtrees of a tree may m atch one or more rewrite rules. Traditional research in term rewrite systems is concerned with de termining if a given system...be simulated by sets of rewrite rules. Non-local condjtions are des cribed in an awkward way since the only way to transmit information is indirectly

  9. Preparing a Sublanguage Grammar

    DTIC Science & Technology

    1991-10-31

    new BNFs based on the Navy CASREP data. Rewriting of BNF Options Rewriting Options for Subordinate Clause Strings Originally, SUB6 and SUB7 were "rare...the CASREP data, we found it necessary to derarify the expansions of SUB6 and SUB7 in CSSTG. An Example or Svibordinate Clauses SUB6 expands to a CS6... SUB7 was derarified to account for the sentences in (7). (7) a. [Testa 4.1]: WHILE DIESEL WAS OPERATING WITH SAC DISENGAGED, SAC LO ALARM SOUNDED. b

  10. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    NASA Astrophysics Data System (ADS)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.

    2017-02-01

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.

  11. TOMML: A Rule Language for Structured Data

    NASA Astrophysics Data System (ADS)

    Cirstea, Horatiu; Moreau, Pierre-Etienne; Reilles, Antoine

    We present the TOM language that extends JAVA with the purpose of providing high level constructs inspired by the rewriting community. TOM bridges thus the gap between a general purpose language and high level specifications based on rewriting. This approach was motivated by the promotion of rule based techniques and their integration in large scale applications. Powerful matching capabilities along with a rich strategy language are among TOM's strong features that make it easy to use and competitive with respect to other rule based languages. TOM is thus a natural choice for querying and transforming structured data and in particular XML documents [1]. We present here its main XML oriented features and illustrate its use on several examples.

  12. 75 FR 5241 - General Services Administration Acquisition Regulation; Rewrite of Part 512, Acquisition of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-02

    ... Acquisition Regulation; Rewrite of Part 512, Acquisition of Commercial Items AGENCIES: Office of Acquisition... Administration (GSA) is amending the General Services Administration Acquisition Regulation (GSAR) to update the text addressing the acquisition of commercial items. This rule is a result of the GSAM Rewrite...

  13. Analytical learning and term-rewriting systems

    NASA Technical Reports Server (NTRS)

    Laird, Philip; Gamble, Evan

    1990-01-01

    Analytical learning is a set of machine learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several algorithms with this purpose have been suggested, most of which are closely tied to a first order logical language and are variants of goal regression, such as the familiar explanation based generalization (EBG) procedure. But because predicate calculus is a poor representation for some domains, these learning algorithms are extended to apply to other computational models. It is shown that the goal regression technique applies to a large family of programming languages, all based on a kind of term rewriting system. Included in this family are three language families of importance to artificial intelligence: logic programming, such as Prolog; lambda calculus, such as LISP; and combinatorial based languages, such as FP. A new analytical learning algorithm, AL-2, is exhibited that learns from success but is otherwise quite different from EBG. These results suggest that term rewriting systems are a good framework for analytical learning research in general, and that further research should be directed toward developing new techniques.

  14. Rewritable phosphorescent paper by the control of competing kinetic and thermodynamic self-assembling events

    NASA Astrophysics Data System (ADS)

    Kishimura, Akihiro; Yamashita, Takashi; Yamaguchi, Kentaro; Aida, Takuzo

    2005-07-01

    Security inks have become of increasing importance. They are composed of invisible substances that provide printed images that are not able to be photocopied, and are readable only under special environments. Here we report a novel photoluminescent ink for rewritable media that dichroically emits phosphorescence due to a structural bistability of the self-assembled luminophor. Long-lasting images have been developed by using conventional thermal printers, which are readable only on exposure to ultraviolet light, and more importantly, are thermally erasable for rewriting. Although thermally rewritable printing media have already been developed using visible dyes and cholesteric liquid crystals, security inks that allow rewriting of invisible printed images are unprecedented. We realized this unique feature by the control of kinetic and thermodynamic processes that compete with one another in the self-assembly of the luminophor. This strategy can provide an important step towards the next-generation security technology for information handling.

  15. Mirroring the Object of the Lesson: The Creative Process of Scriptural Rewriting as an Effective Practice for Teaching Sacred Texts

    ERIC Educational Resources Information Center

    Palmer, Carmen

    2018-01-01

    This paper introduces Rewritten Scripture and scriptural rewriting as a creative process that, when mirrored in a teaching exercise, may serve as an effective practice in teaching sacred texts. Observing changes made between scripture and its rewriting may allow readers to identify different contexts among these texts. Furthermore, the act of…

  16. Formal logic rewrite system bachelor in teaching mathematical informatics

    NASA Astrophysics Data System (ADS)

    Habiballa, Hashim; Jendryscik, Radek

    2017-07-01

    The article presents capabilities of the formal rewrite logic system - Bachelor - for teaching theoretical computer science (mathematical informatics). The system Bachelor enables constructivist approach to teaching and therefore it may enhance the learning process in hard informatics essential disciplines. It brings not only detailed description of formal rewrite process but also it can demonstrate algorithmical principles for logic formulae manipulations.

  17. Printable and Rewritable Full Block Copolymer Structural Color.

    PubMed

    Kang, Han Sol; Lee, Jinseong; Cho, Suk Man; Park, Tae Hyun; Kim, Min Ju; Park, Chanho; Lee, Seung Won; Kim, Kang Lib; Ryu, Du Yeol; Huh, June; Thomas, Edwin L; Park, Cheolmin

    2017-08-01

    Structural colors (SCs) of photonic crystals (PCs) arise from selective constructive interference of incident light. Here, an ink-jet printable and rewritable block copolymer (BCP) SC display is demonstrated, which can be quickly written and erased over 50 times with resolution nearly equivalent to that obtained with a commercial office ink-jet printer. Moreover, the writing process employs an easily modified printer for position- and concentration-controlled deposition of a single, colorless, water-based ink containing a reversible crosslinking agent, ammonium persulfate. Deposition of the ink onto a self-assembled BCP PC film comprising a 1D stack of alternating layers enables differential swelling of the written BCP film and produces a full-colored SC display of characters and images. Furthermore, the information can be readily erased and the system can be reset by application of hydrogen bromide. Subsequently, new information can be rewritten, resulting in a chemically rewritable BCP SC display. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A halochromic stimuli-responsive reversible fluorescence switching 3, 4, 9, 10-perylene tetracarboxylic acid dye for fabricating rewritable platform

    NASA Astrophysics Data System (ADS)

    Hariharan, P. S.; Pitchaimani, J.; Madhu, Vedichi; Anthony, Savarimuthu Philip

    2017-02-01

    3, 4, 9, 10-perylene tetracarboxylic acid (PTCA), a strongly fluorescent water soluble dye with halochromic functionality showed pH dependent reversible fluorescence switching. The strong fluorescence of PTCA (Φf = 0.67) in basic medium was completely quenched upon acidification. The fluorescent PTCA has been transferred on to a solid substrate (filter paper and glass plate) that also showed reversible off-on fluorescence switching by acid/base and drying/water vapor exposure. The reversible fluorescence switching of PTCA could be of potential interest for fabricating rewritable fluorescent medium.

  19. Recovering from "amnesia" brought about by radiation. Verification of the "Over the air" (OTA) application software update mechanism On-Board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Da Silva, Antonio; Sánchez Prieto, Sebastián; Rodriguez Polo, Oscar; Parra Espada, Pablo

    Computer memories are not supposed to forget, but they do. Because of the proximity of the Sun, from the Solar Orbiter boot software perspective, it is mandatory to look out for permanent memory errors resulting from (SEL) latch-up failures in application binaries stored in EEPROM and its SDRAM deployment areas. In this situation, the last line in defense established by FDIR mechanisms is the capability of the boot software to provide an accurate report of the memories’ damages and to perform an application software update, that avoid the harmed locations by flashing EEPROM with a new binary. This paper describes the OTA EEPROM firmware update procedure verification of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. Since the maximum number of rewrites on real EEPROM is limited and permanent memory faults cannot be friendly emulated in real hardware, the verification has been accomplished by the use of a LEON2 Virtual Platform (Leon2ViP) with fault injection capabilities and real SpaceWire interfaces developed by the Space Research Group (SRG) of the University of Alcalá. This way it is possible to run the exact same target binary software as if was run on the real ICU platform. Furthermore, the use of this virtual hardware-in-the-loop (VHIL) approach makes it possible to communicate with Electrical Ground Support Equipment (EGSE) through real SpaceWire interfaces in an agile, controlled and deterministic environment.

  20. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    DOE PAGES

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...

    2017-03-20

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less

  1. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less

  2. Termination of String Rewriting Rules that have One Pair of Overlaps

    NASA Technical Reports Server (NTRS)

    Geser, Alfons; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    This paper presents a partial solution to the long standing open problem of termination of one-rule string rewriting. Overlaps between the two sides of the rule play a central role in existing termination criteria. We characterize termination of all one-rule string rewriting systems that have one such overlap at either end. This both completes a result of Kurth and generalizes a result of Shikishima-Tsuji et al.

  3. Nitrogen-doped partially reduced graphene oxide rewritable nonvolatile memory.

    PubMed

    Seo, Sohyeon; Yoon, Yeoheung; Lee, Junghyun; Park, Younghun; Lee, Hyoyoung

    2013-04-23

    As memory materials, two-dimensional (2D) carbon materials such as graphene oxide (GO)-based materials have attracted attention due to a variety of advantageous attributes, including their solution-processability and their potential for highly scalable device fabrication for transistor-based memory and cross-bar memory arrays. In spite of this, the use of GO-based materials has been limited, primarily due to uncontrollable oxygen functional groups. To induce the stable memory effect by ionic charges of a negatively charged carboxylic acid group of partially reduced graphene oxide (PrGO), a positively charged pyridinium N that served as a counterion to the negatively charged carboxylic acid was carefully introduced on the PrGO framework. Partially reduced N-doped graphene oxide (PrGODMF) in dimethylformamide (DMF) behaved as a semiconducting nonvolatile memory material. Its optical energy band gap was 1.7-2.1 eV and contained a sp2 C═C framework with 45-50% oxygen-functionalized carbon density and 3% doped nitrogen atoms. In particular, rewritable nonvolatile memory characteristics were dependent on the proportion of pyridinum N, and as the proportion of pyridinium N atom decreased, the PrGODMF film lost memory behavior. Polarization of charged PrGODMF containing pyridinium N and carboxylic acid under an electric field produced N-doped PrGODMF memory effects that followed voltage-driven rewrite-read-erase-read processes.

  4. A Rewriting Logic Approach to Type Inference

    NASA Astrophysics Data System (ADS)

    Ellison, Chucky; Şerbănuţă, Traian Florin; Roşu, Grigore

    Meseguer and Roşu proposed rewriting logic semantics (RLS) as a programing language definitional framework that unifies operational and algebraic denotational semantics. RLS has already been used to define a series of didactic and real languages, but its benefits in connection with defining and reasoning about type systems have not been fully investigated. This paper shows how the same RLS style employed for giving formal definitions of languages can be used to define type systems. The same term-rewriting mechanism used to execute RLS language definitions can now be used to execute type systems, giving type checkers or type inferencers. The proposed approach is exemplified by defining the Hindley-Milner polymorphic type inferencer mathcal{W} as a rewrite logic theory and using this definition to obtain a type inferencer by executing it in a rewriting logic engine. The inferencer obtained this way compares favorably with other definitions or implementations of mathcal{W}. The performance of the executable definition is within an order of magnitude of that of highly optimized implementations of type inferencers, such as that of OCaml.

  5. Match-bounded String Rewriting Systems

    NASA Technical Reports Server (NTRS)

    Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2003-01-01

    We introduce a new class of automated proof methods for the termination of rewriting systems on strings. The basis of all these methods is to show that rewriting preserves regular languages. To this end, letters are annotated with natural numbers, called match heights. If the minimal height of all positions in a redex is h+1 then every position in the reduct will get height h+1. In a match-bounded system, match heights are globally bounded. Using recent results on deleting systems, we prove that rewriting by a match-bounded system preserves regular languages. Hence it is decidable whether a given rewriting system has a given match bound. We also provide a sufficient criterion for the abence of a match-bound. The problem of existence of a match-bound is still open. Match-boundedness for all strings can be used as an automated criterion for termination, for match-bounded systems are terminating. This criterion can be strengthened by requiring match-boundedness only for a restricted set of strings, for instance the set of right hand sides of forward closures.

  6. SPARQL Query Re-writing Using Partonomy Based Transformation Rules

    NASA Astrophysics Data System (ADS)

    Jain, Prateek; Yeh, Peter Z.; Verma, Kunal; Henson, Cory A.; Sheth, Amit P.

    Often the information present in a spatial knowledge base is represented at a different level of granularity and abstraction than the query constraints. For querying ontology's containing spatial information, the precise relationships between spatial entities has to be specified in the basic graph pattern of SPARQL query which can result in long and complex queries. We present a novel approach to help users intuitively write SPARQL queries to query spatial data, rather than relying on knowledge of the ontology structure. Our framework re-writes queries, using transformation rules to exploit part-whole relations between geographical entities to address the mismatches between query constraints and knowledge base. Our experiments were performed on completely third party datasets and queries. Evaluations were performed on Geonames dataset using questions from National Geographic Bee serialized into SPARQL and British Administrative Geography Ontology using questions from a popular trivia website. These experiments demonstrate high precision in retrieval of results and ease in writing queries.

  7. Rewritable Painting Realized from Ambient-Sensitive Fluorescence of ZnO Nanoparticles

    PubMed Central

    Liu, Kai-Kai; Shan, Chong-Xin; He, Gao-Hang; Wang, Ruo-Qiu; Dong, Lin; Shen, De-Zhen

    2017-01-01

    Paper, as one of the most important information carriers, has contributed to the development and transmission of human civilization greatly. Meanwhile, a serious problem of environmental sustainable development caused by the production and utilization of paper has been resulted to modern society. Therefore, a simple and green route is urgently demanded to realize rewritable painting on paper. Herein, a simple route to rewritable painting on copy paper has been demonstrated by using eco-friendly ZnO nanoparticles (NPs) as fluorescent ink, and vinegar and soda that are frequently used in kitchen as erasing and neutralizing agents. Words or patterns written using the ZnO NPs as ink can be erased by vinegar vapour within five seconds, and after a neutralizing process in the ambient of soda vapour, the paper can be used for writing again. It is worth noting that the resolution and precision of the patterns produced via the above route degrade little after ten rewriting cycles, and the quality of the patterns produced using the ZnO NPs as ink fades little after being storage for several months, which promises the versatile potential applications of the rewriting route proposed in this paper. PMID:28169344

  8. Synthesis for Structure Rewriting Systems

    NASA Astrophysics Data System (ADS)

    Kaiser, Łukasz

    The description of a single state of a modelled system is often complex in practice, but few procedures for synthesis address this problem in depth. We study systems in which a state is described by an arbitrary finite structure, and changes of the state are represented by structure rewriting rules, a generalisation of term and graph rewriting. Both the environment and the controller are allowed to change the structure in this way, and the question we ask is how a strategy for the controller that ensures a given property can be synthesised.

  9. NSR&D FY17 Report: CartaBlanca Capability Enhancements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Christopher Curtis; Dhakal, Tilak Raj; Zhang, Duan Zhong

    Over the last several years, particle technology in the CartaBlanca code has been matured and has been successfully applied to a wide variety of physical problems. It has been shown that the particle methods, especially Los Alamos's dual domain material point method, is capable of computing many problems involves complex physics, chemistries accompanied by large material deformations, where the traditional finite element or Eulerian method encounter significant difficulties. In FY17, the CartaBlanca code has been enhanced with physical models and numerical algorithms. We started out to compute penetration and HE safety problems. Most of the year we focused on themore » TEPLA model improvement testing against the sweeping wave experiment by Gray et al., because it was found that pore growth and material failure are essentially important for our tasks and needed to be understood for modeling the penetration and the can experiments efficiently. We extended the TEPLA mode from the point view of ensemble phase average to include the effects of nite deformation. It is shown that the assumed pore growth model in TEPLA is actually an exact result from the theory. Alone this line, we then generalized the model to include finite deformations to consider nonlinear dynamics of large deformation. The interaction between the HE product gas and the solid metal is based on the multi-velocity formation. Our preliminary numerical results suggest good agreement between the experiment and the numerical results, pending further verification. To improve the parallel processing capabilities of the CartaBlanca code, we are actively working with the Next Generation Code (NGC) project to rewrite selected packages using C++. This work is expected to continue in the following years. This effort also makes the particle technology developed with CartaBlanca project available to other part of the laboratory. Working with the NGC project and rewriting some parts of the code also given us an opportunity to improve our numerical implementations of the method and to take advantage of recently advances in the numerical methods, such as multiscale algorithms.« less

  10. Erasure and formation of femtosecond laser-induced nanostructures

    NASA Astrophysics Data System (ADS)

    Zimmermann, F.; Plech, A.; Richter, Sören; Tünnermann, A.; Nolte, S.

    2015-03-01

    The local inscription of strong birefringence by ultrashort laser pulses facilitates the fabrication of manifold photonic devices, such as data storage devices. One intriguing feature of these nanograting-based data units is to delete and rewrite new nanograting voxels by changing the laser polarization orientation during inscription. However, up to now no comprehensive picture of this complex physical process exists. Thus we performed optical retardance measurements as well as microscopic analyses, such as small-angle X-ray scattering (SAXS) and scanning electron microscopy (SEM) to address this issue. Our results reveal that only few laser pulses already lead to an erasure of nanometric pores which is mapped by the total (X-ray) scattering volume as well as by the strong reduction of the initial form birefringence. Simultaneously, new nanostructures form which arrange in individual grating planes with ongoing irradiation. However, since the rewrite process is no ideal mechanism some of the old sheets remain, which perturb the quality of the new nanograting. When rewriting multiple times the glass becomes even more porous due to repetitive annealing and quenching. This promotes the formation of new inhomogeneities and in turn leads to an increase in optical retardance.

  11. Rewriting the Ways of Globalising Education?

    ERIC Educational Resources Information Center

    Singh, Michael

    2002-01-01

    Recodes metaphors in Apple's "Educating the 'Right' Way: Markets, Standards, God and Inequality," as resources for rewriting ways of globalizing education. Suggests that the radical right is imposing its market-driven, evangelical, reductionist project on educational globalization. These efforts frame the work of real-world teachers…

  12. Learning by Restorying

    ERIC Educational Resources Information Center

    Slabon, Wayne A.; Richards, Randy L.; Dennen, Vanessa P.

    2014-01-01

    In this paper, we introduce restorying, a pedagogical approach based on social constructivism that employs successive iterations of rewriting and discussing personal, student-generated, domain-relevant stories to promote conceptual application, critical thinking, and ill-structured problem solving skills. Using a naturalistic, qualitative case…

  13. Proposal of New Rewritable Printing Media Using Electrophoresis and Confirmation of Its Mechanism

    NASA Astrophysics Data System (ADS)

    Hoshino, Yasushi; Ogura, Masahiro; Sano, Takayuki

    2004-10-01

    A new rewritable printing media using electrophoresis and selective heating is proposed to contribute to the reduction in paper consumption by printers. The mechanism is that when a heated part of the rewritable media is melted, white particles in that part of the media are able to move by electrophoresis. The media is initialized by heating its entire surface under the condition of voltage application and imaging is carried out by selective heating under the condition of an applied reversed-polarity voltage. Using a mixture system of carnauba wax and particles coated with titanium oxide (TiO2), the feasibility of the mechanism is confirmed.

  14. A flexible optically re-writable color liquid crystal display

    NASA Astrophysics Data System (ADS)

    Zhang, Yihong; Sun, Jiatong; Liu, Yang; Shang, Jianhua; Liu, Hao; Liu, Huashan; Gong, Xiaohui; Chigrinov, Vladimir; Kowk, Hoi Sing

    2018-03-01

    It is very difficult to make a liquid crystal display (LCD) that is flexible. However, for an optically re-writable LCD (ORWLCD), only the spacers and the substrates need to be flexible because the driving unit and the display unit are separate and there are no electronics in the display part of ORWLCD. In this paper, three flexible-spacer methods are proposed to achieve this goal. A cholesteric liquid crystal colored mirror with a polarizer behind it is used as the colored reflective backboard of an ORWLCD. Polyethersulfone substrates and flexible spacers are used to make the optically re-writable cell insensitive to mechanical force.

  15. A Female Interrogative Reader: The Adolescent Jane Austen Reads and Rewrites (His)tory.

    ERIC Educational Resources Information Center

    Reid-Walsh, Jacqueline

    1992-01-01

    Argues that Jane Austen's unpublished juvenile work "The History of England" has considerable relevance to twentieth-century high-school English classrooms. Notes that the work humorously shows the gender bias of traditional history texts because it is a "woman-centered" rewriting. (RS)

  16. 75 FR 48872 - General Services Administration Acquisition Regulation; Rewrite of GSAR Part 541, Acquisition of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-12

    ... Acquisition Regulation; Rewrite of GSAR Part 541, Acquisition of Utility Services AGENCIES: Office of Acquisition Policy, General Services Administration (GSA). ACTION: Final rule. SUMMARY: The General Services Administration (GSA) is amending the General Services Administration Acquisition Regulation (GSAR) to improve the...

  17. 78 FR 71041 - VA Compensation and Pension Regulation Rewrite Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-27

    ...The Department of Veterans Affairs (VA) proposes to reorganize and rewrite its compensation and pension regulations in a logical, claimant-focused, and user-friendly format. The intended effect of the proposed revisions is to assist claimants, beneficiaries, veterans' representatives, and VA personnel in locating and understanding these regulations.

  18. Rewriting of States' Standards on Social Studies Stirs Debate

    ERIC Educational Resources Information Center

    Robelen, Erik W.

    2010-01-01

    As debate continues around the development and adoption of common standards in English and mathematics, several states are independently wrestling with rewrites of standards in a content area largely absent from that national discussion--social studies--and encountering their own shares of controversy. Flash points in the social studies debates…

  19. Rewriting Citizenship? Civic Education in Costa Rica and Argentina

    ERIC Educational Resources Information Center

    Suarez, David F.

    2008-01-01

    To what degree are nations "rewriting" citizenship by expanding discussions of human rights, diversity and cultural pluralism in modern civic education, and what explains variation between countries? This study addresses these issues by analysing the intended content of civic education in Costa Rica and Argentina. Over time, civic…

  20. Kellogg Foundation Initiative: Rewriting the Way Foundations Do Business in Indian Country.

    ERIC Educational Resources Information Center

    Boyer, Paul

    2000-01-01

    Describes the multi-million dollar initiative announced by W. K. Kellogg Foundation in 1995 to support the Native American Higher Education Initiative, and how the Kellogg initiative deserves attention from the nation as a whole because it is attempting to fundamentally rewrite the way foundations do business with Indian communities. (VWC)

  1. Phase structure rewrite systems in information retrieval

    NASA Technical Reports Server (NTRS)

    Klingbiel, P. H.

    1985-01-01

    Operational level automatic indexing requires an efficient means of normalizing natural language phrases. Subject switching requires an efficient means of translating one set of authorized terms to another. A phrase structure rewrite system called a Lexical Dictionary is explained that performs these functions. Background, operational use, other applications and ongoing research are explained.

  2. 76 FR 69296 - Proposed Models for Plant-Specific Adoption of Technical Specifications Task Force Traveler TSTF...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-08

    ... Adoption of Technical Specifications Task Force Traveler TSTF-500, Revision 2, ``DC Electrical Rewrite... Technical Specifications Task Force (TSTF) Traveler TSTF-500, Revision 2, ``DC Electrical Rewrite--Update to... Reactor Systems Engineer, Technical Specifications Branch, Mail Stop: O-7 C2A, Division of Inspection and...

  3. Emancipatory Narratives: Rewriting the Master Script in the School Curriculum.

    ERIC Educational Resources Information Center

    Swartz, Ellen

    1992-01-01

    Early efforts at multicultural education have largely been compensatory attempts to address inequities in cultural representation. What is needed is a rewriting of the entire master script of curriculum to eliminate implicit racism, classism, and sexism. Examples from U.S. history illustrate the scope of revision needed in education. (SLD)

  4. Re/Writing the Subject: A Contribution to Post-Structuralist Theory in Mathematics Education

    ERIC Educational Resources Information Center

    Roth, Wolff-Michael

    2012-01-01

    This text, occasioned by a critical reading of "Mathematics Education and Subjectivity" (Brown, "2011") and constituting a response to the book, aims at contributing to the building of (post-structuralist) theory in mathematics education. Its purpose was to re/write two major positions that "Mathematics Education and Subjectivity" articulates:…

  5. Sentence-Level Rewriting Detection

    ERIC Educational Resources Information Center

    Zhang, Fan; Litman, Diane

    2014-01-01

    Writers usually need iterations of revisions and edits during their writings. To better understand the process of rewriting, we need to know what has changed be-tween the revisions. Prior work mainly focuses on detecting corrections within sentences, which is at the level of words or phrases. This paper proposes to detect revision changes at the…

  6. Questions from Afar: The Influence of Outsideness on Web-Based Conversation

    ERIC Educational Resources Information Center

    Deed, Craig; Edwards, Anthony; Gomez, Viviana

    2015-01-01

    This paper defines the metaphor of outsideness in relation to web-based interaction. Outsideness is conceived of as a key influence in online academic conversation. In particular, through the sharing of cultural perspectives, asking questions to resolve doubt, and collaborative writing and re-writing as a basis for shaping ideas through reasoning.…

  7. Incremental Query Rewriting with Resolution

    NASA Astrophysics Data System (ADS)

    Riazanov, Alexandre; Aragão, Marcelo A. T.

    We address the problem of semantic querying of relational databases (RDB) modulo knowledge bases using very expressive knowledge representation formalisms, such as full first-order logic or its various fragments. We propose to use a resolution-based first-order logic (FOL) reasoner for computing schematic answers to deductive queries, with the subsequent translation of these schematic answers to SQL queries which are evaluated using a conventional relational DBMS. We call our method incremental query rewriting, because an original semantic query is rewritten into a (potentially infinite) series of SQL queries. In this chapter, we outline the main idea of our technique - using abstractions of databases and constrained clauses for deriving schematic answers, and provide completeness and soundness proofs to justify the applicability of this technique to the case of resolution for FOL without equality. The proposed method can be directly used with regular RDBs, including legacy databases. Moreover, we propose it as a potential basis for an efficient Web-scale semantic search technology.

  8. Financing Education for Children Affected by Conflict: Lessons from Save the Children's Rewrite the Future Campaign

    ERIC Educational Resources Information Center

    Dolan, Janice; Ndaruhutse, Susy

    2011-01-01

    In recent years, Save the Children, a non-governmental organization, prioritized education for children affected by conflict through its Rewrite the Future Campaign. By significantly scaling up the resources allocated to programmes in conflict-affected countries, the organization has grown its education programmes in these contexts. Thus it has…

  9. Students' Use of Languaging in Rewriting Events from "The Things They Carried"

    ERIC Educational Resources Information Center

    Beach, Richard

    2017-01-01

    This article describes high school students' responses to events in the novel, "The Things They Carried," leading to their collaborative rewriting to create their own narrative versions of these events. It draws on "enactivist" theory of languaging, an approach to language that focuses on its use as social actions to enact and…

  10. Michigan Judge Strikes Down Controversial University Policy on Racial Harassment.

    ERIC Educational Resources Information Center

    Black Issues in Higher Education, 1989

    1989-01-01

    A University of Michigan policy barred harassment or discrimination based on race, ethnicity, religion, sex, sexual orientation, creed, national origin, ancestry, age, marital status, handicapped, or Vietnam-veteran status. It was declared unconstitutional because it violated First Amendment rights. University officials may rewrite policy or…

  11. The diagonalization of cubic matrices

    NASA Astrophysics Data System (ADS)

    Cocolicchio, D.; Viggiano, M.

    2000-08-01

    This paper is devoted to analysing the problem of the diagonalization of cubic matrices. We extend the familiar algebraic approach which is based on the Cardano formulae. We rewrite the complex roots of the associated resolvent secular equation in terms of transcendental functions and we derive the diagonalizing matrix.

  12. Fiche pratique: En chantant; Recrire un scenario; Pour en decoudre avec le subjonctif; Cri d'alerte (Practical Ideas: Singing; Rewriting a Scenario; to Unravel the Subjunctive; Cry of Alarm).

    ERIC Educational Resources Information Center

    Delbende, Jean-Christophe; And Others

    1996-01-01

    Four ideas for French language classroom activities are described: an exercise in listening to popular songs; a film scenario rewriting exercise; a technique for teaching the subjunctive mood; and a paired or small-group activity to enhance understanding of advertising. (MSE)

  13. Word Processing with the Elementary School Student--A Teaching and Learning Experience for Both Teachers and Students.

    ERIC Educational Resources Information Center

    Jacoby, Adrienne

    Using word processing in the elementary school writing curriculum is advantageous for both students and teachers. Word processors motivate students to spend more time on task, encourage changes and rewriting, and eliminate concern for neatness and the tedium of writing (and rewriting) by hand. Teachers can see that students using the word…

  14. A SCIENCE PROGRAM FOR THE ELEMENTARY SCHOOLS OF LOWER MERION SCHOOL DISTRICT.

    ERIC Educational Resources Information Center

    Lower Merion Township School District, Ardmore, PA.

    AFTER AN EVALUATION MADE BY THE TEACHERS OF KINDERGARTEN THROUGH GRADE 6, THE FOLLOWING AREAS OF CLARIFICATION, REWRITING, OR ADDITIONS WERE INDICATED--THE PURPOSE AND USE OF THE SCIENCE GUIDE, EVALUATION OF THE UNITS BY GRADES, ADDITIONAL MATERIALS FOR THE UNITS, A REWRITING OF PARTICULAR UNITS, HEALTH UNITS FOR GRADES 1 THROUGH 5, THE USE OF…

  15. A Prediction Error-driven Retrieval Procedure for Destabilizing and Rewriting Maladaptive Reward Memories in Hazardous Drinkers

    PubMed Central

    Das, Ravi K.; Gale, Grace; Hennessy, Vanessa; Kamboj, Sunjeev K.

    2018-01-01

    Maladaptive reward memories (MRMs) can become unstable following retrieval under certain conditions, allowing their modification by subsequent new learning. However, robust (well-rehearsed) and chronologically old MRMs, such as those underlying substance use disorders, do not destabilize easily when retrieved. A key determinate of memory destabilization during retrieval is prediction error (PE). We describe a retrieval procedure for alcohol MRMs in hazardous drinkers that specifically aims to maximize the generation of PE and therefore the likelihood of MRM destabilization. The procedure requires explicitly generating the expectancy of alcohol consumption and then violating this expectancy (withholding alcohol) following the presentation of a brief set of prototypical alcohol cue images (retrieval + PE). Control procedures involve presenting the same cue images, but allow alcohol to be consumed, generating minimal PE (retrieval-no PE) or generate PE without retrieval of alcohol MRMs, by presenting orange juice cues (no retrieval + PE). Subsequently, we describe a multisensory disgust-based counterconditioning procedure to probe MRM destabilization by re-writing alcohol cue-reward associations prior to reconsolidation. This procedure pairs alcohol cues with images invoking pathogen disgust and an extremely bitter-tasting solution (denatonium benzoate), generating gustatory disgust. Following retrieval + PE, but not no retrieval + PE or retrieval-no PE, counterconditioning produces evidence of MRM rewriting as indexed by lasting reductions in alcohol cue valuation, attentional capture, and alcohol craving. PMID:29364255

  16. Ancient DNA and the rewriting of human history: be sparing with Occam's razor.

    PubMed

    Haber, Marc; Mezzavilla, Massimo; Xue, Yali; Tyler-Smith, Chris

    2016-01-11

    Ancient DNA research is revealing a human history far more complex than that inferred from parsimonious models based on modern DNA. Here, we review some of the key events in the peopling of the world in the light of the findings of work on ancient DNA.

  17. A kilobyte rewritable atomic memory

    NASA Astrophysics Data System (ADS)

    Kalff, Floris; Rebergen, Marnix; Fahrenfort, Nora; Girovsky, Jan; Toskovic, Ranko; Lado, Jose; FernáNdez-Rossier, JoaquíN.; Otte, Sander

    The ability to manipulate individual atoms by means of scanning tunneling microscopy (STM) opens op opportunities for storage of digital data on the atomic scale. Recent achievements in this direction include data storage based on bits encoded in the charge state, the magnetic state, or the local presence of single atoms or atomic assemblies. However, a key challenge at this stage is the extension of such technologies into large-scale rewritable bit arrays. We demonstrate a digital atomic-scale memory of up to 1 kilobyte (8000 bits) using an array of individual surface vacancies in a chlorine terminated Cu(100) surface. The chlorine vacancies are found to be stable at temperatures up to 77 K. The memory, crafted using scanning tunneling microscopy at low temperature, can be read and re-written automatically by means of atomic-scale markers, and offers an areal density of 502 Terabits per square inch, outperforming state-of-the-art hard disk drives by three orders of magnitude.

  18. Testing Linear Temporal Logic Formulae on Finite Execution Traces

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Norvig, Peter (Technical Monitor)

    2001-01-01

    We present an algorithm for efficiently testing Linear Temporal Logic (LTL) formulae on finite execution traces. The standard models of LTL are infinite traces, reflecting the behavior of reactive and concurrent systems which conceptually may be continuously alive. In most past applications of LTL. theorem provers and model checkers have been used to formally prove that down-scaled models satisfy such LTL specifications. Our goal is instead to use LTL for up-scaled testing of real software applications. Such tests correspond to analyzing the conformance of finite traces against LTL formulae. We first describe what it means for a finite trace to satisfy an LTL property. We then suggest an optimized algorithm based on transforming LTL formulae. The work is done using the Maude rewriting system. which turns out to provide a perfect notation and an efficient rewriting engine for performing these experiments.

  19. Naphthalene based AIE active stimuli-responsive material as rewritable media for temporary communication

    NASA Astrophysics Data System (ADS)

    Pannipara, Mehboobali; Al-Sehemi, Abdullah G.; Kalam, Abul; Asiri, Abdullah M.

    2017-10-01

    Organic molecules having extended π-conjugated moieties is useful for creating 'dynamic' functional materials by modulating the photophysical properties and molecular packing through non-covalent interactions. Herein, we report the photoluminescence properties of a luminogen, NBA, exhibiting aggregation-induced emission (AIE) characteristics, synthesized by Knoevenagel condensation reaction between 2-Hydroxy naphthaldehyde and malononitrile. NBA emits strongly upon aggregation and in solid state with large Stokes shift whereas it is non emissive in pure solvents. The aggregation induced emission behavior of the compound was carried out in DMSO (good solvent)-water mixture (poor solvent) with water fraction (fw) ranging from 0% to 98%. The AIE property of the luminogen were further exploited for fabricating rewritable fluorescent paper substrates that found applications in security printing and data storage where the written images or letters stored on the filter paper are invisible under normal light.

  20. Israel & Jordan: Paving a Path for the Future through Understanding the Peoples and Cultures of the Middle East. Fulbright-Hays Summer Seminars Abroad, 1998 (Israel and Jordan).

    ERIC Educational Resources Information Center

    Moore, Ilene

    This curriculum project on the cultures of the Middle Eastern countries of Israel and Jordan stresses the language arts and focuses on objectives for elementary-age students to attain. The project states that children will: locate, list, identify, label, demonstrate, research, organize, compose, conference, rewrite, proofread, rewrite again,…

  1. Rewriting and Paraphrasing Source Texts in Second Language Writing

    ERIC Educational Resources Information Center

    Shi, Ling

    2012-01-01

    The present study is based on interviews with 48 students and 27 instructors in a North American university and explores whether students and professors across faculties share the same views on the use of paraphrased, summarized, and translated texts in four examples of L2 student writing. Participants' comments centered on whether the paraphrases…

  2. Revisiting and Rewriting Early Career Encounters: Reconstructing One "Identity Defining" Moment

    ERIC Educational Resources Information Center

    Yoo, Joanne

    2011-01-01

    There has been much research conducted into the effects of early career experiences on future practice. The research indicates that early career academics are particularly susceptible to burnout, as they are still developing their professional knowledge base, and are therefore more reliant on their theoretical knowledge or idealism to interpret…

  3. Superhydrophobic Surface With Shape Memory Micro/Nanostructure and Its Application in Rewritable Chip for Droplet Storage.

    PubMed

    Lv, Tong; Cheng, Zhongjun; Zhang, Dongjie; Zhang, Enshuang; Zhao, Qianlong; Liu, Yuyan; Jiang, Lei

    2016-09-21

    Recently, superhydrophobic surfaces with tunable wettability have aroused much attention. Noticeably, almost all present smart performances rely on the variation of surface chemistry on static micro/nanostructure, to obtain a surface with dynamically tunable micro/nanostructure, especially that can memorize and keep different micro/nanostructures and related wettabilities, is still a challenge. Herein, by creating micro/nanostructured arrays on shape memory polymer, a superhydrophobic surface that has shape memory ability in changing and recovering its hierarchical structures and related wettabilities was reported. Meanwhile, the surface was successfully used in the rewritable functional chip for droplet storage by designing microstructure-dependent patterns, which breaks through current research that structure patterns cannot be reprogrammed. This article advances a superhydrophobic surface with shape memory hierarchical structure and the application in rewritable functional chip, which could start some fresh ideas for the development of smart superhydrophobic surface.

  4. Termination Proofs for String Rewriting Systems via Inverse Match-Bounds

    NASA Technical Reports Server (NTRS)

    Butler, Ricky (Technical Monitor); Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2004-01-01

    Annotating a letter by a number, one can record information about its history during a reduction. A string rewriting system is called match-bounded if there is a global upper bound to these numbers. In earlier papers we established match-boundedness as a strong sufficient criterion for both termination and preservation of regular languages. We show now that the string rewriting system whose inverse (left and right hand sides exchanged) is match-bounded, also have exceptional properties, but slightly different ones. Inverse match-bounded systems effectively preserve context-free languages; their sets of normalized strings and their sets of immortal strings are effectively regular. These sets of strings can be used to decide the normalization, the termination and the uniform termination problems of inverse match-bounded systems. We also show that the termination problem is decidable in linear time, and that a certain strong reachability problem is deciable, thus solving two open problems of McNaughton's.

  5. Generic strategies for chemical space exploration.

    PubMed

    Andersen, Jakob L; Flamm, Christoph; Merkle, Daniel; Stadler, Peter F

    2014-01-01

    The chemical universe of molecules reachable from a set of start compounds by iterative application of a finite number of reactions is usually so vast, that sophisticated and efficient exploration strategies are required to cope with the combinatorial complexity. A stringent analysis of (bio)chemical reaction networks, as approximations of these complex chemical spaces, forms the foundation for the understanding of functional relations in Chemistry and Biology. Graphs and graph rewriting are natural models for molecules and reactions. Borrowing the idea of partial evaluation from functional programming, we introduce partial applications of rewrite rules. A framework for the specification of exploration strategies in graph-rewriting systems is presented. Using key examples of complex reaction networks from carbohydrate chemistry we demonstrate the feasibility of this high-level strategy framework. While being designed for chemical applications, the framework can also be used to emulate higher-level transformation models such as illustrated in a small puzzle game.

  6. From "Somatic Scandals" to "A Constant Potential for Violence"? The Culture of Dissection, Brain-Based Learning, and the Rewriting/Rewiring of "The Child"

    ERIC Educational Resources Information Center

    Baker, Bernadette

    2015-01-01

    Within educational research across Europe and the US, one of the most rapidly traveling discourses and highly funded pursuits of the moment is brain-based learning (BBL). BBL is an approach to curriculum and pedagogical decision-making that is located within the new field of educational neuroscience. In some strands of BBL research the structure…

  7. A Project to Rewrite and Restructure the Competitive Events for the Distributive Education Clubs of America, Texas Association. Final Report.

    ERIC Educational Resources Information Center

    Speary, William A.

    A project is reported which accomplished the following objectives: (1) Developed greater awareness among high school distributive education teacher-coordinators and State and area staff toward the competency based concept as applied to the Texas DECA (Distributive Education Clubs of America) Association's competitive events program, (2) identified…

  8. Empowering Students to Write and Re-Write: Standards-Based Strategies for Middle and High School Teachers

    ERIC Educational Resources Information Center

    Combs, Warren E.

    2009-01-01

    In this book, the author provides teachers with detailed strategies and lesson plans, along with real student writing samples. He describes effective routines of formative self-assessment, and shows teachers how to form a professional learning team with their colleagues using the 6-session professional learning guide. Contents include: (1)…

  9. The symbolic computation and automatic analysis of trajectories

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  10. Leave No City Behind: England/United States Dialogue on Urban Education Reform

    ERIC Educational Resources Information Center

    Hannaway, Jane; Murphy, Marilyn; Reed, Jodie

    2004-01-01

    Both the United States and England initiated ambitious standards-based education reform to eliminate large gaps between their highest and lowest achievers. England appears to be ahead, having started in 1988 with a national curriculum, tests, and performance tables. The United States' No Child Left Behind Act began rewriting state rules in 2002…

  11. Automatic micropropagation of plants--the vision-system: graph rewriting as pattern recognition

    NASA Astrophysics Data System (ADS)

    Schwanke, Joerg; Megnet, Roland; Jensch, Peter F.

    1993-03-01

    The automation of plant-micropropagation is necessary to produce high amounts of biomass. Plants have to be dissected on particular cutting-points. A vision-system is needed for the recognition of the cutting-points on the plants. With this background, this contribution is directed to the underlying formalism to determine cutting-points on abstract-plant models. We show the usefulness of pattern recognition by graph-rewriting along with some examples in this context.

  12. Introduction to the Natural Anticipator and the Artificial Anticipator

    NASA Astrophysics Data System (ADS)

    Dubois, Daniel M.

    2010-11-01

    This short communication deals with the introduction of the concept of anticipator, which is one who anticipates, in the framework of computing anticipatory systems. The definition of anticipation deals with the concept of program. Indeed, the word program, comes from "pro-gram" meaning "to write before" by anticipation, and means a plan for the programming of a mechanism, or a sequence of coded instructions that can be inserted into a mechanism, or a sequence of coded instructions, as genes or behavioural responses, that is part of an organism. Any natural or artificial programs are thus related to anticipatory rewriting systems, as shown in this paper. All the cells in the body, and the neurons in the brain, are programmed by the anticipatory genetic code, DNA, in a low-level language with four signs. The programs in computers are also computing anticipatory systems. It will be shown, at one hand, that the genetic code DNA is a natural anticipator. As demonstrated by Nobel laureate McClintock [8], genomes are programmed. The fundamental program deals with the DNA genetic code. The properties of the DNA consist in self-replication and self-modification. The self-replicating process leads to reproduction of the species, while the self-modifying process leads to new species or evolution and adaptation in existing ones. The genetic code DNA keeps its instructions in memory in the DNA coding molecule. The genetic code DNA is a rewriting system, from DNA coding to DNA template molecule. The DNA template molecule is a rewriting system to the Messenger RNA molecule. The information is not destroyed during the execution of the rewriting program. On the other hand, it will be demonstrated that Turing machine is an artificial anticipator. The Turing machine is a rewriting system. The head reads and writes, modifying the content of the tape. The information is destroyed during the execution of the program. This is an irreversible process. The input data are lost.

  13. Novel scenarios of early animal evolution--is it time to rewrite textbooks?

    PubMed

    Dohrmann, Martin; Wörheide, Gert

    2013-09-01

    Understanding how important phenotypic, developmental, and genomic features of animals originated and evolved is essential for many fields of biological research, but such understanding depends on robust hypotheses about the phylogenetic interrelationships of the higher taxa to which the studied species belong. Molecular approaches to phylogenetics have proven able to revolutionize our knowledge of organismal evolution. However, with respect to the deepest splits in the metazoan Tree of Life-the relationships between Bilateria and the four non-bilaterian phyla (Porifera, Placozoa, Ctenophora, and Cnidaria)-no consensus has been reached yet, since a number of different, often contradictory, hypotheses with sometimes spectacular implications have been proposed in recent years. Here, we review the recent literature on the topic and contrast it with more classical perceptions based on analyses of morphological characters. We conclude that the time is not yet ripe to rewrite zoological textbooks and advocate a conservative approach when it comes to developing scenarios of the early evolution of animals.

  14. A graph grammar approach to artificial life.

    PubMed

    Kniemeyer, Ole; Buck-Sorlin, Gerhard H; Kurth, Winfried

    2004-01-01

    We present the high-level language of relational growth grammars (RGGs) as a formalism designed for the specification of ALife models. RGGs can be seen as an extension of the well-known parametric Lindenmayer systems and contain rule-based, procedural, and object-oriented features. They are defined as rewriting systems operating on graphs with the edges coming from a set of user-defined relations, whereas the nodes can be associated with objects. We demonstrate their ability to represent genes, regulatory networks of metabolites, and morphologically structured organisms, as well as developmental aspects of these entities, in a common formal framework. Mutation, crossing over, selection, and the dynamics of a network of gene regulation can all be represented with simple graph rewriting rules. This is demonstrated in some detail on the classical example of Dawkins' biomorphs and the ABC model of flower morphogenesis: other applications are briefly sketched. An interactive program was implemented, enabling the execution of the formalism and the visualization of the results.

  15. Proof of Concept for the Rewrite Rule Machine: Interensemble Studies

    DTIC Science & Technology

    1994-02-23

    34 -,,, S2 •fbo fibo 0 1 Figure 1: Concurrent Rewriting of Fibonacci Expressions exploit a problem’s parallelism at several levels. We call this...property multigrain concurrency; it makes the RRM very well suited for solving not only homogeneous problems, but also complex, locally homogeneous but...interprocessor message passing over a network-is not well suited to data parallelism. A key goal of the RRM is to combine the best of these two approaches in a

  16. Rewritable Optical Storage with a Spiropyran Doped Liquid Crystal Polymer Film.

    PubMed

    Petriashvili, Gia; De Santo, Maria Penelope; Devadze, Lali; Zurabishvili, Tsisana; Sepashvili, Nino; Gary, Ramla; Barberi, Riccardo

    2016-03-01

    Rewritable optical storage has been obtained in a spiropyran doped liquid crystal polymer films. Pictures can be recorded on films upon irradiation with UV light passing through a grayscale mask and they can be rapidly erased using visible light. Films present improved photosensitivity and optical contrast, good resistance to photofatigue, and high spatial resolution. These photochromic films work as a multifunctional, dynamic photosensitive material with a real-time image recording feature. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Video Editing System

    NASA Technical Reports Server (NTRS)

    Schlecht, Leslie E.; Kutler, Paul (Technical Monitor)

    1998-01-01

    This is a proposal for a general use system based, on the SGI IRIS workstation platform, for recording computer animation to videotape. In addition, this system would provide features for simple editing and enhancement. Described here are a list of requirements for the system, and a proposed configuration including the SGI VideoLab Integrator, VideoMedia VLAN animation controller and the Pioneer rewritable laserdisc recorder.

  18. Addressable configurations of DNA nanostructures for rewritable memory

    PubMed Central

    Levchenko, Oksana; Patel, Dhruv S.; MacIsaac, Molly

    2017-01-01

    Abstract DNA serves as nature's information storage molecule, and has been the primary focus of engineered systems for biological computing and data storage. Here we combine recent efforts in DNA self-assembly and toehold-mediated strand displacement to develop a rewritable multi-bit DNA memory system. The system operates by encoding information in distinct and reversible conformations of a DNA nanoswitch and decoding by gel electrophoresis. We demonstrate a 5-bit system capable of writing, erasing, and rewriting binary representations of alphanumeric symbols, as well as compatibility with ‘OR’ and ‘AND’ logic operations. Our strategy is simple to implement, requiring only a single mixing step at room temperature for each operation and standard gel electrophoresis to read the data. We envision such systems could find use in covert product labeling and barcoding, as well as secure messaging and authentication when combined with previously developed encryption strategies. Ultimately, this type of memory has exciting potential in biomedical sciences as data storage can be coupled to sensing of biological molecules. PMID:28977499

  19. Rewriting magnetic phase change memory by laser heating

    NASA Astrophysics Data System (ADS)

    Timmerwilke, John; Liou, Sy-Hwang; Cheng, Shu Fan; Edelstein, Alan S.

    2016-04-01

    Magnetic phase change memory (MAG PCM) consists of bits with different magnetic permeability values. The bits are read by measuring their effect on a magnetic probe field. Previously low permeability crystalline bits had been written in high permeability amorphous films of Metglas via laser heating. Here data is presented showing that by applying short laser pulses with the appropriate power to previously crystallized regions they can first be vitrified and then again crystallized. Thus, MAG PCM is rewriteable. Technical issues in processing the bits are discussed and results on thermal modeling are presented.

  20. An Algebraic Approach to the Study and Optimization of the Set of Rules of a Conditional Rewrite System

    NASA Astrophysics Data System (ADS)

    Makhortov, S. D.

    2018-03-01

    An algebraic system containing the semantics of a set of rules of the conditional equational theory (or the conditional term rewriting system) is introduced. The following basic questions are considered for the given model: existence of logical closure, structure of logical closure, possibility of equivalent transformations, and construction of logical reduction. The obtained results can be applied to the analysis and automatic optimization of the corresponding set of rules. The basis for the given research is the theory of lattices and binary relations.

  1. Programs Visualize Earth and Space for Interactive Education

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Kevin Hussey and others at the Jet Propulsion Laboratory produced web applications to visualize all of the spacecraft in orbit around Earth and in the Solar System. Hussey worked with Milwaukee, Wisconsin-based The Elumenati to rewrite the programs, and after licensing them, the company started offering a version that can be viewed on spheres and dome theaters for schools, museums, science centers, and other institutions.

  2. Re-Writing Interpersonal Communication: A Portfolio-Based Curriculum for Process Pedagogy and Moving Theory into Practice

    ERIC Educational Resources Information Center

    Cunningham, Summer; Bartesaghi, Mariaelena; Bowman, Jim; Bender, Jennifer

    2017-01-01

    How does one create a class where the theoretical concepts emerge through classroom practice and engagement? This is the question that Mariaelena posed to herself when taking over the position of Director of the Interpersonal Communication course at the University of South Florida. In this essay we describe how we worked through a new way of…

  3. Addressable configurations of DNA nanostructures for rewritable memory.

    PubMed

    Chandrasekaran, Arun Richard; Levchenko, Oksana; Patel, Dhruv S; MacIsaac, Molly; Halvorsen, Ken

    2017-11-02

    DNA serves as nature's information storage molecule, and has been the primary focus of engineered systems for biological computing and data storage. Here we combine recent efforts in DNA self-assembly and toehold-mediated strand displacement to develop a rewritable multi-bit DNA memory system. The system operates by encoding information in distinct and reversible conformations of a DNA nanoswitch and decoding by gel electrophoresis. We demonstrate a 5-bit system capable of writing, erasing, and rewriting binary representations of alphanumeric symbols, as well as compatibility with 'OR' and 'AND' logic operations. Our strategy is simple to implement, requiring only a single mixing step at room temperature for each operation and standard gel electrophoresis to read the data. We envision such systems could find use in covert product labeling and barcoding, as well as secure messaging and authentication when combined with previously developed encryption strategies. Ultimately, this type of memory has exciting potential in biomedical sciences as data storage can be coupled to sensing of biological molecules. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. A Multi-Addressable Dyad with Switchable CMY Colors for Full-Color Rewritable Papers.

    PubMed

    Qin, Tianyou; Han, Jiaqi; Geng, Yue; Ju, Le; Sheng, Lan; Zhang, Sean Xiao-An

    2018-06-23

    Reversible multicolor displays on solid media using single molecule pigments have been a long-awaited goal. Herein, a new and simple molecular dyad, which can undergo switchable CMY color changes both in solution and solid substrate upon exposure to light, water/acid, and nucleophiles, is designed and synthesized. The stimuli used in this work can be applied independent of each other, which is beneficial for color changes without mutual interference. As a comparison, the mixtures of the two molecular switching motifs forming the basis of the dyad were also studied. The dyad greatly outperforms the corresponding mixed system with respect to reversible color-switching on the paper substrate. Its potential for full-color rewritable paper with excellent reversibility has been demonstrated. Legible multicolor prints, that is, high color contrast and resolution, good dispersion, excellent reversibility, were achieved using common water-jet and light-based printers. This work provides a very promising approach for further development of full-color switchable molecules, materials and displays. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-12-10

    Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error

  6. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  7. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  8. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  9. Audio-visual imposture

    NASA Astrophysics Data System (ADS)

    Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard

    2006-05-01

    A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.

  10. An Evaluation of Stereoscopic Digital Mammography for Earlier Detection of Breast Cancer and Reduced Rate of Recall

    DTIC Science & Technology

    2004-08-01

    on a pair of high -resolution, LCD medical monitors. The change to the new workstation has required us to rewrite the software... In the original CRT-based system, the two 7 images forming a stereo pair were displayed alternately on the same CRT face, at a high frame rate (120 Hz...then, separately, receive the stereo screening exam on the research GE digital mammography unit.

  11. Optical testing of aspheres based on photochromic computer-generated holograms

    NASA Astrophysics Data System (ADS)

    Pariani, Giorgio; Bianco, Andrea; Bertarelli, Chiara; Spanó, Paolo; Molinari, Emilio

    2010-07-01

    Aspherical optics are widely used in modern optical telescopes and instrumentation because of their ability to reduce aberrations with a simple optical system. Testing their optical quality through null interferometry is not trivial as reference optics are not available. Computer-Generated Holograms (CGHs) are efficient devices that allow to generate a well-defined optical wavefront. We developed rewritable Computer Generated Holograms for the interferometric test of aspheres based on photochromic layers. These photochromic holograms are cost-effective and the method of production does not need any post exposure process.

  12. A new order-theoretic characterisation of the polytime computable functions☆

    PubMed Central

    Avanzini, Martin; Eguchi, Naohi; Moser, Georg

    2015-01-01

    We propose a new order-theoretic characterisation of the class of polytime computable functions. To this avail we define the small polynomial path order (sPOP⁎ for short). This termination order entails a new syntactic method to analyse the innermost runtime complexity of term rewrite systems fully automatically: for any rewrite system compatible with sPOP⁎ that employs recursion up to depth d, the (innermost) runtime complexity is polynomially bounded of degree d. This bound is tight. Thus we obtain a direct correspondence between a syntactic (and easily verifiable) condition of a program and the asymptotic worst-case complexity of the program. PMID:26412933

  13. Rapidly Responsive and Flexible Chiral Nematic Cellulose Nanocrystal Composites as Multifunctional Rewritable Photonic Papers with Eco-Friendly Inks.

    PubMed

    Wan, Hao; Li, Xiaofeng; Zhang, Liang; Li, Xiaopeng; Liu, Pengfei; Jiang, Zhiguo; Yu, Zhong-Zhen

    2018-02-14

    Rapidly responsive and flexible photonic papers are manufactured by coassembly of cellulose nanocrystals (CNCs) and waterborne polyurethane (WPU) latex for fully taking advantage of the chiral nematic structure of CNCs and the flexibility of WPU elastomer. The resulting CNC/WPU composite papers exhibit not only tunable iridescent colors by adjusting the helical pitch size, but also instant optical responses to water and wet gas, ascribed to the easy chain movement of the elastomeric WPU that does not restrict the fast water absorption-induced swelling of CNCs. By choosing water or NaCl aqueous solutions as inks, the colorful patterns on the CNC/WPU photonic paper can be made temporary, durable, or even disguisable. In addition, the photonic paper is simultaneously rewritable for all these three types of patterns, and the disguisable patterns, which are invisible at normal times and show up under stimuli, exhibit a quick reveal conversion just by exhaling on the paper. The rewritability, rapid responsibility, easy fabrication, and the eco-friendly nature of the inks make the flexible photonic paper/ink combination highly promising in sensors, displays, and photonic circuits.

  14. Research on key technology of the verification system of steel rule based on vision measurement

    NASA Astrophysics Data System (ADS)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  15. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  16. Inference for Transition Network Grammars,

    DTIC Science & Technology

    1976-01-01

    If the arc Is followed. language L(G) is said to be structurally complete if The power of an augmented transition network (Am) is each rewriting rule ...Clearly, a context-sensitive grammar can be represented as a context—free grarmar plus a set of transformationDbbbbb Eabbbbbb Dbb~~bb Ebbbbbb rules ...are the foun— as a CFG (base) and a set of transformationa l rules . datIons of grammars of different complexities. The The CSL Is obtained by appl

  17. Resistive switching effect in the planar structure of all-printed, flexible and rewritable memory device based on advanced 2D nanocomposite of graphene quantum dots and white graphene flakes

    NASA Astrophysics Data System (ADS)

    Muqeet Rehman, Muhammad; Uddin Siddiqui, Ghayas; Kim, Sowon; Choi, Kyung Hyun

    2017-08-01

    Pursuit of the most appropriate materials and fabrication methods is essential for developing a reliable, rewritable and flexible memory device. In this study, we have proposed an advanced 2D nanocomposite of white graphene (hBN) flakes embedded with graphene quantum dots (GQDs) as the functional layer of a flexible memory device owing to their unique electrical, chemical and mechanical properties. Unlike the typical sandwich type structure of a memory device, we developed a cost effective planar structure, to simplify device fabrication and prevent sneak current. The entire device fabrication was carried out using printing technology followed by encapsulation in an atomically thin layer of aluminum oxide (Al2O3) for protection against environmental humidity. The proposed memory device exhibited attractive bipolar switching characteristics of high switching ratio, large electrical endurance and enhanced lifetime, without any crosstalk between adjacent memory cells. The as-fabricated device showed excellent durability for several bending cycles at various bending diameters without any degradation in bistable resistive states. The memory mechanism was deduced to be conductive filamentary; this was validated by illustrating the temperature dependence of bistable resistive states. Our obtained results pave the way for the execution of promising 2D material based next generation flexible and non-volatile memory (NVM) applications.

  18. The rewritable effects of bonded magnet for large starting torque and high efficiency in the small power single-phase written pole motor

    NASA Astrophysics Data System (ADS)

    Choi, Jae-Hak; Lee, Sung-Ho

    2009-04-01

    This paper presents a single-phase written pole motor using a bonded ring magnet for the small power home application. The motor has an exciter pole structure inside the stator and hybrid characteristics of an induction motor and permanent magnet motor. The design parameters and operating characteristics of the hybrid concept motor are investigated to increase starting torque and efficiency, which is most important for the small power home application. Larger starting torque and higher efficiency than those of the conventional induction motor could be obtained by using the rewritable characteristics of bonded magnet on the starting and running conditions.

  19. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  20. Adaptive implicit-explicit and parallel element-by-element iteration schemes

    NASA Technical Reports Server (NTRS)

    Tezduyar, T. E.; Liou, J.; Nguyen, T.; Poole, S.

    1989-01-01

    Adaptive implicit-explicit (AIE) and grouped element-by-element (GEBE) iteration schemes are presented for the finite element solution of large-scale problems in computational mechanics and physics. The AIE approach is based on the dynamic arrangement of the elements into differently treated groups. The GEBE procedure, which is a way of rewriting the EBE formulation to make its parallel processing potential and implementation more clear, is based on the static arrangement of the elements into groups with no inter-element coupling within each group. Various numerical tests performed demonstrate the savings in the CPU time and memory.

  1. New developments in optical phase-change memory

    NASA Astrophysics Data System (ADS)

    Ovshinsky, Stanford R.; Czubatyj, Wolodymyr

    2001-02-01

    Phase change technology has progressed from the original invention of Ovshinsky to become the leading choice for rewritable optical disks. ECD's early work in phase change materials and methods for operating in a direct overwrite fashion were crucial to the successes that have been achieved. Since the introduction of the first rewritable phase change products in 1991, the market has expanded from CD-RW into rewritable DVD with creative work going on worldwide. Phase change technology is ideally suited to address the continuous demand for increased storage capacity. First, laser beams can be focused to ever-smaller spot sizes using shorter wavelength lasers and higher performance optics. Blue lasers are now commercially viable and high numerical aperture and near field lenses have been demonstrated. Second, multilevel approaches can be used to increase capacity by a factor of three or more with concomitant increases in data transfer rate. In addition, ECD has decreased manufacturing costs through the use of innovative production technology. These factors combine to accelerate the widespread use of phase change technology. As in all our technologies, such as thin film photovoltaics, nickel metal hydride batteries, hydrogen storage systems, fuel cells, electrical memory, etc., we have invented the materials, the products, the production machines and the production processes for high rate, low-cost manufacture.

  2. The capability of lithography simulation based on MVM-SEM® system

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong

    2015-10-01

    The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.

  3. Dynamic Information Management and Exchange for Command and Control Applications, Modelling and Enforcing Category-Based Access Control via Term Rewriting

    DTIC Science & Technology

    2015-03-01

    a hotel and a hospital. 2. Event handler for emergency policies (item 2 above): this has been implemented in two UG projects, one project developed a...Workshop on Logical and Se- mantic Frameworks, with Applications, Brasilia, Brazil , September 2014. Electronic Notes in Theoretical Computer Science (to...Brasilia, Brazil , September 2014, 2015. [3] S. Barker. The next 700 access control models or a unifying meta-model? In SACMAT 2009, 14th ACM Symposium on

  4. Experience with the CAIS

    NASA Technical Reports Server (NTRS)

    Tighe, Michael F.

    1986-01-01

    Intermetrics' experience is that the Ada package construct, which allows separation of specification and implementation allows specification of a CAIS that is transportable across varying hardware and software bases. Additionally, the CAIS is an excellent basis for providing operating system functionality to Ada applications. By allowing the Byron APSE to be moved easily from system to system, and allowing significant re-writes of underlying code. Ada and the CAIS provide portability as well as transparency to change at the application operating system interface level.

  5. DFM flow by using combination between design based metrology system and model based verification at sub-50nm memory device

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong

    2007-03-01

    As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.

  6. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  7. Nonvolatile gate effect in a ferroelectric-semiconductor quantum well.

    PubMed

    Stolichnov, Igor; Colla, Enrico; Setter, Nava; Wojciechowski, Tomasz; Janik, Elzbieta; Karczewski, Grzegorz

    2006-12-15

    Field effect transistors with ferroelectric gates would make ideal rewritable nonvolatile memories were it not for the severe problems in integrating the ferroelectric oxide directly on the semiconductor channel. We propose a powerful way to avoid these problems using a gate material that is ferroelectric and semiconducting simultaneously. First, ferroelectricity in semiconductor (Cd,Zn)Te films is proven and studied using modified piezoforce scanning probe microscopy. Then, a rewritable field effect device is demonstrated by local poling of the (Cd,Zn)Te layer of a (Cd,Zn)Te/CdTe quantum well, provoking a reversible, nonvolatile change in the resistance of the 2D electron gas. The results point to a potential new family of nanoscale one-transistor memories.

  8. CD volume design and verification

    NASA Technical Reports Server (NTRS)

    Li, Y. P.; Hughes, J. S.

    1993-01-01

    In this paper, we describe a prototype for CD-ROM volume design and verification. This prototype allows users to create their own model of CD volumes by modifying a prototypical model. Rule-based verification of the test volumes can then be performed later on against the volume definition. This working prototype has proven the concept of model-driven rule-based design and verification for large quantity of data. The model defined for the CD-ROM volumes becomes a data model as well as an executable specification.

  9. Understanding Patchy Landscape Dynamics: Towards a Landscape Language

    PubMed Central

    Gaucherel, Cédric; Boudon, Frédéric; Houet, Thomas; Castets, Mathieu; Godin, Christophe

    2012-01-01

    Patchy landscapes driven by human decisions and/or natural forces are still a challenge to be understood and modelled. No attempt has been made up to now to describe them by a coherent framework and to formalize landscape changing rules. Overcoming this lacuna was our first objective here, and this was largely based on the notion of Rewriting Systems, also called Formal Grammars. We used complicated scenarios of agricultural dynamics to model landscapes and to write their corresponding driving rule equations. Our second objective was to illustrate the relevance of this landscape language concept for landscape modelling through various grassland managements, with the final aim to assess their respective impacts on biological conservation. For this purpose, we made the assumptions that a higher grassland appearance frequency and higher land cover connectivity are favourable to species conservation. Ecological results revealed that dairy and beef livestock production systems are more favourable to wild species than is hog farming, although in different ways. Methodological results allowed us to efficiently model and formalize these landscape dynamics. This study demonstrates the applicability of the Rewriting System framework to the modelling of agricultural landscapes and, hopefully, to other patchy landscapes. The newly defined grammar is able to explain changes that are neither necessarily local nor Markovian, and opens a way to analytical modelling of landscape dynamics. PMID:23049935

  10. Towards rewritable multilevel optical data storage in single nanocrystals.

    PubMed

    Riesen, Nicolas; Pan, Xuanzhao; Badek, Kate; Ruan, Yinlan; Monro, Tanya M; Zhao, Jiangbo; Ebendorff-Heidepriem, Heike; Riesen, Hans

    2018-04-30

    Novel approaches for digital data storage are imperative, as storage capacities are drastically being outpaced by the exponential growth in data generation. Optical data storage represents the most promising alternative to traditional magnetic and solid-state data storage. In this paper, a novel and energy efficient approach to optical data storage using rare-earth ion doped inorganic insulators is demonstrated. In particular, the nanocrystalline alkaline earth halide BaFCl:Sm is shown to provide great potential for multilevel optical data storage. Proof-of-concept demonstrations reveal for the first time that these phosphors could be used for rewritable, multilevel optical data storage on the physical dimensions of a single nanocrystal. Multilevel information storage is based on the very efficient and reversible conversion of Sm 3+ to Sm 2+ ions upon exposure to UV-C light. The stored information is then read-out using confocal optics by employing the photoluminescence of the Sm 2+ ions in the nanocrystals, with the signal strength depending on the UV-C fluence used during the write step. The latter serves as the mechanism for multilevel data storage in the individual nanocrystals, as demonstrated in this paper. This data storage platform has the potential to be extended to 2D and 3D memory for storage densities that could potentially approach petabyte/cm 3 levels.

  11. Functionalized Graphitic Carbon Nitride for Metal-free, Flexible and Rewritable Nonvolatile Memory Device via Direct Laser-Writing

    NASA Astrophysics Data System (ADS)

    Zhao, Fei; Cheng, Huhu; Hu, Yue; Song, Long; Zhang, Zhipan; Jiang, Lan; Qu, Liangti

    2014-07-01

    Graphitic carbon nitride nanosheet (g-C3N4-NS) has layered structure similar with graphene nanosheet and presents unusual physicochemical properties due to the s-triazine fragments. But their electronic and electrochemical applications are limited by the relatively poor conductivity. The current work provides the first example that atomically thick g-C3N4-NSs are the ideal candidate as the active insulator layer with tunable conductivity for achieving the high performance memory devices with electrical bistability. Unlike in conventional memory diodes, the g-C3N4-NSs based devices combined with graphene layer electrodes are flexible, metal-free and low cost. The functionalized g-C3N4-NSs exhibit desirable dispersibility and dielectricity which support the all-solution fabrication and high performance of the memory diodes. Moreover, the flexible memory diodes are conveniently fabricated through the fast laser writing process on graphene oxide/g-C3N4-NSs/graphene oxide thin film. The obtained devices not only have the nonvolatile electrical bistability with great retention and endurance, but also show the rewritable memory effect with a reliable ON/OFF ratio of up to 105, which is the highest among all the metal-free flexible memory diodes reported so far, and even higher than those of metal-containing devices.

  12. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  13. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    NASA Astrophysics Data System (ADS)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  14. Simulation environment based on the Universal Verification Methodology

    NASA Astrophysics Data System (ADS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  15. A Phase-Based Approach to Satellite Constellation Analysis and Design

    DTIC Science & Technology

    1991-01-01

    and 4p is a phase angle representing true anomaly, as measured from the line of nodes. For a spherical earth, the orbital parameters are related...Var Outdat : Arrayll..2,1..90] of Real; J Output data for cost versus optimization parameter I F : text; { Output file Y, DY : Vec2; Y is a point on...InitGraph(Gd, Gm,’graph’); Assign(f,’c:\\matlab\\ OutDat ’); Rewrite (f); 129 7 o / With Common do with Target do With LoopParm do With Constellation do

  16. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  17. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations... concentration subcomponents (e.g., THC and CH4 for NMHC) separately. For example, for NMHC measurements, perform drift verification on NMHC; do not verify THC and CH4 separately. (2) Drift verification requires two...

  18. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  19. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  20. Glove-based approach to online signature verification.

    PubMed

    Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A

    2008-06-01

    Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSDF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, SEPTITECH, INC. MODEL 400 SYSTEM - 02/04/WQPC-SWP

    EPA Science Inventory

    Verification testing of the SeptiTech Model 400 System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing was u...

  2. What Works for Me.

    ERIC Educational Resources Information Center

    Wolf, Lori; And Others

    1994-01-01

    Offers 11 classroom tips from teachers for a variety of activities, including fictional movie reviews, haiku writing, questions to develop student journals, handouts, rewriting stories, and a "dirty trick" to get better research topics. (SR)

  3. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  4. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  5. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    PubMed Central

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-01

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375

  6. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.

    PubMed

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-10

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  7. Experiencing "Macbeth": From Text Rendering to Multicultural Performance.

    ERIC Educational Resources Information Center

    Reisin, Gail

    1993-01-01

    Shows how one teacher used innovative methods in teaching William Shakespeare's "Macbeth." Outlines student assignments including text renderings, rewriting a scene from the play, and creating a multicultural scrapbook for the play. (HB)

  8. Electronic paper rewrites the rulebook for displays

    NASA Astrophysics Data System (ADS)

    Graham-Rowe, Duncan

    2007-05-01

    Following years of development, electronic paper is now entering ebooks, mobile phones and signs, and, as Duncan Graham-Rowe reports, is starting to gain the market acceptance that it has long strived for.

  9. 50 CFR 600.507 - Recordkeeping.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... indelible ink, with corrections to be accomplished by lining out and rewriting, rather than erasure. (i) Alternative log formats. As an alternative to the use of the specific formats provided, a Nation may submit a...

  10. 50 CFR 600.507 - Recordkeeping.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... indelible ink, with corrections to be accomplished by lining out and rewriting, rather than erasure. (i) Alternative log formats. As an alternative to the use of the specific formats provided, a Nation may submit a...

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, F. R. MAHONEY & ASSOC., AMPHIDROME SYSTEM FOR SINGLE FAMILY HOMES - 02/05/WQPC-SWP

    EPA Science Inventory

    Verification testing of the F.R. Mahoney Amphidrome System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing w...

  12. Monitoring Java Programs with Java PathExplorer

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2001-01-01

    We present recent work on the development Java PathExplorer (JPAX), a tool for monitoring the execution of Java programs. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems. The tool facilitates automated instrumentation of a program's late code which will then omit events to an observer during its execution. The observer checks the events against user provided high level requirement specifications, for example temporal logic formulae, and against lower level error detection procedures, for example concurrency related such as deadlock and data race algorithms. High level requirement specifications together with their underlying logics are defined in the Maude rewriting logic, and then can either be directly checked using the Maude rewriting engine, or be first translated to efficient data structures and then checked in Java.

  13. Partial Data Traces: Efficient Generation and Representation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, F; De Supinski, B R; McKee, S A

    2001-08-20

    Binary manipulation techniques are increasing in popularity. They support program transformations tailored toward certain program inputs, and these transformations have been shown to yield performance gains beyond the scope of static code optimizations without profile-directed feedback. They even deliver moderate gains in the presence of profile-guided optimizations. In addition, transformations can be performed on the entire executable, including library routines. This work focuses on program instrumentation, yet another application of binary manipulation. This paper reports preliminary results on generating partial data traces through dynamic binary rewriting. The contributions are threefold. First, a portable method for extracting precise data traces formore » partial executions of arbitrary applications is developed. Second, a set of hierarchical structures for compactly representing these accesses is developed. Third, an efficient online algorithm to detect regular accesses is introduced. The authors utilize dynamic binary rewriting to selectively collect partial address traces of regions within a program. This allows partial tracing of hot paths for only a short time during program execution in contrast to static rewriting techniques that lack hot path detection and also lack facilities to limit the duration of data collection. Preliminary results show reductions of three orders of a magnitude of inline instrumentation over a dual process approach involving context switching. They also report constant size representations for regular access patters in nested loops. These efforts are part of a larger project to counter the increasing gap between processor and main memory speeds by means of software optimization and hardware enhancements.« less

  14. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  15. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  16. Functionalized Graphitic Carbon Nitride for Metal-free, Flexible and Rewritable Nonvolatile Memory Device via Direct Laser-Writing

    PubMed Central

    Zhao, Fei; Cheng, Huhu; Hu, Yue; Song, Long; Zhang, Zhipan; Jiang, Lan; Qu, Liangti

    2014-01-01

    Graphitic carbon nitride nanosheet (g-C3N4-NS) has layered structure similar with graphene nanosheet and presents unusual physicochemical properties due to the s-triazine fragments. But their electronic and electrochemical applications are limited by the relatively poor conductivity. The current work provides the first example that atomically thick g-C3N4-NSs are the ideal candidate as the active insulator layer with tunable conductivity for achieving the high performance memory devices with electrical bistability. Unlike in conventional memory diodes, the g-C3N4-NSs based devices combined with graphene layer electrodes are flexible, metal-free and low cost. The functionalized g-C3N4-NSs exhibit desirable dispersibility and dielectricity which support the all-solution fabrication and high performance of the memory diodes. Moreover, the flexible memory diodes are conveniently fabricated through the fast laser writing process on graphene oxide/g-C3N4-NSs/graphene oxide thin film. The obtained devices not only have the nonvolatile electrical bistability with great retention and endurance, but also show the rewritable memory effect with a reliable ON/OFF ratio of up to 105, which is the highest among all the metal-free flexible memory diodes reported so far, and even higher than those of metal-containing devices. PMID:25073687

  17. Job Analysis: A Local Government's Experience.

    ERIC Educational Resources Information Center

    Urbanek, Steve J.

    1997-01-01

    A county personnel department undertook reclassification of all positions by collecting and using job analysis data to rewrite job descriptions. External pay equity and validated selection procedures resulted with only a modest increase in payroll costs. (SK)

  18. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.

  19. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  20. Security Verification Techniques Applied to PatchLink COTS Software

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer

    2006-01-01

    Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.

  1. Modelling protein functional domains in signal transduction using Maude

    NASA Technical Reports Server (NTRS)

    Sriram, M. G.

    2003-01-01

    Modelling of protein-protein interactions in signal transduction is receiving increased attention in computational biology. This paper describes recent research in the application of Maude, a symbolic language founded on rewriting logic, to the modelling of functional domains within signalling proteins. Protein functional domains (PFDs) are a critical focus of modern signal transduction research. In general, Maude models can simulate biological signalling networks and produce specific testable hypotheses at various levels of abstraction. Developing symbolic models of signalling proteins containing functional domains is important because of the potential to generate analyses of complex signalling networks based on structure-function relationships.

  2. An upconverted photonic nonvolatile memory.

    PubMed

    Zhou, Ye; Han, Su-Ting; Chen, Xian; Wang, Feng; Tang, Yong-Bing; Roy, V A L

    2014-08-21

    Conventional flash memory devices are voltage driven and found to be unsafe for confidential data storage. To ensure the security of the stored data, there is a strong demand for developing novel nonvolatile memory technology for data encryption. Here we show a photonic flash memory device, based on upconversion nanocrystals, which is light driven with a particular narrow width of wavelength in addition to voltage bias. With the help of near-infrared light, we successfully manipulate the multilevel data storage of the flash memory device. These upconverted photonic flash memory devices exhibit high ON/OFF ratio, long retention time and excellent rewritable characteristics.

  3. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  4. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  5. Using Concept Space to Verify Hyponymy in Building a Hyponymy Lexicon

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Zhang, Sen; Diao, Lu Hong; Yan, Shu Ying; Cao, Cun Gen

    Verification of hyponymy relations is a basic problem in knowledge acquisition. We present a method of hyponymy verification based on concept space. Firstly, we give the definition of concept space about a group of candidate hyponymy relations. Secondly we analyze the concept space and define a set of hyponymy features based on the space structure. Then we use them to verify candidate hyponymy relations. Experimental results show that the method can provide adequate verification of hyponymy.

  6. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    DTIC Science & Technology

    2003-03-01

    Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig

  7. FORMED: Bringing Formal Methods to the Engineering Desktop

    DTIC Science & Technology

    2016-02-01

    integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT; ULTRASONIC AQUEOUS CLEANING SYSTEMS, SMART SONIC CORPORATION, SMART SONIC

    EPA Science Inventory

    This report is a product of the U.S. EPA's Environmental Technoloy Verification (ETV) Program and is focused on the Smart Sonics Ultrasonic Aqueous Cleaning Systems. The verification is based on three main objectives. (1) The Smart Sonic Aqueous Cleaning Systems, Model 2000 and...

  9. Reconfigurable optical assembly of nanostructures

    PubMed Central

    Montelongo, Yunuen; Yetisen, Ali K.; Butt, Haider; Yun, Seok-Hyun

    2016-01-01

    Arrangements of nanostructures in well-defined patterns are the basis of photonic crystals, metamaterials and holograms. Furthermore, rewritable optical materials can be achieved by dynamically manipulating nanoassemblies. Here we demonstrate a mechanism to configure plasmonic nanoparticles (NPs) in polymer media using nanosecond laser pulses. The mechanism relies on optical forces produced by the interference of laser beams, which allow NPs to migrate to lower-energy configurations. The resulting NP arrangements are stable without any external energy source, but erasable and rewritable by additional recording pulses. We demonstrate reconfigurable optical elements including multilayer Bragg diffraction gratings, volumetric photonic crystals and lenses, as well as dynamic holograms of three-dimensional virtual objects. We aim to expand the applications of optical forces, which have been mostly restricted to optical tweezers. Holographic assemblies of nanoparticles will allow a new generation of programmable composites for tunable metamaterials, data storage devices, sensors and displays. PMID:27337216

  10. Federal employees health benefits: payment of premiums for periods of leave without pay or insufficient pay. Final rule.

    PubMed

    2007-02-05

    The Office of Personnel Management (OPM) is issuing final regulations to rewrite certain sections of the Federal regulations in plain language. These final regulations require Federal agencies to provide employees entering leave without pay (LWOP) status, or whose pay is insufficient to cover their Federal Employees Health Benefits (FEHB) premium payments, written notice of their opportunity to continue their FEHB coverage. Employees who want to continue their enrollment must sign a form agreeing to pay their premiums directly to their agency on a current basis, or to incur a debt to be withheld from their future salary. The purpose of this final regulation is to rewrite the existing regulations to ensure that employees who are entering LWOP status, or whose pay is insufficient to pay their FEHB premiums, are fully informed when they decide whether or not to continue their FEHB coverage.

  11. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bojechko, Casey; Phillps, Mark; Kalet, Alan

    Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into differentmore » failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.« less

  12. Method and computer product to increase accuracy of time-based software verification for sensor networks

    DOEpatents

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  13. Post-OPC verification using a full-chip pattern-based simulation verification method

    NASA Astrophysics Data System (ADS)

    Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary

    2005-11-01

    In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow down to unique patterns/cells.

  14. Signature Verification Based on Handwritten Text Recognition

    NASA Astrophysics Data System (ADS)

    Viriri, Serestina; Tapamo, Jules-R.

    Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.

  15. A framework of multitemplate ensemble for fingerprint verification

    NASA Astrophysics Data System (ADS)

    Yin, Yilong; Ning, Yanbin; Ren, Chunxiao; Liu, Li

    2012-12-01

    How to improve performance of an automatic fingerprint verification system (AFVS) is always a big challenge in biometric verification field. Recently, it becomes popular to improve the performance of AFVS using ensemble learning approach to fuse related information of fingerprints. In this article, we propose a novel framework of fingerprint verification which is based on the multitemplate ensemble method. This framework is consisted of three stages. In the first stage, enrollment stage, we adopt an effective template selection method to select those fingerprints which best represent a finger, and then, a polyhedron is created by the matching results of multiple template fingerprints and a virtual centroid of the polyhedron is given. In the second stage, verification stage, we measure the distance between the centroid of the polyhedron and a query image. In the final stage, a fusion rule is used to choose a proper distance from a distance set. The experimental results on the FVC2004 database prove the improvement on the effectiveness of the new framework in fingerprint verification. With a minutiae-based matching method, the average EER of four databases in FVC2004 drops from 10.85 to 0.88, and with a ridge-based matching method, the average EER of these four databases also decreases from 14.58 to 2.51.

  16. Use of metaknowledge in the verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Morell, Larry J.

    1989-01-01

    Knowledge-based systems are modeled as deductive systems. The model indicates that the two primary areas of concern in verification are demonstrating consistency and completeness. A system is inconsistent if it asserts something that is not true of the modeled domain. A system is incomplete if it lacks deductive capability. Two forms of consistency are discussed along with appropriate verification methods. Three forms of incompleteness are discussed. The use of metaknowledge, knowledge about knowledge, is explored in connection to each form of incompleteness.

  17. Developing and Evaluating Patient Education Materials.

    ERIC Educational Resources Information Center

    Monsivais, Diane; Reynolds, Audree

    2003-01-01

    Discusses the rationale for nurse involvement in the development of patient education materials. Presents guidelines for evaluating existing material, including print and web resources, for credibility and readability. Makes recommendations for rewriting material at an easier-to-read level. (SK)

  18. Inferiority is compex

    NASA Astrophysics Data System (ADS)

    Wade, Jess

    2017-07-01

    In Inferior: How Science Got Women Wrong and the New Research That's Rewriting the Story, author Angela Saini puts forward the idea that bad science has been used to endorse the cultural prejudice that women are both biologically and psychologically second rate to men.

  19. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    PubMed

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.

  20. Feasibility study on dosimetry verification of volumetric-modulated arc therapy-based total marrow irradiation.

    PubMed

    Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K

    2013-03-04

    The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marleau, Peter; Brubaker, Erik; Deland, Sharon M.

    This report summarizes the discussion and conclusions reached during a table top exercise held at Sandia National Laboratories, Albuquerque on September 3, 2014 regarding a recently described approach for nuclear warhead verification based on the cryptographic concept of a zero-knowledge protocol (ZKP) presented in a recent paper authored by Glaser, Barak, and Goldston. A panel of Sandia National Laboratories researchers, whose expertise includes radiation instrumentation design and development, cryptography, and arms control verification implementation, jointly reviewed the paper and identified specific challenges to implementing the approach as well as some opportunities. It was noted that ZKP as used in cryptographymore » is a useful model for the arms control verification problem, but the direct analogy to arms control breaks down quickly. The ZKP methodology for warhead verification fits within the general class of template-based verification techniques, where a reference measurement is used to confirm that a given object is like another object that has already been accepted as a warhead by some other means. This can be a powerful verification approach, but requires independent means to trust the authenticity of the reference warhead - a standard that may be difficult to achieve, which the ZKP authors do not directly address. Despite some technical challenges, the concept of last-minute selection of the pre-loads and equipment could be a valuable component of a verification regime.« less

  2. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  3. Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder

    NASA Technical Reports Server (NTRS)

    Lindsey, A. E.; Pecheur, Charles

    2004-01-01

    AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.

  4. A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems

    DTIC Science & Technology

    2016-03-01

    insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of

  5. Verification of an on line in vivo semiconductor dosimetry system for TBI with two TLD procedures.

    PubMed

    Sánchez-Doblado, F; Terrón, J A; Sánchez-Nieto, B; Arráns, R; Errazquin, L; Biggs, D; Lee, C; Núñez, L; Delgado, A; Muñiz, J L

    1995-01-01

    This work presents the verification of an on line in vivo dosimetry system based on semiconductors. Software and hardware has been designed to convert the diode signal into absorbed dose. Final verification was made in the form of an intercomparison with two independent thermoluminiscent (TLD) dosimetry systems, under TBI conditions.

  6. Loglines. January - February 2012

    DTIC Science & Technology

    2012-02-01

    to that organization’s lineup since 00 , Lee said. Some of those were already used by commercial industry but simply needed to be brought into the...recently began updating and rewriting the Defense Logistics Acquisition Directive, the document that establishes DLA procedures relating to the

  7. Real Audiences and Contexts for LD Writers.

    ERIC Educational Resources Information Center

    Stires, Susan

    1983-01-01

    The process/conference model of writing instruction is described for intermediate-level learning disabled students. Students proceed through several stages of writing (rehearsal, drafting, revising, editing, and rewriting) during which they have conferences with the teacher and eventually publish their writing. (CL)

  8. Rewriting History.

    ERIC Educational Resources Information Center

    Ramirez, Catherine Clark

    1994-01-01

    Suggests that the telling of vivid stories can help engage elementary students' emotions and increase the chances of fostering an interest in Texas history. Suggests that incorporating elements of the process approach to writing can merge with social studies objectives in creating a curriculum for wisdom. (RS)

  9. The Schizophrenic Brain: Rewriting the Chapter.

    ERIC Educational Resources Information Center

    Greenberg, Joel

    1979-01-01

    Evidence of last two decades indicates schizophrenic disorders related to imbalance of brain chemicals. Recent discovery made of association between chronic schizophrenia and variety of structural abnormalities. Included are frontal lobe reversal and accipital lobe reversal. Computer tomography scans and data presented. (SA)

  10. The Grammatical Universe and the Laws of Thermodynamics and Quantum Entanglement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcer, Peter J.; Rowlands, Peter

    2010-11-24

    The universal nilpotent computational rewrite system (UNCRS) is shown to formalize an irreversible process of evolution in conformity with the First, Second and Third Laws of Thermodynamics, in terms of a single algebraic creation operator (ikE+ip+jm) which delivers the whole quantum mechanical language apparatus, where k, i, j are quaternions units and E, p, m are energy, momentum and rest mass. This nilpotent evolution describes 'a dynamic zero totality universe' in terms of its fermion states (each of which, by Pauli exclusion, is unique and nonzero), where, together with their boson interactions, these define physics at the fundamental level. (Themore » UNCRS implies that the inseparability of objects and fields in the quantum universe is based on the fact that the only valid mathematical representations are all automorphisms of the universe itself, and that this is the mathematical meaning of quantum entanglement. It thus appears that the nilpotent fermion states are in fact what is called the splitting field in Quantum Mechanics of the Galois group which leads to the roots of the corresponding algebraic equation, and concerns in this case the alternating group of even permutations which are themselves automorphisms). In the nilpotent evolutionary process: (i) the Quantum Carnot Engine (QCE) extended model of thermodynamic irreversibility, consisting of a single heat bath of an ensemble of Standard Model elementary particles, retains a small amount of quantum coherence / entanglement, so as to constitute new emergent fermion states of matter, and (ii) the metric (E{sup 2}-p{sup 2}m{sup 2}) = 0 ensures the First Law of the conservation of energy operates at each nilpotent stage, so that (iii) prior to each creation (and implied corresponding annihilation / conserve operation), E and m can be postulated to constitute dark energy and matter respectively. It says that the natural language form of the rewrite grammar of the evolution consists of the well known precepts of the Laws of Thermodynamics, formalized by the UNCRS regress, so as to become (as UNCRS rewrites already published at CASYS), firstly the Quantum Laws of Physics in the form of the generalized Dirac equation and later at higher stages of QCE ensemble complexity, the Laws of Life in the form of Nature's (DNA / RNA genetic) Code and then subsequently those of Intelligence and Consciousness (Nature's Rules).« less

  11. Isocenter verification for linac‐based stereotactic radiation therapy: review of principles and techniques

    PubMed Central

    Sabet, Mahsheed; O'Connor, Daryl J.; Greer, Peter B.

    2011-01-01

    There have been several manual, semi‐automatic and fully‐automatic methods proposed for verification of the position of mechanical isocenter as part of comprehensive quality assurance programs required for linear accelerator‐based stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments. In this paper, a systematic review has been carried out to discuss the present methods for isocenter verification and compare their characteristics, to help physicists in making a decision on selection of their quality assurance routine. PACS numbers: 87.53.Ly, 87.56.Fc, 87.56.‐v PMID:22089022

  12. Current status of 3D EPID-based in vivo dosimetry in The Netherlands Cancer Institute

    NASA Astrophysics Data System (ADS)

    Mijnheer, B.; Olaciregui-Ruiz, I.; Rozendaal, R.; Spreeuw, H.; van Herk, M.; Mans, A.

    2015-01-01

    3D in vivo dose verification using a-Si EPIDs is performed routinely in our institution for almost all RT treatments. The EPID-based 3D dose distribution is reconstructed using a back-projection algorithm and compared with the planned dose distribution using 3D gamma evaluation. Dose-reconstruction and gamma-evaluation software runs automatically, and deviations outside the alert criteria are immediately available and investigated, in combination with inspection of cone-beam CT scans. The implementation of our 3D EPID- based in vivo dosimetry approach was able to replace pre-treatment verification for more than 90% of the patient treatments. Clinically relevant deviations could be detected for approximately 1 out of 300 patient treatments (IMRT and VMAT). Most of these errors were patient related anatomical changes or deviations from the routine clinical procedure, and would not have been detected by pre-treatment verification. Moreover, 3D EPID-based in vivo dose verification is a fast and accurate tool to assure the safe delivery of RT treatments. It provides clinically more useful information and is less time consuming than pre-treatment verification measurements. Automated 3D in vivo dosimetry is therefore a prerequisite for large-scale implementation of patient-specific quality assurance of RT treatments.

  13. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    NASA Astrophysics Data System (ADS)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  14. Resolution verification targets for airborne and spaceborne imaging systems at the Stennis Space Center

    NASA Astrophysics Data System (ADS)

    McKellip, Rodney; Yuan, Ding; Graham, William; Holland, Donald E.; Stone, David; Walser, William E.; Mao, Chengye

    1997-06-01

    The number of available spaceborne and airborne systems will dramatically increase over the next few years. A common systematic approach toward verification of these systems will become important for comparing the systems' operational performance. The Commercial Remote Sensing Program at the John C. Stennis Space Center (SSC) in Mississippi has developed design requirements for a remote sensing verification target range to provide a means to evaluate spatial, spectral, and radiometric performance of optical digital remote sensing systems. The verification target range consists of spatial, spectral, and radiometric targets painted on a 150- by 150-meter concrete pad located at SSC. The design criteria for this target range are based upon work over a smaller, prototypical target range at SSC during 1996. This paper outlines the purpose and design of the verification target range based upon an understanding of the systems to be evaluated as well as data analysis results from the prototypical target range.

  15. Aided generation of search interfaces to astronomical archives

    NASA Astrophysics Data System (ADS)

    Zorba, Sonia; Bignamini, Andrea; Cepparo, Francesco; Knapic, Cristina; Molinaro, Marco; Smareglia, Riccardo

    2016-07-01

    Astrophysical data provider organizations that host web based interfaces to provide access to data resources have to cope with possible changes in data management that imply partial rewrites of web applications. To avoid doing this manually it was decided to develop a dynamically configurable Java EE web application that can set itself up reading needed information from configuration files. Specification of what information the astronomical archive database has to expose is managed using the TAP SCHEMA schema from the IVOA TAP recommendation, that can be edited using a graphical interface. When configuration steps are done the tool will build a war file to allow easy deployment of the application.

  16. A Comparison between Strand Spaces and Multiset Rewriting for Security Protocol Analysis

    DTIC Science & Technology

    2005-01-01

    directed labeled graph GL is a structure (S,−→, L , Λ) where (S,−→) is a directed graph, L is a set of labels, and Λ : S → L is a labeling function that...particular, for ν ∈ S and l ∈ L , we will write “ν = l ” as an abbreviation of Λ(ν) = l . However, for ν1, ν2 ∈ S, expressions of the form “ν 1 = ν2” shall...appeared in [4]. First-order formalisms were considered only several years later in the classical work of Berry and Boudol [2], whose state-based

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION: Reduction of Nitrogen in Domestic Wastewater from Individual Residential Homes. BioConcepts, Inc. ReCip® RTS ~ 500 System

    EPA Science Inventory

    Verification testing of the ReCip® RTS-500 System was conducted over a 12-month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located on Otis Air National Guard Base in Bourne, Massachusetts. A nine-week startup period preceded the verification test t...

  18. A robust method using propensity score stratification for correcting verification bias for binary tests

    PubMed Central

    He, Hua; McDermott, Michael P.

    2012-01-01

    Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650

  19. Gaia challenging performances verification: combination of spacecraft models and test results

    NASA Astrophysics Data System (ADS)

    Ecale, Eric; Faye, Frédéric; Chassat, François

    2016-08-01

    To achieve the ambitious scientific objectives of the Gaia mission, extremely stringent performance requirements have been given to the spacecraft contractor (Airbus Defence and Space). For a set of those key-performance requirements (e.g. end-of-mission parallax, maximum detectable magnitude, maximum sky density or attitude control system stability), this paper describes how they are engineered during the whole spacecraft development process, with a focus on the end-to-end performance verification. As far as possible, performances are usually verified by end-to-end tests onground (i.e. before launch). However, the challenging Gaia requirements are not verifiable by such a strategy, principally because no test facility exists to reproduce the expected flight conditions. The Gaia performance verification strategy is therefore based on a mix between analyses (based on spacecraft models) and tests (used to directly feed the models or to correlate them). Emphasis is placed on how to maximize the test contribution to performance verification while keeping the test feasible within an affordable effort. In particular, the paper highlights the contribution of the Gaia Payload Module Thermal Vacuum test to the performance verification before launch. Eventually, an overview of the in-flight payload calibration and in-flight performance verification is provided.

  20. "How Do You Spell 'Caught'?"

    ERIC Educational Resources Information Center

    Broderick, Conne

    1995-01-01

    Discusses a strategy that provides a structure for students to do what adults do naturally: determine the correct spelling of a word by rewriting it until it looks right. Notes that the technique can be incorporated into the classroom as a regular editing technique. (RS)

  1. 76 FR 63640 - Public Housing Assessment System (PHAS): Proposed Physical Condition Interim Scoring Notice

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... call-for-aid is a system designed to provide elderly residents the opportunity to call for help in the... open. This bars that are change also rewrites designed to open the Level 3 should open. If they...

  2. Stories and Storytelling in Extension Work

    ERIC Educational Resources Information Center

    Peters, Scott; Franz, Nancy K.

    2012-01-01

    Deep budget cuts, increased accountability, and the growth of anti-government and anti-intellectual sentiments place Extension systems in a defensive position. In response, we're engaging in organizational change exercises, restructuring, regionalizing, rewriting mission statements, and developing strategic plans. We're spending…

  3. That's a "Wrap."

    ERIC Educational Resources Information Center

    Gillespie, Patricia

    1995-01-01

    A secondary teacher in Hawaii's Kamehameha schools describes how she teaches her students about video and television production. Because it is a school for native Hawaiians, the program emphasizes cultural documentation. Through the program, students learn to research, interview, organize video footage, write, rewrite, and use technology. (SM)

  4. Integrity Verification for Multiple Data Copies in Cloud Storage Based on Spatiotemporal Chaos

    NASA Astrophysics Data System (ADS)

    Long, Min; Li, You; Peng, Fei

    Aiming to strike for a balance between the security, efficiency and availability of the data verification in cloud storage, a novel integrity verification scheme based on spatiotemporal chaos is proposed for multiple data copies. Spatiotemporal chaos is implemented for node calculation of the binary tree, and the location of the data in the cloud is verified. Meanwhile, dynamic operation can be made to the data. Furthermore, blind information is used to prevent a third-party auditor (TPA) leakage of the users’ data privacy in a public auditing process. Performance analysis and discussion indicate that it is secure and efficient, and it supports dynamic operation and the integrity verification of multiple copies of data. It has a great potential to be implemented in cloud storage services.

  5. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  6. UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces

    NASA Technical Reports Server (NTRS)

    Shiffman, Smadar; Degani, Asaf; Heymann, Michael

    2004-01-01

    In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.

  7. Information verification and encryption based on phase retrieval with sparsity constraints and optical inference

    NASA Astrophysics Data System (ADS)

    Zhong, Shenlu; Li, Mengjiao; Tang, Xiajie; He, Weiqing; Wang, Xiaogang

    2017-01-01

    A novel optical information verification and encryption method is proposed based on inference principle and phase retrieval with sparsity constraints. In this method, a target image is encrypted into two phase-only masks (POMs), which comprise sparse phase data used for verification. Both of the two POMs need to be authenticated before being applied for decrypting. The target image can be optically reconstructed when the two authenticated POMs are Fourier transformed and convolved by the correct decryption key, which is also generated in encryption process. No holographic scheme is involved in the proposed optical verification and encryption system and there is also no problem of information disclosure in the two authenticable POMs. Numerical simulation results demonstrate the validity and good performance of this new proposed method.

  8. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    ... developed CBSV as a user- friendly, internet-based application with safeguards that protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to users in a secure manner, CBSV provides us with cost and workload management benefits. New Information...

  9. Model-Based Building Verification in Aerial Photographs.

    DTIC Science & Technology

    1987-09-01

    Powers ’ordon E. Schacher Chaii nan Dean of Science and Electrical and Computer Engineering Engineering "p. 5.€ ’ ,’"..€ € . € -, _ _ . ."€ . 4...paper, we have proposed an ex)erimental knowledge-based " verification syste, te organization for change (letection is oitliinet. , Kowledge rules and

  10. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  11. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  12. Elementary Particle Spectroscopy in Regular Solid Rewrite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trell, Erik

    The Nilpotent Universal Computer Rewrite System (NUCRS) has operationalized the radical ontological dilemma of Nothing at All versus Anything at All down to the ground recursive syntax and principal mathematical realisation of this categorical dichotomy as such and so governing all its sui generis modalities, leading to fulfilment of their individual terms and compass when the respective choice sequence operations are brought to closure. Focussing on the general grammar, NUCRS by pure logic and its algebraic notations hence bootstraps Quantum Mechanics, aware that it ''is the likely keystone of a fundamental computational foundation'' also for e.g. physics, molecular biology andmore » neuroscience. The present work deals with classical geometry where morphology is the modality, and ventures that the ancient regular solids are its specific rewrite system, in effect extensively anticipating the detailed elementary particle spectroscopy, and further on to essential structures at large both over the inorganic and organic realms. The geodetic antipode to Nothing is extension, with natural eigenvector the endless straight line which when deployed according to the NUCRS as well as Plotelemeian topographic prescriptions forms a real three-dimensional eigenspace with cubical eigenelements where observed quark-skewed quantum-chromodynamical particle events self-generate as an Aristotelean phase transition between the straight and round extremes of absolute endlessness under the symmetry- and gauge-preserving, canonical coset decomposition SO(3)xO(5) of Lie algebra SU(3). The cubical eigen-space and eigen-elements are the parental state and frame, and the other solids are a range of transition matrix elements and portions adapting to the spherical root vector symmetries and so reproducibly reproducing the elementary particle spectroscopy, including a modular, truncated octahedron nano-composition of the Electron which piecemeal enter into molecular structures or compressed to each other fuse into atomic honeycombs of periodic table signature.« less

  13. MR Imaging Based Treatment Planning for Radiotherapy of Prostate Cancer

    DTIC Science & Technology

    2007-02-01

    developed practical methods for heterogeneity correction for MRI - based dose calculations (Chen et al 2007). 6) We will use existing Monte Carlo ... Monte Carlo verification of IMRT dose distributions from a commercial treatment planning optimization system, Phys. Med. Biol., 45:2483-95 (2000) Ma...accuracy and consistency for MR based IMRT treatment planning for prostate cancer. A short paper entitled “ Monte Carlo dose verification of MR image based

  14. Mutation Testing for Effective Verification of Digital Components of Physical Systems

    NASA Astrophysics Data System (ADS)

    Kushik, N. G.; Evtushenko, N. V.; Torgaev, S. N.

    2015-12-01

    Digital components of modern physical systems are often designed applying circuitry solutions based on the field programmable gate array technology (FPGA). Such (embedded) digital components should be carefully tested. In this paper, an approach for the verification of digital physical system components based on mutation testing is proposed. The reference description of the behavior of a digital component in the hardware description language (HDL) is mutated by introducing into it the most probable errors and, unlike mutants in high-level programming languages, the corresponding test case is effectively derived based on a comparison of special scalable representations of the specification and the constructed mutant using various logic synthesis and verification systems.

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL RESIDENTIAL HOMES, WATERLOO BIOFILTER® MODEL 4-BEDROOM (NSF 02/03/WQPC-SWP)

    EPA Science Inventory

    Verification testing of the Waterloo Biofilter Systems (WBS), Inc. Waterloo Biofilter® Model 4-Bedroom system was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at Otis Air National Guard Base in Bourne, Mas...

  16. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, AQUAPOINT, INC. BIOCLERE MODEL 16/12 - 02/02/WQPC-SWP

    EPA Science Inventory

    Verification testing of the Aquapoint, Inc. (AQP) BioclereTM Model 16/12 was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC), located at Otis Air National Guard Base in Bourne, Massachusetts. Sanitary sewerage from the ba...

  18. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  19. Reading Educational Philosophies in "Freedom Writers"

    ERIC Educational Resources Information Center

    Choi, Jung-Ah

    2009-01-01

    The 2007 film "Freedom Writers" portrays the real-life experiences of Erin Gruwell, a teacher at an inner-city high school in Long Beach, California. This article discusses the educational theories underpinning Gruwell's pedagogical practice, as seen in "Freedom Writers", and identifies four themes--rewriting curriculum,…

  20. 78 FR 31879 - General Services Administration Acquisition Regulation (GSAR); Electronic Contracting Initiative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-28

    ...); Electronic Contracting Initiative (ECI) AGENCY: Office of Acquisition Policy, General Services Administration..., Electronic Contracting Initiative, by any of the following methods: Regulations.gov : http://www.regulations... the rewrite of GSAR Part 538, Electronic Contracting Initiative (Modifications). On December 17, 2012...

  1. Revealing, Reinterpreting, Rewriting Mujeres

    ERIC Educational Resources Information Center

    Preuss, Cara Lynne; Saavedra, Cinthya M.

    2014-01-01

    This paper reanalyzed research previously conducted with Spanish-speaking childcare providers who participated in an educational literacy program. The women in the program were generally framed as the deficient other--illiterate, immigrant women. The authors used a critical framework and Chicana/Latina feminist methodologies, namely "pláticas…

  2. The Challenges and Rewards of Teaching Spanish in a Community College Prison Program

    ERIC Educational Resources Information Center

    Palomino, Erick Nava; Ragsdale, Lee

    2015-01-01

    Two authors describe how teaching Spanish in an Illinois prison led them to rewrite the examples used in a Spanish textbook and engage incarcerated students in novel ways in order to make up for the lack of conventional classroom resources.

  3. Individual Differences in Reprocessing of Text.

    ERIC Educational Resources Information Center

    Haenggi, Dieter; Perfetti, Charles A.

    1992-01-01

    Decoding, working memory, and domain-specific prior knowledge were studied as predictors of comprehension for 48 university undergraduate students after rewriting notes, rereading notes, or rereading a text. Working memory was most important for comprehension of text-implicit information, whereas knowledge was relatively more important for…

  4. ReWritable Data Storage on DVD by Using Phase Change Technology

    NASA Astrophysics Data System (ADS)

    Kleine, H.; Martin, F.; Kapeller, M.; Cord, B.; Ebinger, H.

    It is expected that the next few years the VHS casette will be replaced by rewritable Digital Versatile Discs (DVD) for home video recording. At this moment three different standards DVD+RW, DVD-RW and DVD-RAM exist, out of which the DVD+RW is expected to dominate the market in Europe and the United States. The disc holds 4.7 GB of computer data, which is equivalent to several hours of high quality video content. At the heart of the disc is a thin film layer stack with a special phase change recording layer. By proper laser irradiation the disc can be overwritten up to 1000 times without noticeable quality loss. A shelf lifetime of 20-50 years is anticipated. With these characteristics the disc is well suited for consumer applications. The present article illuminates how a process engineer can control the disc recording sensitivity, the recording speed and the number of overwriting cycles by the design of the thin film layer stack.

  5. Rewritable three-dimensional holographic data storage via optical forces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yetisen, Ali K., E-mail: ayetisen@mgh.harvard.edu; Harvard-MIT Division of Health Sciences and Technology, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139; Montelongo, Yunuen

    2016-08-08

    The development of nanostructures that can be reversibly arranged and assembled into 3D patterns may enable optical tunability. However, current dynamic recording materials such as photorefractive polymers cannot be used to store information permanently while also retaining configurability. Here, we describe the synthesis and optimization of a silver nanoparticle doped poly(2-hydroxyethyl methacrylate-co-methacrylic acid) recording medium for reversibly recording 3D holograms. We theoretically and experimentally demonstrate organizing nanoparticles into 3D assemblies in the recording medium using optical forces produced by the gradients of standing waves. The nanoparticles in the recording medium are organized by multiple nanosecond laser pulses to produce reconfigurablemore » slanted multilayer structures. We demonstrate the capability of producing rewritable optical elements such as multilayer Bragg diffraction gratings, 1D photonic crystals, and 3D multiplexed optical gratings. We also show that 3D virtual holograms can be reversibly recorded. This recording strategy may have applications in reconfigurable optical elements, data storage devices, and dynamic holographic displays.« less

  6. Rewritable ghost floating gates by tunnelling triboelectrification for two-dimensional electronics

    PubMed Central

    Kim, Seongsu; Kim, Tae Yun; Lee, Kang Hyuck; Kim, Tae-Ho; Cimini, Francesco Arturo; Kim, Sung Kyun; Hinchet, Ronan; Kim, Sang-Woo; Falconi, Christian

    2017-01-01

    Gates can electrostatically control charges inside two-dimensional materials. However, integrating independent gates typically requires depositing and patterning suitable insulators and conductors. Moreover, after manufacturing, gates are unchangeable. Here we introduce tunnelling triboelectrification for localizing electric charges in very close proximity of two-dimensional materials. As representative materials, we use chemical vapour deposition graphene deposited on a SiO2/Si substrate. The triboelectric charges, generated by friction with a Pt-coated atomic force microscope tip and injected through defects, are trapped at the air–SiO2 interface underneath graphene and act as ghost floating gates. Tunnelling triboelectrification uniquely permits to create, modify and destroy p and n regions at will with the spatial resolution of atomic force microscopes. As a proof of concept, we draw rewritable p/n+ and p/p+ junctions with resolutions as small as 200 nm. Our results open the way to time-variant two-dimensional electronics where conductors, p and n regions can be defined on demand. PMID:28649986

  7. Rewritable ghost floating gates by tunnelling triboelectrification for two-dimensional electronics

    NASA Astrophysics Data System (ADS)

    Kim, Seongsu; Kim, Tae Yun; Lee, Kang Hyuck; Kim, Tae-Ho; Cimini, Francesco Arturo; Kim, Sung Kyun; Hinchet, Ronan; Kim, Sang-Woo; Falconi, Christian

    2017-06-01

    Gates can electrostatically control charges inside two-dimensional materials. However, integrating independent gates typically requires depositing and patterning suitable insulators and conductors. Moreover, after manufacturing, gates are unchangeable. Here we introduce tunnelling triboelectrification for localizing electric charges in very close proximity of two-dimensional materials. As representative materials, we use chemical vapour deposition graphene deposited on a SiO2/Si substrate. The triboelectric charges, generated by friction with a Pt-coated atomic force microscope tip and injected through defects, are trapped at the air-SiO2 interface underneath graphene and act as ghost floating gates. Tunnelling triboelectrification uniquely permits to create, modify and destroy p and n regions at will with the spatial resolution of atomic force microscopes. As a proof of concept, we draw rewritable p/n+ and p/p+ junctions with resolutions as small as 200 nm. Our results open the way to time-variant two-dimensional electronics where conductors, p and n regions can be defined on demand.

  8. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thoelking, J; Yuvaraj, S; Jens, F

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference)more » and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan verification. Funding Support, Disclosures, and Conflict of Interest: COIs: Frank Lohr: Elekta: research grant, travel grants, teaching honoraria IBA: research grant, travel grants, teaching honoraria, advisory board C-Rad: board honoraria, travel grants Frederik Wenz: Elekta: research grant, teaching honoraria, consultant, advisory board Zeiss: research grant, teaching honoraria, patent Hansjoerg Wertz: Elekta: research grant, teaching honoraria IBA: research grant.« less

  9. Online 3D EPID-based dose verification: Proof of concept.

    PubMed

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.

  10. Improved Hip-Based Individual Recognition Using Wearable Motion Recording Sensor

    NASA Astrophysics Data System (ADS)

    Gafurov, Davrondzhon; Bours, Patrick

    In todays society the demand for reliable verification of a user identity is increasing. Although biometric technologies based on fingerprint or iris can provide accurate and reliable recognition performance, they are inconvenient for periodic or frequent re-verification. In this paper we propose a hip-based user recognition method which can be suitable for implicit and periodic re-verification of the identity. In our approach we use a wearable accelerometer sensor attached to the hip of the person, and then the measured hip motion signal is analysed for identity verification purposes. The main analyses steps consists of detecting gait cycles in the signal and matching two sets of detected gait cycles. Evaluating the approach on a hip data set consisting of 400 gait sequences (samples) from 100 subjects, we obtained equal error rate (EER) of 7.5% and identification rate at rank 1 was 81.4%. These numbers are improvements by 37.5% and 11.2% respectively of the previous study using the same data set.

  11. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  12. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  13. Toward a theory of distributed word expert natural language parsing

    NASA Technical Reports Server (NTRS)

    Rieger, C.; Small, S.

    1981-01-01

    An approach to natural language meaning-based parsing in which the unit of linguistic knowledge is the word rather than the rewrite rule is described. In the word expert parser, knowledge about language is distributed across a population of procedural experts, each representing a word of the language, and each an expert at diagnosing that word's intended usage in context. The parser is structured around a coroutine control environment in which the generator-like word experts ask questions and exchange information in coming to collective agreement on sentence meaning. The word expert theory is advanced as a better cognitive model of human language expertise than the traditional rule-based approach. The technical discussion is organized around examples taken from the prototype LISP system which implements parts of the theory.

  14. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  15. Method for secure electronic voting system: face recognition based approach

    NASA Astrophysics Data System (ADS)

    Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran

    2017-06-01

    In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.

  16. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  17. Marker-based quantification of interfractional tumor position variation and the use of markers for setup verification in radiation therapy for esophageal cancer.

    PubMed

    Jin, Peng; van der Horst, Astrid; de Jong, Rianne; van Hooft, Jeanin E; Kamphuis, Martijn; van Wieringen, Niek; Machiels, Melanie; Bel, Arjan; Hulshof, Maarten C C M; Alderliesten, Tanja

    2015-12-01

    The aim of this study was to quantify interfractional esophageal tumor position variation using markers and investigate the use of markers for setup verification. Sixty-five markers placed in the tumor volumes of 24 esophageal cancer patients were identified in computed tomography (CT) and follow-up cone-beam CT. For each patient we calculated pairwise distances between markers over time to evaluate geometric tumor volume variation. We then quantified marker displacements relative to bony anatomy and estimated the variation of systematic (Σ) and random errors (σ). During bony anatomy-based setup verification, we visually inspected whether the markers were inside the planning target volume (PTV) and attempted marker-based registration. Minor time trends with substantial fluctuations in pairwise distances implied tissue deformation. Overall, Σ(σ) in the left-right/cranial-caudal/anterior-posterior direction was 2.9(2.4)/4.1(2.4)/2.2(1.8) mm; for the proximal stomach, it was 5.4(4.3)/4.9(3.2)/1.9(2.4) mm. After bony anatomy-based setup correction, all markers were inside the PTV. However, due to large tissue deformation, marker-based registration was not feasible. Generally, the interfractional position variation of esophageal tumors is more pronounced in the cranial-caudal direction and in the proximal stomach. Currently, marker-based setup verification is not feasible for clinical routine use, but markers can facilitate the setup verification by inspecting whether the PTV covers the tumor volume adequately. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. A Ten-Year Reflection

    ERIC Educational Resources Information Center

    Phillip, Cyndi

    2016-01-01

    Five initiatives launched during Cyndi Phillip's term as American Association of School Librarians (AASL) President (2006-2007) continue to have an impact on school librarians ten years later. They include the rewriting of AASL's learning standards, introduction of the SKILLS Act, the presentation of the Crystal Apple Award to Scholastic Library…

  19. Teaching about the French Revolution--A Play.

    ERIC Educational Resources Information Center

    Pezone, Michael

    2002-01-01

    Presents a play about the French Revolution, discussing how the play was used within a global history course. States that students read the play, work in groups to rewrite the play, and perform their version of the play. Includes key questions that are asked of the students. (CMK)

  20. Rewriting the Journal

    ERIC Educational Resources Information Center

    Fredette, Michelle

    2012-01-01

    With faculty balking at the price of academic journals, can other digital publishing options get traction? University libraries are no strangers to one of the most popular online alternatives, the open-access archive. These archives enable scholars to upload work--including drafts of articles that are published later in subscription journals--so…

  1. Upgrade of U.S. EPA's Experimental Stream Facility Supervisory Control and Data Acquisition System

    EPA Science Inventory

    The Supervisory control and data acquisition (SCADA) system for the U.S. EPA’s Experimental Stream Facility (ESF) was upgraded using Camile hardware and software in 2015. The upgrade added additional hardwired connections, new wireless capabilities, and included a complete rewrit...

  2. Development of an Executive Level Stock Fund Handbook

    DTIC Science & Technology

    1990-09-01

    graphics assistance, thorough proof-reading of countless rewrites, and prayer support. I want to thank Heike, my wife to be, for her tremendous patience...Usability........................... 46 Format Design ....................... 50 III. Methodology.................................... 52 Overall...54 Panel of Experts ....................57 Interview Instrument ................57 Handbook Format Design ............. 58 Handbook Usability

  3. Ottawa County Writing Process Model for PPO Assessments.

    ERIC Educational Resources Information Center

    Ottawa County Office of Education, OH.

    This guide outlines the writing procedures for English Composition Pupil Performance Objective (PPO) assessments and tests. Procedures for both students and teachers are included for the prewriting, first draft writing, and revising/rewriting sessions. A brief guide to evaluation procedures and intervention strategies is also provided. (MM)

  4. Do It Now!

    ERIC Educational Resources Information Center

    Conners, Keith J.

    1995-01-01

    One college teacher's approach to the problem of student procrastination in research paper writing has been to implement a liberal policy concerning deadlines that includes incentives for early submission of work, such as more extensive feedback and options for rewriting. The policy has had modest success and is appreciated by students for…

  5. Teaching Integer Operations Using Ring Theory

    ERIC Educational Resources Information Center

    Hirsch, Jenna

    2012-01-01

    A facility with signed numbers forms the basis for effective problem solving throughout developmental mathematics. Most developmental mathematics textbooks explain signed number operations using absolute value, a method that involves considering the problem in several cases (same sign, opposite sign), and in the case of subtraction, rewriting the…

  6. Offline signature verification using convolution Siamese network

    NASA Astrophysics Data System (ADS)

    Xing, Zi-Jian; Yin, Fei; Wu, Yi-Chao; Liu, Cheng-Lin

    2018-04-01

    This paper presents an offline signature verification approach using convolutional Siamese neural network. Unlike the existing methods which consider feature extraction and metric learning as two independent stages, we adopt a deepleaning based framework which combines the two stages together and can be trained end-to-end. The experimental results on two offline public databases (GPDSsynthetic and CEDAR) demonstrate the superiority of our method on the offline signature verification problem.

  7. QPF verification using different radar-based analyses: a case study

    NASA Astrophysics Data System (ADS)

    Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.

    2009-09-01

    Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.

  8. Bayesian Estimation of Combined Accuracy for Tests with Verification Bias

    PubMed Central

    Broemeling, Lyle D.

    2011-01-01

    This presentation will emphasize the estimation of the combined accuracy of two or more tests when verification bias is present. Verification bias occurs when some of the subjects are not subject to the gold standard. The approach is Bayesian where the estimation of test accuracy is based on the posterior distribution of the relevant parameter. Accuracy of two combined binary tests is estimated employing either “believe the positive” or “believe the negative” rule, then the true and false positive fractions for each rule are computed for two tests. In order to perform the analysis, the missing at random assumption is imposed, and an interesting example is provided by estimating the combined accuracy of CT and MRI to diagnose lung cancer. The Bayesian approach is extended to two ordinal tests when verification bias is present, and the accuracy of the combined tests is based on the ROC area of the risk function. An example involving mammography with two readers with extreme verification bias illustrates the estimation of the combined test accuracy for ordinal tests. PMID:26859487

  9. Verification of road databases using multiple road models

    NASA Astrophysics Data System (ADS)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  10. Environmental Technology Verification: Test Report of Mobile Source Selective Catalytic Reduction--Nett Technologies, Inc., BlueMAX 100 version A urea-based selective catalytic reduction technology

    EPA Science Inventory

    Nett Technologies’ BlueMAX 100 version A Urea-Based SCR System utilizes a zeolite catalyst coating on a cordierite honeycomb substrate for heavy-duty diesel nonroad engines for use with commercial ultra-low–sulfur diesel fuel. This environmental technology verification (ETV) repo...

  11. Face verification system for Android mobile devices using histogram based features

    NASA Astrophysics Data System (ADS)

    Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu

    2016-07-01

    This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.

  12. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  13. YIP Formal Synthesis of Software-Based Control Protocols for Fractionated,Composable Autonomous Systems

    DTIC Science & Technology

    2016-07-08

    Systems Using Automata Theory and Barrier Certifi- cates We developed a sound but incomplete method for the computational verification of specifications...method merges ideas from automata -based model checking with those from control theory including so-called barrier certificates and optimization-based... Automata theory meets barrier certificates: Temporal logic verification of nonlinear systems,” IEEE Transactions on Automatic Control, 2015. [J2] R

  14. Verification and Validation of KBS with Neural Network Components

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Callahan, John

    1996-01-01

    Artificial Neural Network (ANN) play an important role in developing robust Knowledge Based Systems (KBS). The ANN based components used in these systems learn to give appropriate predictions through training with correct input-output data patterns. Unlike traditional KBS that depends on a rule database and a production engine, the ANN based system mimics the decisions of an expert without specifically formulating the if-than type of rules. In fact, the ANNs demonstrate their superiority when such if-then type of rules are hard to generate by human expert. Verification of traditional knowledge based system is based on the proof of consistency and completeness of the rule knowledge base and correctness of the production engine.These techniques, however, can not be directly applied to ANN based components.In this position paper, we propose a verification and validation procedure for KBS with ANN based components. The essence of the procedure is to obtain an accurate system specification through incremental modification of the specifications using an ANN rule extraction algorithm.

  15. RF model of the distribution system as a communication channel, phase 2. Volume 1: Summary Report

    NASA Technical Reports Server (NTRS)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    The design, implementation, and verification of a computerized model for predicting the steady-state sinusoidal response of radial (tree) configured distribution feeders was undertaken. That work demonstrated the feasibility and validity based on verification measurements made on a limited size portion of an actual live feeder. On that basis a follow-on effort concerned with (1) extending the verification based on a greater variety of situations and network size, (2) extending the model capabilities for reverse direction propagation, (3) investigating parameter sensitivities, (4) improving transformer models, and (5) investigating procedures/fixes for ameliorating propagation trouble spots was conducted. Results are summarized.

  16. Rewriting Professional Development: Professional Learning Communities in an Urban Charter School

    ERIC Educational Resources Information Center

    Glasheen, Gregory J.

    2017-01-01

    This study challenges traditional professional development models, in which teachers are positioned as receptacles for knowledge and "best practices". This type of professional development devalues the local knowledge teachers possess, their theories of practice (Lytle & Cochran-Smith, 1994), and their ability to reflect on their…

  17. Trident Warrior Buoy Testing

    DTIC Science & Technology

    2013-09-30

    the data to track swell events, accurately model swell refraction, and use the data to drive surf -forecasting and other nearshore models (e.g...Temperature (SST). • Addition of a 8GB Micro-SD card for on board time series storage (can be unpopulated or disabled ). • A complete rewrite of the

  18. Faculty Mothers

    ERIC Educational Resources Information Center

    Stockdell-Giesler, Anne; Ingalls, Rebecca

    2007-01-01

    This essay argues that it is time to rewrite the rhetoric of motherhood in higher education, and use American Association of University Professors (AAUP ) recommendations to help. The authors observe that although the AAUP and other groups have urged colleges and universities to strike a work-life balance, academic culture is slow to change, and…

  19. Defense.gov - Special Report: Veterans Employment

    Science.gov Websites

    . Veterans' employment case manager Angela Eberle helped Rivera rewrite his resume and translated his Obama launched the Veterans Employment Center, the first online one-stop shopping tool for veterans Jobs Troops, Vets Want 'Fair Shot' at Employment, Battaglia Says First Lady Asks Governors to Aid

  20. Pandora’s Box: Lethally-Armed Ground Robots in Operations in Iraq and Afghanistan

    DTIC Science & Technology

    2010-10-27

    Others debate whether Isaac Asimov ‟s famous Three Laws of Robotics (featured in his book I, Robot and the movie of the same name) could be applied in...http://www.tgdaily.com/hardware-features/43441-engineers-rewrite- asimovs -three- laws (accessed 7 September 2010).

  1. Apologizing in Italian and English.

    ERIC Educational Resources Information Center

    Lipson, Maxine

    1994-01-01

    Compared apology exchanges in Italy and the United States by having 10 Italian university students view American situation comedy television programs and rewrite particular conflict and apology exchanges in an Italian context. The status and role of the programs' participants affected the Italian students' choice of apology strategies more so than…

  2. Rewriting Writing in Higher Education: The Contested Spaces of Proofreading

    ERIC Educational Resources Information Center

    Turner, Joan

    2011-01-01

    This article reports on a research project on proofreading, prompted by its proliferation in contemporary higher education. The article is framed by an academic literacies perspective and develops the concept of "writtenness", which draws attention to both the underlying culturally and socially constructed values relating to the…

  3. The Body Disciplined: Rewriting Teaching Competence and the Doctrine of Reflection

    ERIC Educational Resources Information Center

    Erlandson, Peter

    2005-01-01

    Shortly after the publication of "The Reflective Practitioner" (1983) and the sequel "Educating the Reflective Practitioner" (1987) "reflection-in-action" became a major concept in teacher education. The concept has, however, been criticised on ontological/epistemological as well as practice oriented accounts (Van Manen, 1995; Newman, 1999;…

  4. Digitizing Craft: Creative Writing Studies and New Media--A Proposal

    ERIC Educational Resources Information Center

    Koehler, Adam

    2013-01-01

    This article identifies and examines a digital arm of creative writing studies and organizes that proposal into four categories through which to theorize the "craft" of creative production, each borrowed from Tim Mayers's "(Re)Writing Craft: Composition, Creative Writing, and the Future of English Studies": process, genre, author, and…

  5. Life without Scan-Tron: Tests as Thinking.

    ERIC Educational Resources Information Center

    Posner, Richard

    1987-01-01

    Claims that written tests are superior to objective, scan-tron tests in literature, composition, and vocabulary because they require students to think on paper. Describes the following types of in-class written tests and examines the advantages of each: literary essay, topical composition, imitation, brief answer, timed rewrites, and vocabulary…

  6. Guidelines for Writing (or Rewriting) Manuals for Instructional Software.

    ERIC Educational Resources Information Center

    Litchfield, Brenda C.

    1990-01-01

    Discusses the need for adequate student user manuals for computer software and presents guidelines to help teachers develop these manuals. The sections that a student manual should contain are outlined, including objectives, pretests and posttests for self-evaluation, and worksheets; and examples are given for further clarification. (LRW)

  7. The Outlook for Computer Professions: 1985 Rewrites the Program.

    ERIC Educational Resources Information Center

    Drake, Larry

    1986-01-01

    The author states that graduates of junior college programs who learn COBOL will continue to find jobs, but employers will increasingly seek college graduates when filling positions for computer programers and systems analysts. Areas of growth for computer applications (services, military, data communications, and artificial intelligence) are…

  8. On Grandmother Neurons and Grandfather Clocks

    ERIC Educational Resources Information Center

    Perkins, David

    2009-01-01

    What does contemporary neuroscience offer educational practice? The promise seems immense, as we come to understand better how the brain learns. However, critics caution that only a few concrete implications for practice have emerged, nowhere near a rewrite of the craft of teaching and learning. How then can we understand better the relationship…

  9. The Emergence of the American University: An International Perspective

    ERIC Educational Resources Information Center

    Nelson, Adam R.

    2005-01-01

    In 1979, fourteen years after publishing his landmark work, "The Emergence of the American University," Laurence R. Veysey wrote a forward-looking article for the "American Quarterly" titled "The Autonomy of American History Reconsidered." In his article, Veysey suggested that the time had come to rewrite American…

  10. Conversion of PCDP Dialogs.

    ERIC Educational Resources Information Center

    Bork, Alfred M.

    An introduction to the problems involved in conversion of computer dialogues from one computer language to another is presented. Conversion of individual dialogues by complete rewriting is straightforward, if tedious. To make a general conversion of a large group of heterogeneous dialogue material from one language to another at one step is more…

  11. Into Print: A Practical Guide to Writing, Illustrating, and Publishing.

    ERIC Educational Resources Information Center

    Hill, Mary; Cochran, Wendell

    Designed for writers of nonfiction, this publication provides practical suggestions for writing books and getting them published. The 25 chapters discuss the following topics: planning for book publication, keeping track of money and facts, getting started writing, rewriting, writing reviews of other books, typing and labeling the manuscript,…

  12. Nontraditional Forms of Assessment

    ERIC Educational Resources Information Center

    O'Neal, Brooke

    2015-01-01

    The author discusses the benefits of nontraditional assessments and shares how she has used them in her 5th-grade classroom in rural South Carolina. The examples include doing a Gallery Walk, writing poetry, integrating the arts into other subjects, having students reenact historical scenes, using social media, and rewriting popular songs with…

  13. Rewriting Dominant Narratives of the Academy: Women Faculty of Color and Identity Management

    ERIC Educational Resources Information Center

    Motha, Suhanthie; Varghese, Manka M.

    2018-01-01

    Drawing on Delgado and Yosso's "counterstory," Yosso's "community cultural wealth," and Alsup's "borderland discourses," the authors, who are women of color academics, use narratives from their lives to discuss the ways in which they draw on resources in managing and reconfiguring their multiple identities within the…

  14. An Approach to Revision and Evaluation of Student Writing.

    ERIC Educational Resources Information Center

    Duke, Charles R.

    An approach to evaluating student writing that emphasizes reformulation and deemphasizes grades teaches students that reworking their writing is a necessary and acceptable part of the writing process. Reformulation is divided into rewriting, revising, and editing. The instructor diagnoses student papers to determine significant problems on a…

  15. The BASIC Instructional Program: Conversion into MAINSAIL Language.

    ERIC Educational Resources Information Center

    Dageforde, Mary L.

    This report summarizes the rewriting of the BASIC Instructional Program (BIP) (a "hands-on laboratory" that teaches elementary programming in the BASIC language) from SAIL (a programming language available only on PDP-10 computers) into MAINSAIL (a language designed for portability on a broad class of computers). Four sections contain…

  16. 75 FR 48873 - Acquisition Regulation Rewrite

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-12

    ... Acquisition Regulation (DIAR). This action revises the DIAR, 48 CFR chapter 14, but does not impose any new..., 2010. No public comments were received. DOI has concluded that the interim rule should be adopted as a... DEPARTMENT OF THE INTERIOR Office of the Secretary 48 CFR Chapter 14 RIN 1093-AA11 Acquisition...

  17. 76 FR 54510 - Notice of Availability of Proposed Models for Plant-Specific Adoption of Technical Specifications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-01

    ... Models for Plant-Specific Adoption of Technical Specifications Task Force Traveler TSTF-500, Revision 2... Specifications Task Force (TSTF) Traveler TSTF- 500, Revision 2, ``DC Electrical Rewrite--Update to TSTF-360....8.6, ``Battery Cell Parameters.'' Additionally, a new Administrative Controls program, titled...

  18. Optical Disc Technology for Information Management.

    ERIC Educational Resources Information Center

    Brumm, Eugenia K.

    1991-01-01

    This summary of the literature on document image processing from 1988-90 focuses on WORM (write once read many) technology and on rewritable (i.e., erasable) optical discs, and excludes CD-ROM. Highlights include vendors and products, standards, comparisons of storage media, software, legal issues, records management, indexing, and computer…

  19. Rewriting the Vietnam Narrative: Strategic Partnership Opportunities in Southeast Asia

    DTIC Science & Technology

    2013-03-01

    American foot soldiers. Fighting continued against the Cambodians, Mongols and even the Chinese again as they all invaded and were eventually...occupied it until the defeat of the Japanese Empire by the Allies in 1945 and the French re-occupied their former colonies. The French fought

  20. How New Zealand Consumers Respond to Plain English.

    ERIC Educational Resources Information Center

    Campbell, Nittaya

    1999-01-01

    Considers how New Zealand has seen a need for providing readily understandable business and government documents. Reports a psycholinguistic study testing the level of consumer comprehension of bank contracts, and the effect of using plain English to rewrite them. Finds that the most effective means of enhancing comprehension was that which…

  1. Resurrection Symphony: "El Sistema" as Ideology in Venezuela and Los Angeles

    ERIC Educational Resources Information Center

    Fink, Robert

    2016-01-01

    The explosive growth of Venezuela's "El Sistema" is rewriting the agenda of musical education in the West. Many commentators from the world of classical music react to the spectacle of dedicated young colonial musicians playing European masterworks as a kind of "miracle," accepting "Sistema" founder José Antonio…

  2. The Many Faces of Censorship.

    ERIC Educational Resources Information Center

    Peck, Richard

    1999-01-01

    An author of 26 books for young adults writes about different forms of censorship including rewriting history textbooks and parents who censor themselves by not staying in touch with their children. Citing Cormier's "The Chocolate War" and Golding's "Lord of the Flies" and making reference to the Colulmbine school murders, he illustrates the…

  3. Encouraging Civic Engagement through Extended Community Writing Projects: Rewriting the Curriculum

    ERIC Educational Resources Information Center

    Simmons, Michele

    2010-01-01

    Developing community writing projects that effectively benefit students, the community, and the goals of the writing program is a tricky task. Drawing from recent scholarship and the author's own challenges with integrating meaningful civic engagement into the professional writing classes at her university, she examines limitations of single…

  4. When Bad Things Happen to Good Superintendents: What Comes between Well-Meaning School Boards and Their Superintendents?

    ERIC Educational Resources Information Center

    Marlowe, John

    2001-01-01

    Major causes of "sudden-death syndrome" for superintendents include mismatched priorities, unseen problems, interference from special-interest groups, and misdirected disagreements. Beleaguered superintendents should be positive and proactive, act and speak as one person, understand agreements, seek help, listen, rewrite personnel…

  5. 75 FR 19827 - Acquisition Regulation Rewrite

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-15

    ... of the Secretary, telephone (202) 513-0747, fax (202) 219-4244, or e-mail [email protected]ios... rule we did not conduct or use a study, experiment, or survey requiring peer review under the Data... endorsement of a product, service or position which the contractor represents. 1403.570-2 Procedures. If a...

  6. Social Studies Research Papers: A Writing Process Approach.

    ERIC Educational Resources Information Center

    Gilstrap, Robert L.

    1987-01-01

    Describes a writing process approach to research papers which involves four steps: prewriting, composing, rewriting, and sharing. Illustrates the process using an intermediate grade level example but states that the process is appropriate at higher levels. Stresses that this approach is important because it integrates writing skills with social…

  7. Rewriting evolution--"been there, done that".

    PubMed

    Penny, David

    2013-01-01

    A recent paper by a science journalist in Nature shows major errors in understanding phylogenies, in this case of placental mammals. The underlying unrooted tree is probably correct, but the placement of the root just reflects a well-known error from the acceleration in the rate of evolution among some myomorph rodents.

  8. 12 CFR 1235.4 - Minimum requirements of a record retention program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... appropriate to support administrative, business, external and internal audit functions, and litigation of the... for appropriate back-up and recovery of electronic records to ensure the same accuracy as the primary... records, preferably searchable, must be maintained on immutable, non-rewritable storage in a manner that...

  9. 12 CFR 1235.4 - Minimum requirements of a record retention program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... appropriate to support administrative, business, external and internal audit functions, and litigation of the... for appropriate back-up and recovery of electronic records to ensure the same accuracy as the primary... records, preferably searchable, must be maintained on immutable, non-rewritable storage in a manner that...

  10. 12 CFR 1235.4 - Minimum requirements of a record retention program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... appropriate to support administrative, business, external and internal audit functions, and litigation of the... for appropriate back-up and recovery of electronic records to ensure the same accuracy as the primary... records, preferably searchable, must be maintained on immutable, non-rewritable storage in a manner that...

  11. Being There: (Re)Making the Assessment Scene

    ERIC Educational Resources Information Center

    Gallagher, Chris W.

    2011-01-01

    I use Burkean analysis to show how neoliberalism undermines faculty assessment expertise and underwrites testing industry expertise in the current assessment scene. Contending that we cannot extricate ourselves from our limited agency in this scene until we abandon the familiar "stakeholder" theory of power, I propose a rewriting of the…

  12. Setting the Record Straight

    ERIC Educational Resources Information Center

    Ruffins, Paul

    2010-01-01

    Native Americans have long struggled to battle Hollywood stereotypes, correct the distorted "official" histories found in textbooks and museums and present their stories on their own terms. It is not surprising that a group of Native American scholars and activists is gearing up for an effort to rewrite their history to clarify the true…

  13. 77 FR 59790 - General Services Administration Acquisition Regulation (GSAR); Rewrite of Part 504...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-01

    ....'' GSAR Subpart 504.11, Central Contractor Registration (CCR) and GSAR 504.1103, Procedures, to add the... Sec. 504.1103 Procedures. Subpart 504.11--Central Contractor Registration 504.1103 Procedures. In addition to the requirements found in FAR 4.1103, prior to awarding a contractual instrument the...

  14. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  15. Automated Vectorization of Decision-Based Algorithms

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    Virtually all existing vectorization algorithms are designed to only analyze the numeric properties of an algorithm and distribute those elements across multiple processors. This advances the state of the practice because it is the only known system, at the time of this reporting, that takes high-level statements and analyzes them for their decision properties and converts them to a form that allows them to automatically be executed in parallel. The software takes a high-level source program that describes a complex decision- based condition and rewrites it as a disjunctive set of component Boolean relations that can then be executed in parallel. This is important because parallel architectures are becoming more commonplace in conventional systems and they have always been present in NASA flight systems. This technology allows one to take existing condition-based code and automatically vectorize it so it naturally decomposes across parallel architectures.

  16. Concept document of the repository-based software engineering program: A constructive appraisal

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A constructive appraisal of the Concept Document of the Repository-Based Software Engineering Program is provided. The Concept Document is designed to provide an overview of the Repository-Based Software Engineering (RBSE) Program. The Document should be brief and provide the context for reading subsequent requirements and product specifications. That is, all requirements to be developed should be traceable to the Concept Document. Applied Expertise's analysis of the Document was directed toward assuring that: (1) the Executive Summary provides a clear, concise, and comprehensive overview of the Concept (rewrite as necessary); (2) the sections of the Document make best use of the NASA 'Data Item Description' for concept documents; (3) the information contained in the Document provides a foundation for subsequent requirements; and (4) the document adequately: identifies the problem being addressed; articulates RBSE's specific role; specifies the unique aspects of the program; and identifies the nature and extent of the program's users.

  17. Evolution and coevolution of developmental programs

    NASA Astrophysics Data System (ADS)

    Jacob, Christian

    1999-09-01

    The developmental processes of single organisms, such as growth and structure formation, can be described by parallel rewrite systems in the form of Lindenmayer systems, which also allow one to generate geometrical structures in 3D space using turtle interpretation. We present examples of L-systems for growth programs of plant-like structures. Evolution-based programming techniques are applied to design L-systems by Genetic L-system Programming (GLP), demonstrating how developmental programs for plants, exhibiting specific morphogenetic properties can be interactively bred or automatically evolved. Finally, we demonstrate coevolutionary effects among plant populations consisting of different species, interacting with each other, competing for resources like sunlight and nutrients, and evolving successful reproduction strategies in their specific environments.

  18. Primal-dual methods of shape sensitivity analysis for curvilinear cracks with nonpenetration

    NASA Astrophysics Data System (ADS)

    Kovtunenko, V. A.

    2006-10-01

    Based on a level-set description of a crack moving with a given velocity, the problem of shape perturb-ation of the crack is considered. Nonpenetration conditions are imposed between opposite crack surfaces which result in a constrained minimization problem describing equilibrium of a solid with the crack. We suggest a minimax formulation of the state problem thus allowing curvilinear (nonplanar) cracks for the consideration. Utilizing primal-dual methods of shape sensitivity analysis we obtain the general formula for a shape derivative of the potential energy, which describes an energy-release rate for the curvilinear cracks. The conditions sufficient to rewrite it in the form of a path-independent integral (J-integral) are derived.

  19. Proving refinement transformations using extended denotational semantics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winter, V.L.; Boyle, J.M.

    1996-04-01

    TAMPR is a fully automatic transformation system based on syntactic rewrites. Our approach in a correctness proof is to map the transformation into an axiomatized mathematical domain where formal (and automated) reasoning can be performed. This mapping is accomplished via an extended denotational semantic paradigm. In this approach, the abstract notion of a program state is distributed between an environment function and a store function. Such a distribution introduces properties that go beyond the abstract state that is being modeled. The reasoning framework needs to be aware of these properties in order to successfully complete a correctness proof. This papermore » discusses some of our experiences in proving the correctness of TAMPR transformations.« less

  20. WE-DE-201-11: Sensitivity and Specificity of Verification Methods Based On Total Reference Air Kerma (TRAK) Or On User Provided Dose Points for Graphically Planned Skin HDR Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damato, A; Devlin, P; Bhagwat, M

    Purpose: To investigate the sensitivity and specificity of a novel verification methodology for image-guided skin HDR brachytherapy plans using a TRAK-based reasonableness test, compared to a typical manual verification methodology. Methods: Two methodologies were used to flag treatment plans necessitating additional review due to a potential discrepancy of 3 mm between planned dose and clinical target in the skin. Manual verification was used to calculate the discrepancy between the average dose to points positioned at time of planning representative of the prescribed depth and the expected prescription dose. Automatic verification was used to calculate the discrepancy between TRAK of themore » clinical plan and its expected value, which was calculated using standard plans with varying curvatures, ranging from flat to cylindrically circumferential. A plan was flagged if a discrepancy >10% was observed. Sensitivity and specificity were calculated using as a criteria for true positive that >10% of plan dwells had a distance to prescription dose >1 mm different than prescription depth (3 mm + size of applicator). All HDR image-based skin brachytherapy plans treated at our institution in 2013 were analyzed. Results: 108 surface applicator plans to treat skin of the face, scalp, limbs, feet, hands or abdomen were analyzed. Median number of catheters was 19 (range, 4 to 71) and median number of dwells was 257 (range, 20 to 1100). Sensitivity/specificity were 57%/78% for manual and 70%/89% for automatic verification. Conclusion: A check based on expected TRAK value is feasible for irregularly shaped, image-guided skin HDR brachytherapy. This test yielded higher sensitivity and specificity than a test based on the identification of representative points, and can be implemented with a dedicated calculation code or with pre-calculated lookup tables of ideally shaped, uniform surface applicators.« less

  1. Verification of EPA's " Preliminary remediation goals for radionuclides" (PRG) electronic calculator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stagich, B. H.

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides information on establishing PRGs for radionuclides at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites with radioactive contamination (Verification Study Charge, Background). These risk-based PRGs set concentration limits using carcinogenic toxicity values under specific exposure conditions (PRG User’s Guide, Section 1). The purpose of this verification study is to ascertain that the computer codes has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly.

  2. VERIFYING THE VOC CONTROL PERFORMANCE OF BIOREACTORS

    EPA Science Inventory

    The paper describes the verification testing approach used to collect high-quality, peer-reviewed data on the performance of bioreaction-based technologies for the control of volatile organic compounds (VOCs). The verification protocol that describes the approach for these tests ...

  3. Feasibility of biochemical verification in a web-based smoking cessation study.

    PubMed

    Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L

    2017-10-01

    Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Comparison between In-house developed and Diamond commercial software for patient specific independent monitor unit calculation and verification with heterogeneity corrections.

    PubMed

    Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya

    2016-02-01

    The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Acoustic-based proton range verification in heterogeneous tissue: simulation studies

    NASA Astrophysics Data System (ADS)

    Jones, Kevin C.; Nie, Wei; Chu, James C. H.; Turian, Julius V.; Kassaee, Alireza; Sehgal, Chandra M.; Avery, Stephen

    2018-01-01

    Acoustic-based proton range verification (protoacoustics) is a potential in vivo technique for determining the Bragg peak position. Previous measurements and simulations have been restricted to homogeneous water tanks. Here, a CT-based simulation method is proposed and applied to a liver and prostate case to model the effects of tissue heterogeneity on the protoacoustic amplitude and time-of-flight range verification accuracy. For the liver case, posterior irradiation with a single proton pencil beam was simulated for detectors placed on the skin. In the prostate case, a transrectal probe measured the protoacoustic pressure generated by irradiation with five separate anterior proton beams. After calculating the proton beam dose deposition, each CT voxel’s material properties were mapped based on Hounsfield Unit values, and thermoacoustically-generated acoustic wave propagation was simulated with the k-Wave MATLAB toolbox. By comparing the simulation results for the original liver CT to homogenized variants, the effects of heterogeneity were assessed. For the liver case, 1.4 cGy of dose at the Bragg peak generated 50 mPa of pressure (13 cm distal), a 2×  lower amplitude than simulated in a homogeneous water tank. Protoacoustic triangulation of the Bragg peak based on multiple detector measurements resulted in 0.4 mm accuracy for a δ-function proton pulse irradiation of the liver. For the prostate case, higher amplitudes are simulated (92-1004 mPa) for closer detectors (<8 cm). For four of the prostate beams, the protoacoustic range triangulation was accurate to  ⩽1.6 mm (δ-function proton pulse). Based on the results, application of protoacoustic range verification to heterogeneous tissue will result in decreased signal amplitudes relative to homogeneous water tank measurements, but accurate range verification is still expected to be possible.

  6. A Comparative Study of Two Azimuth Based Non Standard Location Methods

    DTIC Science & Technology

    2017-03-23

    Standard Location Methods Rongsong JIH U.S. Department of State / Arms Control, Verification, and Compliance Bureau, 2201 C Street, NW, Washington...COMPARATIVE STUDY OF TWO AZIMUTH-BASED NON-STANDARD LOCATION METHODS R. Jih Department of State / Arms Control, Verification, and Compliance Bureau...cable. The so-called “Yin Zhong Xian” (“引中线” in Chinese) algorithm, hereafter the YZX method , is an Oriental version of IPB-based procedure. It

  7. Online 3D EPID-based dose verification: Proof of concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozenda

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of thismore » study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5–10 s irradiation time. Conclusions: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.« less

  8. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    PubMed Central

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  9. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    PubMed

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  10. Verification of performance specifications for a US Food and Drug Administration-approved molecular microbiology test: Clostridium difficile cytotoxin B using the Becton, Dickinson and Company GeneOhm Cdiff assay.

    PubMed

    Schlaberg, Robert; Mitchell, Michael J; Taggart, Edward W; She, Rosemary C

    2012-01-01

    US Food and Drug Administration (FDA)-approved diagnostic tests based on molecular genetic technologies are becoming available for an increasing number of microbial pathogens. Advances in technology and lower costs have moved molecular diagnostic tests formerly performed for research purposes only into much wider use in clinical microbiology laboratories. To provide an example of laboratory studies performed to verify the performance of an FDA-approved assay for the detection of Clostridium difficile cytotoxin B compared with the manufacturer's performance standards. We describe the process and protocols used by a laboratory for verification of an FDA-approved assay, assess data from the verification studies, and implement the assay after verification. Performance data from the verification studies conducted by the laboratory were consistent with the manufacturer's performance standards and the assay was implemented into the laboratory's test menu. Verification studies are required for FDA-approved diagnostic assays prior to use in patient care. Laboratories should develop a standardized approach to verification studies that can be adapted and applied to different types of assays. We describe the verification of an FDA-approved real-time polymerase chain reaction assay for the detection of a toxin gene in a bacterial pathogen.

  11. Deductive Evaluation: Implicit Code Verification With Low User Burden

    NASA Technical Reports Server (NTRS)

    Di Vito, Ben L.

    2016-01-01

    We describe a framework for symbolically evaluating C code using a deductive approach that discovers and proves program properties. The framework applies Floyd-Hoare verification principles in its treatment of loops, with a library of iteration schemes serving to derive loop invariants. During evaluation, theorem proving is performed on-the-fly, obviating the generation of verification conditions normally needed to establish loop properties. A PVS-based prototype is presented along with results for sample C functions.

  12. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 1: A case study in theorem prover-based verification

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1991-01-01

    The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  13. GENERIC VERIFICATION PROTOCOL FOR AQUEOUS CLEANER RECYCLING TECHNOLOGIES

    EPA Science Inventory

    This generic verification protocol has been structured based on a format developed for ETV-MF projects. This document describes the intended approach and explain plans for testing with respect to areas such as test methodology, procedures, parameters, and instrumentation. Also ...

  14. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience.

    PubMed

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael

    2007-08-21

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm(3) ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 +/- 1.2% and 0.5 +/- 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 +/- 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach. The physical effects modelled in the dose calculation software MUV allow accurate dose calculations in individual verification points. Independent calculations may be used to replace experimental dose verification once the IMRT programme is mature.

  15. Methodology for the specification of communication activities within the framework of a multi-layered architecture: Toward the definition of a knowledge base

    NASA Astrophysics Data System (ADS)

    Amyay, Omar

    A method defined in terms of synthesis and verification steps is presented. The specification of the services and protocols of communication within a multilayered architecture of the Open Systems Interconnection (OSI) type is an essential issue for the design of computer networks. The aim is to obtain an operational specification of the protocol service couple of a given layer. Planning synthesis and verification steps constitute a specification trajectory. The latter is based on the progressive integration of the 'initial data' constraints and verification of the specification originating from each synthesis step, through validity constraints that characterize an admissible solution. Two types of trajectories are proposed according to the style of the initial specification of the service protocol couple: operational type and service supplier viewpoint; knowledge property oriented type and service viewpoint. Synthesis and verification activities were developed and formalized in terms of labeled transition systems, temporal logic and epistemic logic. The originality of the second specification trajectory and the use of the epistemic logic are shown. An 'artificial intelligence' approach enables a conceptual model to be defined for a knowledge base system for implementing the method proposed. It is structured in three levels of representation of the knowledge relating to the domain, the reasoning characterizing synthesis and verification activities and the planning of the steps of a specification trajectory.

  16. Non-volatile memory based on the ferroelectric photovoltaic effect

    PubMed Central

    Guo, Rui; You, Lu; Zhou, Yang; Shiuh Lim, Zhi; Zou, Xi; Chen, Lang; Ramesh, R.; Wang, Junling

    2013-01-01

    The quest for a solid state universal memory with high-storage density, high read/write speed, random access and non-volatility has triggered intense research into new materials and novel device architectures. Though the non-volatile memory market is dominated by flash memory now, it has very low operation speed with ~10 μs programming and ~10 ms erasing time. Furthermore, it can only withstand ~105 rewriting cycles, which prevents it from becoming the universal memory. Here we demonstrate that the significant photovoltaic effect of a ferroelectric material, such as BiFeO3 with a band gap in the visible range, can be used to sense the polarization direction non-destructively in a ferroelectric memory. A prototype 16-cell memory based on the cross-bar architecture has been prepared and tested, demonstrating the feasibility of this technique. PMID:23756366

  17. An ontology based trust verification of software license agreement

    NASA Astrophysics Data System (ADS)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  18. Biometrics based authentication scheme for session initiation protocol.

    PubMed

    Xie, Qi; Tang, Zhixiong

    2016-01-01

    Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when compared to other related protocols.

  19. Computing exponentially faster: implementing a non-deterministic universal Turing machine using DNA

    PubMed Central

    Currin, Andrew; Korovin, Konstantin; Ababi, Maria; Roper, Katherine; Kell, Douglas B.; Day, Philip J.

    2017-01-01

    The theory of computer science is based around universal Turing machines (UTMs): abstract machines able to execute all possible algorithms. Modern digital computers are physical embodiments of classical UTMs. For the most important class of problem in computer science, non-deterministic polynomial complete problems, non-deterministic UTMs (NUTMs) are theoretically exponentially faster than both classical UTMs and quantum mechanical UTMs (QUTMs). However, no attempt has previously been made to build an NUTM, and their construction has been regarded as impossible. Here, we demonstrate the first physical design of an NUTM. This design is based on Thue string rewriting systems, and thereby avoids the limitations of most previous DNA computing schemes: all the computation is local (simple edits to strings) so there is no need for communication, and there is no need to order operations. The design exploits DNA's ability to replicate to execute an exponential number of computational paths in P time. Each Thue rewriting step is embodied in a DNA edit implemented using a novel combination of polymerase chain reactions and site-directed mutagenesis. We demonstrate that the design works using both computational modelling and in vitro molecular biology experimentation: the design is thermodynamically favourable, microprogramming can be used to encode arbitrary Thue rules, all classes of Thue rule can be implemented, and non-deterministic rule implementation. In an NUTM, the resource limitation is space, which contrasts with classical UTMs and QUTMs where it is time. This fundamental difference enables an NUTM to trade space for time, which is significant for both theoretical computer science and physics. It is also of practical importance, for to quote Richard Feynman ‘there's plenty of room at the bottom’. This means that a desktop DNA NUTM could potentially utilize more processors than all the electronic computers in the world combined, and thereby outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy. PMID:28250099

  20. Rewriting "The Road to Nowhere": Place Pedagogies in Western Sydney

    ERIC Educational Resources Information Center

    Gannon, Susanne

    2009-01-01

    Negative representations of parts of our cities are endemic in the Australian media, where certain suburbs function as motifs for failure--past, present, and future. Indeed, as one journalist put it after invoking the "interchangeable" triumvirate of Sydney's Mount Druitt, Melbourne's West Heidelberg, and Brisbane's Inala, "geography is destiny"…

  1. Awakening to Opportunities in International Business: A Title VI-B Grant and Partnership Approach

    ERIC Educational Resources Information Center

    Haynes, Joan; Rutz, Rebecca

    2007-01-01

    Subsequent to hurricane Katrina, Mississippi Gulf Coast businesses are rebuilding and rewriting rules. This catastrophic event has offered the coast a tremendous opportunity to write new rules furthering the redevelopment and expansion of international trade through the ports of Gulfport and Pascagoula and the expansion of the tourism industry.

  2. Rewriting Writers Workshop: Creating Safe Spaces for Disruptive Stories

    ERIC Educational Resources Information Center

    Lewison, Mitzi; Heffernan, Lee

    2008-01-01

    This article explores a third-grade teacher's use of critical writing pedagogy to encourage students' exploration of issues that were important in their lives from personal as well as social perspectives. She used a particular version of critical writing pedagogy--social narrative writing--in which students read and discussed children's literature…

  3. "Bloodline Is All I Need": Defiant Indigeneity and Hawaiian Hip-Hop

    ERIC Educational Resources Information Center

    Teves, Stephanie Nohelani

    2011-01-01

    During the late twentieth century, Kanaka Maoli have struggled to push back against these representations, offering a rewriting of Hawaiian history, quite literally. Infused by Hawaiian nationalism and a growing library of works that investigate the naturalization of American colonialism in Hawai'i, innovative Kanaka Maoli representations in the…

  4. Foucault and Marxism: Rewriting the Theory of Historical Materialism

    ERIC Educational Resources Information Center

    Olssen, Mark

    2004-01-01

    This article explores the relationship of Foucault to Marxism. Although he was often critical of Marxism, Foucault's own approach bears striking parallels to Marxism, as a form of method, as an account of history, and as an analysis of social structure. Like Marxism, Foucault represents social practices as transitory and all knowledge and…

  5. From Competence in the Curriculum to Competence in Action

    ERIC Educational Resources Information Center

    Jonnaert, Philippe; Masciotra, Domenico; Barrette, Johanne; Morel, Denise; Mane, Yaya

    2007-01-01

    The article begins by drawing a distinction between the concepts of "curriculum" and "programme of study", and goes on to show that curriculum reform involves much more than simply rewriting programmes of study. The reforms that are presently sweeping across education systems throughout the world qualify, in many cases, as true paradigm…

  6. Spent Nuclear Fuel Project Configuration Management Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reilly, M.A.

    This document is a rewrite of the draft ``C`` that was agreed to ``in principle`` by SNF Project level 2 managers on EDT 609835, dated March 1995 (not released). The implementation process philosphy was changed in keeping with the ongoing reengineering of the WHC Controlled Manuals to achieve configuration management within the SNF Project.

  7. The Virtual Dream: Rewriting Stories of Loss and Grief

    ERIC Educational Resources Information Center

    Neimeyer, Robert A.; Torres, Carlos; Smith, Douglas C.

    2011-01-01

    In this article, the authors introduce the "virtual dream", a technique that entails writing a brief, spontaneous dreamlike story on themes of loss, using a flexible set of assigned elements of setting and characterization to scaffold the writing. After providing several examples of virtual dreams written by workshop participants, the authors…

  8. Rewriting the Book on Science Instruction

    ERIC Educational Resources Information Center

    Young, Betty

    2007-01-01

    Science testing, as mandated by the No Child Left Behind (NCLB) Act, has refocused attention on the quality of the K-8 science curriculum and instruction in many districts around the country. It has become clear that to improve quality, and meet NCLB requirements, elementary and middle schools must develop different teaching approaches and…

  9. 75 FR 32860 - General Services Administration Acquisition Regulation; GSAR Case 2008-G503, Rewrite of GSAR Part...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-10

    ... Government procurement. Dated: May 17, 2010. Rodney P. Lantier, Acting Senior Procurement Executive, Office... (Change 45) Docket 2008- 0007; Sequence 11] RIN 3090-AI71 General Services Administration Acquisition... CONTACT: For clarification of content, contact Beverly Cromer, Procurement Analyst, at (202) 501-1448. For...

  10. Rewriting My Autobiography: The Legal and Ethical Implications of Memory-Dampening Agents

    ERIC Educational Resources Information Center

    Aoki, Cynthia R. A.

    2008-01-01

    The formation and recall of memories are fundamental aspects of life and help preserve the complex collection of experiences that provide us with a sense of identity and autonomy. Scientists have recently started to investigate pharmacological agents that inhibit or "dampen" the strength of memory formation and recall. The development of…

  11. Technical Writing: Process and Product. Third Edition.

    ERIC Educational Resources Information Center

    Gerson, Sharon J.; Gerson, Steven M.

    This book guides students through the entire writing process--prewriting, writing, and rewriting--developing an easy-to-use, step-by-step technique for writing the types of documents they will encounter on the job. It engages students in the writing process and encourages hands-on application as well as discussions about ethics, audience…

  12. Unions Assail Teacher Ideas in NCLB Draft

    ERIC Educational Resources Information Center

    Klein, Alyson; Hoff, David J.

    2007-01-01

    This article reports on two national teachers' unions that have mounted a vigorous lobbying campaign to rewrite language linking teacher bonuses to student test scores and other incentive-pay provisions contained in a draft bill for reauthorizing the No Child Left Behind Act. Members of the National Education Association circulated in the halls of…

  13. Geometrical Simplification of the Dipole-Dipole Interaction Formula

    ERIC Educational Resources Information Center

    Kocbach, Ladislav; Lubbad, Suhail

    2010-01-01

    Many students meet dipole-dipole potential energy quite early on when they are taught electrostatics or magnetostatics and it is also a very popular formula, featured in encyclopedias. We show that by a simple rewriting of the formula it becomes apparent that, for example, by reorienting the two dipoles, their attraction can become exactly twice…

  14. New Leeway on Horizon under NCLB

    ERIC Educational Resources Information Center

    McNeil, Michele

    2011-01-01

    As the clock ticks toward President Barack Obama's back-to-school deadline for rewriting the No Child Left Behind Act, U.S. Secretary of Education Arne Duncan is preparing to grant states relief from key provisions of the federal school accountability law in exchange for what he calls "commitments to key reforms." The move comes as…

  15. Students Rewriting Gibbon, and Other Stories: Disciplinary History Writing

    ERIC Educational Resources Information Center

    Ricot, Richard

    2010-01-01

    The most successful historical arguments are expressed in a voice unmistakeably the author's own, yet this numbers among the most difficult skills to accomplish. In this article, I discuss a series of seminars which I ran in University College London's History Department in order to help undergraduate historians develop their authorial voice. Some…

  16. Optically reconfigurable patterning for control of the propagation characteristics of a planar waveguide

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Klittnick, A.; Clark, N. A.; Keller, P.

    2008-10-01

    We demonstrate an easily fabricated all-optical and freely reconfigurable method of controlling the propagating characteristics of the optic path within a planar waveguide with low insertion losses by employing the optical patterning of the refractive index of an erasable and rewriteable photosensitive liquid crystal polymer cladding layer.

  17. 77 FR 76446 - General Services Administration Acquisition Regulation (GSAR); GSAR Case 2006-G507; Rewrite of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-28

    ... 4596, January 26, 2009. There were 36 public comments received in response to the Advanced Notice of... series of new GSAR cases to modernize the Federal Supply Schedules (FSS) program. The new GSAR cases will..., General Services Administration (GSA). ACTION: Proposed rule; withdrawal. SUMMARY: The General Services...

  18. 7 CFR 762.149 - Liquidation.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... farmer-creditor mediation program; and (3) Not agree to any proposals to rewrite the terms of a... action will be pursued. If the lender does not pursue the recovery, the reason must be documented when an... recovery value of the security as defined in § 762.102. The appraisal requirement may be waived by the...

  19. 7 CFR 762.149 - Liquidation.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... farmer-creditor mediation program; and (3) Not agree to any proposals to rewrite the terms of a... action will be pursued. If the lender does not pursue the recovery, the reason must be documented when an... recovery value of the security as defined in § 762.102. The appraisal requirement may be waived by the...

  20. 7 CFR 762.149 - Liquidation.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... farmer-creditor mediation program; and (3) Not agree to any proposals to rewrite the terms of a... action will be pursued. If the lender does not pursue the recovery, the reason must be documented when an... recovery value of the security as defined in § 762.102. The appraisal requirement may be waived by the...

  1. Computational Nonlinear Morphology with Emphasis on Semitic Languages. Studies in Natural Language Processing.

    ERIC Educational Resources Information Center

    Kiraz, George Anton

    This book presents a tractable computational model that can cope with complex morphological operations, especially in Semitic languages, and less complex morphological systems present in Western languages. It outlines a new generalized regular rewrite rule system that uses multiple finite-state automata to cater to root-and-pattern morphology,…

  2. A Survey of Recent Advances in Optical and Multimedia Information Technologies.

    ERIC Educational Resources Information Center

    Jessop, Deborah

    1997-01-01

    Examines developments in multimedia technologies and in the World Wide Web. Discusses CD-recordable, CD-rewritable, cable modems, personal digital assistants, digital video discs, interactivity and virtual worlds, advertising on the Web, and Intranets and CD-ROM networks. Eight tables and figures show costs, download time, estimated sales, storage…

  3. School-Meals Makeover Stirs the Pot

    ERIC Educational Resources Information Center

    Shah, Nirvi

    2011-01-01

    Proposed new federal rules governing the meals served to school children across the country each weekday are causing a stir among food industry groups, cafeteria managers, parents, and students. The skirmish is over the U.S. Department of Agriculture's efforts, prompted by the recent passage of the Healthy, Hunger-Free Kids Act, to rewrite the…

  4. Teachers Reflect Standards in Basals

    ERIC Educational Resources Information Center

    Gewertz, Catherine

    2012-01-01

    Dozens of teachers and literacy specialists from across the country hunkered down in Baltimore at round tables, with laptops, pens, and paper, intent on rewriting the collections that wield tremendous influence over the way millions of U.S. children learn literacy skills: the big-name basal readers. Hailing from 18 school districts in 11 states,…

  5. Rescripting a Troubled Past: John Brown's Family and the Harpers Ferry Conspiracy.

    ERIC Educational Resources Information Center

    McGlone, Robert E.

    1989-01-01

    Uses autobiographical information constructed by John Brown's family in the aftermath of Harpers Ferry to illustrate the issue of rescripting of history. Points out that this nondeliberate rewriting of the past is a result of the personal need to refocus self-schema and the validation of false memories as authentic. (KO)

  6. Total Physical Response Storytelling: A Communicative Approach to Language Learning.

    ERIC Educational Resources Information Center

    Marsh, Valeri

    1998-01-01

    Describes total physical response storytelling, which provides the critical vehicle--storytelling--for utilizing and expanding vocabulary. High-interest stories contextualize the vocabulary, enabling students to hear and see a story and then to act out, revise, and rewrite. A brief outline of the sequence of steps for using TPR storytelling in…

  7. Rewriting Requirements for Design

    DTIC Science & Technology

    2002-11-06

    Lights 1.2.8. Window Lights 2. Behavior Hiding 2.1. Function Drivers 2.1.1. Malfunction Lights 2.1.2. Office Lights 2.2. Shared Services 2.2.1. Mode...4702, 1981. [6] P.C. Clements, Abstract Interface Specifications for the A-7E Shared Services Module, NRL Memorandum Report 4863, 1982. [7] D.L

  8. Rewriting a Discursive Practice: Atheist Adaptation of Coming Out Discourse

    ERIC Educational Resources Information Center

    Cloud, Doug

    2017-01-01

    "Coming out" is a powerful way for individuals to disclose, constitute, and perform membership in stigmatized identity categories. The practice has now spread far beyond its LGBTQ origins. In this essay, I examine how atheists and other secularists have taken up and adapted coming out discourse to meet their situational and rhetorical…

  9. Student and Instructor Use of Comments on Business Communication Papers.

    ERIC Educational Resources Information Center

    Winter, Janet K.; Neal, Joan C.; Waner, Karen K.

    1996-01-01

    Surveys college students regarding their use of instructor comments written on their papers. Finds all students tend to use comments; no significant correlations exist between students' ability levels and their propensity to review, understand, and use comments; students were likely to review comments if they had to rewrite assignments; and…

  10. Easy Ways to Promote Inquiry in a Laboratory Course: The Power of Student Questions

    ERIC Educational Resources Information Center

    Polacek, Kelly Myer; Keeling, Elena Levine

    2005-01-01

    To teach students to think like scientists, the authors modified their laboratory course to include regular opportunities for student practice of inquiry and the scientific process. Their techniques are simple; they can be implemented without rewriting lab manuals, require little additional grading beyond typical lab reports, and are applicable…

  11. Making Change Happen: 1998/99 Critical Issues Paper.

    ERIC Educational Resources Information Center

    Carter, Patricia; Alfred, Richard

    This monograph suggests that many of the fundamental aspects of the design and delivery of education are changing radically. At the same time, three forces are converging to destabilize educational institutions: a new generation of learners is rewriting the rules of service, technology is opening new frontiers, and a new breed of competitors is…

  12. Rewriting the Rules of Engagement: Elaborating a Model of District-Community Collaboration

    ERIC Educational Resources Information Center

    Ishimaru, Ann M.

    2014-01-01

    In this ethnographic case study, Ann M. Ishimaru examines how a collaboration emerged and evolved between a low-income Latino parent organizing group and the leadership of a rapidly changing school district. Using civic capacity and community organizing theories, Ishimaru seeks to understand the role of parents, goals, strategies, and change…

  13. Transformation of Traditional Vocabulary Exercises into Collaborative Writing Activity

    ERIC Educational Resources Information Center

    Zheng, Jian-feng

    2010-01-01

    In the reading course, especially the so-called intensive reading course or integrative English reading course, there are some vocabulary exercises which intend to consolidate the active vocabulary emerging in the reading passages. Mostly, these exercises are in the form of blank-filling or rewriting sentences with the words given. The problem…

  14. A More Intuitive Version of the Lorentz Velocity Addition Formula

    ERIC Educational Resources Information Center

    Devlin, John F.

    2009-01-01

    The Lorentz velocity addition formula for one-dimensional motion presents a number of problems for beginning students of special relativity. In this paper we suggest a simple rewrite of the formula that is easier for students to memorize and manipulate, and furthermore is more intuitive in understanding the correction necessary when adding…

  15. Rewriting Evolution—“Been There, Done That”

    PubMed Central

    Penny, David

    2013-01-01

    A recent paper by a science journalist in Nature shows major errors in understanding phylogenies, in this case of placental mammals. The underlying unrooted tree is probably correct, but the placement of the root just reflects a well-known error from the acceleration in the rate of evolution among some myomorph rodents. PMID:23558594

  16. 76 FR 30842 - General Services Administration Acquisition Regulation; Rewrite of Part 570; Acquiring Leasehold...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-27

    ... Sustainable Buildings; to delete the dollar value of the simplified lease acquisition threshold and instead.... GSAR 570.117, Sustainable Requirements for Lease Acquisitions, is added to add a requirement for the contracting officer to include sustainable design requirements appropriate for the type of leasing action in...

  17. A Case of You: Remembering David Fowler

    ERIC Educational Resources Information Center

    Pimm, David

    2004-01-01

    The author has framed this brief appreciation of David Flower in terms of influence; specifically, his influence as a teacher, both in person and through his writing (most of all his attempted rewriting of much of the history of Greek mathematics). The author will also make some second-order remarks about the influence of teachers.

  18. Telecommunication market research processing

    NASA Astrophysics Data System (ADS)

    Dupont, J. F.

    1983-06-01

    The data processing in two telecommunication market investigations is described. One of the studies concerns the office applications of communication and the other the experiences with a videotex terminal. Statistical factorial analysis was performed on a large mass of data. A comparison between utilization intentions and effective utilization is made. Extensive rewriting of statistical analysis computer programs was required.

  19. Rewriting Literacy: Culture and the Discourse of the Other. Critical Studies in Education and Culture Series.

    ERIC Educational Resources Information Center

    Mitchell, Candace, Ed.; Weiler, Kathleen, Ed.

    Sixteen chapters discuss the relationship among literacy, culture, and difference in education; restructuring school curricula to meet the needs of those traditionally excluded from education's dominant discourse; the social and cultural context of literacy; and literacy's highly political nature. After "Series Introduction: Literacy,…

  20. Verification of Java Programs using Symbolic Execution and Invariant Generation

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  1. Implementation and verification of global optimization benchmark problems

    NASA Astrophysics Data System (ADS)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  2. Markov chain aggregation and its applications to combinatorial reaction networks.

    PubMed

    Ganguly, Arnab; Petrov, Tatjana; Koeppl, Heinz

    2014-09-01

    We consider a continuous-time Markov chain (CTMC) whose state space is partitioned into aggregates, and each aggregate is assigned a probability measure. A sufficient condition for defining a CTMC over the aggregates is presented as a variant of weak lumpability, which also characterizes that the measure over the original process can be recovered from that of the aggregated one. We show how the applicability of de-aggregation depends on the initial distribution. The application section is devoted to illustrate how the developed theory aids in reducing CTMC models of biochemical systems particularly in connection to protein-protein interactions. We assume that the model is written by a biologist in form of site-graph-rewrite rules. Site-graph-rewrite rules compactly express that, often, only a local context of a protein (instead of a full molecular species) needs to be in a certain configuration in order to trigger a reaction event. This observation leads to suitable aggregate Markov chains with smaller state spaces, thereby providing sufficient reduction in computational complexity. This is further exemplified in two case studies: simple unbounded polymerization and early EGFR/insulin crosstalk.

  3. Electrical Conductance Tuning and Bistable Switching in Poly(N-vinylcarbazole)-Carbon Nanotube Composite Films.

    PubMed

    Liu, Gang; Ling, Qi-Dan; Teo, Eric Yeow Hwee; Zhu, Chun-Xiang; Chan, D Siu-Hung; Neoh, Koon-Gee; Kang, En-Tang

    2009-07-28

    By varying the carbon nanotube (CNT) content in poly(N-vinylcarbazole) (PVK) composite thin films, the electrical conductance behavior of an indium-tin oxide/PVK-CNT/aluminum (ITO/PVK-CNT/Al) sandwich structure can be tuned in a controlled manner. Distinctly different electrical conductance behaviors, such as (i) insulator behavior, (ii) bistable electrical conductance switching effects (write-once read-many-times (WORM) memory effect and rewritable memory effect), and (iii) conductor behavior, are discernible from the current density-voltage characteristics of the composite films. The turn-on voltage of the two bistable conductance switching devices decreases and the ON/OFF state current ratio of the WORM device increases with the increase in CNT content of the composite film. Both the WORM and rewritable devices are stable under a constant voltage stress or a continuous pulse voltage stress, with an ON/OFF state current ratio in excess of 10(3). The conductance switching effects of the composite films have been attributed to electron trapping in the CNTs of the electron-donating/hole-transporting PVK matrix.

  4. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: QUALITY AND MANAGEMENT PLAN FOR THE PILOT PERIOD (1995-2000)

    EPA Science Inventory

    Based upon the structure and specifications in ANSI/ASQC E4-1994, Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs, the Environmental Technology Verification (ETV) program Quality and Management Plan (QMP) f...

  6. An analysis of random projection for changeable and privacy-preserving biometric verification.

    PubMed

    Wang, Yongjin; Plataniotis, Konstantinos N

    2010-10-01

    Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.

  7. A Scala DSL for RETE-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  8. Commissioning and quality assurance of an integrated system for patient positioning and setup verification in particle therapy.

    PubMed

    Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G

    2014-08-01

    In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.

  9. Further Evidence in Support of the Universal Nilpotent Grammatical Computational Paradigm of Quantum Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcer, Peter J.; Rowlands, Peter

    2010-12-22

    Further evidence is presented in favour of the computational paradigm, conceived and constructed by Rowlands and Diaz, as detailed in Rowlands' book Zero to Infinity (2007), and in particular the authors' paper 'The Grammatical Universe: the Laws of Thermodynamics and Quantum Entanglement'. The paradigm, which has isomorphic group and algebraic quantum mechanical language interpretations, not only predicts the well-established facts of quantum physics, the periodic table, chemistry / valence and of molecular biology, whose understanding it extends; it also provides an elegant, simple solution to the unresolved quantum measurement problem. In this fundamental paradigm, all the computational constructs / predictionsmore » that emerge, follow from the simple fact, that, as in quantum mechanics, the wave function is defined only up to an arbitrary fixed phase. This fixed phase provides a simple physical understanding of the quantum vacuum in quantum field theory, where only relative phases, known to be able to encode 3+1 relativistic space-time geometries, can be measured. It is the arbitrary fixed measurement standard, against which everything that follows is to be measured, even though the standard itself cannot be, since nothing exists against which to measure it. The standard, as an arbitrary fixed reference phase, functions as the holographic basis for a self-organized universal quantum process of emergent novel fermion states of matter where, following each emergence, the arbitrary standard is re-fixed anew so as to provide a complete history / holographic record or hologram of the current fixed past, advancing an unending irreversible evolution, such as is the evidence of our senses. The fermion states, in accord with the Pauli exclusion principle, each correspond to a unique nilpotent symbol in the infinite alphabet (which specifies the grammar in this nilpotent universal computational rewrite system (NUCRS) paradigm); and the alphabet, as Hill and Rowlands hypothesize on substantial evidence [26], includes that of the RNA / DNA genetic code and, as holographic phase encodings / holograms, the 4D geometries of all living systems as self-organised grammatical computational rewrite machines / machinery. Human brains, natural grammatical (written symbol) languages, 4D geometric self-awareness and a totally new emergent property of matter, human consciousness, can thus with some measure of confidence be postulated as further genetic consequences which follow from this self-organizing fundamental rewrite NUCRS construction. For it, like natural language, possesses a semantics and not just a syntax, where the initial symbol, i.e. the arbitrary fixed phase measurement standard, is able to function as the template for the blueprints of the emergent 4D relativistic real and virtual geometries to come, in a 'from the Self Creation to the creation of the human self' computational rewrite process evolution.« less

  10. Further Evidence in Support of the Universal Nilpotent Grammatical Computational Paradigm of Quantum Physics

    NASA Astrophysics Data System (ADS)

    Marcer, Peter J.; Rowlands, Peter

    2010-12-01

    Further evidence is presented in favour of the computational paradigm, conceived and constructed by Rowlands and Diaz, as detailed in Rowlands' book Zero to Infinity (2007) [2], and in particular the authors' paper `The Grammatical Universe: the Laws of Thermodynamics and Quantum Entanglement' [1]. The paradigm, which has isomorphic group and algebraic quantum mechanical language interpretations, not only predicts the well-established facts of quantum physics, the periodic table, chemistry / valence and of molecular biology, whose understanding it extends; it also provides an elegant, simple solution to the unresolved quantum measurement problem. In this fundamental paradigm, all the computational constructs / predictions that emerge, follow from the simple fact, that, as in quantum mechanics, the wave function is defined only up to an arbitrary fixed phase. This fixed phase provides a simple physical understanding of the quantum vacuum in quantum field theory, where only relative phases, known to be able to encode 3+1 relativistic space-time geometries, can be measured. It is the arbitrary fixed measurement standard, against which everything that follows is to be measured, even though the standard itself cannot be, since nothing exists against which to measure it. The standard, as an arbitrary fixed reference phase, functions as the holographic basis for a self-organized universal quantum process of emergent novel fermion states of matter where, following each emergence, the arbitrary standard is re-fixed anew so as to provide a complete history / holographic record or hologram of the current fixed past, advancing an unending irreversible evolution, such as is the evidence of our senses. The fermion states, in accord with the Pauli exclusion principle, each correspond to a unique nilpotent symbol in the infinite alphabet (which specifies the grammar in this nilpotent universal computational rewrite system (NUCRS) paradigm); and the alphabet, as Hill and Rowlands hypothesize on substantial evidence [26], includes that of the RNA / DNA genetic code and, as holographic phase encodings / holograms, the 4D geometries of all living systems as self-organised grammatical computational rewrite machines / machinery. Human brains, natural grammatical (written symbol) languages, 4D geometric self-awareness and a totally new emergent property of matter, human consciousness, can thus with some measure of confidence be postulated as further genetic consequences which follow from this self-organizing fundamental rewrite NUCRS construction. For it, like natural language, possesses a semantics and not just a syntax, where the initial symbol, i.e. the arbitrary fixed phase measurement standard, is able to function as the template for the blueprints of the emergent 4D relativistic real and virtual geometries to come, in a `from the Self Creation to the creation of the human self' computational rewrite process evolution.

  11. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  12. ENVIORNMENTAL TECHNOLOGY VERIFICATION REPORT: ANEST IWATA CORPORATION LPH400-LV HVLP SPRAY GUN

    EPA Science Inventory

    This Enviornmental Technology Verification reports on the characteristics of a paint spray gun. The research showed that the spray gun provided absolute and relative increases in transfer efficiency over the base line and provided a reduction in the use of paint.

  13. IN PURSUIT OF AN INTERNATIONAL APPROACH TO QUALITY ASSURANCE FOR ENVIRONMENTAL TECHNOLOGY VERIFICATION

    EPA Science Inventory

    In the mid-1990's, the USEPA began the Environmental Technology Verification (ETV) Program in order to provide purchasers of environmental technology with independently acquired, quality-assured, test data, upon which to base their purchasing decisions. From the beginning, a str...

  14. 78 FR 27390 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-10

    ... programs voluntarily self-nominate their practice or healthcare system by completing a web-based nomination... CDC with a ranked list of nominees. Finalists will be asked to participate in a data verification process that includes verification of how information was obtained from electronic records, remote...

  15. Formal hardware verification of digital circuits

    NASA Technical Reports Server (NTRS)

    Joyce, J.; Seger, C.-J.

    1991-01-01

    The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.

  16. Microsatellite Imputation for parental verification from SNP across multiple Bos taurus and indicus breeds

    USDA-ARS?s Scientific Manuscript database

    Microsatellite markers (MS) have traditionally been used for parental verification and are still the international standard in spite of their higher cost, error rate, and turnaround time compared with Single Nucleotide Polymorphisms (SNP)-based assays. Despite domestic and international demands fro...

  17. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  18. Development of CFC-Free Cleaning Processes at the NASA White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul

    1995-01-01

    The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-113- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. The presentation will include the findings of investigations of aqueous cleaning and verification processes that are based on a draft of a proposed NASA Kennedy Space Center (KSC) cleaning procedure. Verification testing with known contaminants, such as hydraulic fluid and commonly used oils, established correlations between nonvolatile residue and CFC-113. Recoveries ranged from 35 to 60 percent of theoretical. WSTF is also investigating enhancements to aqueous sampling for organics and particulates. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon-225 (HCFC-225), tert-butylmethylether, and n-Hexane was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC-113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autoignition and liquid oxygen mechanical impact testing.

  19. The usefulness of Poynting's theorem in magnetic turbulence

    NASA Astrophysics Data System (ADS)

    Treumann, Rudolf A.; Baumjohann, Wolfgang

    2017-12-01

    We rewrite Poynting's theorem, already used in a previous publication Treumann and Baumjohann (2017a) to derive relations between the turbulent magnetic and electric power spectral densities, to make explicit where the mechanical contributions enter. We then make explicit use of the relativistic transformation of the turbulent electric fluctuations to obtain expressions which depend only on the magnetic and velocity fluctuations. Any electric fluctuations play just an intermediate role. Equations are constructed for the turbulent conductivity spectrum in Alfvénic and non-Alfvénic turbulence in extension of the results in the above citation. An observation-based discussion of their use in application to solar wind turbulence is given. The inertial range solar wind turbulence exhibits signs of chaos and self-organization.

  20. Experimental evaluation of fingerprint verification system based on double random phase encoding

    NASA Astrophysics Data System (ADS)

    Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi

    2006-03-01

    We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.

  1. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  2. Eliminating Sexism: Rewriting the Scripts. An Informational Guide to Sex Stereotyping, Sex Bias and Sex Discrimination. Instructor's Manual.

    ERIC Educational Resources Information Center

    Etheridge, Rose M.; Rice, Eric

    Part of a series devoted to identifying and evaluating strategies which vocational education administrators and instructors can use at the secondary student, teacher, or administrator level to eliminate sex stereotyping and sex bias in vocational education programs, this manual provides teachers with instructional materials concerning the current…

  3. 75 FR 41093 - General Services Administration Acquisition Regulation; Rewrite of GSAR Part 516, Types of Contracts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-15

    ...) will place all orders by EDI using computer-to-computer EDI. If computer-to-computer EDI is not possible, FAS will use an alternative EDI method allowing the Contractor to receive orders by facsimile transmission. Subject to the Contractor's agreement, other agencies may place orders by EDI. * * * * * (g) The...

  4. Heterotopia and Its Role in the Lived Experiences of Resettlement

    ERIC Educational Resources Information Center

    Rossetto, Marietta

    2006-01-01

    Place, as a metaphor, can be experienced in different ways, existing or created. If created, space can be Foucault's "placeless place", a utopia. A place that exists, however, can be a heterotopic space. A heterotopia is what we as individuals interpret it to be: it can be a space for reconstituting the self, rewriting the scripts of…

  5. The Effect of Instruction Method and Relearning on Dutch Spelling Performance of Third- through Fifth-Graders

    ERIC Educational Resources Information Center

    Bouwmeester, Samantha; Verkoeijen, Peter P. J. L.

    2011-01-01

    In this study, we compared two instruction methods on spelling performance: a rewriting instruction in which children repeatedly rewrote words and an ambiguous property instruction in which children deliberately practiced on a difficult word aspect. Moreover, we examined whether the testing effect applies to spelling performance. One hundred…

  6. TEACHING AND TRAINING WITH MOTION PICTURES (MAGNETIC SOUND).

    ERIC Educational Resources Information Center

    Bell and Howell Co., Lincolnwood, IL.

    THE PREPARATION OF A MAGNETIC-SOUND TRACK FOR 16 MM. MOTION PICTURE FILMS IS DESCRIBED. IN SCRIPT PREPARATION, THE SCRIPT SHOULD BE WRITTEN IN NARRATIVE FORM TO INCLUDE ALL SHOTS NEEDED AND TO SUPPLEMENT AND GIVE INFORMATION NOT IN THE FILM. LANGUAGE SHOULD BE KEPT SIMPLE, AND UNAVOIDABLE TECHNICAL TERMS SHOULD BE EXPLAINED. IN REWRITING THE…

  7. 75 FR 20271 - Oil and Gas and Sulphur Operations in the Outer Continental Shelf-Oil and Gas Production...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-19

    ... amending the regulations regarding oil and natural gas production requirements. This is a complete rewrite... flaring natural gas, to ensure appropriate development of these natural resources. The final rule eliminates most restrictions on production rates and clarifies limits on the amount of natural gas that can...

  8. Spinning New Tales from Traditional Texts: Donna Jo Napoli and the Rewriting of Fairy Tale.

    ERIC Educational Resources Information Center

    Crew, Hilary S.

    2002-01-01

    Demonstrates how Donna Jo Napoli changes generic conventions and reworks discursive formations in order to retell tradition tales. Discusses the narrative strategies she uses in telling her stories, her representation of male and female characters in regard to gender and gendered relationships, and the way she renegotiates ideologies and value…

  9. Rewriting American Democracy: Language and Cultural (Dis)locations in Esmeralda Santiago and Julia Alvarez

    ERIC Educational Resources Information Center

    Schultermandl, Silvia

    2007-01-01

    This article talks about how two American authors of Latin-Caribbean descent, Esmeralda Santiago and Julia Alvarez, inscribe their native language into the discourse of American literature, contributing to a more diverse picture of what American culture is. Thus Alvarez's and Santiago's texts not only renegotiate ethnic immigrant experiences of…

  10. (Re)Writing "Feminism in Canada": Wikipedia in the Feminist Classroom

    ERIC Educational Resources Information Center

    Cattapan, Alana

    2012-01-01

    In the winter of 2012, the students in the author's fourth-year seminar, The Politics of the Canadian Women's Movement, undertook the project of editing, updating, and expanding a number of Wikipedia articles, including the page on "Feminism in Canada." Though it often serves as a first point of reference for research on Canadian…

  11. Silent Echoes: A Young Author Rewrites the Rules to Transitioning

    ERIC Educational Resources Information Center

    Stephens, Aaron Notarianni

    2008-01-01

    This article describes Sarah, a young woman with autism from Frederick, Maryland, who made a choice to forgo traditional employment options for people with disabilities and to pursue the seemingly improbable option of becoming an author. Becoming a successful writer can be a dubious prospect for people without disabilities. And yet with talent,…

  12. The Next Chapter: Supporting Literacy within ESEA

    ERIC Educational Resources Information Center

    Haynes, Mariana

    2015-01-01

    Noting that 60 percent of both fourth and eighth graders currently struggle with reading, this report urges the U.S. Congress to focus on students' literacy development from early childhood through grade twelve as it works to rewrite of the Elementary and Secondary Education Act (ESEA), currently known as the No Child Left Behind Act (NCLB). In…

  13. Writing Stories, Rewriting Identities: Using Journalism Education and Mobile Technologies to Empower Marginalized High School Students

    ERIC Educational Resources Information Center

    Cybart-Persenaire, Alena; Literat, Ioana

    2018-01-01

    This study examines the impact that producing a print newspaper using cell phones had on marginalized students in a high school journalism classroom. Analysis of data from participant observation, artifact analysis and student interviews revealed that a) students negotiated cell phone use for educational purposes, despite school bans on such…

  14. The Politics of Desire and Possibility in Urban Playwriting: (Re)reading and (Re)writing the Script

    ERIC Educational Resources Information Center

    Winn, Maisha T.

    2012-01-01

    In this article, the author analyses scripts written by incarcerated girls in playwriting and performance workshops conducted in regional youth detention centres and performed by formerly incarcerated girls in a programme called "Girl Time" in an urban American southeastern city. Through a close reading and analysis of characters, plots…

  15. The Culture of Witnessing: War Correspondents Rewriting the History of the Iraq War

    ERIC Educational Resources Information Center

    Mellor, Noha

    2012-01-01

    Building on Zelizer's framework of analyzing journalism and memory, this article aims to analyze Arab journalists' narratives of the Iraq War. Through scrutinizing four selected narratives, published by four pan-Arab journalists from three different transnational satellite channels (Abu Dhabi TV, Al Jazeera and Al Manar), I aim to show how their…

  16. Actively Experiencing Shakespeare: Students "Get on Their Feet" for "Henry IV, Part One."

    ERIC Educational Resources Information Center

    Meyer, Herbert M.; Thomsen, Lee

    1999-01-01

    Discusses how a literature and multimedia course for 11th and 12th graders used active-learning experiences to engage students with Shakespeare's "Henry IV, Part One." Describes how shouting Hal's soliloquy; constructing a chart of character relations; rewriting a scene in their own words; performing, filming, and critiquing a scene; and…

  17. Writing Experiment Manuals in Science Education: The Impact of Writing, Genre, and Audience

    ERIC Educational Resources Information Center

    Rijlaarsdam, Gert; Couzijn, Michel; Janssen, Tanja; Braaksma, Martine; Kieft, Marleen

    2006-01-01

    In this study, Grade 9 students wrote experiment manuals for their peers describing a simple physics investigation to explore whether air takes space. Peers executed these manuals and their processes were videotaped. In several experimental conditions, these videotapes were played back for authors. Then they had to rewrite the experiment manual.…

  18. Alumni Try to Rewrite History on College-Newspaper Web Sites

    ERIC Educational Resources Information Center

    Kolowich, Steve

    2009-01-01

    When Nickie Dobo wrote a column in 2003 for her college newspaper--"The Daily Collegian" at Pennsylvania State University--decrying the "hook-up culture" on the campus, she never expected it to resurface years later in an attack on her professional credibility. But that's what happened when Ms. Dobo, now a reporter for the…

  19. Rewriting Identity: Social Meanings of Literacy and "Re-Visions" of Self.

    ERIC Educational Resources Information Center

    Mahiri, Jabari; Godley, Amanda J.

    1998-01-01

    Studies the life story and perceptions of literacy of a highly literate Latina female college student who developed Carpal Tunnel Syndrome. Finds that the woman's ability to write had been the foundation on which her social meanings of literacy rested: her life story revealed how her identity was influenced by societal values for writing and…

  20. Re-Writing the Subject: Psychoanalytic Approaches to Creative Writing and Composition Pedagogy.

    ERIC Educational Resources Information Center

    Harris, Judith

    2001-01-01

    Suggests that the teaching of both composition and creative writing would benefit from focusing less exclusively on the writing process and products and more on the writing subject. Claims that focusing on the writing subject through the lens of psychoanalysis provides several potential benefits. Concludes psychoanalysis can be a filtrate for the…

  1. Shakespeare and the Cultural Capital Tension: Advancing Literacy in Rural Arkansas

    ERIC Educational Resources Information Center

    Jolliffe, David Alton

    2012-01-01

    As the author does his job, trying to sponsor and support reading and writing practices that will ideally enrich lives and communities throughout Arkansas, he is always tempted to rewrite the American Declaration of Independence so that its second paragraph begins this way: "We hold these truths to be self-evident, that all men are created…

  2. Cover-Copy-Compare and Spelling: One versus Three Repetitions

    ERIC Educational Resources Information Center

    Erion, Joel; Davenport, Cindy; Rodax, Nicole; Scholl, Bethany; Hardy, Jennifer

    2009-01-01

    Cover, copy, compare (CCC) has been used with success to improve spelling skills. This study adds to existing research by completing an analysis of the rewriting component of the intervention. The impact of varying the number of times a subject copied a word following an error was examined with four elementary age students. An adaptive alternating…

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, SWINE WASTE ELECTRIC POWER AND HEAT PRODUCTION--CAPSTONE 30KW MICROTURBINE SYSTEM

    EPA Science Inventory

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, a combined heat and power system was evaluated based on the Capstone 30kW Microturbine developed by Cain Ind...

  4. Finding the Bio in Biobased Products: Electrophoretic Identification of Wheat Proteins in Processed Products

    USDA-ARS?s Scientific Manuscript database

    Verification of the bio-content in bio-based or green products identifies genuine products, exposes counterfeit copies, supports or refutes content claims and ensures consumer confidence. When the bio-content includes protein, elemental nitrogen analysis is insufficient for verification since non-pr...

  5. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to the business community in a secure manner, CBSV provides us with cost and workload management benefits. New Information: To use CBSV, interested parties must pay a one- time non-refundable...

  6. RF model of the distribution system as a communication channel, phase 2. Volume 2: Task reports

    NASA Technical Reports Server (NTRS)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    Based on the established feasibility of predicting, via a model, the propagation of Power Line Frequency on radial type distribution feeders, verification studies comparing model predictions against measurements were undertaken using more complicated feeder circuits and situations. Detailed accounts of the major tasks are presented. These include: (1) verification of model; (2) extension, implementation, and verification of perturbation theory; (3) parameter sensitivity; (4) transformer modeling; and (5) compensation of power distribution systems for enhancement of power line carrier communication reliability.

  7. Experimental preparation and verification of quantum money

    NASA Astrophysics Data System (ADS)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  8. Formal Verification of Complex Systems based on SysML Functional Requirements

    DTIC Science & Technology

    2014-12-23

    Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools

  9. Rule groupings: An approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.

  10. Spot scanning proton therapy plan assessment: design and development of a dose verification application for use in routine clinical practice

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Walsh, Timothy J.; Beltran, Chris J.; Stoker, Joshua B.; Mundy, Daniel W.; Parry, Mark D.; Bues, Martin; Fatyga, Mirek

    2016-04-01

    The use of radiation therapy for the treatment of cancer has been carried out clinically since the late 1800's. Early on however, it was discovered that a radiation dose sufficient to destroy cancer cells can also cause severe injury to surrounding healthy tissue. Radiation oncologists continually strive to find the perfect balance between a dose high enough to destroy the cancer and one that avoids damage to healthy organs. Spot scanning or "pencil beam" proton radiotherapy offers another option to improve on this. Unlike traditional photon therapy, proton beams stop in the target tissue, thus better sparing all organs beyond the targeted tumor. In addition, the beams are far narrower and thus can be more precisely "painted" onto the tumor, avoiding exposure to surrounding healthy tissue. To safely treat patients with proton beam radiotherapy, dose verification should be carried out for each plan prior to treatment. Proton dose verification systems are not currently commercially available so the Department of Radiation Oncology at the Mayo Clinic developed its own, called DOSeCHECK, which offers two distinct dose simulation methods: GPU-based Monte Carlo and CPU-based analytical. The three major components of the system include the web-based user interface, the Linux-based dose verification simulation engines, and the supporting services and components. The architecture integrates multiple applications, libraries, platforms, programming languages, and communication protocols and was successfully deployed in time for Mayo Clinic's first proton beam therapy patient. Having a simple, efficient application for dose verification greatly reduces staff workload and provides additional quality assurance, ultimately improving patient safety.

  11. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamashita, M; Kokubo, M; Institute of Biomedical Research and Innovation, Kobe, Hyogo

    2016-06-15

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used formore » dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  12. Knowledge based system verification and validation as related to automation of space station subsystems: Rationale for a knowledge based system lifecycle

    NASA Technical Reports Server (NTRS)

    Richardson, Keith; Wong, Carla

    1988-01-01

    The role of verification and validation (V and V) in software has been to support and strengthen the software lifecycle and to ensure that the resultant code meets the standards of the requirements documents. Knowledge Based System (KBS) V and V should serve the same role, but the KBS lifecycle is ill-defined. The rationale of a simple form of the KBS lifecycle is presented, including accommodation to certain critical KBS differences from software development.

  13. A formal approach to validation and verification for knowledge-based control systems

    NASA Technical Reports Server (NTRS)

    Castore, Glen

    1987-01-01

    As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.

  14. Options and Risk for Qualification of Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)

    2002-01-01

    Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.

  15. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    PubMed

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  16. WRAP-RIB antenna technology development

    NASA Technical Reports Server (NTRS)

    Freeland, R. E.; Garcia, N. F.; Iwamoto, H.

    1985-01-01

    The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.

  17. A Secure Framework for Location Verification in Pervasive Computing

    NASA Astrophysics Data System (ADS)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  18. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  19. Specification, Synthesis, and Verification of Software-based Control Protocols for Fault-Tolerant Space Systems

    DTIC Science & Technology

    2016-08-16

    Force Research Laboratory Space Vehicles Directorate AFRL /RVSV 3550 Aberdeen Ave, SE 11. SPONSOR/MONITOR’S REPORT Kirtland AFB, NM 87117-5776 NUMBER...Ft Belvoir, VA 22060-6218 1 cy AFRL /RVIL Kirtland AFB, NM 87117-5776 2 cys Official Record Copy AFRL /RVSV/Richard S. Erwin 1 cy... AFRL -RV-PS- AFRL -RV-PS- TR-2016-0112 TR-2016-0112 SPECIFICATION, SYNTHESIS, AND VERIFICATION OF SOFTWARE-BASED CONTROL PROTOCOLS FOR FAULT-TOLERANT

  20. Formal Verification at System Level

    NASA Astrophysics Data System (ADS)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.

    2009-05-01

    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  1. Pilot Guidelines for Improving Instructional Materials Through the Process of Learner Verification and Revision.

    ERIC Educational Resources Information Center

    Educational Products Information Exchange Inst., Stony Brook, NY.

    Learner Verification and Revision (LVR) Process of Instructional Materials is an ongoing effort for the improvement of instructional materials based on systematic feedback from learners who have used the materials. This evaluation gives publishers a method of identifying instructional strengths and weaknesses of a product and provides an…

  2. ANDalyze Lead 100 Test Kit and AND1000 Fluorimeter Environmental Technology Verification Report and Statement

    EPA Science Inventory

    This report provides results for the verification testing of the Lead100/AND1000. The following is a description of the technology based on information provided by the vendor. The information provided below was not verified in this test. The ANDalyze Lead100/AND1000 was des...

  3. Imputation of microsatellite alleles from dense SNP genotypes for parentage verification across multiple Bos taurus and Bos indicus breeds

    USDA-ARS?s Scientific Manuscript database

    Microsatellite markers (MS) have traditionally been used for parental verification and are still the international standard in spite of their higher cost, error rate, and turnaround time compared with Single Nucleotide Polymorphisms (SNP) -based assays. Despite domestic and international demands fr...

  4. Integrating Model-Based Verification into Software Design Education

    ERIC Educational Resources Information Center

    Yilmaz, Levent; Wang, Shuo

    2005-01-01

    Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…

  5. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  6. A Verification-Driven Approach to Traceability and Documentation for Auto-Generated Mathematical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Fischer, Bernd

    2009-01-01

    Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.

  7. A comparative verification of high resolution precipitation forecasts using model output statistics

    NASA Astrophysics Data System (ADS)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  8. A novel all-optical label processing based on multiple optical orthogonal codes sequences for optical packet switching networks

    NASA Astrophysics Data System (ADS)

    Zhang, Chongfu; Qiu, Kun; Xu, Bo; Ling, Yun

    2008-05-01

    This paper proposes an all-optical label processing scheme that uses the multiple optical orthogonal codes sequences (MOOCS)-based optical label for optical packet switching (OPS) (MOOCS-OPS) networks. In this scheme, each MOOCS is a permutation or combination of the multiple optical orthogonal codes (MOOC) selected from the multiple-groups optical orthogonal codes (MGOOC). Following a comparison of different optical label processing (OLP) schemes, the principles of MOOCS-OPS network are given and analyzed. Firstly, theoretical analyses are used to prove that MOOCS is able to greatly enlarge the number of available optical labels when compared to the previous single optical orthogonal code (SOOC) for OPS (SOOC-OPS) network. Then, the key units of the MOOCS-based optical label packets, including optical packet generation, optical label erasing, optical label extraction and optical label rewriting etc., are given and studied. These results are used to verify that the proposed MOOCS-OPS scheme is feasible.

  9. Indiana Studies: Hoosier History, Government, and People. Unit II: Constitutional Crisis and Change.

    ERIC Educational Resources Information Center

    Barger, Harry D.; And Others

    The three chapters in Unit 2 of a six-unit series on Indiana state history designed to be taught in Indiana secondary schools chronicle the need for rewriting the Constitution of 1816, the events of the Constitutional Convention of 1850-51, and the details of the new constitution. Chapter 1 explains the reasons that Hoosiers wanted a new…

  10. Dismantling the Prison-House of Colonial History in a Selection of Michelle Cliff's Texts

    ERIC Educational Resources Information Center

    Labidi, Abid Larbi

    2016-01-01

    Most, if not all, writings by Jamaican writer Michelle Cliff are connected by a subterranean desire to re-write Afro-Caribbean history from new untold perspectives in reaction to the immense loss and/or distortions that marked the region's history for entire centuries. In this paper, I meticulously read four of Cliff's texts--"Abeng"…

  11. Contesting the Politics of Culture, Rewriting the Boundaries of Inclusion: Working for Social Justice with Muslim and Arab Communities.

    ERIC Educational Resources Information Center

    Abu El-Haj, Thea R.

    2002-01-01

    Recommends that educational anthropologists publicly attack the ideological purposes to which the concept of culture has been deployed following the September 11 attacks, noting the importance of supporting schools, communities, and the media in addressing the power and politics of race and religion in contemporary social and political contexts.…

  12. "Tech"nically Speaking: Social Technology Cyberbullying among Middle and High School Peers

    ERIC Educational Resources Information Center

    Weber, Nicole L.

    2012-01-01

    Being a teenager is not easy, but most of us live through it. Cyberbullying suicide victims will not have this luxury. Advancements in and access to social technologies (social networking sites, instant messaging systems, cell phone texting) are rewriting interaction patterns as they provide a majority of our nation's students with 24-hour-a-day,…

  13. Rewriting the Script: Multiple Modalities in a High School Humanities Classroom

    ERIC Educational Resources Information Center

    Block, Joshua

    2014-01-01

    In this article, Joshua Block states that his high school students are creators discovering how to express their ideas and emotions in multiple, complex ways. He teaches students who write their lives through words on pages as they fill journal after journal. There are others who constantly write and create in the form of tweets, photos, videos,…

  14. The New Taylorism: Hacking at the Philosophy of the University's End

    ERIC Educational Resources Information Center

    Goodman, Robin Truth

    2012-01-01

    This article looks at the critical writings of Mark C. Taylor. It suggests that Mark C. Taylor is rewriting a global imaginary devoid of the kind of citizenship that Henry Giroux claims as the basis for public education. Instead, Taylor wants to see the university take shape as profit-generating. According to Taylor, in lieu of learning to take…

  15. The American Indian Reader: History. Book Four of a Series in Educational Perspectives.

    ERIC Educational Resources Information Center

    Costo, Rupert; Henry, Jeannette, Ed.

    In an attempt to rewrite American history incorporating "long hidden facts" pertinent to the American Indian, this book endeavors to relate the "truth in history" and make "humanity see itself face to face without fear and in spite of the pangs of conscience". Each of 7 chapters addresses a specific aspect of American history relevant to the…

  16. 29 CFR 793.8 - “News editor.”

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 3 2013-07-01 2013-07-01 false âNews editor.â 793.8 Section 793.8 Labor Regulations... Exemption § 793.8 “News editor.” A news editor is an employee who gathers, edits and rewrites the news. He may also select and prepare news items for broadcast and present the news on the air. An employee who...

  17. 29 CFR 793.8 - “News editor.”

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 3 2011-07-01 2011-07-01 false âNews editor.â 793.8 Section 793.8 Labor Regulations... Exemption § 793.8 “News editor.” A news editor is an employee who gathers, edits and rewrites the news. He may also select and prepare news items for broadcast and present the news on the air. An employee who...

  18. 29 CFR 793.8 - “News editor.”

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 3 2012-07-01 2012-07-01 false âNews editor.â 793.8 Section 793.8 Labor Regulations... Exemption § 793.8 “News editor.” A news editor is an employee who gathers, edits and rewrites the news. He may also select and prepare news items for broadcast and present the news on the air. An employee who...

  19. 29 CFR 793.8 - “News editor.”

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false âNews editor.â 793.8 Section 793.8 Labor Regulations... Exemption § 793.8 “News editor.” A news editor is an employee who gathers, edits and rewrites the news. He may also select and prepare news items for broadcast and present the news on the air. An employee who...

  20. 29 CFR 793.8 - “News editor.”

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 3 2014-07-01 2014-07-01 false âNews editor.â 793.8 Section 793.8 Labor Regulations... Exemption § 793.8 “News editor.” A news editor is an employee who gathers, edits and rewrites the news. He may also select and prepare news items for broadcast and present the news on the air. An employee who...

  1. INFOL for the CDC 6400 Information Storage and Retrieval System. Reference Manual.

    ERIC Educational Resources Information Center

    Mittman, B.; And Others

    INFOL for the CDC 6400 is a rewrite in FORTRAN IV of the CDC 3600/3800 INFOL (Information Oriented Language), a generalized information storage and retrieval system developed by the Control Data Corporation for the CDC 3600/3800 computer. With INFOL, selected pieces of information are extracted from a file and presented to the user quickly and…

  2. 76 FR 54268 - Self-Regulatory Organizations; Fixed Income Clearing Corporation; Notice of Filing of Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-31

    ... undertaking a rewrite of its internal software applications and operating systems to promote efficiency and... believes there is no need to provide participants with a choice of match mode because MBSD's system already attempts to find an exact match for trade input and, only if an exact match is not found, will the system...

  3. A National Education Standards Exit Strategy for States. WebMemo. No. 3437

    ERIC Educational Resources Information Center

    Burke, Lindsey M.

    2011-01-01

    The push for centralized control over what every child should learn has never had more momentum. The Obama Administration has pressured states to adopt the Common Core State Standards Initiative, conditioning more than $4 billion in Race to the Top grants on its adoption. The Administration's blueprint for the rewrite of No Child Left Behind also…

  4. A Newcomer Gains Power: An Analysis of the Role of Rhetorical Expertise.

    ERIC Educational Resources Information Center

    Katz, Susan M.

    1998-01-01

    Offers a case study describing how the rhetorical expertise of a young woman (at the lowest professional level in a male-dominated bureaucratic organization) gave her the power to revise the processes by which her organization did its work, to rewrite the job descriptions of the managers within the organization, and to create a unique role for…

  5. Rethinking the "L" Word in Higher Education. ASHE Higher Education Report, Volume 31, Number 6

    ERIC Educational Resources Information Center

    Kezar, Adrianna J., Ed.; Carducci, Rozana, Ed.; Contreras-McGavin, Melissa, Ed.

    2006-01-01

    Given the major changes that have occurred, it is important to examine the state of leadership research in higher education. This volume updates the 1989 Bensimon, Neumann, and Birnbaum volume. Rather than rewrite a book that nicely summarized research on leadership until the late 1980s, this book focuses on reviewing advances in paradigms,…

  6. Rewriting the Metabolic Blueprint: Advances in Pathway Diversification in Microorganisms

    PubMed Central

    Hossain, Gazi Sakir; Nadarajan, Saravanan Prabhu; Zhang, Lei; Ng, Tee-Kheang; Foo, Jee Loon; Ling, Hua; Choi, Won Jae; Chang, Matthew Wook

    2018-01-01

    Living organisms have evolved over millions of years to fine tune their metabolism to create efficient pathways for producing metabolites necessary for their survival. Advancement in the field of synthetic biology has enabled the exploitation of these metabolic pathways for the production of desired compounds by creating microbial cell factories through metabolic engineering, thus providing sustainable routes to obtain value-added chemicals. Following the past success in metabolic engineering, there is increasing interest in diversifying natural metabolic pathways to construct non-natural biosynthesis routes, thereby creating possibilities for producing novel valuable compounds that are non-natural or without elucidated biosynthesis pathways. Thus, the range of chemicals that can be produced by biological systems can be expanded to meet the demands of industries for compounds such as plastic precursors and new antibiotics, most of which can only be obtained through chemical synthesis currently. Herein, we review and discuss novel strategies that have been developed to rewrite natural metabolic blueprints in a bid to broaden the chemical repertoire achievable in microorganisms. This review aims to provide insights on recent approaches taken to open new avenues for achieving biochemical production that are beyond currently available inventions. PMID:29483901

  7. Rewriting the Metabolic Blueprint: Advances in Pathway Diversification in Microorganisms.

    PubMed

    Hossain, Gazi Sakir; Nadarajan, Saravanan Prabhu; Zhang, Lei; Ng, Tee-Kheang; Foo, Jee Loon; Ling, Hua; Choi, Won Jae; Chang, Matthew Wook

    2018-01-01

    Living organisms have evolved over millions of years to fine tune their metabolism to create efficient pathways for producing metabolites necessary for their survival. Advancement in the field of synthetic biology has enabled the exploitation of these metabolic pathways for the production of desired compounds by creating microbial cell factories through metabolic engineering, thus providing sustainable routes to obtain value-added chemicals. Following the past success in metabolic engineering, there is increasing interest in diversifying natural metabolic pathways to construct non-natural biosynthesis routes, thereby creating possibilities for producing novel valuable compounds that are non-natural or without elucidated biosynthesis pathways. Thus, the range of chemicals that can be produced by biological systems can be expanded to meet the demands of industries for compounds such as plastic precursors and new antibiotics, most of which can only be obtained through chemical synthesis currently. Herein, we review and discuss novel strategies that have been developed to rewrite natural metabolic blueprints in a bid to broaden the chemical repertoire achievable in microorganisms. This review aims to provide insights on recent approaches taken to open new avenues for achieving biochemical production that are beyond currently available inventions.

  8. Age verification cards fail to fully prevent minors from accessing tobacco products.

    PubMed

    Kanda, Hideyuki; Osaki, Yoneatsu; Ohida, Takashi; Kaneita, Yoshitaka; Munezawa, Takeshi

    2011-03-01

    Proper age verification can prevent minors from accessing tobacco products. For this reason, electronic locking devices based on a proof-of age system utilising cards were installed in almost every tobacco vending machine across Japan and Germany to restrict sales to minors. We aimed to clarify the associations between amount smoked by high school students and the usage of age verification cards by conducting a nationwide cross-sectional survey of students in Japan. This survey was conducted in 2008. We asked high school students, aged 13-18 years, in Japan about their smoking behaviour, where they purchase cigarettes, if or if not they have used age verification cards, and if yes, how they obtained this card. As the amount smoked increased, the prevalence of purchasing cigarettes from vending machines also rose for both males and females. The percentage of those with experience of using an age verification card was also higher among those who smoked more. Somebody outside of family was the top source of obtaining cards. Surprisingly, around 5% of males and females belonging to the group with highest smoking levels applied for cards themselves. Age verification cards cannot fully prevent minors from accessing tobacco products. These findings suggest that a total ban of tobacco vending machines, not an age verification system, is needed to prevent sales to minors.

  9. Formal verification of a set of memory management units

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, Gerald C.

    1992-01-01

    This document describes the verification of a set of memory management units (MMU). The verification effort demonstrates the use of hierarchical decomposition and abstract theories. The MMUs can be organized into a complexity hierarchy. Each new level in the hierarchy adds a few significant features or modifications to the lower level MMU. The units described include: (1) a page check translation look-aside module (TLM); (2) a page check TLM with supervisor line; (3) a base bounds MMU; (4) a virtual address translation MMU; and (5) a virtual address translation MMU with memory resident segment table.

  10. Towards Verification and Validation for Increased Autonomy

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra

    2017-01-01

    This presentation goes over the work we have performed over the last few years on verification and validation of the next generation onboard collision avoidance system, ACAS X, for commercial aircraft. It describes our work on probabilistic verification and synthesis of the model that ACAS X is based on, and goes on to the validation of that model with respect to actual simulation and flight data. The presentation then moves on to identify the characteristics of ACAS X that are related to autonomy and to discuss the challenges that autonomy pauses on VV. All work presented has already been published.

  11. EOS-AM precision pointing verification

    NASA Technical Reports Server (NTRS)

    Throckmorton, A.; Braknis, E.; Bolek, J.

    1993-01-01

    The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.

  12. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  13. On verifying a high-level design. [cost and error analysis

    NASA Technical Reports Server (NTRS)

    Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.

    1993-01-01

    An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.

  14. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  15. Limitations in learning: How treatment verifications fail and what to do about it?

    PubMed

    Richardson, Susan; Thomadsen, Bruce

    The purposes of this study were: to provide dialog on why classic incident learning systems have been insufficient for patient safety improvements, discuss failures in treatment verification, and to provide context to the reasons and lessons that can be learned from these failures. Historically, incident learning in brachytherapy is performed via database mining which might include reading of event reports and incidents followed by incorporating verification procedures to prevent similar incidents. A description of both classic event reporting databases and current incident learning and reporting systems is given. Real examples of treatment failures based on firsthand knowledge are presented to evaluate the effectiveness of verification. These failures will be described and analyzed by outlining potential pitfalls and problems based on firsthand knowledge. Databases and incident learning systems can be limited in value and fail to provide enough detail for physicists seeking process improvement. Four examples of treatment verification failures experienced firsthand by experienced brachytherapy physicists are described. These include both underverification and oververification of various treatment processes. Database mining is an insufficient method to affect substantial improvements in the practice of brachytherapy. New incident learning systems are still immature and being tested. Instead, a new method of shared learning and implementation of changes must be created. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  16. A Lightweight RFID Mutual Authentication Protocol Based on Physical Unclonable Function.

    PubMed

    Xu, He; Ding, Jie; Li, Peng; Zhu, Feng; Wang, Ruchuan

    2018-03-02

    With the fast development of the Internet of Things, Radio Frequency Identification (RFID) has been widely applied into many areas. Nevertheless, security problems of the RFID technology are also gradually exposed, when it provides life convenience. In particular, the appearance of a large number of fake and counterfeit goods has caused massive loss for both producers and customers, for which the clone tag is a serious security threat. If attackers acquire the complete information of a tag, they can then obtain the unique identifier of the tag by some technological means. In general, because there is no extra identifier of a tag, it is difficult to distinguish an original tag and its clone one. Once the legal tag data is obtained, attackers can be able to clone this tag. Therefore, this paper shows an efficient RFID mutual verification protocol. This protocol is based on the Physical Unclonable Function (PUF) and the lightweight cryptography to achieve efficient verification of a single tag. The protocol includes three process: tag recognition, mutual verification and update. The tag recognition is that the reader recognizes the tag; mutual verification is that the reader and tag mutually verify the authenticity of each other; update is supposed to maintain the latest secret key for the following verification. Analysis results show that this protocol has a good balance between performance and security.

  17. A Lightweight RFID Mutual Authentication Protocol Based on Physical Unclonable Function

    PubMed Central

    Ding, Jie; Zhu, Feng; Wang, Ruchuan

    2018-01-01

    With the fast development of the Internet of Things, Radio Frequency Identification (RFID) has been widely applied into many areas. Nevertheless, security problems of the RFID technology are also gradually exposed, when it provides life convenience. In particular, the appearance of a large number of fake and counterfeit goods has caused massive loss for both producers and customers, for which the clone tag is a serious security threat. If attackers acquire the complete information of a tag, they can then obtain the unique identifier of the tag by some technological means. In general, because there is no extra identifier of a tag, it is difficult to distinguish an original tag and its clone one. Once the legal tag data is obtained, attackers can be able to clone this tag. Therefore, this paper shows an efficient RFID mutual verification protocol. This protocol is based on the Physical Unclonable Function (PUF) and the lightweight cryptography to achieve efficient verification of a single tag. The protocol includes three process: tag recognition, mutual verification and update. The tag recognition is that the reader recognizes the tag; mutual verification is that the reader and tag mutually verify the authenticity of each other; update is supposed to maintain the latest secret key for the following verification. Analysis results show that this protocol has a good balance between performance and security. PMID:29498684

  18. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  19. A calibration method for patient specific IMRT QA using a single therapy verification film

    PubMed Central

    Shukla, Arvind Kumar; Oinam, Arun S.; Kumar, Sanjeev; Sandhu, I.S.; Sharma, S.C.

    2013-01-01

    Aim The aim of the present study is to develop and verify the single film calibration procedure used in intensity-modulated radiation therapy (IMRT) quality assurance. Background Radiographic films have been regularly used in routine commissioning of treatment modalities and verification of treatment planning system (TPS). The radiation dosimetery based on radiographic films has ability to give absolute two-dimension dose distribution and prefer for the IMRT quality assurance. However, the single therapy verification film gives a quick and significant reliable method for IMRT verification. Materials and methods A single extended dose rate (EDR 2) film was used to generate the sensitometric curve of film optical density and radiation dose. EDR 2 film was exposed with nine 6 cm × 6 cm fields of 6 MV photon beam obtained from a medical linear accelerator at 5-cm depth in solid water phantom. The nine regions of single film were exposed with radiation doses raging from 10 to 362 cGy. The actual dose measurements inside the field regions were performed using 0.6 cm3 ionization chamber. The exposed film was processed after irradiation using a VIDAR film scanner and the value of optical density was noted for each region. Ten IMRT plans of head and neck carcinoma were used for verification using a dynamic IMRT technique, and evaluated using the gamma index method with TPS calculated dose distribution. Results Sensitometric curve has been generated using a single film exposed at nine field region to check quantitative dose verifications of IMRT treatments. The radiation scattered factor was observed to decrease exponentially with the increase in the distance from the centre of each field region. The IMRT plans based on calibration curve were verified using the gamma index method and found to be within acceptable criteria. Conclusion The single film method proved to be superior to the traditional calibration method and produce fast daily film calibration for highly accurate IMRT verification. PMID:24416558

  20. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tachibana, H; Tachibana, R

    2015-06-15

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification softwaremore » program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction.« less

  1. SU-E-T-398: Feasibility of Automated Tools for Robustness Evaluation of Advanced Photon and Proton Techniques in Oropharyngeal Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, H; Liang, X; Kalbasi, A

    2014-06-01

    Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: protonmore » PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.« less

  2. Development and verification of an agent-based model of opinion leadership.

    PubMed

    Anderson, Christine A; Titler, Marita G

    2014-09-27

    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The development and testing of agent-based models is an iterative process. The opinion leader model presented here provides a basic structure for continued model development, ongoing verification, and the establishment of validation procedures, including empirical data collection.

  3. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    NASA Astrophysics Data System (ADS)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.

  4. From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems

    DTIC Science & Technology

    2015-03-13

    A. Lee. “A Programming Model for Time - Synchronized Distributed Real- Time Systems”. In: Proceedings of Real Time and Em- bedded Technology and Applications Symposium. 2007, pp. 259–268. ...From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems...the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data

  5. A Mechanism of Modeling and Verification for SaaS Customization Based on TLA

    NASA Astrophysics Data System (ADS)

    Luan, Shuai; Shi, Yuliang; Wang, Haiyang

    With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: UTC FUEL CELLS' PC25C POWER PLANT - GAS PROCESSING UNIT PERFORMANCE FOR ANAEROBIC DIGESTER GAS

    EPA Science Inventory

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, a combined heat and power system based on the UTC Fuel Cell's PC25C Fuel Cell Power Plant was evaluated. The...

  7. "Expert" Verification of Classroom-Based Indicators of Teaching and Learning Effectiveness for Professional Renewable Certification.

    ERIC Educational Resources Information Center

    Naik, Nitin S.; And Others

    The results are provided of a statewide content verification survey of "expert" educators designed to verify indicators in the 1989-90 System for Teaching and Learning Assessment and Review (STAR) as reasonable expectations for beginning and/or experienced teachers (BETs) in Louisiana and as providing professional endorsement at the…

  8. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  9. 78 FR 69602 - Foreign Supplier Verification Programs for Importers of Food for Humans and Animals; Extension of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Food for... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Part 1 [Docket No. FDA-2011-N-0143] RIN 0910-AG64 Foreign Supplier Verification Programs for Importers of Food for Humans and...

  10. A New "Moodle" Module Supporting Automatic Verification of VHDL-Based Assignments

    ERIC Educational Resources Information Center

    Gutierrez, Eladio; Trenas, Maria A.; Ramos, Julian; Corbera, Francisco; Romero, Sergio

    2010-01-01

    This work describes a new "Moodle" module developed to give support to the practical content of a basic computer organization course. This module goes beyond the mere hosting of resources and assignments. It makes use of an automatic checking and verification engine that works on the VHDL designs submitted by the students. The module automatically…

  11. Integrated Formal Analysis of Timed-Triggered Ethernet

    NASA Technical Reports Server (NTRS)

    Dutertre, Bruno; Shankar, Nstarajan; Owre, Sam

    2012-01-01

    We present new results related to the verification of the Timed-Triggered Ethernet (TTE) clock synchronization protocol. This work extends previous verification of TTE based on model checking. We identify a suboptimal design choice in a compression function used in clock synchronization, and propose an improvement. We compare the original design and the improved definition using the SAL model checker.

  12. Registration verification of SEA/AR fields. [Oregon, Texas, Montana, Nebraska, Washington, Colorado, Kansas, Oklahoma, and North Dakota

    NASA Technical Reports Server (NTRS)

    Austin, W. W.; Lautenschlager, L. (Principal Investigator)

    1981-01-01

    A method of field registration verification for 20 SEA/AR sites for the 1979 crop year is evaluated. Field delineations for the sites were entered into the data base, and their registration verified using single channel gray scale computer printout maps of LANDSAT data taken over the site.

  13. Description of a Website Resource for Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.

    2010-01-01

    The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

  14. Metasurfaces Based on Phase-Change Material as a Reconfigurable Platform for Multifunctional Devices

    PubMed Central

    Raeis-Hosseini, Niloufar; Rho, Junsuk

    2017-01-01

    Integration of phase-change materials (PCMs) into electrical/optical circuits has initiated extensive innovation for applications of metamaterials (MMs) including rewritable optical data storage, metasurfaces, and optoelectronic devices. PCMs have been studied deeply due to their reversible phase transition, high endurance, switching speed, and data retention. Germanium-antimony-tellurium (GST) is a PCM that has amorphous and crystalline phases with distinct properties, is bistable and nonvolatile, and undergoes a reliable and reproducible phase transition in response to an optical or electrical stimulus; GST may therefore have applications in tunable photonic devices and optoelectronic circuits. In this progress article, we outline recent studies of GST and discuss its advantages and possible applications in reconfigurable metadevices. We also discuss outlooks for integration of GST in active nanophotonic metadevices. PMID:28878196

  15. Two-point correlators revisited: fast and slow scales in multifield models of inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghersi, José T. Gálvez; Frolov, Andrei V., E-mail: joseg@sfu.ca, E-mail: frolov@sfu.ca

    2017-05-01

    We study the structure of two-point correlators of the inflationary field fluctuations in order to improve the accuracy and efficiency of the existing methods to calculate primordial spectra. We present a description motivated by the separation of the fast and slow evolving components of the spectrum which is based on Cholesky decomposition of the field correlator matrix. Our purpose is to rewrite all the relevant equations of motion in terms of slowly varying quantities. This is important in order to consider the contribution from high-frequency modes to the spectrum without affecting computational performance. The slow-roll approximation is not required tomore » reproduce the main distinctive features in the power spectrum for each specific model of inflation.« less

  16. Fingerprint Identification Using SIFT-Based Minutia Descriptors and Improved All Descriptor-Pair Matching

    PubMed Central

    Zhou, Ru; Zhong, Dexing; Han, Jiuqiang

    2013-01-01

    The performance of conventional minutiae-based fingerprint authentication algorithms degrades significantly when dealing with low quality fingerprints with lots of cuts or scratches. A similar degradation of the minutiae-based algorithms is observed when small overlapping areas appear because of the quite narrow width of the sensors. Based on the detection of minutiae, Scale Invariant Feature Transformation (SIFT) descriptors are employed to fulfill verification tasks in the above difficult scenarios. However, the original SIFT algorithm is not suitable for fingerprint because of: (1) the similar patterns of parallel ridges; and (2) high computational resource consumption. To enhance the efficiency and effectiveness of the algorithm for fingerprint verification, we propose a SIFT-based Minutia Descriptor (SMD) to improve the SIFT algorithm through image processing, descriptor extraction and matcher. A two-step fast matcher, named improved All Descriptor-Pair Matching (iADM), is also proposed to implement the 1:N verifications in real-time. Fingerprint Identification using SMD and iADM (FISiA) achieved a significant improvement with respect to accuracy in representative databases compared with the conventional minutiae-based method. The speed of FISiA also can meet real-time requirements. PMID:23467056

  17. A new approach to hand-based authentication

    NASA Astrophysics Data System (ADS)

    Amayeh, G.; Bebis, G.; Erol, A.; Nicolescu, M.

    2007-04-01

    Hand-based authentication is a key biometric technology with a wide range of potential applications both in industry and government. Traditionally, hand-based authentication is performed by extracting information from the whole hand. To account for hand and finger motion, guidance pegs are employed to fix the position and orientation of the hand. In this paper, we consider a component-based approach to hand-based verification. Our objective is to investigate the discrimination power of different parts of the hand in order to develop a simpler, faster, and possibly more accurate and robust verification system. Specifically, we propose a new approach which decomposes the hand in different regions, corresponding to the fingers and the back of the palm, and performs verification using information from certain parts of the hand only. Our approach operates on 2D images acquired by placing the hand on a flat lighting table. Using a part-based representation of the hand allows the system to compensate for hand and finger motion without using any guidance pegs. To decompose the hand in different regions, we use a robust methodology based on morphological operators which does not require detecting any landmark points on the hand. To capture the geometry of the back of the palm and the fingers in suffcient detail, we employ high-order Zernike moments which are computed using an effcient methodology. The proposed approach has been evaluated on a database of 100 subjects with 10 images per subject, illustrating promising performance. Comparisons with related approaches using the whole hand for verification illustrate the superiority of the proposed approach. Moreover, qualitative comparisons with state-of-the-art approaches indicate that the proposed approach has comparable or better performance.

  18. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  19. The FoReVer Methodology: A MBSE Framework for Formal Verification

    NASA Astrophysics Data System (ADS)

    Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald

    2013-08-01

    The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.

  20. Structural damage detection based on stochastic subspace identification and statistical pattern recognition: II. Experimental validation under varying temperature

    NASA Astrophysics Data System (ADS)

    Lin, Y. Q.; Ren, W. X.; Fang, S. E.

    2011-11-01

    Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.

  1. Inverse probability weighting estimation of the volume under the ROC surface in the presence of verification bias.

    PubMed

    Zhang, Ying; Alonzo, Todd A

    2016-11-01

    In diagnostic medicine, the volume under the receiver operating characteristic (ROC) surface (VUS) is a commonly used index to quantify the ability of a continuous diagnostic test to discriminate between three disease states. In practice, verification of the true disease status may be performed only for a subset of subjects under study since the verification procedure is invasive, risky, or expensive. The selection for disease examination might depend on the results of the diagnostic test and other clinical characteristics of the patients, which in turn can cause bias in estimates of the VUS. This bias is referred to as verification bias. Existing verification bias correction in three-way ROC analysis focuses on ordinal tests. We propose verification bias-correction methods to construct ROC surface and estimate the VUS for a continuous diagnostic test, based on inverse probability weighting. By applying U-statistics theory, we develop asymptotic properties for the estimator. A Jackknife estimator of variance is also derived. Extensive simulation studies are performed to evaluate the performance of the new estimators in terms of bias correction and variance. The proposed methods are used to assess the ability of a biomarker to accurately identify stages of Alzheimer's disease. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  3. SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baba, H; Tachibana, H; Kamima, T

    2015-06-15

    Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiologicalmore » path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%.« less

  4. Optical/digital identification/verification system based on digital watermarking technology

    NASA Astrophysics Data System (ADS)

    Herrigel, Alexander; Voloshynovskiy, Sviatoslav V.; Hrytskiv, Zenon D.

    2000-06-01

    This paper presents a new approach for the secure integrity verification of driver licenses, passports or other analogue identification documents. The system embeds (detects) the reference number of the identification document with the DCT watermark technology in (from) the owner photo of the identification document holder. During verification the reference number is extracted and compared with the reference number printed in the identification document. The approach combines optical and digital image processing techniques. The detection system must be able to scan an analogue driver license or passport, convert the image of this document into a digital representation and then apply the watermark verification algorithm to check the payload of the embedded watermark. If the payload of the watermark is identical with the printed visual reference number of the issuer, the verification was successful and the passport or driver license has not been modified. This approach constitutes a new class of application for the watermark technology, which was originally targeted for the copyright protection of digital multimedia data. The presented approach substantially increases the security of the analogue identification documents applied in many European countries.

  5. Test and Verification Approach for the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Strong, Edward

    2008-01-01

    This viewgraph presentation is a test and verification approach for the NASA Constellation Program. The contents include: 1) The Vision for Space Exploration: Foundations for Exploration; 2) Constellation Program Fleet of Vehicles; 3) Exploration Roadmap; 4) Constellation Vehicle Approximate Size Comparison; 5) Ares I Elements; 6) Orion Elements; 7) Ares V Elements; 8) Lunar Lander; 9) Map of Constellation content across NASA; 10) CxP T&V Implementation; 11) Challenges in CxP T&V Program; 12) T&V Strategic Emphasis and Key Tenets; 13) CxP T&V Mission & Vision; 14) Constellation Program Organization; 15) Test and Evaluation Organization; 16) CxP Requirements Flowdown; 17) CxP Model Based Systems Engineering Approach; 18) CxP Verification Planning Documents; 19) Environmental Testing; 20) Scope of CxP Verification; 21) CxP Verification - General Process Flow; 22) Avionics and Software Integrated Testing Approach; 23) A-3 Test Stand; 24) Space Power Facility; 25) MEIT and FEIT; 26) Flight Element Integrated Test (FEIT); 27) Multi-Element Integrated Testing (MEIT); 28) Flight Test Driving Principles; and 29) Constellation s Integrated Flight Test Strategy Low Earth Orbit Servicing Capability.

  6. Maude: A Wide Spectrum Language for Secure Active Networks

    DTIC Science & Technology

    2002-08-01

    AFRL-IF-RS-TR-2002-197 Final Technical Report August 2002 MAUDE: A WIDE SPECTRUM LANGUAGE FOR SECURE ACTIVE NETWORKS SRI...MAUDE: A WIDE SPECTRUM FORMAL LANGUAGE FOR SECURE ACTIVE NETWORKS 6. AUTHOR(S) Jose Meseguer and Carolyn Talcott 5. FUNDING NUMBERS C...specifications to address this challenge. We also show how, using the Maude rewriting logic language and tools, active network systems, languages , and

  7. Practical Techniques for Language Design and Prototyping

    DTIC Science & Technology

    2005-01-01

    Practical Techniques for Language Design and Prototyping Mark-Oliver Stehr1 and Carolyn L. Talcott2 1 University of Illinois at Urbana-Champaign...cs.stanford.edu Abstract. Global computing involves the interplay of a vast variety of languages , but practially useful foundations for language ...framework, namely rewriting logic, that allows us to express (1) and (2) and, in addition, language aspects such as concurrency and non-determinism. We

  8. Magazine Editors and the Writing Process: An Analysis of How Editors Work with Staff and Free-Lance Writers.

    ERIC Educational Resources Information Center

    Schierhorn, Ann B.; Endres, Kathleen L.

    Editors of business and consumer magazines chosen by a random sample were asked in a mail survey what method they used in working with staff writers and free-lance writers. They were asked how they work with writers in the five stages of the writing process--idea, reporting, organizing, writing and rewriting. The first mailing to consumer…

  9. Thirteenth Annual "Brown" Lecture in Education Research: Public Education and the Social Contract--Restoring the Promise in an Age of Diversity and Division

    ERIC Educational Resources Information Center

    Tienda, Marta

    2017-01-01

    Building on the premise that closing achievement gaps is an economic imperative both to regain international educational supremacy and to maintain global economic competitiveness, I ask whether it is possible to rewrite the social contract so that education is a fundamental right--a statutory guarantee--that is both uniform across states and…

  10. Overview of Non-Volatile Testing and Screening Methods

    NASA Technical Reports Server (NTRS)

    Irom, Farokh

    2001-01-01

    Testing methods for memories and non-volatile memories have become increasingly sophisticated as they become denser and more complex. High frequency and faster rewrite times as well as smaller feature sizes have led to many testing challenges. This paper outlines several testing issues posed by novel memories and approaches to testing for radiation and reliability effects. We discuss methods for measurements of Total Ionizing Dose (TID).

  11. Levels of Understanding of L2 Literary Texts under Repeated Readings: Factors Contributing to Readers' Processing of Second Language Literature and Their Learning Outcomes.

    ERIC Educational Resources Information Center

    Carroli, Piera

    This study investigated college students' levels of understanding of texts and reading processes, noting how they changed through a cycle of individual reading and writing followed by classroom comparison of students' responses, text re-reading, and re-writing. The study, which followed 17 students of continuing Italian over 6 weeks, involved…

  12. Automated, Certified Program-rewriting for Software Security Enforcement

    DTIC Science & Technology

    2012-03-05

    VLC ), pages 257-260, Oak Brook, Illinois, Oc- tober 2010. [14] Aditi A. Patwardhan. Security-aware program visualization for analyz- ing in-lined...January 2010. [17] Meera Sridhar and Kevin W. Hamlen. Flexible in-lined reference moni- tor certification: Challenges and future directions. In...pages 55-60, Austin, Texas, January 2011. [18] Bhavani Thuraisingham and Kevin W. Hamlen. Challenges and future directions of software technology

  13. Active Camouflage for Infantry Headwear Applications

    DTIC Science & Technology

    2007-02-01

    incorporates a rewriteable display medium. Military, academic, and commercial groups are aiming at developing OLEDs for full- color flexible displays...as shown in Figure 7. Figure 7: Organic LED Prototype shown on a Flexible surface (Kincade, 2004). OLEDs are self-luminous and do not require...brighter, more stable color displays. The OLED manufacturing process is much more amenable to retaining optimum performance on a flexible surface

  14. Transforming the Liberal Curriculum: Rewriting the Story of Sophie and Emile. Working Paper No. 156.

    ERIC Educational Resources Information Center

    Martin, Jane Roland

    The growing body of literature on transforming the curriculum through the study of women contains several strands which serve as evidence of the range of issues this body of work addresses. It is argued that the genderized ideal of the educated person is an inadequate guide to the education of either sex in the late twentieth century United…

  15. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  16. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems

    DTIC Science & Technology

    1994-07-29

    time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real

  17. Automatic Methods and Tools for the Verification of Real Time Systems

    DTIC Science & Technology

    1997-11-30

    We developed formal methods and tools for the verification of real - time systems . This was accomplished by extending techniques, based on automata...embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous... real - time systems , and we identified the exact boundary between decidability and undecidability of real-time reasoning.

  18. 2007 Beyond SBIR Phase II: Bringing Technology Edge to the Warfighter

    DTIC Science & Technology

    2007-08-23

    Systems Trade-Off Analysis and Optimization Verification and Validation On-Board Diagnostics and Self - healing Security and Anti-Tampering Rapid...verification; Safety and reliability analysis of flight and mission critical systems On-Board Diagnostics and Self - Healing Model-based monitoring and... self - healing On-board diagnostics and self - healing ; Autonomic computing; Network intrusion detection and prevention Anti-Tampering and Trust

  19. Review of waste package verification tests. Semiannual report, October 1982-March 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soo, P.

    1983-08-01

    The current study is part of an ongoing task to specify tests that may be used to verify that engineered waste package/repository systems comply with NRC radionuclide containment and controlled release performance objectives. Work covered in this report analyzes verification tests for borosilicate glass waste forms and bentonite- and zeolite-based packing mateials (discrete backfills). 76 references.

  20. Applying Formal Verification Techniques to Ambient Assisted Living Systems

    NASA Astrophysics Data System (ADS)

    Benghazi, Kawtar; Visitación Hurtado, María; Rodríguez, María Luisa; Noguera, Manuel

    This paper presents a verification approach based on timed traces semantics and MEDISTAM-RT [1] to check the fulfillment of non-functional requirements, such as timeliness and safety, and assure the correct functioning of the Ambient Assisted Living (AAL) systems. We validate this approach by its application to an Emergency Assistance System for monitoring people suffering from cardiac alteration with syncope.

Top