Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification
2015-11-01
28 2.3.4 Input/Output Automata ...various other modeling frameworks such as I/O Automata , Kahn Process Networks, Petri-nets, Multi-dimensional SDF, etc. are also used for designing...Formal Ideally suited to model DSP applications 3 Petri Nets Graphical Formal Used for modeling distributed systems 4 I/O Automata Both Formal
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
Formal Assurance for Cognitive Architecture Based Autonomous Agent
NASA Technical Reports Server (NTRS)
Bhattacharyya, Siddhartha; Eskridge, Thomas; Neogi, Natasha; Carvalho, Marco
2017-01-01
Autonomous systems are designed and deployed in different modeling paradigms. These environments focus on specific concepts in designing the system. We focus our effort in the use of cognitive architectures to design autonomous agents to collaborate with humans to accomplish tasks in a mission. Our research focuses on introducing formal assurance methods to verify the behavior of agents designed in Soar, by translating the agent to the formal verification environment Uppaal.
What can formal methods offer to digital flight control systems design
NASA Technical Reports Server (NTRS)
Good, Donald I.
1990-01-01
Formal methods research begins to produce methods which will enable mathematic modeling of the physical behavior of digital hardware and software systems. The development of these methods directly supports the NASA mission of increasing the scope and effectiveness of flight system modeling capabilities. The conventional, continuous mathematics that is used extensively in modeling flight systems is not adequate for accurate modeling of digital systems. Therefore, the current practice of digital flight control system design has not had the benefits of extensive mathematical modeling which are common in other parts of flight system engineering. Formal methods research shows that by using discrete mathematics, very accurate modeling of digital systems is possible. These discrete modeling methods will bring the traditional benefits of modeling to digital hardware and hardware design. Sound reasoning about accurate mathematical models of flight control systems can be an important part of reducing risk of unsafe flight control.
ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis
NASA Technical Reports Server (NTRS)
Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.
2006-01-01
Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.
An Ontology for State Analysis: Formalizing the Mapping to SysML
NASA Technical Reports Server (NTRS)
Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel
2012-01-01
State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.
Formal Analysis of BPMN Models Using Event-B
NASA Astrophysics Data System (ADS)
Bryans, Jeremy W.; Wei, Wei
The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Divito, Ben L.; Holloway, C. Michael
1994-01-01
In this paper the design and formal verification of the lower levels of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications, are presented. The RCP uses NMR-style redundancy to mask faults and internal majority voting to flush the effects of transient faults. Two new layers of the RCP hierarchy are introduced: the Minimal Voting refinement (DA_minv) of the Distributed Asynchronous (DA) model and the Local Executive (LE) Model. Both the DA_minv model and the LE model are specified formally and have been verified using the Ehdm verification system. All specifications and proofs are available electronically via the Internet using anonymous FTP or World Wide Web (WWW) access.
Learning Goal Orientation, Formal Mentoring, and Leadership Competence in HRD: A Conceptual Model
ERIC Educational Resources Information Center
Kim, Sooyoung
2007-01-01
Purpose: The purpose of this paper is to suggest a conceptual model of formal mentoring as a leadership development initiative including "learning goal orientation", "mentoring functions", and "leadership competencies" as key constructs of the model. Design/methodology/approach: Some empirical studies, though there are not many, will provide…
Modeling and Verification of Dependable Electronic Power System Architecture
NASA Astrophysics Data System (ADS)
Yuan, Ling; Fan, Ping; Zhang, Xiao-fang
The electronic power system can be viewed as a system composed of a set of concurrently interacting subsystems to generate, transmit, and distribute electric power. The complex interaction among sub-systems makes the design of electronic power system complicated. Furthermore, in order to guarantee the safe generation and distribution of electronic power, the fault tolerant mechanisms are incorporated in the system design to satisfy high reliability requirements. As a result, the incorporation makes the design of such system more complicated. We propose a dependable electronic power system architecture, which can provide a generic framework to guide the development of electronic power system to ease the development complexity. In order to provide common idioms and patterns to the system *designers, we formally model the electronic power system architecture by using the PVS formal language. Based on the PVS model of this system architecture, we formally verify the fault tolerant properties of the system architecture by using the PVS theorem prover, which can guarantee that the system architecture can satisfy high reliability requirements.
Systems engineering principles for the design of biomedical signal processing systems.
Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo
2011-06-01
Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Calvo, Gilbert
Various educators from Latin and Central America and the Caribbean met to design and produce materials for teaching family life, human sexuality, community life, and environmental studies. They concluded that the materials should meet community standards; help prepare for future change; develop working models for designing effective teaching…
Reciprocal relations between cognitive neuroscience and formal cognitive models: opposites attract?
Forstmann, Birte U; Wagenmakers, Eric-Jan; Eichele, Tom; Brown, Scott; Serences, John T
2011-06-01
Cognitive neuroscientists study how the brain implements particular cognitive processes such as perception, learning, and decision-making. Traditional approaches in which experiments are designed to target a specific cognitive process have been supplemented by two recent innovations. First, formal cognitive models can decompose observed behavioral data into multiple latent cognitive processes, allowing brain measurements to be associated with a particular cognitive process more precisely and more confidently. Second, cognitive neuroscience can provide additional data to inform the development of formal cognitive models, providing greater constraint than behavioral data alone. We argue that these fields are mutually dependent; not only can models guide neuroscientific endeavors, but understanding neural mechanisms can provide key insights into formal models of cognition. Copyright © 2011 Elsevier Ltd. All rights reserved.
Fourth NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)
1997-01-01
This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.
A general U-block model-based design procedure for nonlinear polynomial control systems
NASA Astrophysics Data System (ADS)
Zhu, Q. M.; Zhao, D. Y.; Zhang, Jianhua
2016-10-01
The proposition of U-model concept (in terms of 'providing concise and applicable solutions for complex problems') and a corresponding basic U-control design algorithm was originated in the first author's PhD thesis. The term of U-model appeared (not rigorously defined) for the first time in the first author's other journal paper, which established a framework for using linear polynomial control system design approaches to design nonlinear polynomial control systems (in brief, linear polynomial approaches → nonlinear polynomial plants). This paper represents the next milestone work - using linear state-space approaches to design nonlinear polynomial control systems (in brief, linear state-space approaches → nonlinear polynomial plants). The overall aim of the study is to establish a framework, defined as the U-block model, which provides a generic prototype for using linear state-space-based approaches to design the control systems with smooth nonlinear plants/processes described by polynomial models. For analysing the feasibility and effectiveness, sliding mode control design approach is selected as an exemplary case study. Numerical simulation studies provide a user-friendly step-by-step procedure for the readers/users with interest in their ad hoc applications. In formality, this is the first paper to present the U-model-oriented control system design in a formal way and to study the associated properties and theorems. The previous publications, in the main, have been algorithm-based studies and simulation demonstrations. In some sense, this paper can be treated as a landmark for the U-model-based research from intuitive/heuristic stage to rigour/formal/comprehensive studies.
Design Strategy for a Formally Verified Reliable Computing Platform
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Caldwell, James L.; DiVito, Ben L.
1991-01-01
This paper presents a high-level design for a reliable computing platform for real-time control applications. The design tradeoffs and analyses related to the development of a formally verified reliable computing platform are discussed. The design strategy advocated in this paper requires the use of techniques that can be completely characterized mathematically as opposed to more powerful or more flexible algorithms whose performance properties can only be analyzed by simulation and testing. The need for accurate reliability models that can be related to the behavior models is also stressed. Tradeoffs between reliability and voting complexity are explored. In particular, the transient recovery properties of the system are found to be fundamental to both the reliability analysis as well as the "correctness" models.
Formal Techniques for Synchronized Fault-Tolerant Systems
NASA Technical Reports Server (NTRS)
DiVito, Ben L.; Butler, Ricky W.
1992-01-01
We present the formal verification of synchronizing aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization is based on an extended state machine model incorporating snapshots of local processors clocks.
ERIC Educational Resources Information Center
Ward, Ted W.; Herzog, William A., Jr.
This document is part of a series dealing with nonformal education. Introductory information is included in document SO 008 058. The focus of this report is on the learning effectiveness of nonformal education. Chapter 1 compares effective learning in a formal and nonformal environment. Chapter 2 develops a systems model for designers of learning…
A Formal Methods Approach to the Analysis of Mode Confusion
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.
2004-01-01
The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal models and analyses can be used to help eliminate mode confusion from flight deck designs and at the same time increase our confidence in the safety of the implementation. The paper is based upon interim results from a new project involving NASA Langley and Rockwell Collins in applying formal methods to a realistic business jet Flight Guidance System (FGS).
Towards the formal specification of the requirements and design of a processor interface unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
Work to formally specify the requirements and design of a Processor Interface Unit (PIU), a single-chip subsystem providing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault-tolerant computer system, is described. This system, the Fault-Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance free operation, or both. The approaches that were developed for modeling the PIU requirements and for composition of the PIU subcomponents at high levels of abstraction are described. These approaches were used to specify and verify a nontrivial subset of the PIU behavior. The PIU specification in Higher Order Logic (HOL) is documented in a companion NASA contractor report entitled 'Towards the Formal Specification of the Requirements and Design of a Processor Interfacs Unit - HOL Listings.' The subsequent verification approach and HOL listings are documented in NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit' and NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings.'
DRS: Derivational Reasoning System
NASA Technical Reports Server (NTRS)
Bose, Bhaskar
1995-01-01
The high reliability requirements for airborne systems requires fault-tolerant architectures to address failures in the presence of physical faults, and the elimination of design flaws during the specification and validation phase of the design cycle. Although much progress has been made in developing methods to address physical faults, design flaws remain a serious problem. Formal methods provides a mathematical basis for removing design flaws from digital systems. DRS (Derivational Reasoning System) is a formal design tool based on advanced research in mathematical modeling and formal synthesis. The system implements a basic design algebra for synthesizing digital circuit descriptions from high level functional specifications. DRS incorporates an executable specification language, a set of correctness preserving transformations, verification interface, and a logic synthesis interface, making it a powerful tool for realizing hardware from abstract specifications. DRS integrates recent advances in transformational reasoning, automated theorem proving and high-level CAD synthesis systems in order to provide enhanced reliability in designs with reduced time and cost.
48 CFR 252.211-7007 - Reporting of Government-Furnished Property.
Code of Federal Regulations, 2013 CFR
2013-10-01
... and groups of products; or (iii) Formal designations assigned to products by customer or supplier (such as model number or model type, design differentiation, or specific design series or configuration). “Part or identifying number (PIN)” means the identifier assigned by the original design activity, or by...
48 CFR 252.211-7007 - Reporting of Government-Furnished Property.
Code of Federal Regulations, 2012 CFR
2012-10-01
... and groups of products; or (iii) Formal designations assigned to products by customer or supplier (such as model number or model type, design differentiation, or specific design series or configuration). “Part or identifying number (PIN)” means the identifier assigned by the original design activity, or by...
48 CFR 252.211-7007 - Reporting of Government-Furnished Property.
Code of Federal Regulations, 2014 CFR
2014-10-01
... and groups of products; or (iii) Formal designations assigned to products by customer or supplier (such as model number or model type, design differentiation, or specific design series or configuration). “Part or identifying number (PIN)” means the identifier assigned by the original design activity, or by...
Software Design for Real-Time Systems on Parallel Computers: Formal Specifications.
1996-04-01
This research investigated the important issues related to the analysis and design of real - time systems targeted to parallel architectures. In...particular, the software specification models for real - time systems on parallel architectures were evaluated. A survey of current formal methods for...uniprocessor real - time systems specifications was conducted to determine their extensibility in specifying real - time systems on parallel architectures. In
The Latin American laws of correct nutrition: Review, unified interpretation, model and tools.
Chávez-Bosquez, Oscar; Pozos-Parra, Pilar
2016-03-01
The "Laws of Correct Nutrition": the Law of Quantity, the Law of Quality, the Law of Harmony and the Law of Adequacy, provide the basis of a proper diet, i.e. one that provides the body with the energy required and nutrients it needs for daily activities and maintenance of vital functions. For several decades, these Laws have been the basis of nourishing menus designed in Latin America; however, they are stated in a colloquial language, which leads to differences in interpretation and ambiguities for non-experts and even experts in the field. We present a review of the different interpretations of the Laws and describe a consensus. We represent concepts related to nourishing menu design employing the Unified Modeling Language (UML). We formalize the Laws using the Object Constraint Language (OCL). We design a nourishing menu for a particular user through enforcement of the Laws. We designed a domain model with the essential elements to plan a nourishing menu and we expressed the necessary constraints to make the model׳s behavior conform to the four Laws. We made a formal verification and validation of the model via USE (UML-based Specification Environment) and we developed a software prototype for menu design under the Laws. Diet planning is considered as an art but consideration should be given to the need for a set of strict rules to design adequate menus. Thus, we model the "Laws of Nutrition" as a formal basis and standard framework for this task. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Lindgren, Helena; Lundin-Olsson, Lillemor; Pohl, Petra; Sandlund, Marlene
2014-01-01
Five physiotherapists organised a user-centric design process of a knowledge-based support system for promoting exercise and preventing falls. The process integrated focus group studies with 17 older adults and prototyping. The transformation of informal medical and rehabilitation expertise and older adults' experiences into formal information and process models during the development was studied. As tool they used ACKTUS, a development platform for knowledge-based applications. The process became agile and incremental, partly due to the diversity of expectations and preferences among both older adults and physiotherapists, and the participatory approach to design and development. In addition, there was a need to develop the knowledge content alongside with the formal models and their presentations, which allowed the participants to test hands-on and evaluate the ideas, content and design. The resulting application is modular, extendable, flexible and adaptable to the individual end user. Moreover, the physiotherapists are able to modify the information and process models, and in this way further develop the application. The main constraint was found to be the lack of support for the initial phase of concept modelling, which lead to a redesigned user interface and functionality of ACKTUS.
Verifying Hybrid Systems Modeled as Timed Automata: A Case Study
1997-03-01
Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools
Review of Estelle and LOTOS with respect to critical computer applications
NASA Technical Reports Server (NTRS)
Bown, Rodney L.
1991-01-01
Man rated NASA space vehicles seem to represent a set of ultimate critical computer applications. These applications require a high degree of security, integrity, and safety. A variety of formal and/or precise modeling techniques are becoming available for the designer of critical systems. The design phase of the software engineering life cycle includes the modification of non-development components. A review of the Estelle and LOTOS formal description languages is presented. Details of the languages and a set of references are provided. The languages were used to formally describe some of the Open System Interconnect (OSI) protocols.
A Formal Model of Partitioning for Integrated Modular Avionics
NASA Technical Reports Server (NTRS)
DiVito, Ben L.
1998-01-01
The aviation industry is gradually moving toward the use of integrated modular avionics (IMA) for civilian transport aircraft. An important concern for IMA is ensuring that applications are safely partitioned so they cannot interfere with one another. We have investigated the problem of ensuring safe partitioning and logical non-interference among separate applications running on a shared Avionics Computer Resource (ACR). This research was performed in the context of ongoing standardization efforts, in particular, the work of RTCA committee SC-182, and the recently completed ARINC 653 application executive (APEX) interface standard. We have developed a formal model of partitioning suitable for evaluating the design of an ACR. The model draws from the mathematical modeling techniques developed by the computer security community. This report presents a formulation of partitioning requirements expressed first using conventional mathematical notation, then formalized using the language of SRI'S Prototype Verification System (PVS). The approach is demonstrated on three candidate designs, each an abstraction of features found in real systems.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2004-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Quality, Organization Design, and Standards.
ERIC Educational Resources Information Center
Gardner, James F.
1992-01-01
This paper discusses regulations and voluntary standards within the context of organizational mission and management leadership for quality in human services. Regulation is seen as a measure of organizational formalism; formalism is seen as an attribute of the machine organization; and both are contrasted to organic control organization models and…
Optimized Temporal Monitors for SystemC
NASA Technical Reports Server (NTRS)
Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.
2012-01-01
SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.
Formal Verification of Complex Systems based on SysML Functional Requirements
2014-12-23
Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools
2017-04-17
Cyberphysical Systems, Formal Methods , Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...5 3. Methods , Assumptions, and Procedures...Rockwell Collins has been addressing these challenges by developing compositional reasoning methods that permit the verification of systems that exceed
Design of high reliability organizations in health care.
Carroll, J S; Rudolph, J W
2006-12-01
To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self-understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self-design for safety and reliability.
A systematic approach to embedded biomedical decision making.
Song, Zhe; Ji, Zhongkai; Ma, Jian-Guo; Sputh, Bernhard; Acharya, U Rajendra; Faust, Oliver
2012-11-01
An embedded decision making is a key feature for many biomedical systems. In most cases human life directly depends on correct decisions made by these systems, therefore they have to work reliably. This paper describes how we applied systems engineering principles to design a high performance embedded classification system in a systematic and well structured way. We introduce the structured design approach by discussing requirements capturing, specifications refinement, implementation and testing. Thereby, we follow systems engineering principles and execute each of these processes as formal as possible. The requirements, which motivate the system design, describe an automated decision making system for diagnostic support. These requirements are refined into the implementation of a support vector machine (SVM) algorithm which enables us to integrate automated decision making in embedded systems. With a formal model we establish functionality, stability and reliability of the system. Furthermore, we investigated different parallel processing configurations of this computationally complex algorithm. We found that, by adding SVM processes, an almost linear speedup is possible. Once we established these system properties, we translated the formal model into an implementation. The resulting implementation was tested using XMOS processors with both normal and failure cases, to build up trust in the implementation. Finally, we demonstrated that our parallel implementation achieves the speedup, predicted by the formal model. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Büker, Gundula; Schell-Straub, Sigrid
2017-01-01
Global learning facilitators from civil society organizations (CSOs) design and enrich educational processes in formal and non-formal educational settings. They need to be empowered through adequate training opportunities in global learning (GL) contexts. The project Facilitating Global Learning--Key Competences from Members of European CSOs (FGL)…
Developing Formal Correctness Properties from Natural Language Requirements
NASA Technical Reports Server (NTRS)
Nikora, Allen P.
2006-01-01
This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.
Machine Learning-based Intelligent Formal Reasoning and Proving System
NASA Astrophysics Data System (ADS)
Chen, Shengqing; Huang, Xiaojian; Fang, Jiaze; Liang, Jia
2018-03-01
The reasoning system can be used in many fields. How to improve reasoning efficiency is the core of the design of system. Through the formal description of formal proof and the regular matching algorithm, after introducing the machine learning algorithm, the system of intelligent formal reasoning and verification has high efficiency. The experimental results show that the system can verify the correctness of propositional logic reasoning and reuse the propositional logical reasoning results, so as to obtain the implicit knowledge in the knowledge base and provide the basic reasoning model for the construction of intelligent system.
Scaffolding Students' Development of Creative Design Skills: A Curriculum Reference Model
ERIC Educational Resources Information Center
Lee, Chien-Sing; Kolodner, Janet L.
2011-01-01
This paper provides a framework for promoting creative design capabilities in the context of achieving community goals pertaining to sustainable development among high school students. The framework can be used as a reference model to design formal or out-of-school curriculum units in any geographical region. This theme is chosen due to its…
State Event Models for the Formal Analysis of Human-Machine Interactions
NASA Technical Reports Server (NTRS)
Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles
2014-01-01
The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Information system modeling for biomedical imaging applications
NASA Astrophysics Data System (ADS)
Hoo, Kent S., Jr.; Wong, Stephen T. C.
1999-07-01
Information system modeling has historically been relegated to a low priority among the designers of information systems. Often times, there is a rush to design and implement hardware and software solutions after only the briefest assessments of the domain requirements. Although this process results in a rapid development cycle, the system usually does not satisfy the needs of the users and the developers are forced to re-program certain aspects of the system. It would be much better to create an accurate model of the system based on the domain needs so that the implementation of the solution satisfies the needs of the users immediately. It would also be advantageous to build extensibility into the model so that updates to the system could be carried out in an organized fashion. The significance of this research is the development of a new formal framework for the construction of a multimedia medical information system. This formal framework is constructed using visual modeling which provides a way of thinking about problems using models organized around real- world ideas. These models provide an abstract way to view complex problems, making them easier for one to understand. The formal framework is the result of an object-oriented analysis and design process that translates the systems requirements and functionality into software models. The usefulness of this information framework is demonstrated with two different applications in epilepsy research and care, i.e., surgical planning of epilepsy and decision threshold determination.
Design of high reliability organizations in health care
Carroll, J S; Rudolph, J W
2006-01-01
To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self‐understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self‐design for safety and reliability. PMID:17142607
Symbolic discrete event system specification
NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.; Chi, Sungdo
1992-01-01
Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.
ERIC Educational Resources Information Center
Raffo, Carlo; O'Connor, Justin; Lovatt, Andy; Banks, Mark
2000-01-01
Presents arguments supporting a social model of learning linked to situated learning and cultural capital. Critiques training methods used in cultural industries (arts, publishing, broadcasting, design, fashion, restaurants). Uses case study evidence to demonstrates inadequacies of formal training in this sector. (Contains 49 references.) (SK)
Verification of NASA Emergent Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.
A High-Level Language for Modeling Algorithms and Their Properties
NASA Astrophysics Data System (ADS)
Akhtar, Sabina; Merz, Stephan; Quinson, Martin
Designers of concurrent and distributed algorithms usually express them using pseudo-code. In contrast, most verification techniques are based on more mathematically-oriented formalisms such as state transition systems. This conceptual gap contributes to hinder the use of formal verification techniques. Leslie Lamport introduced PlusCal, a high-level algorithmic language that has the "look and feel" of pseudo-code, but is equipped with a precise semantics and includes a high-level expression language based on set theory. PlusCal models can be compiled to TLA + and verified using the model checker tlc.
ERIC Educational Resources Information Center
Grey, Simon; Grey, David; Gordon, Neil; Purdy, Jon
2017-01-01
This paper offers an approach to designing game-based learning experiences inspired by the Mechanics-Dynamics-Aesthetics (MDA) model (Hunicke et al., 2004) and the elemental tetrad model (Schell, 2008) for game design. A case for game based learning as an active and social learning experience is presented including arguments from both teachers and…
The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1995-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
Modeling and Analysis of Asynchronous Systems Using SAL and Hybrid SAL
NASA Technical Reports Server (NTRS)
Tiwari, Ashish; Dutertre, Bruno
2013-01-01
We present formal models and results of formal analysis of two different asynchronous systems. We first examine a mid-value select module that merges the signals coming from three different sensors that are each asynchronously sampling the same input signal. We then consider the phase locking protocol proposed by Daly, Hopkins, and McKenna. This protocol is designed to keep a set of non-faulty (asynchronous) clocks phase locked even in the presence of Byzantine-faulty clocks on the network. All models and verifications have been developed using the SAL model checking tools and the Hybrid SAL abstractor.
Multiagent Modeling and Simulation in Human-Robot Mission Operations Work System Design
NASA Technical Reports Server (NTRS)
Sierhuis, Maarten; Clancey, William J.; Sims, Michael H.; Shafto, Michael (Technical Monitor)
2001-01-01
This paper describes a collaborative multiagent modeling and simulation approach for designing work systems. The Brahms environment is used to model mission operations for a semi-autonomous robot mission to the Moon at the work practice level. It shows the impact of human-decision making on the activities and energy consumption of a robot. A collaborative work systems design methodology is described that allows informal models, created with users and stakeholders, to be used as input to the development of formal computational models.
HDL to verification logic translator
NASA Technical Reports Server (NTRS)
Gambles, J. W.; Windley, P. J.
1992-01-01
The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
Light UAV Support Ship (ASW) (LUSSA)
2011-08-01
35 9.5 TriSWACH Model Test Data...7 Figure 8: TriSWACH Model ...Innovation in Ship Design (CISD) used the Northrop Grumman Bat UAV (formally known as the Swift Engineering Killer Bee KB4) to model launch, recovery, and
Formal implementation of a performance evaluation model for the face recognition system.
Shin, Yong-Nyuo; Kim, Jason; Lee, Yong-Jun; Shin, Woochang; Choi, Jin-Young
2008-01-01
Due to usability features, practical applications, and its lack of intrusiveness, face recognition technology, based on information, derived from individuals' facial features, has been attracting considerable attention recently. Reported recognition rates of commercialized face recognition systems cannot be admitted as official recognition rates, as they are based on assumptions that are beneficial to the specific system and face database. Therefore, performance evaluation methods and tools are necessary to objectively measure the accuracy and performance of any face recognition system. In this paper, we propose and formalize a performance evaluation model for the biometric recognition system, implementing an evaluation tool for face recognition systems based on the proposed model. Furthermore, we performed evaluations objectively by providing guidelines for the design and implementation of a performance evaluation system, formalizing the performance test process.
Dependability modeling and assessment in UML-based software development.
Bernardi, Simona; Merseguer, José; Petriu, Dorina C
2012-01-01
Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.
Dependability Modeling and Assessment in UML-Based Software Development
Bernardi, Simona; Merseguer, José; Petriu, Dorina C.
2012-01-01
Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results. PMID:22988428
Non-Lipschitz Dynamics Approach to Discrete Event Systems
NASA Technical Reports Server (NTRS)
Zak, M.; Meyers, R.
1995-01-01
This paper presents and discusses a mathematical formalism for simulation of discrete event dynamics (DED) - a special type of 'man- made' system designed to aid specific areas of information processing. A main objective is to demonstrate that the mathematical formalism for DED can be based upon the terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.
Multi-Attribute Tradespace Exploration in Space System Design
NASA Astrophysics Data System (ADS)
Ross, A. M.; Hastings, D. E.
2002-01-01
The complexity inherent in space systems necessarily requires intense expenditures of resources both human and monetary. The high level of ambiguity present in the early design phases of these systems causes long, highly iterative, and costly design cycles. This paper looks at incorporating decision theory methods into the early design processes to streamline communication of wants and needs among stakeholders and between levels of design. Communication channeled through formal utility interviews and analysis enables engineers to better understand the key drivers for the system and allows a more thorough exploration of the design tradespace. Multi-Attribute Tradespace Exploration (MATE), an evolving process incorporating decision theory into model and simulation- based design, has been applied to several space system case studies at MIT. Preliminary results indicate that this process can improve the quality of communication to more quickly resolve project ambiguity, and enable the engineer to discover better value designs for multiple stakeholders. MATE is also being integrated into a concurrent design environment to facilitate the transfer knowledge of important drivers into higher fidelity design phases. Formal utility theory provides a mechanism to bridge the language barrier between experts of different backgrounds and differing needs (e.g. scientists, engineers, managers, etc). MATE with concurrent design couples decision makers more closely to the design, and most importantly, maintains their presence between formal reviews.
Partridge, Chris; de Cesare, Sergio; Mitchell, Andrew; Odell, James
2018-01-01
Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move toward formalization in part because it illustrates some of the barriers to formalization, including the formal complexity of the pattern and the ontological issues surrounding the "one and the many." Powersets are a way of characterizing the (complex) formal structure of the classification pattern, and their formalization has been extensively studied in mathematics since Cantor's work in the late nineteenth century. One can use this formalization to develop a useful benchmark. There are various communities within information systems engineering (ISE) that are gradually working toward a formalization of the classification pattern. However, for most of these communities, this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other information systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design, and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature, starting from the relevant theoretical works of the mathematical literature and gradually shifting focus to the ISE literature. The literature survey follows the evolution of ISE's understanding of how to formalize the classification pattern. The various proposals are assessed using the classical example of classification; the Linnaean taxonomy formalized using powersets as a benchmark for formal expressiveness. The broad conclusion of the survey is that (1) the ISE community is currently in the early stages of the process of understanding how to formalize the classification pattern, particularly in the requirements for expressiveness exemplified by powersets, and (2) that there is an opportunity to intervene and speed up the process of adoption by clarifying this expressiveness. Given the central place that the classification pattern has in domain modeling, this intervention has the potential to lead to significant improvements.
Prototype design based on NX subdivision modeling application
NASA Astrophysics Data System (ADS)
Zhan, Xianghui; Li, Xiaoda
2018-04-01
Prototype design is an important part of the product design, through a quick and easy way to draw a three-dimensional product prototype. Combined with the actual production, the prototype could be modified several times, resulting in a highly efficient and reasonable design before the formal design. Subdivision modeling is a common method of modeling product prototypes. Through Subdivision modeling, people can in a short time with a simple operation to get the product prototype of the three-dimensional model. This paper discusses the operation method of Subdivision modeling for geometry. Take a vacuum cleaner as an example, the NX Subdivision modeling functions are applied. Finally, the development of Subdivision modeling is forecasted.
Software Formal Inspections Guidebook
NASA Technical Reports Server (NTRS)
1993-01-01
The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.
ERIC Educational Resources Information Center
Lin, Zhiang; Carley, Kathleen
How should organizations of intelligent agents be designed so that they exhibit high performance even during periods of stress? A formal model of organizational performance given a distributed decision-making environment in which agents encounter a radar detection task is presented. Using this model the performance of organizations with various…
Integrated Formal Analysis of Timed-Triggered Ethernet
NASA Technical Reports Server (NTRS)
Dutertre, Bruno; Shankar, Nstarajan; Owre, Sam
2012-01-01
We present new results related to the verification of the Timed-Triggered Ethernet (TTE) clock synchronization protocol. This work extends previous verification of TTE based on model checking. We identify a suboptimal design choice in a compression function used in clock synchronization, and propose an improvement. We compare the original design and the improved definition using the SAL model checker.
Properties of a Formal Method for Prediction of Emergent Behaviors in Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
Autonomous intelligent swarms of satellites are being proposed for NASA missions that have complex behaviors and interactions. The emergent properties of swarms make these missions powerful, but at the same time more difficult to design and assure that proper behaviors will emerge. This paper gives the results of research into formal methods techniques for verification and validation of NASA swarm-based missions. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft. The NASA ANTS mission was used as an example of swarm intelligence for which to apply the formal methods. This paper will give the evaluation of these formal methods and give partial specifications of the ANTS mission using four selected methods. We then give an evaluation of the methods and the needed properties of a formal method for effective specification and prediction of emergent behavior in swarm-based systems.
F-Nets and Software Cabling: Deriving a Formal Model and Language for Portable Parallel Programming
NASA Technical Reports Server (NTRS)
DiNucci, David C.; Saini, Subhash (Technical Monitor)
1998-01-01
Parallel programming is still being based upon antiquated sequence-based definitions of the terms "algorithm" and "computation", resulting in programs which are architecture dependent and difficult to design and analyze. By focusing on obstacles inherent in existing practice, a more portable model is derived here, which is then formalized into a model called Soviets which utilizes a combination of imperative and functional styles. This formalization suggests more general notions of algorithm and computation, as well as insights into the meaning of structured programming in a parallel setting. To illustrate how these principles can be applied, a very-high-level graphical architecture-independent parallel language, called Software Cabling, is described, with many of the features normally expected from today's computer languages (e.g. data abstraction, data parallelism, and object-based programming constructs).
Formal reasoning about systems biology using theorem proving
Hasan, Osman; Siddique, Umair; Tahar, Sofiène
2017-01-01
System biology provides the basis to understand the behavioral properties of complex biological organisms at different levels of abstraction. Traditionally, analysing systems biology based models of various diseases have been carried out by paper-and-pencil based proofs and simulations. However, these methods cannot provide an accurate analysis, which is a serious drawback for the safety-critical domain of human medicine. In order to overcome these limitations, we propose a framework to formally analyze biological networks and pathways. In particular, we formalize the notion of reaction kinetics in higher-order logic and formally verify some of the commonly used reaction based models of biological networks using the HOL Light theorem prover. Furthermore, we have ported our earlier formalization of Zsyntax, i.e., a deductive language for reasoning about biological networks and pathways, from HOL4 to the HOL Light theorem prover to make it compatible with the above-mentioned formalization of reaction kinetics. To illustrate the usefulness of the proposed framework, we present the formal analysis of three case studies, i.e., the pathway leading to TP53 Phosphorylation, the pathway leading to the death of cancer stem cells and the tumor growth based on cancer stem cells, which is used for the prognosis and future drug designs to treat cancer patients. PMID:28671950
The formal verification of generic interpreters
NASA Technical Reports Server (NTRS)
Windley, P.; Levitt, K.; Cohen, G. C.
1991-01-01
The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.
Symbolic LTL Compilation for Model Checking: Extended Abstract
NASA Technical Reports Server (NTRS)
Rozier, Kristin Y.; Vardi, Moshe Y.
2007-01-01
In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.
Designing an architectural style for Pervasive Healthcare systems.
Rafe, Vahid; Hajvali, Masoumeh
2013-04-01
Nowadays, the Pervasive Healthcare (PH) systems are considered as an important research area. These systems have a dynamic structure and configuration. Therefore, an appropriate method for designing such systems is necessary. The Publish/Subscribe Architecture (pub/sub) is one of the convenient architectures to support such systems. PH systems are safety critical; hence, errors can bring disastrous results. To prevent such problems, a powerful analytical tool is required. So using a proper formal language like graph transformation systems for developing of these systems seems necessary. But even if software engineers use such high level methodologies, errors may occur in the system under design. Hence, it should be investigated automatically and formally that whether this model of system satisfies all their requirements or not. In this paper, a dynamic architectural style for developing PH systems is presented. Then, the behavior of these systems is modeled and evaluated using GROOVE toolset. The results of the analysis show its high reliability.
Report on the formal specification and partial verification of the VIPER microprocessor
NASA Technical Reports Server (NTRS)
Brock, Bishop; Hunt, Warren A., Jr.
1991-01-01
The formal specification and partial verification of the VIPER microprocessor is reviewed. The VIPER microprocessor was designed by RSRE, Malvern, England, for safety critical computing applications (e.g., aircraft, reactor control, medical instruments, armaments). The VIPER was carefully specified and partially verified in an attempt to provide a microprocessor with completely predictable operating characteristics. The specification of VIPER is divided into several levels of abstraction, from a gate-level description up to an instruction execution model. Although the consistency between certain levels was demonstrated with mechanically-assisted mathematical proof, the formal verification of VIPER was never completed.
LTSA Conformance Testing to Architectural Design of LMS Using Ontology
ERIC Educational Resources Information Center
Sengupta, Souvik; Dasgupta, Ranjan
2017-01-01
This paper proposes a new methodology for checking conformance of the software architectural design of Learning Management System (LMS) to Learning Technology System Architecture (LTSA). In our approach, the architectural designing of LMS follows the formal modeling style of Acme. An ontology is built to represent the LTSA rules and the software…
NASA Astrophysics Data System (ADS)
White, Irene; Lorenzi, Francesca
2016-12-01
Creativity has been emerging as a key concept in educational policies since the mid-1990s, with many Western countries restructuring their education systems to embrace innovative approaches likely to stimulate creative and critical thinking. But despite current intentions of putting more emphasis on creativity in education policies worldwide, there is still a relative dearth of viable models which capture the complexity of creativity and the conditions for its successful infusion into formal school environments. The push for creativity is in direct conflict with the results-driven/competitive performance-oriented culture which continues to dominate formal education systems. The authors of this article argue that incorporating creativity into mainstream education is a complex task and is best tackled by taking a systematic and multifaceted approach. They present a multidimensional model designed to help educators in tackling the challenges of the promotion of creativity. Their model encompasses three distinct yet interrelated dimensions of a creative space - physical, social-emotional and critical. The authors use the metaphor of space to refer to the interplay of the three identified dimensions. Drawing on confluence approaches to the theorisation of creativity, this paper exemplifies the development of a model before the background of a growing trend of systems theories. The aim of the model is to be helpful in systematising creativity by offering parameters - derived from the evaluation of an example offered by a non-formal educational environment - for the development of creative environments within mainstream secondary schools.
A formally verified algorithm for interactive consistency under a hybrid fault model
NASA Technical Reports Server (NTRS)
Lincoln, Patrick; Rushby, John
1993-01-01
Consistent distribution of single-source data to replicated computing channels is a fundamental problem in fault-tolerant system design. The 'Oral Messages' (OM) algorithm solves this problem of Interactive Consistency (Byzantine Agreement) assuming that all faults are worst-cass. Thambidurai and Park introduced a 'hybrid' fault model that distinguished three fault modes: asymmetric (Byzantine), symmetric, and benign; they also exhibited, along with an informal 'proof of correctness', a modified version of OM. Unfortunately, their algorithm is flawed. The discipline of mechanically checked formal verification eventually enabled us to develop a correct algorithm for Interactive Consistency under the hybrid fault model. This algorithm withstands $a$ asymmetric, $s$ symmetric, and $b$ benign faults simultaneously, using $m+1$ rounds, provided $n is greater than 2a + 2s + b + m$, and $m\\geg a$. We present this algorithm, discuss its subtle points, and describe its formal specification and verification in PVS. We argue that formal verification systems such as PVS are now sufficiently effective that their application to fault-tolerance algorithms should be considered routine.
Using VCL as an Aspect-Oriented Approach to Requirements Modelling
NASA Astrophysics Data System (ADS)
Amálio, Nuno; Kelsen, Pierre; Ma, Qin; Glodt, Christian
Software systems are becoming larger and more complex. By tackling the modularisation of crosscutting concerns, aspect orientation draws attention to modularity as a means to address the problems of scalability, complexity and evolution in software systems development. Aspect-oriented modelling (AOM) applies aspect-orientation to the construction of models. Most existing AOM approaches are designed without a formal semantics, and use multi-view partial descriptions of behaviour. This paper presents an AOM approach based on the Visual Contract Language (VCL): a visual language for abstract and precise modelling, designed with a formal semantics, and comprising a novel approach to visual behavioural modelling based on design by contract where behavioural descriptions are total. By applying VCL to a large case study of a car-crash crisis management system, the paper demonstrates how modularity of VCL's constructs, at different levels of granularity, help to tackle complexity. In particular, it shows how VCL's package construct and its associated composition mechanisms are key in supporting separation of concerns, coarse-grained problem decomposition and aspect-orientation. The case study's modelling solution has a clear and well-defined modular structure; the backbone of this structure is a collection of packages encapsulating local solutions to concerns.
The approach in Lake Huron differs from the Lakewide Management Plans of the other Great Lakes: no formal binational designation of lakewide beneficial use impairments, nor extensive lakewide modeling of chemical loadings
From Informal Safety-Critical Requirements to Property-Driven Formal Validation
NASA Technical Reports Server (NTRS)
Cimatti, Alessandro; Roveri, Marco; Susi, Angelo; Tonetta, Stefano
2008-01-01
Most of the efforts in formal methods have historically been devoted to comparing a design against a set of requirements. The validation of the requirements themselves, however, has often been disregarded, and it can be considered a largely open problem, which poses several challenges. The first challenge is given by the fact that requirements are often written in natural language, and may thus contain a high degree of ambiguity. Despite the progresses in Natural Language Processing techniques, the task of understanding a set of requirements cannot be automatized, and must be carried out by domain experts, who are typically not familiar with formal languages. Furthermore, in order to retain a direct connection with the informal requirements, the formalization cannot follow standard model-based approaches. The second challenge lies in the formal validation of requirements. On one hand, it is not even clear which are the correctness criteria or the high-level properties that the requirements must fulfill. On the other hand, the expressivity of the language used in the formalization may go beyond the theoretical and/or practical capacity of state-of-the-art formal verification. In order to solve these issues, we propose a new methodology that comprises of a chain of steps, each supported by a specific tool. The main steps are the following. First, the informal requirements are split into basic fragments, which are classified into categories, and dependency and generalization relationships among them are identified. Second, the fragments are modeled using a visual language such as UML. The UML diagrams are both syntactically restricted (in order to guarantee a formal semantics), and enriched with a highly controlled natural language (to allow for modeling static and temporal constraints). Third, an automatic formal analysis phase iterates over the modeled requirements, by combining several, complementary techniques: checking consistency; verifying whether the requirements entail some desirable properties; verify whether the requirements are consistent with selected scenarios; diagnosing inconsistencies by identifying inconsistent cores; identifying vacuous requirements; constructing multiple explanations by enabling the fault-tree analysis related to particular fault models; verifying whether the specification is realizable.
Composing, Analyzing and Validating Software Models
NASA Astrophysics Data System (ADS)
Sheldon, Frederick T.
1998-10-01
This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.
Composing, Analyzing and Validating Software Models
NASA Technical Reports Server (NTRS)
Sheldon, Frederick T.
1998-01-01
This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.
Design criteria for extraction with chemical reaction and liquid membrane permeation
NASA Technical Reports Server (NTRS)
Bart, H. J.; Bauer, A.; Lorbach, D.; Marr, R.
1988-01-01
The design criteria for heterogeneous chemical reactions in liquid/liquid systems formally correspond to those of classical physical extraction. More complex models are presented which describe the material exchange at the individual droplets in an extraction with chemical reaction and in liquid membrane permeation.
Reciprocal Relations Between Cognitive Neuroscience and Cognitive Models: Opposites Attract?
Forstmann, Birte U.; Wagenmakers, Eric-Jan; Eichele, Tom; Brown, Scott; Serences, John T.
2012-01-01
Cognitive neuroscientists study how the brain implements particular cognitive processes such as perception, learning, and decision-making. Traditional approaches in which experiments are designed to target a specific cognitive process have been supplemented by two recent innovations. First, formal models of cognition can decompose observed behavioral data into multiple latent cognitive processes, allowing brain measurements to be associated with a particular cognitive process more precisely and more confidently. Second, cognitive neuroscience can provide additional data to inform the development of cognitive models, providing greater constraint than behavioral data alone. We argue that these fields are mutually dependent: not only can models guide neuroscientific endeavors, but understanding neural mechanisms can provide critical insights into formal models of cognition. PMID:21612972
On Design Mining: Coevolution and Surrogate Models.
Preen, Richard J; Bull, Larry
2017-01-01
Design mining is the use of computational intelligence techniques to iteratively search and model the attribute space of physical objects evaluated directly through rapid prototyping to meet given objectives. It enables the exploitation of novel materials and processes without formal models or complex simulation. In this article, we focus upon the coevolutionary nature of the design process when it is decomposed into concurrent sub-design-threads due to the overall complexity of the task. Using an abstract, tunable model of coevolution, we consider strategies to sample subthread designs for whole-system testing and how best to construct and use surrogate models within the coevolutionary scenario. Drawing on our findings, we then describe the effective design of an array of six heterogeneous vertical-axis wind turbines.
Job Design for Learning in Work Groups
ERIC Educational Resources Information Center
Lantz, Annika; Brav, Agneta
2007-01-01
Purpose--What is required of job design and production planning, if they are to result in a work group taking a self-starting approach and going beyond what is formally required of it? This paper aims to contribute to group research by testing a theoretical model of relations between job design on the one hand (captured as completeness, demand on…
An International Survey of Industrial Applications of Formal Methods. Volume 2. Case Studies
1993-09-30
impact of the product on IBM revenues. 4. Error rates were claimed to be below industrial average and errors were minimal to fix. Formal methods, as...critical applications. These include: 3 I I International Survey of Industrial Applications 41 i) "Software failures, particularly under first use, seem...project to add improved modelling capability. I U International Survey of Industrial Applications 93 I Design and Implementation These products are being
Towards a formal semantics for Ada 9X
NASA Technical Reports Server (NTRS)
Guaspari, David; Mchugh, John; Wolfgang, Polak; Saaltink, Mark
1995-01-01
The Ada 9X language precision team was formed during the revisions of Ada 83, with the goal of analyzing the proposed design, identifying problems, and suggesting improvements, through the use of mathematical models. This report defines a framework for formally describing Ada 9X, based on Kahn's 'natural semantics', and applies the framework to portions of the language. The proposals for exceptions and optimization freedoms are also analyzed, using a different technique.
Toward a Formal Evaluation of Refactorings
NASA Technical Reports Server (NTRS)
Paul, John; Kuzmina, Nadya; Gamboa, Ruben; Caldwell, James
2008-01-01
Refactoring is a software development strategy that characteristically alters the syntactic structure of a program without changing its external behavior [2]. In this talk we present a methodology for extracting formal models from programs in order to evaluate how incremental refactorings affect the verifiability of their structural specifications. We envision that this same technique may be applicable to other types of properties such as those that concern the design and maintenance of safety-critical systems.
NASA Technical Reports Server (NTRS)
Richardson, David
2018-01-01
Model-Based Systems Engineering (MBSE) is the formalized application of modeling to support system requirements, design, analysis, verification and validation activities beginning in the conceptual design phase and continuing throughout development and later life cycle phases . This presentation will discuss the value proposition that MBSE has for Systems Engineering, and the associated culture change needed to adopt it.
NASA Technical Reports Server (NTRS)
Moser, Louise; Melliar-Smith, Michael; Schwartz, Richard
1987-01-01
A SIFT reliable aircraft control computer system, designed to meet the ultrahigh reliability required for safety critical flight control applications by use of processor replications and voting, was constructed for SRI, and delivered to NASA Langley for evaluation in the AIRLAB. To increase confidence in the reliability projections for SIFT, produced by a Markov reliability model, SRI constructed a formal specification, defining the meaning of reliability in the context of flight control. A further series of specifications defined, in increasing detail, the design of SIFT down to pre- and post-conditions on Pascal code procedures. Mechanically checked mathematical proofs were constructed to demonstrate that the more detailed design specifications for SIFT do indeed imply the formal reliability requirement. An additional specification defined some of the assumptions made about SIFT by the Markov model, and further proofs were constructed to show that these assumptions, as expressed by that specification, did indeed follow from the more detailed design specifications for SIFT. This report provides an outline of the methodology used for this hierarchical specification and proof, and describes the various specifications and proofs performed.
Experiences applying Formal Approaches in the Development of Swarm-Based Space Exploration Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher A.; Hinchey, Michael G.; Truszkowski, Walter F.; Rash, James L.
2006-01-01
NASA is researching advanced technologies for future exploration missions using intelligent swarms of robotic vehicles. One of these missions is the Autonomous Nan0 Technology Swarm (ANTS) mission that will explore the asteroid belt using 1,000 cooperative autonomous spacecraft. The emergent properties of intelligent swarms make it a potentially powerful concept, but at the same time more difficult to design and ensure that the proper behaviors will emerge. NASA is investigating formal methods and techniques for verification of such missions. The advantage of using formal methods is the ability to mathematically verify the behavior of a swarm, emergent or otherwise. Using the ANTS mission as a case study, we have evaluated multiple formal methods to determine their effectiveness in modeling and ensuring desired swarm behavior. This paper discusses the results of this evaluation and proposes an integrated formal method for ensuring correct behavior of future NASA intelligent swarms.
NASA Astrophysics Data System (ADS)
Farnsworth, Carolyn H.; Mayer, Victor J.
Intensive time-series designs for classroom investigations have been under development since 1975. Studies have been conducted to determine their feasibility (Mayer & Lewis, 1979), their potential for monitoring knowledge acquisition (Mayer & Kozlow, 1980), and the potential threat to validity of the frequency of testing inherent in the design (Mayer & Rojas, 1982). This study, an extension of those previous studies, is an attempt to determine the degree of discrimination the design allows in collecting data on achievement. It also serves as a replication of the Mayer and Kozlow study, an attempt to determine design validity for collecting achievement data. The investigator used her eighth-grade earth science students, from a suburban Columbus (Ohio) junior high school. A multiple-group single intervention time-series design (Glass, Willson, & Gottman, 1975) was adapted to the collection of daily data on achievement in the topic of the intervention, a unit on plate tectonics. Single multiple-choice items were randomly assigned to each of three groups of students, identified on the basis of their ranking on a written test of cognitive level (Lawson, 1978). The top third, or those with formal cognitive tendencies, were compared on the basis of knowledge achievement and understanding achievement with the lowest third of the students, or those with concrete cognitive tendencies, to determine if the data collected in the design would discriminate between the two groups. Several studies (Goodstein & Howe, 1978; Lawson & Renner, 1975) indicated that students with formal cognitive tendencies should learn a formal concept such as plate tectonics with greater understanding than should students with concrete cognitive tendencies. Analyses used were a comparison of regression lines in each of the three study stages: baseline, intervention, and follow-up; t-tests of means of days summed across each stage; and a time-series analysis program. Statistically significant differences were found between the two groups both in slopes of regression lines (0.0001) and in t-tests (0.0005) on both knowledge and understanding levels of learning. These differences confirm the discrimination of the intensive time-series design in showing that it can distinguish differences in learning between students with formal cognitive tendencies and those with concrete cognitive tendencies. The time-series analysis model with a trend in the intervention was better than a model with no trend for both groups of students, in that it accounted for a greater amount of variance in the data from both knowledge and understanding levels of learning. This finding adds additional confidence in the validity of the design for obtaining achievement data. When the analysis model with trend was used on data from the group with formal cognitive tendencies, it accounted for a greater degree of variance than the same model applied to the data from the group with concrete cognitive tendencies. This more conservative analysis, therefor, gave results consistent with those from the more usual linear regression techniques and t-tests, further adding to the confidence in the discrimination of the design.
Real Time Fire Reconnaissance Satellite Monitoring System Failure Model
NASA Astrophysics Data System (ADS)
Nino Prieto, Omar Ariosto; Colmenares Guillen, Luis Enrique
2013-09-01
In this paper the Real Time Fire Reconnaissance Satellite Monitoring System is presented. This architecture is a legacy of the Detection System for Real-Time Physical Variables which is undergoing a patent process in Mexico. The methodologies for this design are the Structured Analysis for Real Time (SA- RT) [8], and the software is carried out by LACATRE (Langage d'aide à la Conception d'Application multitâche Temps Réel) [9,10] Real Time formal language. The system failures model is analyzed and the proposal is based on the formal language for the design of critical systems and Risk Assessment; AltaRica. This formal architecture uses satellites as input sensors and it was adapted from the original model which is a design pattern for physical variation detection in Real Time. The original design, whose task is to monitor events such as natural disasters and health related applications, or actual sickness monitoring and prevention, as the Real Time Diabetes Monitoring System, among others. Some related work has been presented on the Mexican Space Agency (AEM) Creation and Consultation Forums (2010-2011), and throughout the International Mexican Aerospace Science and Technology Society (SOMECYTA) international congress held in San Luis Potosí, México (2012). This Architecture will allow a Real Time Fire Satellite Monitoring, which will reduce the damage and danger caused by fires which consumes the forests and tropical forests of Mexico. This new proposal, permits having a new system that impacts on disaster prevention, by combining national and international technologies and cooperation for the benefit of humankind.
Model-based engineering for medical-device software.
Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi
2010-01-01
This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.
Formally verifying human–automation interaction as part of a system model: limitations and tradeoffs
Bass, Ellen J.
2011-01-01
Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human–automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human–automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE. PMID:21572930
Applying Learning Theories and Instructional Design Models for Effective Instruction
ERIC Educational Resources Information Center
Khalil, Mohammed K.; Elkhider, Ihsan A.
2016-01-01
Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning…
MODIA: Vol. 4. The Resource Utilization Model. A Project AIR FORCE Report.
ERIC Educational Resources Information Center
Gallegos, Margaret
MODIA (Method of Designing Instructional Alternatives) was developed to help the Air Force manage resources for formal training by systematically and explicitly relating quantitative requirements for training resources to the details of course design and course operation during the planning stage. This report describes the Resource Utilization…
Verification of Emergent Behaviors in Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.
Short-Term Memory Scanning Viewed as Exemplar-Based Categorization
ERIC Educational Resources Information Center
Nosofsky, Robert M.; Little, Daniel R.; Donkin, Christopher; Fific, Mario
2011-01-01
Exemplar-similarity models such as the exemplar-based random walk (EBRW) model (Nosofsky & Palmeri, 1997b) were designed to provide a formal account of multidimensional classification choice probabilities and response times (RTs). At the same time, a recurring theme has been to use exemplar models to account for old-new item recognition and to…
An object-oriented approach for harmonization of multimedia markup languages
NASA Astrophysics Data System (ADS)
Chen, Yih-Feng; Kuo, May-Chen; Sun, Xiaoming; Kuo, C.-C. Jay
2003-12-01
An object-oriented methodology is proposed to harmonize several different markup languages in this research. First, we adopt the Unified Modelling Language (UML) as the data model to formalize the concept and the process of the harmonization process between the eXtensible Markup Language (XML) applications. Then, we design the Harmonization eXtensible Markup Language (HXML) based on the data model and formalize the transformation between the Document Type Definitions (DTDs) of the original XML applications and HXML. The transformation between instances is also discussed. We use the harmonization of SMIL and X3D as an example to demonstrate the proposed methodology. This methodology can be generalized to various application domains.
ERIC Educational Resources Information Center
Prins, Gjalt T.; Bulte, Astrid M. W.; Pilot, Albert
2011-01-01
Science education should foster students' epistemological view on models and modelling consistent with formal epistemology in science and technology practices. This paper reports the application of a curriculum unit in the classroom using an authentic chemical practice, "Modelling drinking water treatment", as the context for learning.…
Extending AADL for Security Design Assurance of Cyber Physical Systems
2015-12-16
a detailed system architecture design of a CPS can be analyzed using AADL to prevent such types of CWEs. We divided the work into two tasks as...security modeling to CPSs, and develop a case study to show how formal modeling using AADL could be applied to a CPS to improve the security design of the... CPS . These examples of recent attacks against automobiles have been reported: A wireless device used by Progressive Insurance to gather information
Zhou, Y; Murata, T; Defanti, T A
2000-01-01
Despite their attractive properties, networked virtual environments (net-VEs) are notoriously difficult to design, implement, and test due to the concurrency, real-time and networking features in these systems. Net-VEs demand high quality-of-service (QoS) requirements on the network to maintain natural and real-time interactions among users. The current practice for net-VE design is basically trial and error, empirical, and totally lacks formal methods. This paper proposes to apply a Petri net formal modeling technique to a net-VE-NICE (narrative immersive constructionist/collaborative environment), predict the net-VE performance based on simulation, and improve the net-VE performance. NICE is essentially a network of collaborative virtual reality systems called the CAVE-(CAVE automatic virtual environment). First, we introduce extended fuzzy-timing Petri net (EFTN) modeling and analysis techniques. Then, we present EFTN models of the CAVE, NICE, and transport layer protocol used in NICE: transmission control protocol (TCP). We show the possibility analysis based on the EFTN model for the CAVE. Then, by using these models and design/CPN as the simulation tool, we conducted various simulations to study real-time behavior, network effects and performance (latencies and jitters) of NICE. Our simulation results are consistent with experimental data.
Patch models and their applications to multivehicle command and control.
Rao, Venkatesh G; D'Andrea, Raffaello
2007-06-01
We introduce patch models, a computational modeling formalism for multivehicle combat domains, based on spatiotemporal abstraction methods developed in the computer science community. The framework yields models that are expressive enough to accommodate nontrivial controlled vehicle dynamics while being within the representational capabilities of common artificial intelligence techniques used in the construction of autonomous systems. The framework allows several key design requirements of next-generation network-centric command and control systems, such as maintenance of shared situation awareness, to be achieved. Major features include support for multiple situation models at each decision node and rapid mission plan adaptation. We describe the formal specification of patch models and our prototype implementation, i.e., Patchworks. The capabilities of patch models are validated through a combat mission simulation in Patchworks, which involves two defending teams protecting a camp from an enemy attacking team.
ERIC Educational Resources Information Center
Oakes, G. L.; Felton, A. J.; Garner, K. B.
2006-01-01
The BSc in computer aided product design (CAPD) course at the University of Wolverhampton was conceived as a collaborative venture in 1989 between the School of Engineering and the School of Art and Design. The award was at the forefront of forging interdisciplinary collaboration at undergraduate level in the field of product design. It has…
Toward a Formal Model of the Design and Evolution of Software
1988-12-20
should have the flezibiity to support a variety of design methodologies, be compinhenaive enough to encompass the gamut of software lifecycle...the future. It should have the flezibility to support a variety of design methodologies, be comprehensive enough to encompass the gamut of software...variety of design methodologies, be comprehensive enough to encompass the gamut of software lifecycle activities, and be precise enough to provide the
Density matrix Monte Carlo modeling of quantum cascade lasers
NASA Astrophysics Data System (ADS)
Jirauschek, Christian
2017-10-01
By including elements of the density matrix formalism, the semiclassical ensemble Monte Carlo method for carrier transport is extended to incorporate incoherent tunneling, known to play an important role in quantum cascade lasers (QCLs). In particular, this effect dominates electron transport across thick injection barriers, which are frequently used in terahertz QCL designs. A self-consistent model for quantum mechanical dephasing is implemented, eliminating the need for empirical simulation parameters. Our modeling approach is validated against available experimental data for different types of terahertz QCL designs.
SOFTWARE SYSTEM DESIGN AND IMPLEMENTATION FOR ENVIRONMENTAL MODELING: A MOU WORKING GROUP
A workgroup was formed in conjunction with a formal Memorandum of Understanding (MOU) among six Federal Agencies to pursue collaborative research in technical areas related to environmental modeling. Among the primary objectives of the MOU are to 1) provide a mechanism for the c...
A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.
Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko
2015-10-30
Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.
A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems
Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko
2015-01-01
Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982
1992-12-01
describing how. 5. EDDA . EDDA is an attempt to add mathematical formalism to SADT. Because it is based on SADT, it cannot easily represent any other...design methodology. EDDA has two forms: G- EDDA , the standard graphical version of SADT, and S- EDDA , a textual language that partially represents the...used. "* EDDA only supports the SADT methodology and is too limited in scope to be useful in our research. "* SAMM lacks the semantic richness of
NASA Technical Reports Server (NTRS)
Callender, E. D.; Farny, A. M.
1983-01-01
Problem Statement Language/Problem Statement Analyzer (PSL/PSA) applications, which were once a one-step process in which product system information was immediately translated into PSL statements, have in light of experience been shown to result in inconsistent representations. These shortcomings have prompted the development of an intermediate step, designated the Product System Information Model (PSIM), which provides a basis for the mutual understanding of customer terminology and the formal, conceptual representation of that product system in a PSA data base. The PSIM is initially captured as a paper diagram, followed by formal capture in the PSL/PSA data base.
UML activity diagrams in requirements specification of logic controllers
NASA Astrophysics Data System (ADS)
Grobelna, Iwona; Grobelny, Michał
2015-12-01
Logic controller specification can be prepared using various techniques. One of them is the wide understandable and user-friendly UML language and its activity diagrams. Using formal methods during the design phase increases the assurance that implemented system meets the project requirements. In the approach we use the model checking technique to formally verify a specification against user-defined behavioral requirements. The properties are usually defined as temporal logic formulas. In the paper we propose to use UML activity diagrams in requirements definition and then to formalize them as temporal logic formulas. As a result, UML activity diagrams can be used both for logic controller specification and for requirements definition, what simplifies the specification and verification process.
Predictor-Based Model Reference Adaptive Control
NASA Technical Reports Server (NTRS)
Lavretsky, Eugene; Gadient, Ross; Gregory, Irene M.
2009-01-01
This paper is devoted to robust, Predictor-based Model Reference Adaptive Control (PMRAC) design. The proposed adaptive system is compared with the now-classical Model Reference Adaptive Control (MRAC) architecture. Simulation examples are presented. Numerical evidence indicates that the proposed PMRAC tracking architecture has better than MRAC transient characteristics. In this paper, we presented a state-predictor based direct adaptive tracking design methodology for multi-input dynamical systems, with partially known dynamics. Efficiency of the design was demonstrated using short period dynamics of an aircraft. Formal proof of the reported PMRAC benefits constitute future research and will be reported elsewhere.
Design and verification of distributed logic controllers with application of Petri nets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał
2015-12-31
The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.
Design and verification of distributed logic controllers with application of Petri nets
NASA Astrophysics Data System (ADS)
Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika
2015-12-01
The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.
Bolton, Matthew L.; Bass, Ellen J.; Siminiceanu, Radu I.
2012-01-01
Breakdowns in complex systems often occur as a result of system elements interacting in unanticipated ways. In systems with human operators, human-automation interaction associated with both normative and erroneous human behavior can contribute to such failures. Model-driven design and analysis techniques provide engineers with formal methods tools and techniques capable of evaluating how human behavior can contribute to system failures. This paper presents a novel method for automatically generating task analytic models encompassing both normative and erroneous human behavior from normative task models. The generated erroneous behavior is capable of replicating Hollnagel’s zero-order phenotypes of erroneous action for omissions, jumps, repetitions, and intrusions. Multiple phenotypical acts can occur in sequence, thus allowing for the generation of higher order phenotypes. The task behavior model pattern capable of generating erroneous behavior can be integrated into a formal system model so that system safety properties can be formally verified with a model checker. This allows analysts to prove that a human-automation interactive system (as represented by the model) will or will not satisfy safety properties with both normative and generated erroneous human behavior. We present benchmarks related to the size of the statespace and verification time of models to show how the erroneous human behavior generation process scales. We demonstrate the method with a case study: the operation of a radiation therapy machine. A potential problem resulting from a generated erroneous human action is discovered. A design intervention is presented which prevents this problem from occurring. We discuss how our method could be used to evaluate larger applications and recommend future paths of development. PMID:23105914
Pereira, José N; Silva, Porfírio; Lima, Pedro U; Martinoli, Alcherio
2014-01-01
The work described is part of a long term program of introducing institutional robotics, a novel framework for the coordination of robot teams that stems from institutional economics concepts. Under the framework, institutions are cumulative sets of persistent artificial modifications made to the environment or to the internal mechanisms of a subset of agents, thought to be functional for the collective order. In this article we introduce a formal model of institutional controllers based on Petri nets. We define executable Petri nets-an extension of Petri nets that takes into account robot actions and sensing-to design, program, and execute institutional controllers. We use a generalized stochastic Petri net view of the robot team controlled by the institutional controllers to model and analyze the stochastic performance of the resulting distributed robotic system. The ability of our formalism to replicate results obtained using other approaches is assessed through realistic simulations of up to 40 e-puck robots. In particular, we model a robot swarm and its institutional controller with the goal of maintaining wireless connectivity, and successfully compare our model predictions and simulation results with previously reported results, obtained by using finite state automaton models and controllers.
Minkoff, Kenneth; Cline, Christie A
2004-12-01
This article has described the CCISC model and the process of implementation of systemic implementation of co-occurring disorder services enhancements within the context of existing resources. Four projects were described as illustrations of current implementation activities. Clearly, there is need for improved services for these individuals, and increasing recognition of the need for systemic change models that are effective and efficient. The CCISC model has been recognized by SAMHSA as a consensus best practice for system design, and initial efforts at implementation appear to be promising. The existing toolkit may permit a more formal process of data-driven evaluation of system, program, clinician, and client outcomes, to better measure the effectiveness of this approach. Some projects have begun such formal evaluation processes, but more work is needed, not only with individual projects, but also to develop opportunities for multi-system evaluation, as more projects come on line.
Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Mcmanus, John William
1992-01-01
Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.
NASA Astrophysics Data System (ADS)
Sloan, H.; Drantch, K.; Steenhuis, J.
2006-12-01
We present an NSF-funded collaborative formal-informal partnership for urban Earth science teacher preparation and professional development. This model brings together The American Museum of Natural History (AMNH) and Brooklyn and Lehman College of the City University of New York (CUNY) to address science-impoverished classrooms that lack highly qualified teachers by focusing on Earth science teacher certification. Project design was based on identified needs in the local communities and schools, careful analysis of content knowledge mastery required for Earth science teacher certification, and existing impediments to certification. The problem-based approach required partners to push policy envelopes and to invent new ways of articulating content and pedagogy at both intra- and inter-institutional levels. One key element of the project is involvement of the local board of education, teachers, and administrators in initial design and ongoing assessment. Project components include formal Earth systems science courses, a summer institute primarily led and delivered by AMNH scientists through an informal series of lectures coupled to workshops led by AMNH educators, a mechanism for assigning course credit for informal experiences, development of new teaching approaches that include teacher action plans and an external program of evaluation. The principal research strand of this project focuses on the resulting model for formal-informal teacher education partnership, the project's impact on participating teachers, policy issues surrounding the model and the changes required for its development and implementation, and its potential for Earth science education reform. As the grant funded portion of the project draws to a close we begin to analyze data collected over the past 3 years. Third-year findings of the project's external evaluation indicate that the problem-based approach has been highly successful, particularly its impact on participating teachers. In addition to presenting these results, participating teachers from the 2004 and 2006 cohorts discuss their TRUST experiences and the subsequent impact the program has had on their respective Earth science teaching practices and professional lives.
Imran, Muhammad; Zafar, Nazir Ahmad
2012-01-01
Maintaining inter-actor connectivity is extremely crucial in mission-critical applications of Wireless Sensor and Actor Networks (WSANs), as actors have to quickly plan optimal coordinated responses to detected events. Failure of a critical actor partitions the inter-actor network into disjoint segments besides leaving a coverage hole, and thus hinders the network operation. This paper presents a Partitioning detection and Connectivity Restoration (PCR) algorithm to tolerate critical actor failure. As part of pre-failure planning, PCR determines critical/non-critical actors based on localized information and designates each critical node with an appropriate backup (preferably non-critical). The pre-designated backup detects the failure of its primary actor and initiates a post-failure recovery process that may involve coordinated multi-actor relocation. To prove the correctness, we construct a formal specification of PCR using Z notation. We model WSAN topology as a dynamic graph and transform PCR to corresponding formal specification using Z notation. Formal specification is analyzed and validated using the Z Eves tool. Moreover, we simulate the specification to quantitatively analyze the efficiency of PCR. Simulation results confirm the effectiveness of PCR and the results shown that it outperforms contemporary schemes found in the literature.
Offshore safety case approach and formal safety assessment of ships.
Wang, J
2002-01-01
Tragic marine and offshore accidents have caused serious consequences including loss of lives, loss of property, and damage of the environment. A proactive, risk-based "goal setting" regime is introduced to the marine and offshore industries to increase the level of safety. To maximize marine and offshore safety, risks need to be modeled and safety-based decisions need to be made in a logical and confident way. Risk modeling and decision-making tools need to be developed and applied in a practical environment. This paper describes both the offshore safety case approach and formal safety assessment of ships in detail with particular reference to the design aspects. The current practices and the latest development in safety assessment in both the marine and offshore industries are described. The relationship between the offshore safety case approach and formal ship safety assessment is described and discussed. Three examples are used to demonstrate both the offshore safety case approach and formal ship safety assessment. The study of risk criteria in marine and offshore safety assessment is carried out. The recommendations on further work required are given. This paper gives safety engineers in the marine and offshore industries an overview of the offshore safety case approach and formal ship safety assessment. The significance of moving toward a risk-based "goal setting" regime is given.
Purposeful Design of Formal Laboratory Instruction as a Springboard to Research Participation
ERIC Educational Resources Information Center
Cartrette, David P.; Miller, Matthew L.
2013-01-01
An innovative first- and second-year laboratory course sequence is described. The goal of the instructional model is to introduce chemistry and biochemistry majors to the process of research participation earlier in their academic training. To achieve that goal, the instructional model incorporates significant hands-on experiences with chemical…
Model Adoption Exchange Payment System: Technical Specifications and User Instructions.
ERIC Educational Resources Information Center
Ambrosino, Robert J.
This user's manual, designed to meet the needs of adoption exchange administrators and program managers for a formal tool to assist them in the overall management and operation of their program, presents the Model Adoption Exchange Payment System (MAEPS), which was developed to improve the delivery of adoption exchange services throughout the…
Implementing Enrichment Clusters in a Multiage School: Perspectives from a Principal and Consultant
ERIC Educational Resources Information Center
Reed, Sally E.; Westberg, Karen L.
2003-01-01
Harriet Bishop (HB) Elementary School opened in 1996 with an articulated educational model developed collaboratively by the teachers, parents, and the administration. The model includes a mission, set of beliefs, and rationale for the instructional design. While nearly every school district or school has a formal mission, the statements…
Formal Darwinism, the individual-as-maximizing-agent analogy and bet-hedging
Grafen, A.
1999-01-01
The central argument of The origin of species was that mechanical processes (inheritance of features and the differential reproduction they cause) can give rise to the appearance of design. The 'mechanical processes' are now mathematically represented by the dynamic systems of population genetics, and the appearance of design by optimization and game theory in which the individual plays the part of the maximizing agent. Establishing a precise individual-as-maximizing-agent (IMA) analogy for a population-genetics system justifies optimization approaches, and so provides a modern formal representation of the core of Darwinism. It is a hitherto unnoticed implication of recent population-genetics models that, contrary to a decades-long consensus, an IMA analogy can be found in models with stochastic environments (subject to a convexity assumption), in which individuals maximize expected reproductive value. The key is that the total reproductive value of a species must be considered as constant, so therefore reproductive value should always be calculated in relative terms. This result removes a major obstacle from the theoretical challenge to find a unifying framework which establishes the IMA analogy for all of Darwinian biology, including as special cases inclusive fitness, evolutionarily stable strategies, evolutionary life-history theory, age-structured models and sex ratio theory. This would provide a formal, mathematical justification of fruitful and widespread but 'intentional' terms in evolutionary biology, such as 'selfish', 'altruism' and 'conflict'.
Software Assurance: Five Essential Considerations for Acquisition Officials
2007-05-01
May 2007 www.stsc.hill.af.mil 17 2 • address security concerns in the software development life cycle ( SDLC )? • Are there formal software quality...What threat modeling process, if any, is used when designing the software ? What analysis, design, and construction tools are used by your software design...the-shelf (COTS), government off-the-shelf (GOTS), open- source, embedded, and legacy software . Attackers exploit unintentional vulnerabil- ities or
NASA Astrophysics Data System (ADS)
Cooke-Nieves, Natasha Anika
Science education research has consistently shown that elementary teachers have a low self-efficacy and background knowledge to teach science. When they teach science, there is a lack of field experiences and inquiry-based instruction at the elementary level due to limited resources, both material and pedagogical. This study focused on an analysis of a professional development (PD) model designed by the author known as the Collaborative Diagonal Learning Network (CDLN). The purpose of this study was to examine elementary school teacher participants pedagogical content knowledge related to their experiences in a CDLN model. The CDLN model taught formal and informal instruction using a science coach and an informal educational institution. Another purpose for this research included a theoretical analysis of the CDLN model to see if its design enabled teachers to expand their resource knowledge of available science education materials. The four-month-long study used qualitative data obtained during an in-service professional development program facilitated by a science coach and educators from a large natural history museum. Using case study as the research design, four elementary school teachers were asked to evaluate the effectiveness of their science coach and museum educator workshop sessions. During the duration of this study, semi-structured individual/group interviews and open-ended pre/post PD questionnaires were used. Other data sources included researcher field notes from lesson observations, museum field trips, audio-recorded workshop sessions, email correspondence, and teacher-created artifacts. The data were analyzed using a constructivist grounded theory approach. Themes that emerged included increased self-efficacy; increased pedagogical content knowledge; increased knowledge of museum education resources and access; creation of a professional learning community; and increased knowledge of science notebooking. Implications for formal and informal professional development in elementary science reform are offered. It is suggested that researchers investigate collaborative coaching through the lenses of organizational learning network theory, and develop professional learning communities with formal and informal educators; and that professional developers in city school systems and informal science institutions work in concert to produce more effective elementary teachers who not only love science but love teaching it.
A system approach to aircraft optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1991-01-01
Mutual couplings among the mathematical models of physical phenomena and parts of a system such as an aircraft complicate the design process because each contemplated design change may have a far reaching consequence throughout the system. Techniques are outlined for computing these influences as system design derivatives useful for both judgemental and formal optimization purposes. The techniques facilitate decomposition of the design process into smaller, more manageable tasks and they form a methodology that can easily fit into existing engineering organizations and incorporate their design tools.
NASA Astrophysics Data System (ADS)
McPhee, J.; William, Y. W.
2005-12-01
This work presents a methodology for pumping test design based on the reliability requirements of a groundwater model. Reliability requirements take into consideration the application of the model results in groundwater management, expressed in this case as a multiobjective management model. The pumping test design is formulated as a mixed-integer nonlinear programming (MINLP) problem and solved using a combination of genetic algorithm (GA) and gradient-based optimization. Bayesian decision theory provides a formal framework for assessing the influence of parameter uncertainty over the reliability of the proposed pumping test. The proposed methodology is useful for selecting a robust design that will outperform all other candidate designs under most potential 'true' states of the system
Formalisms for user interface specification and design
NASA Technical Reports Server (NTRS)
Auernheimer, Brent J.
1989-01-01
The application of formal methods to the specification and design of human-computer interfaces is described. A broad outline of human-computer interface problems, a description of the field of cognitive engineering and two relevant research results, the appropriateness of formal specification techniques, and potential NASA application areas are described.
Safety Analysis of FMS/CTAS Interactions During Aircraft Arrivals
NASA Technical Reports Server (NTRS)
Leveson, Nancy G.
1998-01-01
This grant funded research on human-computer interaction design and analysis techniques, using future ATC environments as a testbed. The basic approach was to model the nominal behavior of both the automated and human procedures and then to apply safety analysis techniques to these models. Our previous modeling language, RSML, had been used to specify the system requirements for TCAS II for the FAA. Using the lessons learned from this experience, we designed a new modeling language that (among other things) incorporates features to assist in designing less error-prone human-computer interactions and interfaces and in detecting potential HCI problems, such as mode confusion. The new language, SpecTRM-RL, uses "intent" abstractions, based on Rasmussen's abstraction hierarchy, and includes both informal (English and graphical) specifications and formal, executable models for specifying various aspects of the system. One of the goals for our language was to highlight the system modes and mode changes to assist in identifying the potential for mode confusion. Three published papers resulted from this research. The first builds on the work of Degani on mode confusion to identify aspects of the system design that could lead to potential hazards. We defined and modeled modes differently than Degani and also defined design criteria for SpecTRM-RL models. Our design criteria include the Degani criteria but extend them to include more potential problems. In a second paper, Leveson and Palmer showed how the criteria for indirect mode transitions could be applied to a mode confusion problem found in several ASRS reports for the MD-88. In addition, we defined a visual task modeling language that can be used by system designers to model human-computer interaction. The visual models can be translated into SpecTRM-RL models, and then the SpecTRM-RL suite of analysis tools can be used to perform formal and informal safety analyses on the task model in isolation or integrated with the rest of the modeled system. We had hoped to be able to apply these modeling languages and analysis tools to a TAP air/ground trajectory negotiation scenario, but the development of the tools took more time than we anticipated.
Toward a Model-Based Approach to Flight System Fault Protection
NASA Technical Reports Server (NTRS)
Day, John; Murray, Alex; Meakin, Peter
2012-01-01
Fault Protection (FP) is a distinct and separate systems engineering sub-discipline that is concerned with the off-nominal behavior of a system. Flight system fault protection is an important part of the overall flight system systems engineering effort, with its own products and processes. As with other aspects of systems engineering, the FP domain is highly amenable to expression and management in models. However, while there are standards and guidelines for performing FP related analyses, there are not standards or guidelines for formally relating the FP analyses to each other or to the system hardware and software design. As a result, the material generated for these analyses are effectively creating separate models that are only loosely-related to the system being designed. Development of approaches that enable modeling of FP concerns in the same model as the system hardware and software design enables establishment of formal relationships that has great potential for improving the efficiency, correctness, and verification of the implementation of flight system FP. This paper begins with an overview of the FP domain, and then continues with a presentation of a SysML/UML model of the FP domain and the particular analyses that it contains, by way of showing a potential model-based approach to flight system fault protection, and an exposition of the use of the FP models in FSW engineering. The analyses are small examples, inspired by current real-project examples of FP analyses.
NASA Technical Reports Server (NTRS)
Windley, P. J.
1991-01-01
In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.
Dynamic Gate Product and Artifact Generation from System Models
NASA Technical Reports Server (NTRS)
Jackson, Maddalena; Delp, Christopher; Bindschadler, Duane; Sarrel, Marc; Wollaeger, Ryan; Lam, Doris
2011-01-01
Model Based Systems Engineering (MBSE) is gaining acceptance as a way to formalize systems engineering practice through the use of models. The traditional method of producing and managing a plethora of disjointed documents and presentations ("Power-Point Engineering") has proven both costly and limiting as a means to manage the complex and sophisticated specifications of modern space systems. We have developed a tool and method to produce sophisticated artifacts as views and by-products of integrated models, allowing us to minimize the practice of "Power-Point Engineering" from model-based projects and demonstrate the ability of MBSE to work within and supersede traditional engineering practices. This paper describes how we have created and successfully used model-based document generation techniques to extract paper artifacts from complex SysML and UML models in support of successful project reviews. Use of formal SysML and UML models for architecture and system design enables production of review documents, textual artifacts, and analyses that are consistent with one-another and require virtually no labor-intensive maintenance across small-scale design changes and multiple authors. This effort thus enables approaches that focus more on rigorous engineering work and less on "PowerPoint engineering" and production of paper-based documents or their "office-productivity" file equivalents.
Minimization In Digital Design As A Meta-Planning Problem
NASA Astrophysics Data System (ADS)
Ho, William P. C.; Wu, Jung-Gen
1987-05-01
In our model-based expert system for automatic digital system design, we formalize the design process into three sub-processes - compiling high-level behavioral specifications into primitive behavioral operations, grouping primitive operations into behavioral functions, and grouping functions into modules. Consideration of design minimization explicitly controls decision-making in the last two subprocesses. Design minimization, a key task in the automatic design of digital systems, is complicated by the high degree of interaction among the time sequence and content of design decisions. In this paper, we present an AI approach which directly addresses these interactions and their consequences by modeling the minimization prob-lem as a planning problem, and the management of design decision-making as a meta-planning problem.
The use of analytical models in human-computer interface design
NASA Technical Reports Server (NTRS)
Gugerty, Leo
1993-01-01
Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA.
Toward fidelity between specification and implementation
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.; Morrison, Jeff; Wu, Yunqing
1994-01-01
This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.
Verification and validation of a reliable multicast protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1995-01-01
This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.
NASA Astrophysics Data System (ADS)
Fiorani, D.; Acierno, M.
2017-05-01
The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction) field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014). In order to combine achievements reached within AEC through BIM environment (design control and management) with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.
Statechart Analysis with Symbolic PathFinder
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.
2012-01-01
We report here on our on-going work that addresses the automated analysis and test case generation for software systems modeled using multiple Statechart formalisms. The work is motivated by large programs such as NASA Exploration, that involve multiple systems that interact via safety-critical protocols and are designed with different Statechart variants. To verify these safety-critical systems, we have developed Polyglot, a framework for modeling and analysis of model-based software written using different Statechart formalisms. Polyglot uses a common intermediate representation with customizable Statechart semantics and leverages the analysis and test generation capabilities of the Symbolic PathFinder tool. Polyglot is used as follows: First, the structure of the Statechart model (expressed in Matlab Stateflow or Rational Rhapsody) is translated into a common intermediate representation (IR). The IR is then translated into Java code that represents the structure of the model. The semantics are provided as "pluggable" modules.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
Formal Verification Toolkit for Requirements and Early Design Stages
NASA Technical Reports Server (NTRS)
Badger, Julia M.; Miller, Sheena Judson
2011-01-01
Efficient flight software development from natural language requirements needs an effective way to test designs earlier in the software design cycle. A method to automatically derive logical safety constraints and the design state space from natural language requirements is described. The constraints can then be checked using a logical consistency checker and also be used in a symbolic model checker to verify the early design of the system. This method was used to verify a hybrid control design for the suit ports on NASA Johnson Space Center's Space Exploration Vehicle against safety requirements.
Effect of mentoring on professional values in model C clinical nurse leader graduates.
Gazaway, Shena B; Anderson, Lori; Schumacher, Autumn; Alichnie, Chris
2018-04-19
Nursing graduates acquire their nursing values by professional socialization. Mentoring is a crucial support mechanism for these novice nurses, yet little is known about the model C clinical nurse leader graduate and the effects of mentoring. This investigation examined how mentoring affected the development of professional nursing values in the model C clinical nurse leader graduate. A longitudinal design was used to survey model C clinical nurse leader graduates before and after graduation to determine how different types of mentoring relationships influenced professional values. Demographic surveys documented participant characteristics and the Nurses Professional Values Scale - Revised (NPVS-R) assessed professional nursing values. Mean NPVS-R scores increased after graduation for the formally mentored participants, while the NPVS-R scores decreased or remained unchanged for the other mentoring groups. However, no significant difference was found in NPVS-R scores over time (p = .092) or an interaction between the NPVS-R scores and type of mentoring relationships (p = .09). These results suggest that model C clinical nurse leader graduate participants experiencing formal mentoring may develop professional nursing values more than their colleagues. Formal mentoring relationships are powerful and should be used to promote professional values for model C clinical nurse leader graduates. © 2018 John Wiley & Sons Ltd.
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
Petri net-based modelling of human-automation conflicts in aviation.
Pizziol, Sergio; Tessier, Catherine; Dehais, Frédéric
2014-01-01
Analyses of aviation safety reports reveal that human-machine conflicts induced by poor automation design are remarkable precursors of accidents. A review of different crew-automation conflicting scenarios shows that they have a common denominator: the autopilot behaviour interferes with the pilot's goal regarding the flight guidance via 'hidden' mode transitions. Considering both the human operator and the machine (i.e. the autopilot or the decision functions) as agents, we propose a Petri net model of those conflicting interactions, which allows them to be detected as deadlocks in the Petri net. In order to test our Petri net model, we designed an autoflight system that was formally analysed to detect conflicting situations. We identified three conflicting situations that were integrated in an experimental scenario in a flight simulator with 10 general aviation pilots. The results showed that the conflicts that we had a-priori identified as critical had impacted the pilots' performance. Indeed, the first conflict remained unnoticed by eight participants and led to a potential collision with another aircraft. The second conflict was detected by all the participants but three of them did not manage the situation correctly. The last conflict was also detected by all the participants but provoked typical automation surprise situation as only one declared that he had understood the autopilot behaviour. These behavioural results are discussed in terms of workload and number of fired 'hidden' transitions. Eventually, this study reveals that both formal and experimental approaches are complementary to identify and assess the criticality of human-automation conflicts. Practitioner Summary: We propose a Petri net model of human-automation conflicts. An experiment was conducted with general aviation pilots performing a scenario involving three conflicting situations to test the soundness of our formal approach. This study reveals that both formal and experimental approaches are complementary to identify and assess the criticality conflicts.
Modeling Of Object- And Scene-Prototypes With Hierarchically Structured Classes
NASA Astrophysics Data System (ADS)
Ren, Z.; Jensch, P.; Ameling, W.
1989-03-01
The success of knowledge-based image analysis methodology and implementation tools depends largely on an appropriately and efficiently built model wherein the domain-specific context information about and the inherent structure of the observed image scene have been encoded. For identifying an object in an application environment a computer vision system needs to know firstly the description of the object to be found in an image or in an image sequence, secondly the corresponding relationships between object descriptions within the image sequence. This paper presents models of image objects scenes by means of hierarchically structured classes. Using the topovisual formalism of graph and higraph, we are currently studying principally the relational aspect and data abstraction of the modeling in order to visualize the structural nature resident in image objects and scenes, and to formalize. their descriptions. The goal is to expose the structure of image scene and the correspondence of image objects in the low level image interpretation. process. The object-based system design approach has been applied to build the model base. We utilize the object-oriented programming language C + + for designing, testing and implementing the abstracted entity classes and the operation structures which have been modeled topovisually. The reference images used for modeling prototypes of objects and scenes are from industrial environments as'well as medical applications.
Formal verification of software-based medical devices considering medical guidelines.
Daw, Zamira; Cleaveland, Rance; Vetter, Marcus
2014-01-01
Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one UML diagram. Additionally, the system design can be formally verified automatically.
Saturn Radiation (SATRAD) Model
NASA Technical Reports Server (NTRS)
Garrett, H. B.; Ratliff, J. M.; Evans, R. W.
2005-01-01
The Saturnian radiation belts have not received as much attention as the Jovian radiation belts because they are not nearly as intense-the famous Saturnian particle rings tend to deplete the belts near where their peak would occur. As a result, there has not been a systematic development of engineering models of the Saturnian radiation environment for mission design. A primary exception is that of Divine (1990). That study used published data from several charged particle experiments aboard the Pioneer 1 1, Voyager 1, and Voyager 2 spacecraft during their flybys at Saturn to generate numerical models for the electron and proton radiation belts between 2.3 and 13 Saturn radii. The Divine Saturn radiation model described the electron distributions at energies between 0.04 and 10 MeV and the proton distributions at energies between 0.14 and 80 MeV. The model was intended to predict particle intensity, flux, and fluence for the Cassini orbiter. Divine carried out hand calculations using the model but never formally developed a computer program that could be used for general mission analyses. This report seeks to fill that void by formally developing a FORTRAN version of the model that can be used as a computer design tool for missions to Saturn that require estimates of the radiation environment around the planet. The results of that effort and the program listings are presented here along with comparisons with the original estimates carried out by Divine. In addition, Pioneer and Voyager data were scanned in from the original references and compared with the FORTRAN model s predictions. The results were statistically analyzed in a manner consistent with Divine s approach to provide estimates of the ability of the model to reproduce the original data. Results of a formal review of the model by a panel of experts are also presented. Their recommendations for further tests, analyses, and extensions to the model are discussed.
When product designers use perceptually based color tools
NASA Astrophysics Data System (ADS)
Bender, Walter R.
1998-07-01
Palette synthesis and analysis tools have been built based upon a model of color experience. This model adjusts formal compositional elements such as hue, value, chroma, and their contrasts, as well as size and proportion. Clothing and household product designers were given these tools to give guidance to their selection of seasonal palettes for use in production of the private-label merchandise of a large retail chain. The designers chose base palettes. Accents to these palettes were generated with and without the aid of the color tools. These palettes are compared by using perceptual metrics and interviews. The results are presented.
When product designers use perceptually based color tools
NASA Astrophysics Data System (ADS)
Bender, Walter R.
2001-01-01
Palette synthesis and analysis tools have been built based upon a model of color experience. This model adjusts formal compositional elements such as hue, value, chroma, and their contrasts, as well as size and proportion. Clothing and household product designers were given these tools to guide their selection of seasonal palettes in the production of the private-label merchandise in a large retail chain. The designers chose base palettes. Accents to these palettes were generated with and without the aid of the color tools. These palettes are compared by using perceptual metrics and interviews. The results are presented.
A graph grammar approach to artificial life.
Kniemeyer, Ole; Buck-Sorlin, Gerhard H; Kurth, Winfried
2004-01-01
We present the high-level language of relational growth grammars (RGGs) as a formalism designed for the specification of ALife models. RGGs can be seen as an extension of the well-known parametric Lindenmayer systems and contain rule-based, procedural, and object-oriented features. They are defined as rewriting systems operating on graphs with the edges coming from a set of user-defined relations, whereas the nodes can be associated with objects. We demonstrate their ability to represent genes, regulatory networks of metabolites, and morphologically structured organisms, as well as developmental aspects of these entities, in a common formal framework. Mutation, crossing over, selection, and the dynamics of a network of gene regulation can all be represented with simple graph rewriting rules. This is demonstrated in some detail on the classical example of Dawkins' biomorphs and the ABC model of flower morphogenesis: other applications are briefly sketched. An interactive program was implemented, enabling the execution of the formalism and the visualization of the results.
Architecting the Human Space Flight Program with Systems Modeling Language (SysML)
NASA Technical Reports Server (NTRS)
Jackson, Maddalena M.; Fernandez, Michela Munoz; McVittie, Thomas I.; Sindiy, Oleg V.
2012-01-01
The next generation of missions in NASA's Human Space Flight program focuses on the development and deployment of highly complex systems (e.g., Orion Multi-Purpose Crew Vehicle, Space Launch System, 21st Century Ground System) that will enable astronauts to venture beyond low Earth orbit and explore the moon, near-Earth asteroids, and beyond. Architecting these highly complex system-of-systems requires formal systems engineering techniques for managing the evolution of the technical features in the information exchange domain (e.g., data exchanges, communication networks, ground software) and also, formal correlation of the technical architecture to stakeholders' programmatic concerns (e.g., budget, schedule, risk) and design development (e.g., assumptions, constraints, trades, tracking of unknowns). This paper will describe how the authors have applied System Modeling Language (SysML) to implement model-based systems engineering for managing the description of the End-to-End Information System (EEIS) architecture and associated development activities and ultimately enables stakeholders to understand, reason, and answer questions about the EEIS under design for proposed lunar Exploration Missions 1 and 2 (EM-1 and EM-2).
Formal methods in the design of Ada 1995
NASA Technical Reports Server (NTRS)
Guaspari, David
1995-01-01
Formal, mathematical methods are most useful when applied early in the design and implementation of a software system--that, at least, is the familiar refrain. I will report on a modest effort to apply formal methods at the earliest possible stage, namely, in the design of the Ada 95 programming language itself. This talk is an 'experience report' that provides brief case studies illustrating the kinds of problems we worked on, how we approached them, and the extent (if any) to which the results proved useful. It also derives some lessons and suggestions for those undertaking future projects of this kind. Ada 95 is the first revision of the standard for the Ada programming language. The revision began in 1988, when the Ada Joint Programming Office first asked the Ada Board to recommend a plan for revising the Ada standard. The first step in the revision was to solicit criticisms of Ada 83. A set of requirements for the new language standard, based on those criticisms, was published in 1990. A small design team, the Mapping Revision Team (MRT), became exclusively responsible for revising the language standard to satisfy those requirements. The MRT, from Intermetrics, is led by S. Tucker Taft. The work of the MRT was regularly subject to independent review and criticism by a committee of distinguished Reviewers and by several advisory teams--for example, the two User/Implementor teams, each consisting of an industrial user (attempting to make significant use of the new language on a realistic application) and a compiler vendor (undertaking, experimentally, to modify its current implementation in order to provide the necessary new features). One novel decision established the Language Precision Team (LPT), which investigated language proposals from a mathematical point of view. The LPT applied formal mathematical analysis to help improve the design of Ada 95 (e.g., by clarifying the language proposals) and to help promote its acceptance (e.g., by identifying a verifiable subset that would meet the needs of safety-critical applications). The first LPT project, which ran from the fall of 1990 unti the end of 1992, produced studies of several language issues: optimization, sharing and storage, tasking and protected records, overload resolution, the floating point model, distribution, program erros, and object-oriented programming. The second LPT project, in 1994, formally modeled the dynamic semantics of a large part of the (almost) final language definition, looking especially for interactions between language features.
Modeling Structure-Function Relationships in Synthetic DNA Sequences using Attribute Grammars
Cai, Yizhi; Lux, Matthew W.; Adam, Laura; Peccoud, Jean
2009-01-01
Recognizing that certain biological functions can be associated with specific DNA sequences has led various fields of biology to adopt the notion of the genetic part. This concept provides a finer level of granularity than the traditional notion of the gene. However, a method of formally relating how a set of parts relates to a function has not yet emerged. Synthetic biology both demands such a formalism and provides an ideal setting for testing hypotheses about relationships between DNA sequences and phenotypes beyond the gene-centric methods used in genetics. Attribute grammars are used in computer science to translate the text of a program source code into the computational operations it represents. By associating attributes with parts, modifying the value of these attributes using rules that describe the structure of DNA sequences, and using a multi-pass compilation process, it is possible to translate DNA sequences into molecular interaction network models. These capabilities are illustrated by simple example grammars expressing how gene expression rates are dependent upon single or multiple parts. The translation process is validated by systematically generating, translating, and simulating the phenotype of all the sequences in the design space generated by a small library of genetic parts. Attribute grammars represent a flexible framework connecting parts with models of biological function. They will be instrumental for building mathematical models of libraries of genetic constructs synthesized to characterize the function of genetic parts. This formalism is also expected to provide a solid foundation for the development of computer assisted design applications for synthetic biology. PMID:19816554
Alves, Rui; Vilaprinyo, Ester; Hernádez-Bermejo, Benito; Sorribas, Albert
2008-01-01
There is a renewed interest in obtaining a systemic understanding of metabolism, gene expression and signal transduction processes, driven by the recent research focus on Systems Biology. From a biotechnological point of view, such a systemic understanding of how a biological system is designed to work can facilitate the rational manipulation of specific pathways in different cell types to achieve specific goals. Due to the intrinsic complexity of biological systems, mathematical models are a central tool for understanding and predicting the integrative behavior of those systems. Particularly, models are essential for a rational development of biotechnological applications and in understanding system's design from an evolutionary point of view. Mathematical models can be obtained using many different strategies. In each case, their utility will depend upon the properties of the mathematical representation and on the possibility of obtaining meaningful parameters from available data. In practice, there are several issues at stake when one has to decide which mathematical model is more appropriate for the study of a given problem. First, one needs a model that can represent the aspects of the system one wishes to study. Second, one must choose a mathematical representation that allows an accurate analysis of the system with respect to different aspects of interest (for example, robustness of the system, dynamical behavior, optimization of the system with respect to some production goal, parameter value determination, etc). Third, before choosing between alternative and equally appropriate mathematical representations for the system, one should compare representations with respect to easiness of automation for model set-up, simulation, and analysis of results. Fourth, one should also consider how to facilitate model transference and re-usability by other researchers and for distinct purposes. Finally, one factor that is important for all four aspects is the regularity in the mathematical structure of the equations because it facilitates computational manipulation. This regularity is a mark of kinetic representations based on approximation theory. The use of approximation theory to derive mathematical representations with regular structure for modeling purposes has a long tradition in science. In most applied fields, such as engineering and physics, those approximations are often required to obtain practical solutions to complex problems. In this paper we review some of the more popular mathematical representations that have been derived using approximation theory and are used for modeling in molecular systems biology. We will focus on formalisms that are theoretically supported by the Taylor Theorem. These include the Power-law formalism, the recently proposed (log)linear and Lin-log formalisms as well as some closely related alternatives. We will analyze the similarities and differences between these formalisms, discuss the advantages and limitations of each representation, and provide a tentative "road map" for their potential utilization for different problems.
An Approach to Verification and Validation of a Reliable Multicasting Protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1994-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or offnominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.
An approach to verification and validation of a reliable multicasting protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1995-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.
A System for Natural Language Sentence Generation.
ERIC Educational Resources Information Center
Levison, Michael; Lessard, Gregory
1992-01-01
Describes the natural language computer program, "Vinci." Explains that using an attribute grammar formalism, Vinci can simulate components of several current linguistic theories. Considers the design of the system and its applications in linguistic modelling and second language acquisition research. Notes Vinci's uses in linguistics…
Savini, Giorgio; Pisano, Giampaolo; Ade, Peter A R
2006-12-10
We adopted an existing formalism and modified it to simulate, with high precision, the transmission, reflection, and absorption of multiple-plate birefringent devices as a function of frequency. To validate the model, we use it to compare the measured properties of an achromatic five-plate device with a broadband antireflection coating to expectations derived from the material optical constants and its geometric configuration. The half-wave plate presented here is observed to perform well with a phase shift variation of < 2 degrees from the ideal 180 degrees over a bandwidth of Deltav/v approximately 1 at millimeter wavelengths. This formalism represents a powerful design tool for birefringent polarization modulators and enables its optical properties to be specified with high accuracy.
A Novel Ontology Approach to Support Design for Reliability considering Environmental Effects
Sun, Bo; Li, Yu; Ye, Tianyuan
2015-01-01
Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis. PMID:25821857
A novel ontology approach to support design for reliability considering environmental effects.
Sun, Bo; Li, Yu; Ye, Tianyuan; Ren, Yi
2015-01-01
Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis.
Applying learning theories and instructional design models for effective instruction.
Khalil, Mohammed K; Elkhider, Ihsan A
2016-06-01
Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory. Copyright © 2016 The American Physiological Society.
Formally verifying Ada programs which use real number types
NASA Technical Reports Server (NTRS)
Sutherland, David
1986-01-01
Formal verification is applied to programs which use real number arithmetic operations (mathematical programs). Formal verification of a program P consists of creating a mathematical model of F, stating the desired properties of P in a formal logical language, and proving that the mathematical model has the desired properties using a formal proof calculus. The development and verification of the mathematical model are discussed.
On decentralized design: Rationale, dynamics, and effects on decision-making
NASA Astrophysics Data System (ADS)
Chanron, Vincent
The focus of this dissertation is the design of complex systems, including engineering systems such as cars, airplanes, and satellites. Companies who design these systems are under constant pressure to design better products that meet customer expectations, and competition forces them to develop them faster. One of the responses of the industry to these conflicting challenges has been the decentralization of the design responsibilities. The current lack of understanding of the dynamics of decentralized design processes is the main motivation for this research, and places value on the descriptive base. It identifies the main reasons and the true benefits for companies to decentralize the design of their products. It also demonstrates the limitations of this approach by listing the relevant issues and problems created by the decentralization of decisions. Based on these observations, a game-theoretic approach to decentralized design is proposed to model the decisions made during the design process. The dynamics are modeled using mathematical formulations inspired from control theory. Building upon this formalism, the issue of convergence in decentralized design is analyzed: the equilibrium points of the design space are identified and convergent and divergent patterns are recognized. This rigorous investigation of the design process provides motivation and support for proposing new approaches to decentralized design problems. Two methods are developed, which aim at improving the design process in two ways: decreasing the product development time, and increasing the optimality of the final design. The frame of these methods are inspired by eigenstructure decomposition and set-based design, respectively. The value of the research detailed within this dissertation is in the proposed methods which are built upon the sound mathematical formalism developed. The contribution of this work is two fold: rigorous investigation of the design process, and practical support to decision-making in decentralized environments.
Creation of system of computer-aided design for technological objects
NASA Astrophysics Data System (ADS)
Zubkova, T. M.; Tokareva, M. A.; Sultanov, N. Z.
2018-05-01
Due to the competition in the market of process equipment, its production should be flexible, retuning to various product configurations, raw materials and productivity, depending on the current market needs. This process is not possible without CAD (computer-aided design). The formation of CAD begins with planning. Synthesizing, analyzing, evaluating, converting operations, as well as visualization and decision-making operations, can be automated. Based on formal description of the design procedures, the design route in the form of an oriented graph is constructed. The decomposition of the design process, represented by the formalized description of the design procedures, makes it possible to make an informed choice of the CAD component for the solution of the task. The object-oriented approach allows us to consider the CAD as an independent system whose properties are inherited from the components. The first step determines the range of tasks to be performed by the system, and a set of components for their implementation. The second one is the configuration of the selected components. The interaction between the selected components is carried out using the CALS standards. The chosen CAD / CAE-oriented approach allows creating a single model, which is stored in the database of the subject area. Each of the integration stages is implemented as a separate functional block. The transformation of the CAD model into the model of the internal representation is realized by the block of searching for the geometric parameters of the technological machine, in which the XML-model of the construction is obtained on the basis of the feature method from the theory of image recognition. The configuration of integrated components is divided into three consecutive steps: configuring tasks, components, interfaces. The configuration of the components is realized using the theory of "soft computations" using the Mamdani fuzzy inference algorithm.
Transforming Aggregate Object-Oriented Formal Specifications to Code
1999-03-01
integration issues associated with a formal-based software transformation system, such as the source specification, the problem space architecture , design architecture ... design transforms, and target software transforms. Software is critical in today’s Air Force, yet its specification, design, and development
1981-01-01
comparison of formal and informal design methodologies will show how we think they are converging. Lastly, I will describe our involvement with the DoD...computer security must begin with the design methodology , with the objective being provability. The idea ofa formal evaluation and on-the-shelf... Methodologies ] Here we can compare the formal design methodologies with those used by informal practitioners like Control Data. Obviously, both processes
Toward a mathematical formalism of performance, task difficulty, and activation
NASA Technical Reports Server (NTRS)
Samaras, George M.
1988-01-01
The rudiments of a mathematical formalism for handling operational, physiological, and psychological concepts are developed for use by the man-machine system design engineer. The formalism provides a framework for developing a structured, systematic approach to the interface design problem, using existing mathematical tools, and simplifying the problem of telling a machine how to measure and use performance.
NASA Technical Reports Server (NTRS)
Lucas, S. H.; Scotti, S. J.
1989-01-01
The nonlinear mathematical programming method (formal optimization) has had many applications in engineering design. A figure illustrates the use of optimization techniques in the design process. The design process begins with the design problem, such as the classic example of the two-bar truss designed for minimum weight as seen in the leftmost part of the figure. If formal optimization is to be applied, the design problem must be recast in the form of an optimization problem consisting of an objective function, design variables, and constraint function relations. The middle part of the figure shows the two-bar truss design posed as an optimization problem. The total truss weight is the objective function, the tube diameter and truss height are design variables, with stress and Euler buckling considered as constraint function relations. Lastly, the designer develops or obtains analysis software containing a mathematical model of the object being optimized, and then interfaces the analysis routine with existing optimization software such as CONMIN, ADS, or NPSOL. This final state of software development can be both tedious and error-prone. The Sizing and Optimization Language (SOL), a special-purpose computer language whose goal is to make the software implementation phase of optimum design easier and less error-prone, is presented.
NASA Astrophysics Data System (ADS)
Nouri, N. M.; Mostafapour, K.; Kamran, M.
2018-02-01
In a closed water-tunnel circuit, the multi-component strain gauge force and moment sensor (also known as balance) are generally used to measure hydrodynamic forces and moments acting on scaled models. These balances are periodically calibrated by static loading. Their performance and accuracy depend significantly on the rig and the method of calibration. In this research, a new calibration rig was designed and constructed to calibrate multi-component internal strain gauge balances. The calibration rig has six degrees of freedom and six different component-loading structures that can be applied separately and synchronously. The system was designed based on the applicability of formal experimental design techniques, using gravity for balance loading and balance positioning and alignment relative to gravity. To evaluate the calibration rig, a six-component internal balance developed by Iran University of Science and Technology was calibrated using response surface methodology. According to the results, calibration rig met all design criteria. This rig provides the means by which various methods of formal experimental design techniques can be implemented. The simplicity of the rig saves time and money in the design of experiments and in balance calibration while simultaneously increasing the accuracy of these activities.
ERIC Educational Resources Information Center
Pellas, Nikolaos
2017-01-01
The combination of Open Sim and Scratch4OS can be a worthwhile innovation for introductory programming courses, using a Community of Inquiry (CoI) model as a theoretical instructional design framework. This empirical study had a threefold purpose to present: (a) an instructional design framework for the beneficial formalization of a virtual…
Survey of Verification and Validation Techniques for Small Satellite Software Development
NASA Technical Reports Server (NTRS)
Jacklin, Stephen A.
2015-01-01
The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.
Design of environmental education module towards the needs of aboriginal community learning
NASA Astrophysics Data System (ADS)
Dasman, Siti Mariam; Yasin, Ruhizan Mohammad
2017-05-01
Non-formal education (NFE) refers to a program that is designed for personal and social education for learners to improve the level of skills and competencies outside formal educational curriculum. Issues related to geography and environment of different Aboriginal communities with other communities play an important role in determining the types and methods that should be made available to the minority community groups. Thus, this concept paper is intended to cater for educational environment through the design and development of learning modules based on non-formal education to the learning of Aboriginal community. Methods and techniques in the design and construction of the modules is based on the Design and Development Research (DDR) that was based on instructional design model of Morrison, Kemp and Ross which is more flexible and prioritizes the needs and characteristics of learners who were involved in the learning modules of the future. The discussion is related to the module development which is suitable to the learning needs of the community and there are several recommendations which may be applied in the implementation of this approach. In conclusion, the community of Orang Asli should be offered the same education as other communities but it is important to distinguish acceptance of learning techniques or approaches used in the education system to meet their standards. The implications of this concept paper is to meet the educational needs of the environment which includes a few aspects of science and some learning activities using effective approaches such as playing and building their own knowledge of meaning.
ERIC Educational Resources Information Center
Reynolds, Rebecca; Chiu, Ming Ming
2013-01-01
This paper explored informal (after-school) and formal (elective course in-school) learning contexts as contributors to middle-school student attitudinal changes in a guided discovery-based and blended e-learning program in which students designed web games and used social media and information resources for a full school year. Formality of the…
Formal functional test designs with a test representation language
NASA Technical Reports Server (NTRS)
Hops, J. M.
1993-01-01
The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.
Reactive system verification case study: Fault-tolerant transputer communication
NASA Technical Reports Server (NTRS)
Crane, D. Francis; Hamory, Philip J.
1993-01-01
A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.
Formal specification and design techniques for wireless sensor and actuator networks.
Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons
2011-01-01
A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system.
Final Report - Regulatory Considerations for Adaptive Systems
NASA Technical Reports Server (NTRS)
Wilkinson, Chris; Lynch, Jonathan; Bharadwaj, Raj
2013-01-01
This report documents the findings of a preliminary research study into new approaches to the software design assurance of adaptive systems. We suggest a methodology to overcome the software validation and verification difficulties posed by the underlying assumption of non-adaptive software in the requirementsbased- testing verification methods in RTCA/DO-178B and C. An analysis of the relevant RTCA/DO-178B and C objectives is presented showing the reasons for the difficulties that arise in showing satisfaction of the objectives and suggested additional means by which they could be satisfied. We suggest that the software design assurance problem for adaptive systems is principally one of developing correct and complete high level requirements and system level constraints that define the necessary system functional and safety properties to assure the safe use of adaptive systems. We show how analytical techniques such as model based design, mathematical modeling and formal or formal-like methods can be used to both validate the high level functional and safety requirements, establish necessary constraints and provide the verification evidence for the satisfaction of requirements and constraints that supplements conventional testing. Finally the report identifies the follow-on research topics needed to implement this methodology.
Tools reference manual for a Requirements Specification Language (RSL), version 2.0
NASA Technical Reports Server (NTRS)
Fisher, Gene L.; Cohen, Gerald C.
1993-01-01
This report describes a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages, such as SADT (Structured Analysis and Design Technique), PSL (Problem Statement Language), and RMF (Requirements Modeling Framework). While languages such as these have useful features for structuring a specification, they generally lack formality. To overcome the deficiencies of informal requirements languages, RSL has constructs for formal mathematical specification. These constructs are similar to those found in formal specification languages such as EHDM (Enhanced Hierarchical Development Methodology), Larch, and OBJ3.
Promoting Post-Formal Thinking in a U.S. History Survey Course: A Problem-Based Approach
ERIC Educational Resources Information Center
Wynn, Charles T.; Mosholder, Richard S.; Larsen, Carolee A.
2016-01-01
This article presents a problem-based learning (PBL) model for teaching a college U.S. history survey course (U.S. history since 1890) designed to promote postformal thinking skills and identify and explain thinking systems inherent in adult complex problem-solving. We also present the results of a study in which the outcomes of the PBL model were…
A formal protocol test procedure for the Survivable Adaptable Fiber Optic Embedded Network (SAFENET)
NASA Astrophysics Data System (ADS)
High, Wayne
1993-03-01
This thesis focuses upon a new method for verifying the correct operation of a complex, high speed fiber optic communication network. These networks are of growing importance to the military because of their increased connectivity, survivability, and reconfigurability. With the introduction and increased dependence on sophisticated software and protocols, it is essential that their operation be correct. Because of the speed and complexity of fiber optic networks being designed today, they are becoming increasingly difficult to test. Previously, testing was accomplished by application of conformance test methods which had little connection with an implementation's specification. The major goal of conformance testing is to ensure that the implementation of a profile is consistent with its specification. Formal specification is needed to ensure that the implementation performs its intended operations while exhibiting desirable behaviors. The new conformance test method presented is based upon the System of Communicating Machine model which uses a formal protocol specification to generate a test sequence. The major contribution of this thesis is the application of the System of Communicating Machine model to formal profile specifications of the Survivable Adaptable Fiber Optic Embedded Network (SAFENET) standard which results in the derivation of test sequences for a SAFENET profile. The results applying this new method to SAFENET's OSI and Lightweight profiles are presented.
The Use of a Citizen Leader Model for Teaching Strategic Leadership
ERIC Educational Resources Information Center
Langone, Christine A.
2004-01-01
Strategic leadership is perhaps the area where undergraduate students have the least experience. Therefore, a focus on developing these skills is critical for college-level leadership educators. Teaching strategic leadership requires that educators design programs that make explicit, direct, and formal links between theory and practical…
ERIC Educational Resources Information Center
Marcum-Dietrich, Nanette
2010-01-01
In the scientific community, the symposium is one formal structure of conversation. Scientists routinely hold symposiums to gather and talk about a common topic. To model this method of communication in the classroom, the author designed an activity in which students conduct their own science symposiums. This article presents the science symposium…
Evaluation of Formal Training Programmes in Greek Organisations
ERIC Educational Resources Information Center
Diamantidis, Anastasios D.; Chatzoglou, Prodromos D.
2012-01-01
Purpose: The purpose of the paper is to highlight the training factors that mostly affect trainees' perception of learning and training usefulness. Design/methodology/approach: A new research model is proposed exploring the relationships between a trainer's performance, training programme components, outcomes of the learning process and training…
Visualization of Learning Scenarios with UML4LD
ERIC Educational Resources Information Center
Laforcade, Pierre
2007-01-01
Present Educational Modelling Languages are used to formally specify abstract learning scenarios in a machine-interpretable format. Current tooling does not provide teachers/designers with some graphical facilities to help them in reusing existent scenarios. They need human-readable representations. This paper discusses the UML4LD experimental…
Overview of a Linguistic Theory of Design. AI Memo 383A.
ERIC Educational Resources Information Center
Miller, Mark L.; Goldstein, Ira P.
The SPADE theory, which uses linguistic formalisms to model the planning and debugging processes of computer programming, was simultaneously developed and tested in three separate contexts--computer uses in education, automatic programming (a traditional artificial intelligence arena), and protocol analysis (the domain of information processing…
Specification and Verification of Web Applications in Rewriting Logic
NASA Astrophysics Data System (ADS)
Alpuente, María; Ballis, Demis; Romero, Daniel
This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.
Scoping Planning Agents With Shared Models
NASA Technical Reports Server (NTRS)
Bedrax-Weiss, Tania; Frank, Jeremy D.; Jonsson, Ari K.; McGann, Conor
2003-01-01
In this paper we provide a formal framework to define the scope of planning agents based on a single declarative model. Having multiple agents sharing a single model provides numerous advantages that lead to reduced development costs and increase reliability of the system. We formally define planning in terms of extensions of an initial partial plan, and a set of flaws that make the plan unacceptable. A Flaw Filter (FF) allows us to identify those flaws relevant to an agent. Flaw filters motivate the Plan Identification Function (PIF), which specifies when an agent is is ready hand control to another agent for further work. PIFs define a set of plan extensions that can be generated from a model and a plan request. FFs and PIFs can be used to define the scope of agents without changing the model. We describe an implementation of PIFsand FFswithin the context of EUROPA, a constraint-based planning architecture, and show how it can be used to easily design many different agents.
ERIC Educational Resources Information Center
Almeida, Joana; Fantini, Alvino E.; Simões, Ana Raquel; Costa, Nilza
2016-01-01
This paper examines how the addition of intercultural interventions carried out throughout European credit-bearing exchange programmes can enhance sojourners' development of intercultural competencies, and it explores how both formal and non-formal pedagogical interventions may be designed and implemented. Such interventions were conducted at a…
Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation
NASA Technical Reports Server (NTRS)
Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael
2011-01-01
Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.
Formal Validation of Fault Management Design Solutions
NASA Technical Reports Server (NTRS)
Gibson, Corrina; Karban, Robert; Andolfato, Luigi; Day, John
2013-01-01
The work presented in this paper describes an approach used to develop SysML modeling patterns to express the behavior of fault protection, test the model's logic by performing fault injection simulations, and verify the fault protection system's logical design via model checking. A representative example, using a subset of the fault protection design for the Soil Moisture Active-Passive (SMAP) system, was modeled with SysML State Machines and JavaScript as Action Language. The SysML model captures interactions between relevant system components and system behavior abstractions (mode managers, error monitors, fault protection engine, and devices/switches). Development of a method to implement verifiable and lightweight executable fault protection models enables future missions to have access to larger fault test domains and verifiable design patterns. A tool-chain to transform the SysML model to jpf-Statechart compliant Java code and then verify the generated code via model checking was established. Conclusions and lessons learned from this work are also described, as well as potential avenues for further research and development.
Near-optimal experimental design for model selection in systems biology.
Busetto, Alberto Giovanni; Hauser, Alain; Krummenacher, Gabriel; Sunnåker, Mikael; Dimopoulos, Sotiris; Ong, Cheng Soon; Stelling, Jörg; Buhmann, Joachim M
2013-10-15
Biological systems are understood through iterations of modeling and experimentation. Not all experiments, however, are equally valuable for predictive modeling. This study introduces an efficient method for experimental design aimed at selecting dynamical models from data. Motivated by biological applications, the method enables the design of crucial experiments: it determines a highly informative selection of measurement readouts and time points. We demonstrate formal guarantees of design efficiency on the basis of previous results. By reducing our task to the setting of graphical models, we prove that the method finds a near-optimal design selection with a polynomial number of evaluations. Moreover, the method exhibits the best polynomial-complexity constant approximation factor, unless P = NP. We measure the performance of the method in comparison with established alternatives, such as ensemble non-centrality, on example models of different complexity. Efficient design accelerates the loop between modeling and experimentation: it enables the inference of complex mechanisms, such as those controlling central metabolic operation. Toolbox 'NearOED' available with source code under GPL on the Machine Learning Open Source Software Web site (mloss.org).
A logical foundation for representation of clinical data.
Campbell, K E; Das, A K; Musen, M A
1994-01-01
OBJECTIVE: A general framework for representation of clinical data that provides a declarative semantics of terms and that allows developers to define explicitly the relationships among both terms and combinations of terms. DESIGN: Use of conceptual graphs as a standard representation of logic and of an existing standardized vocabulary, the Systematized Nomenclature of Medicine (SNOMED International), for lexical elements. Concepts such as time, anatomy, and uncertainty must be modeled explicitly in a way that allows relation of these foundational concepts to surface-level clinical descriptions in a uniform manner. RESULTS: The proposed framework was used to model a simple radiology report, which included temporal references. CONCLUSION: Formal logic provides a framework for formalizing the representation of medical concepts. Actual implementations will be required to evaluate the practicality of this approach. PMID:7719805
DOE Office of Scientific and Technical Information (OSTI.GOV)
Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.
Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.
Towards Time Automata and Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Hutzler, G.; Klaudel, H.; Wang, D. Y.
2004-01-01
The design of reactive systems must comply with logical correctness (the system does what it is supposed to do) and timeliness (the system has to satisfy a set of temporal constraints) criteria. In this paper, we propose a global approach for the design of adaptive reactive systems, i.e., systems that dynamically adapt their architecture depending on the context. We use the timed automata formalism for the design of the agents' behavior. This allows evaluating beforehand the properties of the system (regarding logical correctness and timeliness), thanks to model-checking and simulation techniques. This model is enhanced with tools that we developed for the automatic generation of code, allowing to produce very quickly a running multi-agent prototype satisfying the properties of the model.
Research in DRM architecture based on watermarking and PKI
NASA Astrophysics Data System (ADS)
Liu, Ligang; Chen, Xiaosu; Xiao, Dao-ju; Yi, Miao
2005-02-01
Analyze the virtue and disadvantage of the present digital copyright protecting system, design a kind of security protocol model of digital copyright protection, which equilibrium consider the digital media"s use validity, integrality, security of transmission, and trade equity, make a detailed formalize description to the protocol model, analyze the relationship of the entities involved in the digital work copyright protection. The analysis of the security and capability of the protocol model shows that the model is good at security and practicability.
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1991-01-01
The ongoing debate over the role of formalism and formal specifications in software features many speakers with diverse positions. Yet, in the end, they share the conviction that the requirements of a software system can be unambiguously specified, that acceptable software is a product demonstrably meeting the specifications, and that the design process can be carried out with little interaction between designers and users once the specification has been agreed to. This conviction is part of a larger paradigm prevalent in American management thinking, which holds that organizations are systems that can be precisely specified and optimized. This paradigm, which traces historically to the works of Frederick Taylor in the early 1900s, is no longer sufficient for organizations and software systems today. In the domain of software, a new paradigm, called user-centered design, overcomes the limitations of pure formalism. Pioneered in Scandinavia, user-centered design is spreading through Europe and is beginning to make its way into the U.S.
A Generalized Quantum-Inspired Decision Making Model for Intelligent Agent
Loo, Chu Kiong
2014-01-01
A novel decision making for intelligent agent using quantum-inspired approach is proposed. A formal, generalized solution to the problem is given. Mathematically, the proposed model is capable of modeling higher dimensional decision problems than previous researches. Four experiments are conducted, and both empirical experiments results and proposed model's experiment results are given for each experiment. Experiments showed that the results of proposed model agree with empirical results perfectly. The proposed model provides a new direction for researcher to resolve cognitive basis in designing intelligent agent. PMID:24778580
A Fractional Cartesian Composition Model for Semi-Spatial Comparative Visualization Design.
Kolesar, Ivan; Bruckner, Stefan; Viola, Ivan; Hauser, Helwig
2017-01-01
The study of spatial data ensembles leads to substantial visualization challenges in a variety of applications. In this paper, we present a model for comparative visualization that supports the design of according ensemble visualization solutions by partial automation. We focus on applications, where the user is interested in preserving selected spatial data characteristics of the data as much as possible-even when many ensemble members should be jointly studied using comparative visualization. In our model, we separate the design challenge into a minimal set of user-specified parameters and an optimization component for the automatic configuration of the remaining design variables. We provide an illustrated formal description of our model and exemplify our approach in the context of several application examples from different domains in order to demonstrate its generality within the class of comparative visualization problems for spatial data ensembles.
Using stochastic activity networks to study the energy feasibility of automatic weather stations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cassano, Luca; Cesarini, Daniel; Avvenuti, Marco
Automatic Weather Stations (AWSs) are systems equipped with a number of environmental sensors and communication interfaces used to monitor harsh environments, such as glaciers and deserts. Designing such systems is challenging, since designers have to maximize the amount of sampled and transmitted data while considering the energy needs of the system that, in most cases, is powered by rechargeable batteries and exploits energy harvesting, e.g., solar cells and wind turbines. To support designers of AWSs in the definition of the software tasks and of the hardware configuration of the AWS we designed and implemented an energy-aware simulator of such systems.more » The simulator relies on the Stochastic Activity Networks (SANs) formalism and has been developed using the Möbius tool. In this paper we first show how we used the SAN formalism to model the various components of an AWS, we then report results from an experiment carried out to validate the simulator against a real-world AWS and we finally show some examples of usage of the proposed simulator.« less
Specification and verification of gate-level VHDL models of synchronous and asynchronous circuits
NASA Technical Reports Server (NTRS)
Russinoff, David M.
1995-01-01
We present a mathematical definition of hardware description language (HDL) that admits a semantics-preserving translation to a subset of VHDL. Our HDL includes the basic VHDL propagation delay mechanisms and gate-level circuit descriptions. We also develop formal procedures for deriving and verifying concise behavioral specifications of combinational and sequential devices. The HDL and the specification procedures have been formally encoded in the computational logic of Boyer and Moore, which provides a LISP implementation as well as a facility for mechanical proof-checking. As an application, we design, specify, and verify a circuit that achieves asynchronous communication by means of the biphase mark protocol.
Generating Alternative Proposals for the Louvre Using Procedural Modeling
NASA Astrophysics Data System (ADS)
Calogero, E.; Arnold, D.
2011-09-01
This paper presents the process of reconstructing two facade designs for the East wing of the Louvre using procedural modeling. The first proposal reconstructed is Louis Le Vau's 1662 scheme and the second is the 1668 design of the "petit conseil" that still stands today. The initial results presented show how such reconstructions may aid general and expert understanding of the two designs. It is claimed that by formalizing the facade description into a shape grammar in CityEngine, a systematized approach to a stylistic analysis is possible. It is also asserted that such an analysis is still best understood in the historical context of what is known about the contemporary design intentions of the building creators and commissioners.
NASA Technical Reports Server (NTRS)
Bolton, Matthew L.; Bass, Ellen J.
2009-01-01
Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.
Formal Specification and Design Techniques for Wireless Sensor and Actuator Networks
Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons
2011-01-01
A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system. PMID:22344203
Hypersonic Wind Tunnel Calibration Using the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
Rhode, Matthew N.; DeLoach, Richard
2005-01-01
A calibration of a hypersonic wind tunnel has been conducted using formal experiment design techniques and response surface modeling. Data from a compact, highly efficient experiment was used to create a regression model of the pitot pressure as a function of the facility operating conditions as well as the longitudinal location within the test section. The new calibration utilized far fewer design points than prior experiments, but covered a wider range of the facility s operating envelope while revealing interactions between factors not captured in previous calibrations. A series of points chosen randomly within the design space was used to verify the accuracy of the response model. The development of the experiment design is discussed along with tactics used in the execution of the experiment to defend against systematic variation in the results. Trends in the data are illustrated, and comparisons are made to earlier findings.
Suka, Machi; Yamauchi, Takashi; Sugimori, Hiroki
2015-01-01
Objective Encouraging help-seeking for mental illness is essential for prevention of suicide. This study examined the relationship between individual characteristics, neighbourhood contexts and help-seeking intentions for mental illness for the purpose of elucidating the role of neighbourhood in the help-seeking process. Design, setting and participants A cross-sectional web-based survey was conducted among Japanese adults aged 20–59 years in June 2014. Eligible respondents who did not have a serious health condition were included in this study (n=3308). Main outcome measures Participants were asked how likely they would be to seek help from someone close to them (informal help) and medical professionals (formal help), respectively, if they were suffering from serious mental illness. Path analysis with structural equation modelling was performed to represent plausible connections between individual characteristics, neighbourhood contexts, and informal and formal help-seeking intentions. Results The acceptable fitting model indicated that those who had a tendency to consult about everyday affairs were significantly more likely to express an informal help-seeking intention that was directly associated with a formal help-seeking intention. Those living in a communicative neighbourhood, where neighbours say hello whenever they pass each other, were significantly more likely to express informal and formal help-seeking intentions. Those living in a supportive neighbourhood, where neighbours work together to solve neighbourhood problems, were significantly more likely to express an informal help-seeking intention. Adequate health literacy was directly associated with informal and formal help-seeking intentions, along with having an indirect effect on the formal help-seeking intention through developed positive perception of professional help. Conclusions The results of this study bear out the hypothesis that neighbourhood context contributes to help-seeking intentions for mental illness. Living in a neighbourhood with a communicative atmosphere and having adequate health literacy were acknowledged as possible facilitating factors for informal and formal help-seeking for mental illness. PMID:26264273
Designing and Instructing Hybrid Open Learning Spaces Model to Support Lifelong Learning Engagement
ERIC Educational Resources Information Center
Crawford, Caroline M.
2016-01-01
With a focus upon open and lifelong learning understanding, the real world delineation between formalized higher education graduate school efforts and professional career position lines may be suggested as being blurred. This case study offers an analysis of one university instructor's efforts towards developing hybrid learning spaces that…
ERIC Educational Resources Information Center
Schultz, Louise, Ed.
The 31 papers in this proceedings cover social as well as technical issues, mathematical models and formal logic systems, applications descriptions, program design, cost analysis, and predictions. The papers are grouped into sessions including: privacy and information technology, describing documents, information dissemination systems, public and…
The Evolution of SCORM to Tin Can API: Implications for Instructional Design
ERIC Educational Resources Information Center
Lindert, Lisa; Su, Bude
2016-01-01
Integrating and documenting formal and informal learning experiences is challenging using the current Shareable Content Object Reference Model (SCORM) eLearning standard, which limits the media and data that are obtained from eLearning. In response to SCORM's limitations, corporate, military, and academic institutions have collaborated to develop…
Intelligent Tutoring Systems: Formalization as Automata and Interface Design Using Neural Networks
ERIC Educational Resources Information Center
Curilem, S. Gloria; Barbosa, Andrea R.; de Azevedo, Fernando M.
2007-01-01
This article proposes a mathematical model of Intelligent Tutoring Systems (ITS), based on observations of the behaviour of these systems. One of the most important problems of pedagogical software is to establish a common language between the knowledge areas involved in their development, basically pedagogical, computing and domain areas. A…
From ePortfolios to iPortfolios: The Find, Refine, Design, and Bind Model
ERIC Educational Resources Information Center
Foti, Sebastian; Ring, Gail L.
2008-01-01
During the past two decades, educational institutions around the world began formalizing the process of collecting student work as a means of showcasing student accomplishments and ultimately providing students a forum for reflecting on their accomplishments. In this article, the authors propose a redefinition of the electronic portfolio…
ERIC Educational Resources Information Center
Cretin, Shan; Shortell, Stephen M.; Keeler, Emmett B.
2004-01-01
The authors' dual-purpose evaluation assesses the effectiveness of formal collaboratives in stimulating organizational changes to improve chronic illness care (the chronic care model or CCM). Intervention and comparison sites are compared before and after introduction of the CCM. Multiple data sources are used to measure the degree of…
Supply Chain Development: Insights from Strategic Niche Management
ERIC Educational Resources Information Center
Caniels, Marjolein C. J.; Romijn, Henny A.
2008-01-01
Purpose: The purpose of this paper is to contribute to the study of supply chain design from the perspective of complex dynamic systems. Unlike extant studies that use formal simulation modelling and associated methodologies rooted in the physical sciences, it adopts a framework rooted in the social sciences, strategic niche management, which…
Reengineering Framework for Systems in Education
ERIC Educational Resources Information Center
Choquet, Christophe; Corbiere, Alain
2006-01-01
Specifications recently proposed as standards in the domain of Technology Enhanced Learning (TEL), question the designers of TEL systems on how to put them into practice. Recent studies in Model Driven Engineering have highlighted the need for a framework which could formalize the use of these specifications as well as enhance the quality of the…
Qualitative mechanism models and the rationalization of procedures
NASA Technical Reports Server (NTRS)
Farley, Arthur M.
1989-01-01
A qualitative, cluster-based approach to the representation of hydraulic systems is described and its potential for generating and explaining procedures is demonstrated. Many ideas are formalized and implemented as part of an interactive, computer-based system. The system allows for designing, displaying, and reasoning about hydraulic systems. The interactive system has an interface consisting of three windows: a design/control window, a cluster window, and a diagnosis/plan window. A qualitative mechanism model for the ORS (Orbital Refueling System) is presented to coordinate with ongoing research on this system being conducted at NASA Ames Research Center.
Statechart-based design controllers for FPGA partial reconfiguration
NASA Astrophysics Data System (ADS)
Łabiak, Grzegorz; Wegrzyn, Marek; Rosado Muñoz, Alfredo
2015-09-01
Statechart diagram and UML technique can be a vital part of early conceptual modeling. At the present time there is no much support in hardware design methodologies for reconfiguration features of reprogrammable devices. Authors try to bridge the gap between imprecise UML model and formal HDL description. The key concept in author's proposal is to describe the behavior of the digital controller by statechart diagrams and to map some parts of the behavior into reprogrammable logic by means of group of states which forms sequential automaton. The whole process is illustrated by the example with experimental results.
How External Institutions Penetrate Schools through Formal and Informal Leaders
ERIC Educational Resources Information Center
Sun, Min; Frank, Kenneth A.; Penuel, William R.; Kim, Chong Min
2013-01-01
Purposes: This study investigates the role of formal and informal leaders in the diffusion of external reforms into schools and to teachers' practices. Formal leaders are designated by their roles in the formal organization of the school (e.g., principals, department chairs, and instructional coaches) and informal leaders refer to those who do not…
ERIC Educational Resources Information Center
Johnson, Christopher W.
1996-01-01
The development of safety-critical systems (aircraft cockpits and reactor control rooms) is qualitatively different from that of other interactive systems. These differences impose burdens on design teams that must ensure the development of human-machine interfaces. Analyzes strengths and weaknesses of formal methods for the design of user…
Beware the tail that wags the dog: informal and formal models in biology
Gunawardena, Jeremy
2014-01-01
Informal models have always been used in biology to guide thinking and devise experiments. In recent years, formal mathematical models have also been widely introduced. It is sometimes suggested that formal models are inherently superior to informal ones and that biology should develop along the lines of physics or economics by replacing the latter with the former. Here I suggest to the contrary that progress in biology requires a better integration of the formal with the informal. PMID:25368417
Impact of Jovian radiation environmental hazard on spacecraft and mission development design
NASA Technical Reports Server (NTRS)
Divita, E.
1972-01-01
The environmental impact on the TOPS 12L configuration is discussed. The activities in system environmental design and testing are described, and radiation design restraints based on the upper limit model are given. Range energy cutoffs in aluminum are also presented and the effective shielding thicknesses for electrons and protons of different energies are included. Design integration problems and radiation testing aspects are considered. Data are given for selecting the parts which should be tested in a formal test program, and the piece-part radiation thresholds are tabulated for electrons and protons.
An approach to verification and validation of a reliable multicasting protocol: Extended Abstract
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1995-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. This initial version did not handle off-nominal cases such as network partitions or site failures. Meanwhile, the V&V team concurrently developed a formal model of the requirements using a variant of SCR-based state tables. Based on these requirements tables, the V&V team developed test cases to exercise the implementation. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test in the model and implementation agreed, then the test either found a potential problem or verified a required behavior. However, if the execution of a test was different in the model and implementation, then the differences helped identify inconsistencies between the model and implementation. In either case, the dialogue between both teams drove the co-evolution of the model and implementation. We have found that this interactive, iterative approach to development allows software designers to focus on delivery of nominal functionality while the V&V team can focus on analysis of off nominal cases. Testing serves as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP. Although RMP has provided our research effort with a rich set of test cases, it also has practical applications within NASA. For example, RMP is being considered for use in the NASA EOSDIS project due to its significant performance benefits in applications that need to replicate large amounts of data to many network sites.
Automated Verification of Design Patterns with LePUS3
NASA Technical Reports Server (NTRS)
Nicholson, Jonathan; Gasparis, Epameinondas; Eden, Ammon H.; Kazman, Rick
2009-01-01
Specification and [visual] modelling languages are expected to combine strong abstraction mechanisms with rigour, scalability, and parsimony. LePUS3 is a visual, object-oriented design description language axiomatized in a decidable subset of the first-order predicate logic. We demonstrate how LePUS3 is used to formally specify a structural design pattern and prove ( verify ) whether any JavaTM 1.4 program satisfies that specification. We also show how LePUS3 specifications (charts) are composed and how they are verified fully automatically in the Two-Tier Programming Toolkit.
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
1978-01-17
approach to designing computers: Formal mathematical methods were applied and computers themselves began to be widely used in designing other...capital, labor resources and the funds of consumers. Analysis of the model indicates that at the present time the average complexity of production of...ALGORITHMIC COMPLETENESS AND COMPLEXITY OF MICROPROGRAMS Kiev KIBERNETIKA in Russian No 3, May/Jun 77 pp 1-15 manuscript received 22 Dec 76 G0LUNK0V
The evaluative imaging of mental models - Visual representations of complexity
NASA Technical Reports Server (NTRS)
Dede, Christopher
1989-01-01
The paper deals with some design issues involved in building a system that could visually represent the semantic structures of training materials and their underlying mental models. In particular, hypermedia-based semantic networks that instantiate classification problem solving strategies are thought to be a useful formalism for such representations; the complexity of these web structures can be best managed through visual depictions. It is also noted that a useful approach to implement in these hypermedia models would be some metrics of conceptual distance.
State of the art in pathology business process analysis, modeling, design and optimization.
Schrader, Thomas; Blobel, Bernd; García-Rojo, Marcial; Daniel, Christel; Słodkowska, Janina
2012-01-01
For analyzing current workflows and processes, for improving them, for quality management and quality assurance, for integrating hardware and software components, but also for education, training and communication between different domains' experts, modeling business process in a pathology department is inevitable. The authors highlight three main processes in pathology: general diagnostic, cytology diagnostic, and autopsy. In this chapter, those processes are formally modeled and described in detail. Finally, specialized processes such as immunohistochemistry and frozen section have been considered.
NASA Astrophysics Data System (ADS)
Jones, Chris D.; Arora, Vivek; Friedlingstein, Pierre; Bopp, Laurent; Brovkin, Victor; Dunne, John; Graven, Heather; Hoffman, Forrest; Ilyina, Tatiana; John, Jasmin G.; Jung, Martin; Kawamiya, Michio; Koven, Charlie; Pongratz, Julia; Raddatz, Thomas; Randerson, James T.; Zaehle, Sönke
2016-08-01
Coordinated experimental design and implementation has become a cornerstone of global climate modelling. Model Intercomparison Projects (MIPs) enable systematic and robust analysis of results across many models, by reducing the influence of ad hoc differences in model set-up or experimental boundary conditions. As it enters its 6th phase, the Coupled Model Intercomparison Project (CMIP6) has grown significantly in scope with the design and documentation of individual simulations delegated to individual climate science communities. The Coupled Climate-Carbon Cycle Model Intercomparison Project (C4MIP) takes responsibility for design, documentation, and analysis of carbon cycle feedbacks and interactions in climate simulations. These feedbacks are potentially large and play a leading-order contribution in determining the atmospheric composition in response to human emissions of CO2 and in the setting of emissions targets to stabilize climate or avoid dangerous climate change. For over a decade, C4MIP has coordinated coupled climate-carbon cycle simulations, and in this paper we describe the C4MIP simulations that will be formally part of CMIP6. While the climate-carbon cycle community has created this experimental design, the simulations also fit within the wider CMIP activity, conform to some common standards including documentation and diagnostic requests, and are designed to complement the CMIP core experiments known as the Diagnostic, Evaluation and Characterization of Klima (DECK). C4MIP has three key strands of scientific motivation and the requested simulations are designed to satisfy their needs: (1) pre-industrial and historical simulations (formally part of the common set of CMIP6 experiments) to enable model evaluation, (2) idealized coupled and partially coupled simulations with 1 % per year increases in CO2 to enable diagnosis of feedback strength and its components, (3) future scenario simulations to project how the Earth system will respond to anthropogenic activity over the 21st century and beyond. This paper documents in detail these simulations, explains their rationale and planned analysis, and describes how to set up and run the simulations. Particular attention is paid to boundary conditions, input data, and requested output diagnostics. It is important that modelling groups participating in C4MIP adhere as closely as possible to this experimental design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Chris D.; Arora, Vivek; Friedlingstein, Pierre
Coordinated experimental design and implementation has become a cornerstone of global climate modelling. Model Intercomparison Projects (MIPs) enable systematic and robust analysis of results across many models, by reducing the influence of ad hoc differences in model set-up or experimental boundary conditions. As it enters its 6th phase, the Coupled Model Intercomparison Project (CMIP6) has grown significantly in scope with the design and documentation of individual simulations delegated to individual climate science communities. The Coupled Climate–Carbon Cycle Model Intercomparison Project (C4MIP) takes responsibility for design, documentation, and analysis of carbon cycle feedbacks and interactions in climate simulations. These feedbacks aremore » potentially large and play a leading-order contribution in determining the atmospheric composition in response to human emissions of CO 2 and in the setting of emissions targets to stabilize climate or avoid dangerous climate change. For over a decade, C4MIP has coordinated coupled climate–carbon cycle simulations, and in this paper we describe the C4MIP simulations that will be formally part of CMIP6. While the climate–carbon cycle community has created this experimental design, the simulations also fit within the wider CMIP activity, conform to some common standards including documentation and diagnostic requests, and are designed to complement the CMIP core experiments known as the Diagnostic, Evaluation and Characterization of Klima (DECK). C4MIP has three key strands of scientific motivation and the requested simulations are designed to satisfy their needs: (1) pre-industrial and historical simulations (formally part of the common set of CMIP6 experiments) to enable model evaluation, (2) idealized coupled and partially coupled simulations with 1 % per year increases in CO 2 to enable diagnosis of feedback strength and its components, (3) future scenario simulations to project how the Earth system will respond to anthropogenic activity over the 21st century and beyond. This study documents in detail these simulations, explains their rationale and planned analysis, and describes how to set up and run the simulations. Particular attention is paid to boundary conditions, input data, and requested output diagnostics. It is important that modelling groups participating in C4MIP adhere as closely as possible to this experimental design.« less
Jones, Chris D.; Arora, Vivek; Friedlingstein, Pierre; ...
2016-08-25
Coordinated experimental design and implementation has become a cornerstone of global climate modelling. Model Intercomparison Projects (MIPs) enable systematic and robust analysis of results across many models, by reducing the influence of ad hoc differences in model set-up or experimental boundary conditions. As it enters its 6th phase, the Coupled Model Intercomparison Project (CMIP6) has grown significantly in scope with the design and documentation of individual simulations delegated to individual climate science communities. The Coupled Climate–Carbon Cycle Model Intercomparison Project (C4MIP) takes responsibility for design, documentation, and analysis of carbon cycle feedbacks and interactions in climate simulations. These feedbacks aremore » potentially large and play a leading-order contribution in determining the atmospheric composition in response to human emissions of CO 2 and in the setting of emissions targets to stabilize climate or avoid dangerous climate change. For over a decade, C4MIP has coordinated coupled climate–carbon cycle simulations, and in this paper we describe the C4MIP simulations that will be formally part of CMIP6. While the climate–carbon cycle community has created this experimental design, the simulations also fit within the wider CMIP activity, conform to some common standards including documentation and diagnostic requests, and are designed to complement the CMIP core experiments known as the Diagnostic, Evaluation and Characterization of Klima (DECK). C4MIP has three key strands of scientific motivation and the requested simulations are designed to satisfy their needs: (1) pre-industrial and historical simulations (formally part of the common set of CMIP6 experiments) to enable model evaluation, (2) idealized coupled and partially coupled simulations with 1 % per year increases in CO 2 to enable diagnosis of feedback strength and its components, (3) future scenario simulations to project how the Earth system will respond to anthropogenic activity over the 21st century and beyond. This study documents in detail these simulations, explains their rationale and planned analysis, and describes how to set up and run the simulations. Particular attention is paid to boundary conditions, input data, and requested output diagnostics. It is important that modelling groups participating in C4MIP adhere as closely as possible to this experimental design.« less
Design Methods and Practices for Fault Prevention and Management in Spacecraft
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.
2005-01-01
Integrated Systems Health Management (ISHM) is intended to become a critical capability for all space, lunar and planetary exploration vehicles and systems at NASA. Monitoring and managing the health state of diverse components, subsystems, and systems is a difficult task that will become more challenging when implemented for long-term, evolving deployments. A key technical challenge will be to ensure that the ISHM technologies are reliable, effective, and low cost, resulting in turn in safe, reliable, and affordable missions. To ensure safety and reliability, ISHM functionality, decisions and knowledge have to be incorporated into the product lifecycle as early as possible, and ISHM must be considered as an essential element of models developed and used in various stages during system design. During early stage design, many decisions and tasks are still open, including sensor and measurement point selection, modeling and model-checking, diagnosis, signature and data fusion schemes, presenting the best opportunity to catch and prevent potential failures and anomalies in a cost-effective way. Using appropriate formal methods during early design, the design teams can systematically explore risks without committing to design decisions too early. However, the nature of ISHM knowledge and data is detailed, relying on high-fidelity, detailed models, whereas the earlier stages of the product lifecycle utilize low-fidelity, high-level models of systems and their functionality. We currently lack the tools and processes necessary for integrating ISHM into the vehicle system/subsystem design. As a result, most existing ISHM-like technologies are retrofits that were done after the system design was completed. It is very expensive, and sometimes futile, to retrofit a system health management capability into existing systems. Last-minute retrofits result in unreliable systems, ineffective solutions, and excessive costs (e.g., Space Shuttle TPS monitoring which was considered only after 110 flights and the Columbia disaster). High false alarm or false negative rates due to substandard implementations hurt the credibility of the ISHM discipline. This paper presents an overview of the current state of ISHM design,and a review of formal design methods to make recommendations about possible approaches to enable the ISHM capabilities to be designed-in at the system-level, from the very beginning of the vehicle design process.
A Formalized Design Process for Bacterial Consortia That Perform Logic Computing
Sun, Rui; Xi, Jingyi; Wen, Dingqiao; Feng, Jingchen; Chen, Yiwei; Qin, Xiao; Ma, Yanrong; Luo, Wenhan; Deng, Linna; Lin, Hanchi; Yu, Ruofan; Ouyang, Qi
2013-01-01
The concept of microbial consortia is of great attractiveness in synthetic biology. Despite of all its benefits, however, there are still problems remaining for large-scaled multicellular gene circuits, for example, how to reliably design and distribute the circuits in microbial consortia with limited number of well-behaved genetic modules and wiring quorum-sensing molecules. To manage such problem, here we propose a formalized design process: (i) determine the basic logic units (AND, OR and NOT gates) based on mathematical and biological considerations; (ii) establish rules to search and distribute simplest logic design; (iii) assemble assigned basic logic units in each logic operating cell; and (iv) fine-tune the circuiting interface between logic operators. We in silico analyzed gene circuits with inputs ranging from two to four, comparing our method with the pre-existing ones. Results showed that this formalized design process is more feasible concerning numbers of cells required. Furthermore, as a proof of principle, an Escherichia coli consortium that performs XOR function, a typical complex computing operation, was designed. The construction and characterization of logic operators is independent of “wiring” and provides predictive information for fine-tuning. This formalized design process provides guidance for the design of microbial consortia that perform distributed biological computation. PMID:23468999
SSDOnt: An Ontology for Representing Single-Subject Design Studies.
Berges, Idoia; Bermúdez, Jesus; Illarramendi, Arantza
2018-02-01
Single-Subject Design is used in several areas such as education and biomedicine. However, no suited formal vocabulary exists for annotating the detailed configuration and the results of this type of research studies with the appropriate granularity for looking for information about them. Therefore, the search for those study designs relies heavily on a syntactical search on the abstract, keywords or full text of the publications about the study, which entails some limitations. To present SSDOnt, a specific purpose ontology for describing and annotating single-subject design studies, so that complex questions can be asked about them afterwards. The ontology was developed following the NeOn methodology. Once the requirements of the ontology were defined, a formal model was described in a Description Logic and later implemented in the ontology language OWL 2 DL. We show how the ontology provides a reference model with a suitable terminology for the annotation and searching of single-subject design studies and their main components, such as the phases, the intervention types, the outcomes and the results. Some mappings with terms of related ontologies have been established. We show as proof-of-concept that classes in the ontology can be easily extended to annotate more precise information about specific interventions and outcomes such as those related to autism. Moreover, we provide examples of some types of queries that can be posed to the ontology. SSDOnt has achieved the purpose of covering the descriptions of the domain of single-subject research studies. Schattauer GmbH.
Safety Verification of the Small Aircraft Transportation System Concept of Operations
NASA Technical Reports Server (NTRS)
Carreno, Victor; Munoz, Cesar
2005-01-01
A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.
Designing automation for human use: empirical studies and quantitative models.
Parasuraman, R
2000-07-01
An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.
The application of SSADM to modelling the logical structure of proteins.
Saldanha, J; Eccles, J
1991-10-01
A logical design that describes the overall structure of proteins, together with a more detailed design describing secondary and some supersecondary structures, has been constructed using the computer-aided software engineering (CASE) tool, Auto-mate. Auto-mate embodies the philosophy of the Structured Systems Analysis and Design Method (SSADM) which enables the logical design of computer systems. Our design will facilitate the building of large information systems, such as databases and knowledgebases in the field of protein structure, by the derivation of system requirements from our logical model prior to producing the final physical system. In addition, the study has highlighted the ease of employing SSADM as a formalism in which to conduct the transferral of concepts from an expert into a design for a knowledge-based system that can be implemented on a computer (the knowledge-engineering exercise). It has been demonstrated how SSADM techniques may be extended for the purpose of modelling the constituent Prolog rules. This facilitates the integration of the logical system design model with the derived knowledge-based system.
NASA Technical Reports Server (NTRS)
Rowe, Sidney E.
2010-01-01
In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.
NASA Technical Reports Server (NTRS)
Lacovara, R. C.
1990-01-01
The notions, benefits, and drawbacks of numeric simulation are introduced. Two formal simulation languages, Simpscript and Modsim are introduced. The capabilities of each are discussed briefly, and then the two programs are compared. The use of simulation in the process of design engineering for the Control and Monitoring System (CMS) for Space Station Freedom is discussed. The application of the formal simulation language to the CMS design is presented, and recommendations are made as to their use.
Soyyılmaz, Demet; Griffin, Laura M.; Martín, Miguel H.; Kucharský, Šimon; Peycheva, Ekaterina D.; Vaupotič, Nina; Edelsbrunner, Peter A.
2017-01-01
Scientific thinking is a predicate for scientific inquiry, and thus important to develop early in psychology students as potential future researchers. The present research is aimed at fathoming the contributions of formal and informal learning experiences to psychology students’ development of scientific thinking during their 1st-year of study. We hypothesize that informal experiences are relevant beyond formal experiences. First-year psychology student cohorts from various European countries will be assessed at the beginning and again at the end of the second semester. Assessments of scientific thinking will include scientific reasoning skills, the understanding of basic statistics concepts, and epistemic cognition. Formal learning experiences will include engagement in academic activities which are guided by university authorities. Informal learning experiences will include non-compulsory, self-guided learning experiences. Formal and informal experiences will be assessed with a newly developed survey. As dispositional predictors, students’ need for cognition and self-efficacy in psychological science will be assessed. In a structural equation model, students’ learning experiences and personal dispositions will be examined as predictors of their development of scientific thinking. Commonalities and differences in predictive weights across universities will be tested. The project is aimed at contributing information for designing university environments to optimize the development of students’ scientific thinking. PMID:28239363
Soyyılmaz, Demet; Griffin, Laura M; Martín, Miguel H; Kucharský, Šimon; Peycheva, Ekaterina D; Vaupotič, Nina; Edelsbrunner, Peter A
2017-01-01
Scientific thinking is a predicate for scientific inquiry, and thus important to develop early in psychology students as potential future researchers. The present research is aimed at fathoming the contributions of formal and informal learning experiences to psychology students' development of scientific thinking during their 1st-year of study. We hypothesize that informal experiences are relevant beyond formal experiences. First-year psychology student cohorts from various European countries will be assessed at the beginning and again at the end of the second semester. Assessments of scientific thinking will include scientific reasoning skills, the understanding of basic statistics concepts, and epistemic cognition. Formal learning experiences will include engagement in academic activities which are guided by university authorities. Informal learning experiences will include non-compulsory, self-guided learning experiences. Formal and informal experiences will be assessed with a newly developed survey. As dispositional predictors, students' need for cognition and self-efficacy in psychological science will be assessed. In a structural equation model, students' learning experiences and personal dispositions will be examined as predictors of their development of scientific thinking. Commonalities and differences in predictive weights across universities will be tested. The project is aimed at contributing information for designing university environments to optimize the development of students' scientific thinking.
Software Tools for Formal Specification and Verification of Distributed Real-Time Systems
1994-07-29
time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real
Formal design specification of a Processor Interface Unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1992-01-01
This report describes work to formally specify the requirements and design of a processor interface unit (PIU), a single-chip subsystem providing memory-interface bus-interface, and additional support services for a commercial microprocessor within a fault-tolerant computer system. This system, the Fault-Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. The need for high-quality design assurance in such applications is an undisputed fact, given the disastrous consequences that even a single design flaw can produce. Thus, the further development and application of formal methods to fault-tolerant systems is of critical importance as these systems see increasing use in modern society.
Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility
NASA Technical Reports Server (NTRS)
Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd
1999-01-01
We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.
Modeling the peak of emergence in systems: Design and katachi.
Cardier, Beth; Goranson, H T; Casas, Niccolo; Lundberg, Patric; Erioli, Alessio; Takaki, Ryuji; Nagy, Dénes; Ciavarra, Richard; Sanford, Larry D
2017-12-01
It is difficult to model emergence in biological systems using reductionist paradigms. A requirement for computational modeling is that individual entities can be recorded parametrically and related logically, but their transformation into whole systems cannot be captured this way. The problem stems from an inability to formally represent the implicit influences that inform emergent organization, such as context, shifts in causal agency or scale, and self-reference. This lack hampers biological systems modeling and its computational counterpart, indicating a need for new fundamental abstraction frameworks that support system-level characteristics. We develop an approach that formally captures these characteristics, focusing on the way they come together to enable transformation at the 'peak' of the emergent process. An example from virology is presented, in which two seemingly antagonistic systems - the herpes cold sore virus and its host - are capable of altering their basic biological objectives to achieve a new equilibrium. The usual barriers to modeling this process are overcome by incorporating mechanisms from practices centered on its emergent peak: design and katachi. In the Japanese science of form, katachi refers to the emergence of intrinsic structure from real situations, where an optimal balance between implicit influences is achieved. Design indicates how such optimization is guided by principles of flow. These practices leverage qualities of situated abstraction, which we understand through the intuitive method of physicist Kôdi Husimi. Early results indicate that this approach can capture the functional transformations of biological emergence, whilst being reasonably computable. Due to its geometric foundations and narrative-based extension to logic, the method will also generate speculative predictions. This research forms the foundations of a new biomedical modeling platform, which is discussed. Copyright © 2017. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Pying, How Shwu; Rashid, Abdullah Mat
2014-01-01
The focus of this study is to determine the relationship of teacher's teaching styles (expert, formal authority, personal model, delegation, and facilitator) towards student's interests in Integrated Living Skills (ILS) subjects. The research is designed to explore the common teaching styles applied by ILS teachers in schools and identify the…
Naval Shipyard Apprentice Program & Community-Technical College Linkages: A Model for Success.
ERIC Educational Resources Information Center
Cantor, Jeffrey A.
Each of the eight shipyards operated by the U.S. Navy administers a formal 4-year apprentice trades training program. The apprentice programs combine daily on-the-job training with classroom instruction in technical subjects related to work requirements, including shop math, chemistry, physics, and mechanical drafting. The programs are designed to…
MONTAGE: A Methodology for Designing Composable End-to-End Secure Distributed Systems
2012-08-01
83 7.6 Formal Model of Loc Separation . . . . . . . . . . . . . . . . . . . . . . . . . 84 7.6.1 Static Partitions...Next, we derive five requirements (called Loc Separation, Implicit Parameter Separation, Error Signaling Separation, Conf Separation, and Next Call...hypervisors and hardware) and a real cloud (with shared hypervisors and hardware) that satisfies these requirements. Finally we study Loc Separation
Formal Methods for Life-Critical Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
ERIC Educational Resources Information Center
Saif, Perveen; Reba, Amjad; ud Din, Jalal
2017-01-01
This study was designed to compare the subject knowledge of B.Ed graduates of formal and non-formal teacher education systems. The population of the study included all teachers from Girls High and Higher Secondary Schools both from private and public sectors from the district of Peshawar. Out of the total population, twenty schools were randomly…
NASA Astrophysics Data System (ADS)
Attia, Moez; Gueddana, Amor; Chatta, Rihab; Morand, Alain
2013-09-01
The work presented in this paper develops a new formalism to design microdisks and microgears structures. The main objective is to study the optics and geometrics parameters influence on the microdisks and microgears structures resonance behavior. This study is conducted to choice a resonance structure with height quality factor Q to be associated with Quantum dot to form a single photon source. This new method aims to design resonant structures that are simpler and requires less computing performances than FDTD and Floquet Block methods. This formalism is based on simplifying Fourier transformed and using toeplitz matrix writing. This new writing allows designing all kind of resonance structures with any defect and any modification. In other study we have design a quantum dot emitting a photon at 1550 nm of the fundamental mode, but the quantum dot emits other photons at other wavelengths. The focus of the resonant structure and the quantum dot association is the resonance of the photon at 1550 nm and the elimination of all other photons with others energies. The quantum dot studied in [1] is an InAs/GaAs quantum dot, we design an GaAS microdisk and microgear and we compare the quality factor Q of this two structures and we conclude that the microgear is more appropriated to be associate to the quantum dot and increase the probability P1 to obtain a single photon source at 1550 nm and promotes the obtaining of single photon. The performance improving of the resonant structure is able to increase the success of quantum applications such as quantum gates based on single photon source.
Formal development of a clock synchronization circuit
NASA Technical Reports Server (NTRS)
Miner, Paul S.
1995-01-01
This talk presents the latest stage in formal development of a fault-tolerant clock synchronization circuit. The development spans from a high level specification of the required properties to a circuit realizing the core function of the system. An abstract description of an algorithm has been verified to satisfy the high-level properties using the mechanical verification system EHDM. This abstract description is recast as a behavioral specification input to the Digital Design Derivation system (DDD) developed at Indiana University. DDD provides a formal design algebra for developing correct digital hardware. Using DDD as the principle design environment, a core circuit implementing the clock synchronization algorithm was developed. The design process consisted of standard DDD transformations augmented with an ad hoc refinement justified using the Prototype Verification System (PVS) from SRI International. Subsequent to the above development, Wilfredo Torres-Pomales discovered an area-efficient realization of the same function. Establishing correctness of this optimization requires reasoning in arithmetic, so a general verification is outside the domain of both DDD transformations and model-checking techniques. DDD represents digital hardware by systems of mutually recursive stream equations. A collection of PVS theories was developed to aid in reasoning about DDD-style streams. These theories include a combinator for defining streams that satisfy stream equations, and a means for proving stream equivalence by exhibiting a stream bisimulation. DDD was used to isolate the sub-system involved in Torres-Pomales' optimization. The equivalence between the original design and the optimized verified was verified in PVS by exhibiting a suitable bisimulation. The verification depended upon type constraints on the input streams and made extensive use of the PVS type system. The dependent types in PVS provided a useful mechanism for defining an appropriate bisimulation.
A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks
Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos
2016-01-01
Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568
Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Designing Curricular Experiences that Promote Young Adolescents' Cognitive Growth
ERIC Educational Resources Information Center
Brown, Dave F.; Canniff, Mary
2007-01-01
One of the most challenging daily experiences of teaching young adolescents is helping them transition from Piaget's concrete to the formal operational stage of cognitive development during the middle school years. Students who have reached formal operations can design and test hypotheses, engage in deductive reasoning, use flexible thinking,…
Applications of Ontology Design Patterns in Biomedical Ontologies
Mortensen, Jonathan M.; Horridge, Matthew; Musen, Mark A.; Noy, Natalya F.
2012-01-01
Ontology design patterns (ODPs) are a proposed solution to facilitate ontology development, and to help users avoid some of the most frequent modeling mistakes. ODPs originate from similar approaches in software engineering, where software design patterns have become a critical aspect of software development. There is little empirical evidence for ODP prevalence or effectiveness thus far. In this work, we determine the use and applicability of ODPs in a case study of biomedical ontologies. We encoded ontology design patterns from two ODP catalogs. We then searched for these patterns in a set of eight ontologies. We found five patterns of the 69 patterns. Two of the eight ontologies contained these patterns. While ontology design patterns provide a vehicle for capturing formally reoccurring models and best practices in ontology design, we show that today their use in a case study of widely used biomedical ontologies is limited. PMID:23304337
Optimizing Experimental Design for Comparing Models of Brain Function
Daunizeau, Jean; Preuschoff, Kerstin; Friston, Karl; Stephan, Klaas
2011-01-01
This article presents the first attempt to formalize the optimization of experimental design with the aim of comparing models of brain function based on neuroimaging data. We demonstrate our approach in the context of Dynamic Causal Modelling (DCM), which relates experimental manipulations to observed network dynamics (via hidden neuronal states) and provides an inference framework for selecting among candidate models. Here, we show how to optimize the sensitivity of model selection by choosing among experimental designs according to their respective model selection accuracy. Using Bayesian decision theory, we (i) derive the Laplace-Chernoff risk for model selection, (ii) disclose its relationship with classical design optimality criteria and (iii) assess its sensitivity to basic modelling assumptions. We then evaluate the approach when identifying brain networks using DCM. Monte-Carlo simulations and empirical analyses of fMRI data from a simple bimanual motor task in humans serve to demonstrate the relationship between network identification and the optimal experimental design. For example, we show that deciding whether there is a feedback connection requires shorter epoch durations, relative to asking whether there is experimentally induced change in a connection that is known to be present. Finally, we discuss limitations and potential extensions of this work. PMID:22125485
ERIC Educational Resources Information Center
Wu, Qun; Wang, Yecheng
2015-01-01
The purpose of this study is to identify the occurrence of Sudden Moments of Inspiration (SMI) in the sketching process of industrial design through experiments to explain the effect of sub consciousness on SMI. There are a pre-experiment and a formal experiment. In the formal experiment, nine undergraduates majoring in industrial design with same…
Software Validation via Model Animation
NASA Technical Reports Server (NTRS)
Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.
2015-01-01
This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.
Dhar, Purbarun; Maganti, Lakshmi Sirisha; Harikrishnan, A R
2018-05-30
Electrorheological (ER) fluids are known to exhibit enhanced viscous effects under an electric field stimulus. The present article reports the hitherto unreported phenomenon of greatly enhanced thermal conductivity in such electro-active colloidal dispersions in the presence of an externally applied electric field. Typical ER fluids are synthesized employing dielectric fluids and nanoparticles and experiments are performed employing an in-house designed setup. Greatly augmented thermal conductivity under a field's influence was observed. Enhanced thermal conduction along the fibril structures under the field effect is theorized as the crux of the mechanism. The formation of fibril structures has also been experimentally verified employing microscopy. Based on classical models for ER fluids, a mathematical formalism has been developed to predict the propensity of chain formation and statistically feasible chain dynamics at given Mason numbers. Further, a thermal resistance network model is employed to computationally predict the enhanced thermal conduction across the fibrillary colloid microstructure. Good agreement between the mathematical model and the experimental observations is achieved. The domineering role of thermal conductivity over relative permittivity has been shown by proposing a modified Hashin-Shtrikman (HS) formalism. The findings have implications towards better physical understanding and design of ER fluids from both 'smart' viscoelastic as well as thermally active materials points of view.
Testing the effectiveness of family therapeutic assessment: a case study using a time-series design.
Smith, Justin D; Wolf, Nicole J; Handler, Leonard; Nash, Michael R
2009-11-01
We describe a family Therapeutic Assessment (TA) case study employing 2 assessors, 2 assessment rooms, and a video link. In the study, we employed a daily measures time-series design with a pretreatment baseline and follow-up period to examine the family TA treatment model. In addition to being an illustrative addition to a number of clinical reports suggesting the efficacy of family TA, this study is the first to apply a case-based time-series design to test whether family TA leads to clinical improvement and also illustrates when that improvement occurs. Results support the trajectory of change proposed by Finn (2007), the TA model's creator, who posits that benefits continue beyond the formal treatment itself.
Canonical formalism for modelling and control of rigid body dynamics.
Gurfil, P
2005-12-01
This paper develops a new paradigm for stabilization of rigid-body dynamics. The state-space model is formulated using canonical elements, known as the Serret-Andoyer (SA) variables, thus far scarcely used for engineering applications. The main feature of the SA formalism is the reduction of the dynamics via the underlying symmetry stemming from conservation of angular momentum and rotational kinetic energy. The controllability of the system model is examined using the notion of accessibility, and is shown to be accessible from all points. Based on the accessibility proof, two nonlinear asymptotic feedback stabilizers are developed: a damping feedback is designed based on the Jurdjevic-Quinn method, and a Hamiltonian controller is derived by using the Hamiltonian as a natural Lyapunov function for the closed-loop dynamics. It is shown that the Hamiltonian control is both passive and inverse optimal with respect to a meaningful performance index. The performance of the new controllers is examined and compared using simulations of realistic scenarios from the satellite attitude dynamics field.
Modeling formalisms in Systems Biology
2011-01-01
Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422
Multi-Agent Modeling and Simulation Approach for Design and Analysis of MER Mission Operations
NASA Technical Reports Server (NTRS)
Seah, Chin; Sierhuis, Maarten; Clancey, William J.
2005-01-01
A space mission operations system is a complex network of human organizations, information and deep-space network systems and spacecraft hardware. As in other organizations, one of the problems in mission operations is managing the relationship of the mission information systems related to how people actually work (practices). Brahms, a multi-agent modeling and simulation tool, was used to model and simulate NASA's Mars Exploration Rover (MER) mission work practice. The objective was to investigate the value of work practice modeling for mission operations design. From spring 2002 until winter 2003, a Brahms modeler participated in mission systems design sessions and operations testing for the MER mission held at Jet Propulsion Laboratory (JPL). He observed how designers interacted with the Brahms tool. This paper discussed mission system designers' reactions to the simulation output during model validation and the presentation of generated work procedures. This project spurred JPL's interest in the Brahms model, but it was never included as part of the formal mission design process. We discuss why this occurred. Subsequently, we used the MER model to develop a future mission operations concept. Team members were reluctant to use the MER model, even though it appeared to be highly relevant to their effort. We describe some of the tool issues we encountered.
Shoukourian, S K; Vasilyan, A M; Avagyan, A A; Shukurian, A K
1999-01-01
A formalized "top to bottom" design approach was described in [1] for distributed applications built on databases, which were considered as a medium between virtual and real user environments for a specific medical application. Merging different components within a unified distributed application posits new essential problems for software. Particularly protection tools, which are sufficient separately, become deficient during the integration due to specific additional links and relationships not considered formerly. E.g., it is impossible to protect a shared object in the virtual operating room using only DBMS protection tools, if the object is stored as a record in DB tables. The solution of the problem should be found only within the more general application framework. Appropriate tools are absent or unavailable. The present paper suggests a detailed outline of a design and testing toolset for access differentiation systems (ADS) in distributed medical applications which use databases. The appropriate formal model as well as tools for its mapping to a DMBS are suggested. Remote users connected via global networks are considered too.
NASA Technical Reports Server (NTRS)
Leveson, Nancy G.; Heimdahl, Mats P. E.; Reese, Jon Damon
1999-01-01
Previously, we defined a blackbox formal system modeling language called RSML (Requirements State Machine Language). The language was developed over several years while specifying the system requirements for a collision avoidance system for commercial passenger aircraft. During the language development, we received continual feedback and evaluation by FAA employees and industry representatives, which helped us to produce a specification language that is easily learned and used by application experts. Since the completion of the PSML project, we have continued our research on specification languages. This research is part of a larger effort to investigate the more general problem of providing tools to assist in developing embedded systems. Our latest experimental toolset is called SpecTRM (Specification Tools and Requirements Methodology), and the formal specification language is SpecTRM-RL (SpecTRM Requirements Language). This paper describes what we have learned from our use of RSML and how those lessons were applied to the design of SpecTRM-RL. We discuss our goals for SpecTRM-RL and the design features that support each of these goals.
NASA Technical Reports Server (NTRS)
1995-01-01
This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.
Ehrlich, Matthias; Schüffny, René
2013-01-01
One of the major outcomes of neuroscientific research are models of Neural Network Structures (NNSs). Descriptions of these models usually consist of a non-standardized mixture of text, figures, and other means of visual information communication in print media. However, as neuroscience is an interdisciplinary domain by nature, a standardized way of consistently representing models of NNSs is required. While generic descriptions of such models in textual form have recently been developed, a formalized way of schematically expressing them does not exist to date. Hence, in this paper we present Neural Schematics as a concept inspired by similar approaches from other disciplines for a generic two dimensional representation of said structures. After introducing NNSs in general, a set of current visualizations of models of NNSs is reviewed and analyzed for what information they convey and how their elements are rendered. This analysis then allows for the definition of general items and symbols to consistently represent these models as Neural Schematics on a two dimensional plane. We will illustrate the possibilities an agreed upon standard can yield on sampled diagrams transformed into Neural Schematics and an example application for the design and modeling of large-scale NNSs.
Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.
1997-09-30
set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has
5 CFR 2638.311 - Copies of published formal advisory opinions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Copies of published formal advisory... Service § 2638.311 Copies of published formal advisory opinions. Each designated agency ethics official shall receive a copy of each published opinion. Copies will also be available to the public from the...
Conceptual Questions and Lack of Formal Reasoning: Are They Mutually Exclusive?
ERIC Educational Resources Information Center
Igaz, Csaba; Proksa, Miroslav
2012-01-01
Using specially designed conceptual question pairs, 9th grade students were tested on tasks (presented as experimental situations in pictorial form) that involved controlling the variables' scheme of formal reasoning. The question topics focused on these three chemical contexts: chemistry in everyday life, chemistry without formal concepts, and…
Standards for detailed clinical models as the basis for medical data exchange and decision support.
Coyle, Joseph F; Mori, Angelo Rossi; Huff, Stanley M
2003-03-01
Detailed clinical models are necessary to exchange medical data between heterogeneous computer systems and to maintain consistency in a longitudinal electronic medical record system. At Intermountain Health Care (IHC), we have a history of designing detailed clinical models. The purpose of this paper is to share our experience and the lessons we have learned over the last 5 years. IHC's newest model is implemented using eXtensible Markup Language (XML) Schema as the formalism, and conforms to the Health Level Seven (HL7) version 3 data types. The centerpiece of the new strategy is the Clinical Event Model, which is a flexible name-value pair data structure that is tightly linked to a coded terminology. We describe IHC's third-generation strategy for representing and implementing detailed clinical models, and discuss the reasons for this design.
Two-Step Formal Advertisement: An Examination.
1976-10-01
The purpose of this report is to examine the potential application of the Two-Step Formal Advertisement method of procurement. Emphasis is placed on...Step formal advertising is a method of procurement designed to take advantage of negotiation flexibility and at the same time obtain the benefits of...formal advertising . It is used where the specifications are not sufficiently definite or may be too restrictive to permit full and free competition
Abstracting event-based control models for high autonomy systems
NASA Technical Reports Server (NTRS)
Luh, Cheng-Jye; Zeigler, Bernard P.
1993-01-01
A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Divito, Ben L.
1992-01-01
The design and formal verification of the Reliable Computing Platform (RCP), a fault tolerant computing system for digital flight control applications is presented. The RCP uses N-Multiply Redundant (NMR) style redundancy to mask faults and internal majority voting to flush the effects of transient faults. The system is formally specified and verified using the Ehdm verification system. A major goal of this work is to provide the system with significant capability to withstand the effects of High Intensity Radiated Fields (HIRF).
Working on the Boundaries: Philosophies and Practices of the Design Process
NASA Technical Reports Server (NTRS)
Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.
1996-01-01
While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.
On verifying a high-level design. [cost and error analysis
NASA Technical Reports Server (NTRS)
Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.
1993-01-01
An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.
Feature-based component model for design of embedded systems
NASA Astrophysics Data System (ADS)
Zha, Xuan Fang; Sriram, Ram D.
2004-11-01
An embedded system is a hybrid of hardware and software, which combines software's flexibility and hardware real-time performance. Embedded systems can be considered as assemblies of hardware and software components. An Open Embedded System Model (OESM) is currently being developed at NIST to provide a standard representation and exchange protocol for embedded systems and system-level design, simulation, and testing information. This paper proposes an approach to representing an embedded system feature-based model in OESM, i.e., Open Embedded System Feature Model (OESFM), addressing models of embedded system artifacts, embedded system components, embedded system features, and embedded system configuration/assembly. The approach provides an object-oriented UML (Unified Modeling Language) representation for the embedded system feature model and defines an extension to the NIST Core Product Model. The model provides a feature-based component framework allowing the designer to develop a virtual embedded system prototype through assembling virtual components. The framework not only provides a formal precise model of the embedded system prototype but also offers the possibility of designing variation of prototypes whose members are derived by changing certain virtual components with different features. A case study example is discussed to illustrate the embedded system model.
Dynamic Forms. Part 1: Functions
NASA Technical Reports Server (NTRS)
Meyer, George; Smith, G. Allan
1993-01-01
The formalism of dynamic forms is developed as a means for organizing and systematizing the design control systems. The formalism allows the designer to easily compute derivatives to various orders of large composite functions that occur in flight-control design. Such functions involve many function-of-a-function calls that may be nested to many levels. The component functions may be multiaxis, nonlinear, and they may include rotation transformations. A dynamic form is defined as a variable together with its time derivatives up to some fixed but arbitrary order. The variable may be a scalar, a vector, a matrix, a direction cosine matrix, Euler angles, or Euler parameters. Algorithms for standard elementary functions and operations of scalar dynamic forms are developed first. Then vector and matrix operations and transformations between parameterization of rotations are developed in the next level in the hierarchy. Commonly occurring algorithms in control-system design, including inversion of pure feedback systems, are developed in the third level. A large-angle, three-axis attitude servo and other examples are included to illustrate the effectiveness of the developed formalism. All algorithms were implemented in FORTRAN code. Practical experience shows that the proposed formalism may significantly improve the productivity of the design and coding process.
Moving formal methods into practice. Verifying the FTPP Scoreboard: Results, phase 1
NASA Technical Reports Server (NTRS)
Srivas, Mandayam; Bickford, Mark
1992-01-01
This report documents the Phase 1 results of an effort aimed at formally verifying a key hardware component, called Scoreboard, of a Fault-Tolerant Parallel Processor (FTPP) being built at Charles Stark Draper Laboratory (CSDL). The Scoreboard is part of the FTPP virtual bus that guarantees reliable communication between processors in the presence of Byzantine faults in the system. The Scoreboard implements a piece of control logic that approves and validates a message before it can be transmitted. The goal of Phase 1 was to lay the foundation of the Scoreboard verification. A formal specification of the functional requirements and a high-level hardware design for the Scoreboard were developed. The hardware design was based on a preliminary Scoreboard design developed at CSDL. A main correctness theorem, from which the functional requirements can be established as corollaries, was proved for the Scoreboard design. The goal of Phase 2 is to verify the final detailed design of Scoreboard. This task is being conducted as part of a NASA-sponsored effort to explore integration of formal methods in the development cycle of current fault-tolerant architectures being built in the aerospace industry.
NASA Technical Reports Server (NTRS)
Polhamus, E. C.
1979-01-01
An overview is presented of 32 formal papers and 7 open session papers. Topics covered include: (1) studies of configurations of practical interest; (2) mathematical modelling and supporting investigations of slender wings, bodies of revolution, and body-wing configurations; (3) design methods; and (4) air intakes.
ERIC Educational Resources Information Center
LePrevost, Catherine E.; Storm, Julia F.; Asuaje, Cesar R.; Cope, W. Gregory
2014-01-01
Migrant and seasonal farmworkers are typically Spanish-speaking, Latino immigrants with limited formal education and low literacy skills and, as such, are a vulnerable population. We describe the development of the "Pesticides and Farmworker Health Toolkit", a pesticide safety and health curriculum designed to communicate to farmworkers…
Integrating public risk perception into formal natural hazard risk assessment
NASA Astrophysics Data System (ADS)
Plattner, Th.; Plapp, T.; Hebel, B.
2006-06-01
An urgent need to take perception into account for risk assessment has been pointed out by relevant literature, its impact in terms of risk-related behaviour by individuals is obvious. This study represents an effort to overcome the broadly discussed question of whether risk perception is quantifiable or not by proposing a still simple but applicable methodology. A novel approach is elaborated to obtain a more accurate and comprehensive quantification of risk in comparison to present formal risk evaluation practice. A consideration of relevant factors enables a explicit quantification of individual risk perception and evaluation. The model approach integrates the effective individual risk reff and a weighted mean of relevant perception affecting factors PAF. The relevant PAF cover voluntariness of risk-taking, individual reducibility of risk, knowledge and experience, endangerment, subjective damage rating and subjective recurrence frequency perception. The approach assigns an individual weight to each PAF to represent its impact magnitude. The quantification of these weights is target-group-dependent (e.g. experts, laypersons) and may be effected by psychometric methods. The novel approach is subject to a plausibility check using data from an expert-workshop. A first model application is conducted by means of data of an empirical risk perception study in Western Germany to deduce PAF and weight quantification as well as to confirm and evaluate model applicbility and flexibility. Main fields of application will be a quantification of risk perception by individual persons in a formal and technical way e.g. for the purpose of risk communication issues in illustrating differing perspectives of experts and non-experts. For decision making processes this model will have to be applied with caution, since it is by definition not designed to quantify risk acceptance or risk evaluation. The approach may well explain how risk perception differs, but not why it differs. The formal model generates only "snap shots" and considers neither the socio-cultural nor the historical context of risk perception, since it is a highly individualistic and non-contextual approach.
The Second NASA Formal Methods Workshop 1992
NASA Technical Reports Server (NTRS)
Johnson, Sally C. (Compiler); Holloway, C. Michael (Compiler); Butler, Ricky W. (Compiler)
1992-01-01
The primary goal of the workshop was to bring together formal methods researchers and aerospace industry engineers to investigate new opportunities for applying formal methods to aerospace problems. The first part of the workshop was tutorial in nature. The second part of the workshop explored the potential of formal methods to address current aerospace design and verification problems. The third part of the workshop involved on-line demonstrations of state-of-the-art formal verification tools. Also, a detailed survey was filled in by the attendees; the results of the survey are compiled.
Initiating Formal Requirements Specifications with Object-Oriented Models
NASA Technical Reports Server (NTRS)
Ampo, Yoko; Lutz, Robyn R.
1994-01-01
This paper reports results of an investigation into the suitability of object-oriented models as an initial step in developing formal specifications. The requirements for two critical system-level software modules were used as target applications. It was found that creating object-oriented diagrams prior to formally specifying the requirements enhanced the accuracy of the initial formal specifications and reduced the effort required to produce them. However, the formal specifications incorporated some information not found in the object-oriented diagrams, such as higher-level strategy or goals of the software.
Applicability domain: towards a more formal definition.
Hanser, T; Barber, C; Marchaland, J F; Werner, S
2016-11-01
In recent years the applicability domain (AD) of a prediction system has become an important concern in (Q)SAR modelling, especially in the context of human safety assessment. Today AD is an active research topic, and many methods have been designed to estimate the adequacy of a model and the confidence in its outcome for a given prediction task. Unfortunately, the wide spectrum of techniques developed for this purpose is based on various definitions of the concept of AD, often taking into account different types of information. This variety of methodologies confuses the end users and makes the comparison of the AD for different models almost impossible. In this article, we demonstrate that AD is not a monolithic concept and can be broken down into three well-defined sub-domains assessing confidence at the model, prediction and decision levels, respectively. By leveraging this separation of concerns we have an opportunity to clarify, formalize and extend the definition of AD. We propose a framework that captures this new vision with the aim to initiate a global effort to converge towards a common AD definition within the (Q)SAR community.
Formal Analysis of the Remote Agent Before and After Flight
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Lowry, Mike; Park, SeungJoon; Pecheur, Charles; Penix, John; Visser, Willem; White, Jon L.
2000-01-01
This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.
Towards Using Reo for Compliance-Aware Business Process Modeling
NASA Astrophysics Data System (ADS)
Arbab, Farhad; Kokash, Natallia; Meng, Sun
Business process modeling and implementation of process supporting infrastructures are two challenging tasks that are not fully aligned. On the one hand, languages such as Business Process Modeling Notation (BPMN) exist to capture business processes at the level of domain analysis. On the other hand, programming paradigms and technologies such as Service-Oriented Computing (SOC) and web services have emerged to simplify the development of distributed web systems that underly business processes. BPMN is the most recognized language for specifying process workflows at the early design steps. However, it is rather declarative and may lead to the executable models which are incomplete or semantically erroneous. Therefore, an approach for expressing and analyzing BPMN models in a formal setting is required. In this paper we describe how BPMN diagrams can be represented by means of a semantically precise channel-based coordination language called Reo which admits formal analysis using model checking and bisimulation techniques. Moreover, since additional requirements may come from various regulatory/legislative documents, we discuss the opportunities offered by Reo and its mathematical abstractions for expressing process-related constraints such as Quality of Service (QoS) or time-aware conditions on process states.
2012-01-01
Background Globally, extending financial protection and equitable access to health services to those outside the formal sector employment is a major challenge for achieving universal coverage. While some favour contributory schemes, others have embraced tax-funded health service cover for those outside the formal sector. This paper critically examines the issue of how to cover those outside the formal sector through the lens of stakeholder views on the proposed one-time premium payment (OTPP) policy in Ghana. Discussion Ghana in 2004 implemented a National Health Insurance Scheme, based on a contributory model where service benefits are restricted to those who contribute (with some groups exempted from contributing), as the policy direction for moving towards universal coverage. In 2008, the OTPP system was proposed as an alternative way of ensuring coverage for those outside formal sector employment. There are divergent stakeholder views with regard to the meaning of the one-time premium and how it will be financed and sustained. Our stakeholder interviews indicate that the underlying issue being debated is whether the current contributory NHIS model for those outside the formal employment sector should be maintained or whether services for this group should be tax funded. However, the advantages and disadvantages of these alternatives are not being explored in an explicit or systematic way and are obscured by the considerable confusion about the likely design of the OTPP policy. We attempt to contribute to the broader debate about how best to fund coverage for those outside the formal sector by unpacking some of these issues and pointing to the empirical evidence needed to shed even further light on appropriate funding mechanisms for universal health systems. Summary The Ghanaian debate on OTPP is related to one of the most important challenges facing low- and middle-income countries seeking to achieve a universal health care system. It is critical that there is more extensive debate on the advantages and disadvantages of alternative funding mechanisms, supported by a solid evidence base, and with the policy objective of universal coverage providing the guiding light. PMID:23102454
Systems, methods and apparatus for pattern matching in procedure development and verification
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.
NASA Technical Reports Server (NTRS)
Bickford, Mark; Srivas, Mandayam
1991-01-01
Presented here is a formal specification and verification of a property of a quadruplicately redundant fault tolerant microprocessor system design. A complete listing of the formal specification of the system and the correctness theorems that are proved are given. The system performs the task of obtaining interactive consistency among the processors using a special instruction on the processors. The design is based on an algorithm proposed by Pease, Shostak, and Lamport. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, providing certain preconditions hold, using a computer aided design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.
Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation.
Ribba, B; Grimm, H P; Agoram, B; Davies, M R; Gadkar, K; Niederer, S; van Riel, N; Timmis, J; van der Graaf, P H
2017-08-01
With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early Development to focus discussions on two critical methodological aspects of QSP model development: optimal structural granularity and parameter estimation. We here report in a perspective article a summary of presentations and discussions. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Protection - Principles and practice.
NASA Technical Reports Server (NTRS)
Graham, G. S.; Denning, P. J.
1972-01-01
The protection mechanisms of computer systems control the access to objects, especially information objects. The principles of protection system design are formalized as a model (theory) of protection. Each process has a unique identification number which is attached by the system to each access attempted by the process. Details of system implementation are discussed, taking into account the storing of the access matrix, aspects of efficiency, and the selection of subjects and objects. Two systems which have protection features incorporating all the elements of the model are described.
Formalizing nursing knowledge: from theories and models to ontologies.
Peace, Jane; Brennan, Patricia Flatley
2009-01-01
Knowledge representation in nursing is poised to address the depth of nursing knowledge about the specific phenomena of importance to nursing. Nursing theories and models may provide a starting point for making this knowledge explicit in representations. We combined knowledge building methods from nursing and ontology design methods from biomedical informatics to create a nursing representation of family health history. Our experience provides an example of how knowledge representations may be created to facilitate electronic support for nursing practice and knowledge development.
Bias and design in software specifications
NASA Technical Reports Server (NTRS)
Straub, Pablo A.; Zelkowitz, Marvin V.
1990-01-01
Implementation bias in a specification is an arbitrary constraint in the solution space. Presented here is a model of bias in software specifications. Bias is defined in terms of the specification process and a classification of the attributes of the software product. Our definition of bias provides insight into both the origin and the consequences of bias. It also shows that bias is relative and essentially unavoidable. Finally, we describe current work on defining a measure of bias, formalizing our model, and relating bias to software defects.
Crisis Management for Secondary Education: A Survey of Secondary Education Directors in Greece
ERIC Educational Resources Information Center
Savelides, Socrates; Mihiotis, Athanassios; Koutsoukis, Nikitas-Spiros
2015-01-01
Purpose: The Greek secondary education system lacks a formal crisis management system. The purpose of this paper is to address this problem as follows: elicit current crisis management practices, outline features for designing a formal crisis management system in Greece. Design/methodology/approach: The research is based on a survey conducted with…
Formal Verification of the Runway Safety Monitor
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu; Ciardo, Gianfranco
2006-01-01
The Runway Safety Monitor (RSM) designed by Lockheed Martin is part of NASA's effort to reduce runway accidents. We developed a Petri net model of the RSM protocol and used the model checking functions of our tool SMART to investigate a number of safety properties in RSM. To mitigate the impact of state-space explosion, we built a highly discretized model of the system, obtained by partitioning the monitored runway zone into a grid of smaller volumes and by considering scenarios involving only two aircraft. The model also assumes that there are no communication failures, such as bad input from radar or lack of incoming data, thus it relies on a consistent view of reality by all participants. In spite of these simplifications, we were able to expose potential problems in the RSM conceptual design. Our findings were forwarded to the design engineers, who undertook corrective action. Additionally, the results stress the efficiency attained by the new model checking algorithms implemented in SMART, and demonstrate their applicability to real-world systems.
Choi, Insook
2018-01-01
Sonification is an open-ended design task to construct sound informing a listener of data. Understanding application context is critical for shaping design requirements for data translation into sound. Sonification requires methodology to maintain reproducibility when data sources exhibit non-linear properties of self-organization and emergent behavior. This research formalizes interactive sonification in an extensible model to support reproducibility when data exhibits emergent behavior. In the absence of sonification theory, extensibility demonstrates relevant methods across case studies. The interactive sonification framework foregrounds three factors: reproducible system implementation for generating sonification; interactive mechanisms enhancing a listener's multisensory observations; and reproducible data from models that characterize emergent behavior. Supramodal attention research suggests interactive exploration with auditory feedback can generate context for recognizing irregular patterns and transient dynamics. The sonification framework provides circular causality as a signal pathway for modeling a listener interacting with emergent behavior. The extensible sonification model adopts a data acquisition pathway to formalize functional symmetry across three subsystems: Experimental Data Source, Sound Generation, and Guided Exploration. To differentiate time criticality and dimensionality of emerging dynamics, tuning functions are applied between subsystems to maintain scale and symmetry of concurrent processes and temporal dynamics. Tuning functions accommodate sonification design strategies that yield order parameter values to render emerging patterns discoverable as well as rehearsable, to reproduce desired instances for clinical listeners. Case studies are implemented with two computational models, Chua's circuit and Swarm Chemistry social agent simulation, generating data in real-time that exhibits emergent behavior. Heuristic Listening is introduced as an informal model of a listener's clinical attention to data sonification through multisensory interaction in a context of structured inquiry. Three methods are introduced to assess the proposed sonification framework: Listening Scenario classification, data flow Attunement, and Sonification Design Patterns to classify sound control. Case study implementations are assessed against these methods comparing levels of abstraction between experimental data and sound generation. Outcomes demonstrate the framework performance as a reference model for representing experimental implementations, also for identifying common sonification structures having different experimental implementations, identifying common functions implemented in different subsystems, and comparing impact of affordances across multiple implementations of listening scenarios. PMID:29755311
The Design of Model-Based Training Programs
NASA Technical Reports Server (NTRS)
Polson, Peter; Sherry, Lance; Feary, Michael; Palmer, Everett; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)
1997-01-01
This paper proposes a model-based training program for the skills necessary to operate advance avionics systems that incorporate advanced autopilots and fight management systems. The training model is based on a formalism, the operational procedure model, that represents the mission model, the rules, and the functions of a modem avionics system. This formalism has been defined such that it can be understood and shared by pilots, the avionics software, and design engineers. Each element of the software is defined in terms of its intent (What?), the rationale (Why?), and the resulting behavior (How?). The Advanced Computer Tutoring project at Carnegie Mellon University has developed a type of model-based, computer aided instructional technology called cognitive tutors. They summarize numerous studies showing that training times to a specified level of competence can be achieved in one third the time of conventional class room instruction. We are developing a similar model-based training program for the skills necessary to operation the avionics. The model underlying the instructional program and that simulates the effects of pilots entries and the behavior of the avionics is based on the operational procedure model. Pilots are given a series of vertical flightpath management problems. Entries that result in violations, such as failure to make a crossing restriction or violating the speed limits, result in error messages with instruction. At any time, the flightcrew can request suggestions on the appropriate set of actions. A similar and successful training program for basic skills for the FMS on the Boeing 737-300 was developed and evaluated. The results strongly support the claim that the training methodology can be adapted to the cockpit.
Validating EHR clinical models using ontology patterns.
Martínez-Costa, Catalina; Schulz, Stefan
2017-12-01
Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.
Fusing Quantitative Requirements Analysis with Model-based Systems Engineering
NASA Technical Reports Server (NTRS)
Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven
2006-01-01
A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.
Enhancing Formal E-Learning with Edutainment on Social Networks
ERIC Educational Resources Information Center
Labus, A.; Despotovic-Zrakic, M.; Radenkovic, B.; Bogdanovic, Z.; Radenkovic, M.
2015-01-01
This paper reports on the investigation of the possibilities of enhancing the formal e-learning process by harnessing the potential of informal game-based learning on social networks. The goal of the research is to improve the outcomes of the formal learning process through the design and implementation of an educational game on a social network…
42 CFR 411.380 - When CMS issues a formal advisory opinion.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 2 2010-10-01 2010-10-01 false When CMS issues a formal advisory opinion. 411.380... Relationships Between Physicians and Entities Furnishing Designated Health Services § 411.380 When CMS issues a formal advisory opinion. (a) CMS considers an advisory opinion to be issued once it has received payment...
IRDS prototyping with applications to the representation of EA/RA models
NASA Technical Reports Server (NTRS)
Lekkos, Anthony A.; Greenwood, Bruce
1988-01-01
The requirements and system overview for the Information Resources Dictionary System (IRDS) are described. A formal design specification for a scaled down IRDS implementation compatible with the proposed FIPS IRDS standard is contained. The major design objectives for this IRDS will include a menu driven user interface, implementation of basic IRDS operations, and PC compatibility. The IRDS was implemented using Smalltalk/5 object oriented programming system and an ATT 6300 personal computer running under MS-DOS 3.1. The difficulties encountered in using Smalltalk are discussed.
NASA Technical Reports Server (NTRS)
Weber, Doug; Jamsek, Damir
1994-01-01
The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.
On the design of script languages for neural simulation.
Brette, Romain
2012-01-01
In neural network simulators, models are specified according to a language, either specific or based on a general programming language (e.g. Python). There are also ongoing efforts to develop standardized languages, for example NeuroML. When designing these languages, efforts are often focused on expressivity, that is, on maximizing the number of model types than can be described and simulated. I argue that a complementary goal should be to minimize the cognitive effort required on the part of the user to use the language. I try to formalize this notion with the concept of "language entropy", and I propose a few practical guidelines to minimize the entropy of languages for neural simulation.
A provably-secure ECC-based authentication scheme for wireless sensor networks.
Nam, Junghyun; Kim, Moonseong; Paik, Juryon; Lee, Youngsook; Won, Dongho
2014-11-06
A smart-card-based user authentication scheme for wireless sensor networks (in short, a SUA-WSN scheme) is designed to restrict access to the sensor data only to users who are in possession of both a smart card and the corresponding password. While a significant number of SUA-WSN schemes have been suggested in recent years, their intended security properties lack formal definitions and proofs in a widely-accepted model. One consequence is that SUA-WSN schemes insecure against various attacks have proliferated. In this paper, we devise a security model for the analysis of SUA-WSN schemes by extending the widely-accepted model of Bellare, Pointcheval and Rogaway (2000). Our model provides formal definitions of authenticated key exchange and user anonymity while capturing side-channel attacks, as well as other common attacks. We also propose a new SUA-WSN scheme based on elliptic curve cryptography (ECC), and prove its security properties in our extended model. To the best of our knowledge, our proposed scheme is the first SUA-WSN scheme that provably achieves both authenticated key exchange and user anonymity. Our scheme is also computationally competitive with other ECC-based (non-provably secure) schemes.
A Provably-Secure ECC-Based Authentication Scheme for Wireless Sensor Networks
Nam, Junghyun; Kim, Moonseong; Paik, Juryon; Lee, Youngsook; Won, Dongho
2014-01-01
A smart-card-based user authentication scheme for wireless sensor networks (in short, a SUA-WSN scheme) is designed to restrict access to the sensor data only to users who are in possession of both a smart card and the corresponding password. While a significant number of SUA-WSN schemes have been suggested in recent years, their intended security properties lack formal definitions and proofs in a widely-accepted model. One consequence is that SUA-WSN schemes insecure against various attacks have proliferated. In this paper, we devise a security model for the analysis of SUA-WSN schemes by extending the widely-accepted model of Bellare, Pointcheval and Rogaway (2000). Our model provides formal definitions of authenticated key exchange and user anonymity while capturing side-channel attacks, as well as other common attacks. We also propose a new SUA-WSN scheme based on elliptic curve cryptography (ECC), and prove its security properties in our extended model. To the best of our knowledge, our proposed scheme is the first SUA-WSN scheme that provably achieves both authenticated key exchange and user anonymity. Our scheme is also computationally competitive with other ECC-based (non-provably secure) schemes. PMID:25384009
Methods for design and evaluation of integrated hardware-software systems for concurrent computation
NASA Technical Reports Server (NTRS)
Pratt, T. W.
1985-01-01
Research activities and publications are briefly summarized. The major tasks reviewed are: (1) VAX implementation of the PISCES parallel programming environment; (2) Apollo workstation network implementation of the PISCES environment; (3) FLEX implementation of the PISCES environment; (4) sparse matrix iterative solver in PSICES Fortran; (5) image processing application of PISCES; and (6) a formal model of concurrent computation being developed.
Security and Privacy Preservation in Human-Involved Networks
NASA Astrophysics Data System (ADS)
Asher, Craig; Aumasson, Jean-Philippe; Phan, Raphael C.-W.
This paper discusses security within human-involved networks, with a focus on social networking services (SNS). We argue that more secure networks could be designed using semi-formal security models inspired from cryptography, as well as notions like that of ceremony, which exploits human-specific abilities and psychology to assist creating more secure protocols. We illustrate some of our ideas with the example of the SNS Facebook.
ERIC Educational Resources Information Center
Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia
2014-01-01
We examined relations between the processing of facial identity and emotion in own- and other-race faces, using a fully crossed design with participants from 3 different ethnicities. The benefits of redundant identity and emotion signals were evaluated and formally tested in relation to models of independent and coactive feature processing and…
ERIC Educational Resources Information Center
Makri, Katerina; Papanikolaou, Kyparisia; Tsakiri, Athanasia; Karkanis, Stavros
2014-01-01
As e-learning is evolving into a mainstream, widespread practice, adopted by higher education institutions worldwide, much effort is geared towards the articulation of models and strategies for implementing e-learning in formal education settings. In the field of pre-service teacher education, a rising challenge is to equip the "21st century…
An evaluation of the directed flow graph methodology
NASA Technical Reports Server (NTRS)
Snyder, W. E.; Rajala, S. A.
1984-01-01
The applicability of the Directed Graph Methodology (DGM) to the design and analysis of special purpose image and signal processing hardware was evaluated. A special purpose image processing system was designed and described using DGM. The design, suitable for very large scale integration (VLSI) implements a region labeling technique. Two computer chips were designed, both using metal-nitride-oxide-silicon (MNOS) technology, as well as a functional system utilizing those chips to perform real time region labeling. The system is described in terms of DGM primitives. As it is currently implemented, DGM is inappropriate for describing synchronous, tightly coupled, special purpose systems. The nature of the DGM formalism lends itself more readily to modeling networks of general purpose processors.
ERIC Educational Resources Information Center
Monk, John S.; And Others
A multiple-group, single-intervention intensive time-series design was used to examine the achievement of an abstract concept, plate tectonics, of students grouped on the basis of cognitive tendency. Two questions were addressed: (1) How do daily achievement patterns differ between formal and concrete cognitive tendency groups when learning an…
ERIC Educational Resources Information Center
Manurung, Sondang R.; Mihardi, Satria
2016-01-01
The purpose of this study was to determine the effectiveness of hypertext media based kinematic learning and formal thinking ability to improve the conceptual understanding of physic prospective students. The research design used is the one-group pretest-posttest experimental design is carried out in the research by taking 36 students on from…
FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER PHYSICAL SYSTEMS
2018-02-23
FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER- PHYSICAL SYSTEMS UNIVERSITY OF TEXAS AT ARLINGTON FEBRUARY 2018 FINAL...COVERED (From - To) APR 2015 – APR 2017 4. TITLE AND SUBTITLE FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER- PHYSICAL ...dated 16 Jan 09 13. SUPPLEMENTARY NOTES 14. ABSTRACT This project studied emergent behavior in distributed cyber- physical systems (DCPS). Emergent
Experiences Using Lightweight Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1997-01-01
This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
A Tool for Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John
2005-01-01
Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.
Improving Learner Outcomes in Lifelong Education: Formal Pedagogies in Non-Formal Learning Contexts?
ERIC Educational Resources Information Center
Zepke, Nick; Leach, Linda
2006-01-01
This article explores how far research findings about successful pedagogies in formal post-school education might be used in non-formal learning contexts--settings where learning may not lead to formal qualifications. It does this by examining a learner outcomes model adapted from a synthesis of research into retention. The article first…
NASA Astrophysics Data System (ADS)
Abidi, Dhafer
TTEthernet is a deterministic network technology that makes enhancements to Layer 2 Quality-of-Service (QoS) for Ethernet. The components that implement its services enrich the Ethernet functionality with distributed fault-tolerant synchronization, robust temporal partitioning bandwidth and synchronous communication with fixed latency and low jitter. TTEthernet services can facilitate the design of scalable, robust, less complex distributed systems and architectures tolerant to faults. Simulation is nowadays an essential step in critical systems design process and represents a valuable support for validation and performance evaluation. CoRE4INET is a project bringing together all TTEthernet simulation models currently available. It is based on the extension of models of OMNeT ++ INET framework. Our objective is to study and simulate the TTEthernet protocol on a flight management subsystem (FMS). The idea is to use CoRE4INET to design the simulation model of the target system. The problem is that CoRE4INET does not offer a task scheduling tool for TTEthernet network. To overcome this problem we propose an adaptation for simulation purposes of a task scheduling approach based on formal specification of network constraints. The use of Yices solver allowed the translation of the formal specification into an executable program to generate the desired transmission plan. A case study allowed us at the end to assess the impact of the arrangement of Time-Triggered frames offsets on the performance of each type of the system traffic.
On the formalization and reuse of scientific research.
King, Ross D; Liakata, Maria; Lu, Chuan; Oliver, Stephen G; Soldatova, Larisa N
2011-10-07
The reuse of scientific knowledge obtained from one investigation in another investigation is basic to the advance of science. Scientific investigations should therefore be recorded in ways that promote the reuse of the knowledge they generate. The use of logical formalisms to describe scientific knowledge has potential advantages in facilitating such reuse. Here, we propose a formal framework for using logical formalisms to promote reuse. We demonstrate the utility of this framework by using it in a worked example from biology: demonstrating cycles of investigation formalization [F] and reuse [R] to generate new knowledge. We first used logic to formally describe a Robot scientist investigation into yeast (Saccharomyces cerevisiae) functional genomics [f(1)]. With Robot scientists, unlike human scientists, the production of comprehensive metadata about their investigations is a natural by-product of the way they work. We then demonstrated how this formalism enabled the reuse of the research in investigating yeast phenotypes [r(1) = R(f(1))]. This investigation found that the removal of non-essential enzymes generally resulted in enhanced growth. The phenotype investigation was then formally described using the same logical formalism as the functional genomics investigation [f(2) = F(r(1))]. We then demonstrated how this formalism enabled the reuse of the phenotype investigation to investigate yeast systems-biology modelling [r(2) = R(f(2))]. This investigation found that yeast flux-balance analysis models fail to predict the observed changes in growth. Finally, the systems biology investigation was formalized for reuse in future investigations [f(3) = F(r(2))]. These cycles of reuse are a model for the general reuse of scientific knowledge.
Optimally designing games for behavioural research
Rafferty, Anna N.; Zaharia, Matei; Griffiths, Thomas L.
2014-01-01
Computer games can be motivating and engaging experiences that facilitate learning, leading to their increasing use in education and behavioural experiments. For these applications, it is often important to make inferences about the knowledge and cognitive processes of players based on their behaviour. However, designing games that provide useful behavioural data are a difficult task that typically requires significant trial and error. We address this issue by creating a new formal framework that extends optimal experiment design, used in statistics, to apply to game design. In this framework, we use Markov decision processes to model players' actions within a game, and then make inferences about the parameters of a cognitive model from these actions. Using a variety of concept learning games, we show that in practice, this method can predict which games will result in better estimates of the parameters of interest. The best games require only half as many players to attain the same level of precision. PMID:25002821
NASA Astrophysics Data System (ADS)
Macioł, Piotr; Regulski, Krzysztof
2016-08-01
We present a process of semantic meta-model development for data management in an adaptable multiscale modeling framework. The main problems in ontology design are discussed, and a solution achieved as a result of the research is presented. The main concepts concerning the application and data management background for multiscale modeling were derived from the AM3 approach—object-oriented Agile multiscale modeling methodology. The ontological description of multiscale models enables validation of semantic correctness of data interchange between submodels. We also present a possibility of using the ontological model as a supervisor in conjunction with a multiscale model controller and a knowledge base system. Multiscale modeling formal ontology (MMFO), designed for describing multiscale models' data and structures, is presented. A need for applying meta-ontology in the MMFO development process is discussed. Examples of MMFO application in describing thermo-mechanical treatment of metal alloys are discussed. Present and future applications of MMFO are described.
ERIC Educational Resources Information Center
Penning, Margaret J.
2002-01-01
Purpose: In response to concerns among policymakers and others that increases in the availability of publicly funded formal services will lead to reductions in self- and informal care, this study examines the relationship between the extent of formal in-home care received and levels of self- and informal care. Design and Methods: Two-stage least…
The Influence of Rural Location on Utilization of Formal Home Care: The Role of Medicaid
ERIC Educational Resources Information Center
McAuley, William J.; Spector, William D.; Van Nostrand, Joan; Shaffer, Tom
2004-01-01
Purpose: This research examines the impact of rural-urban residence on formal home-care utilization among older people and determines whether and how Medicaid coverage influences the association between, rural-urban location and risk of formal home-care use. Design and Methods: We combined data from the 1998 consolidated file of the Medical…
On the adequacy of current empirical evaluations of formal models of categorization.
Wills, Andy J; Pothos, Emmanuel M
2012-01-01
Categorization is one of the fundamental building blocks of cognition, and the study of categorization is notable for the extent to which formal modeling has been a central and influential component of research. However, the field has seen a proliferation of noncomplementary models with little consensus on the relative adequacy of these accounts. Progress in assessing the relative adequacy of formal categorization models has, to date, been limited because (a) formal model comparisons are narrow in the number of models and phenomena considered and (b) models do not often clearly define their explanatory scope. Progress is further hampered by the practice of fitting models with arbitrarily variable parameters to each data set independently. Reviewing examples of good practice in the literature, we conclude that model comparisons are most fruitful when relative adequacy is assessed by comparing well-defined models on the basis of the number and proportion of irreversible, ordinal, penetrable successes (principles of minimal flexibility, breadth, good-enough precision, maximal simplicity, and psychological focus).
Detecting Mode Confusion Through Formal Modeling and Analysis
NASA Technical Reports Server (NTRS)
Miller, Steven P.; Potts, James N.
1999-01-01
Aircraft safety has improved steadily over the last few decades. While much of this improvement can be attributed to the introduction of advanced automation in the cockpit, the growing complexity of these systems also increases the potential for the pilots to become confused about what the automation is doing. This phenomenon, often referred to as mode confusion, has been involved in several accidents involving modern aircraft. This report describes an effort by Rockwell Collins and NASA Langley to identify potential sources of mode confusion through two complementary strategies. The first is to create a clear, executable model of the automation, connect it to a simulation of the flight deck, and use this combination to review of the behavior of the automation and the man-machine interface with the designers, pilots, and experts in human factors. The second strategy is to conduct mathematical analyses of the model by translating it into a formal specification suitable for analysis with automated tools. The approach is illustrated by applying it to a hypothetical, but still realistic, example of the mode logic of a Flight Guidance System.
NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.
1989-01-01
It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.
Third NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler)
1995-01-01
This publication constitutes the proceedings of NASA Langley Research Center's third workshop on the application of formal methods to the design and verification of life-critical systems. This workshop brought together formal methods researchers, industry engineers, and academicians to discuss the potential of NASA-sponsored formal methods and to investigate new opportunities for applying these methods to industry problems. contained herein are copies of the material presented at the workshop, summaries of many of the presentations, a complete list of attendees, and a detailed summary of the Langley formal methods program. Much of this material is available electronically through the World-Wide Web via the following URL.
NASA Formal Methods Workshop, 1990
NASA Technical Reports Server (NTRS)
Butler, Ricky W. (Compiler)
1990-01-01
The workshop brought together researchers involved in the NASA formal methods research effort for detailed technical interchange and provided a mechanism for interaction with representatives from the FAA and the aerospace industry. The workshop also included speakers from industry to debrief the formal methods researchers on the current state of practice in flight critical system design, verification, and certification. The goals were: define and characterize the verification problem for ultra-reliable life critical flight control systems and the current state of practice in industry today; determine the proper role of formal methods in addressing these problems, and assess the state of the art and recent progress toward applying formal methods to this area.
ERIC Educational Resources Information Center
Laosa, Luis M.; And Others
As the second volume in a 4-volume evaluation report on the University of Massachusetts Non-Formal Education Project (UMass NFEP) in rural Ecuador, this volume details the evaluation design. Cited as basic to the evaluation design are questions which ask: (1) What kinds of effects (changes) can be observed? and (2) What are characteristics of the…
UTP and Temporal Logic Model Checking
NASA Astrophysics Data System (ADS)
Anderson, Hugh; Ciobanu, Gabriel; Freitas, Leo
In this paper we give an additional perspective to the formal verification of programs through temporal logic model checking, which uses Hoare and He Unifying Theories of Programming (UTP). Our perspective emphasizes the use of UTP designs, an alphabetised relational calculus expressed as a pre/post condition pair of relations, to verify state or temporal assertions about programs. The temporal model checking relation is derived from a satisfaction relation between the model and its properties. The contribution of this paper is that it shows a UTP perspective to temporal logic model checking. The approach includes the notion of efficiency found in traditional model checkers, which reduced a state explosion problem through the use of efficient data structures
MOS 2.0: Modeling the Next Revolutionary Mission Operations System
NASA Technical Reports Server (NTRS)
Delp, Christopher L.; Bindschadler, Duane; Wollaeger, Ryan; Carrion, Carlos; McCullar, Michelle; Jackson, Maddalena; Sarrel, Marc; Anderson, Louise; Lam, Doris
2011-01-01
Designed and implemented in the 1980's, the Advanced Multi-Mission Operations System (AMMOS) was a breakthrough for deep-space NASA missions, enabling significant reductions in the cost and risk of implementing ground systems. By designing a framework for use across multiple missions and adaptability to specific mission needs, AMMOS developers created a set of applications that have operated dozens of deep-space robotic missions over the past 30 years. We seek to leverage advances in technology and practice of architecting and systems engineering, using model-based approaches to update the AMMOS. We therefore revisit fundamental aspects of the AMMOS, resulting in a major update to the Mission Operations System (MOS): MOS 2.0. This update will ensure that the MOS can support an increasing range of mission types, (such as orbiters, landers, rovers, penetrators and balloons), and that the operations systems for deep-space robotic missions can reap the benefits of an iterative multi-mission framework.12 This paper reports on the first phase of this major update. Here we describe the methods and formal semantics used to address MOS 2.0 architecture and some early results. Early benefits of this approach include improved stakeholder input and buy-in, the ability to articulate and focus effort on key, system-wide principles, and efficiency gains obtained by use of well-architected design patterns and the use of models to improve the quality of documentation and decrease the effort required to produce and maintain it. We find that such methods facilitate reasoning, simulation, analysis on the system design in terms of design impacts, generation of products (e.g., project-review and software-delivery products), and use of formal process descriptions to enable goal-based operations. This initial phase yields a forward-looking and principled MOS 2.0 architectural vision, which considers both the mission-specific context and long-term system sustainability.
NASA Astrophysics Data System (ADS)
Vassiliev, Oleg N.; Grosshans, David R.; Mohan, Radhe
2017-10-01
We propose a new formalism for calculating parameters α and β of the linear-quadratic model of cell survival. This formalism, primarily intended for calculating relative biological effectiveness (RBE) for treatment planning in hadron therapy, is based on a recently proposed microdosimetric revision of the single-target multi-hit model. The main advantage of our formalism is that it reliably produces α and β that have correct general properties with respect to their dependence on physical properties of the beam, including the asymptotic behavior for very low and high linear energy transfer (LET) beams. For example, in the case of monoenergetic beams, our formalism predicts that, as a function of LET, (a) α has a maximum and (b) the α/β ratio increases monotonically with increasing LET. No prior models reviewed in this study predict both properties (a) and (b) correctly, and therefore, these prior models are valid only within a limited LET range. We first present our formalism in a general form, for polyenergetic beams. A significant new result in this general case is that parameter β is represented as an average over the joint distribution of energies E 1 and E 2 of two particles in the beam. This result is consistent with the role of the quadratic term in the linear-quadratic model. It accounts for the two-track mechanism of cell kill, in which two particles, one after another, damage the same site in the cell nucleus. We then present simplified versions of the formalism, and discuss predicted properties of α and β. Finally, to demonstrate consistency of our formalism with experimental data, we apply it to fit two sets of experimental data: (1) α for heavy ions, covering a broad range of LETs, and (2) β for protons. In both cases, good agreement is achieved.
NASA Technical Reports Server (NTRS)
Ghovanlou, A. H.; Gupta, J. N.; Henderson, R. G.
1977-01-01
The development of quantitative analytical procedures for relating scattered signals, measured by a remote sensor, was considered. The applications of a Monte Carlo simulation model for radiative transfer in turbid water are discussed. The model is designed to calculate the characteristics of the backscattered signal from an illuminated body of water as a function of the turbidity level, and the spectral properties of the suspended particulates. The optical properties of the environmental waters, necessary for model applications, were derived from available experimental data and/or calculated from Mie formalism. Results of applications of the model are presented.
Bruening, Rebecca A; Strazza, Karen; Nocera, Maryalice; Peek-Asa, Corinne; Casteel, Carri
2015-06-01
Small retail businesses experience high robbery and violent crime rates leading to injury and death. Workplace violence prevention programs (WVPP) based on Crime Prevention Through Environmental Design reduce this risk, but low small business participation limits their effectiveness. Recent dissemination models of occupational safety and health information recommend collaborating with an intermediary organization to engage small businesses. Qualitative interviews with 70 small business operators and 32 representatives of organizations with small business influence were conducted to identify factors and recommendations for improving dissemination of a WVPP. Both study groups recommended promoting WVPPs through personal contacts but differed on other promotion methods and the type of influential groups to target. Small business operators indicated few connections to formal business networks. Dissemination of WVPPs to small businesses may require models inclusive of influential individuals (e.g., respected business owners) as intermediaries to reach small businesses with few formal connections. © 2015 Wiley Periodicals, Inc.
Unified modeling language and design of a case-based retrieval system in medical imaging.
LeBozec, C.; Jaulent, M. C.; Zapletal, E.; Degoulet, P.
1998-01-01
One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users. Images Figure 6 Figure 7 PMID:9929346
Unified modeling language and design of a case-based retrieval system in medical imaging.
LeBozec, C; Jaulent, M C; Zapletal, E; Degoulet, P
1998-01-01
One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users.
Enlisted Personnel Individualized Career System (EPICS) Test and Evaluation
1984-01-01
The EPICS program, which was developed using an integrated personnel systems approach ( IPSA ), delays formal school training until after personnel have...received shipboard on-job training complemented by job performance aids (3PAs). Early phases of the program, which involved developing the IPSA EPICS...detailed description of the conception and development of the EPICS IPSA model, the execution of the front-end job design analyses, 3PA and instructional
Formal Language Design in the Context of Domain Engineering
2000-03-28
73 Related Work 75 5.1 Feature oriented domain analysis ( FODA ) 75 5.2 Organizational domain modeling (ODM) 76 5.3 Domain-Specific Software...However there are only a few that are well defined and used repeatedly in practice. These include: Feature oriented domain analysis ( FODA ), Organizational...Feature oriented domain analysis ( FODA ) Feature oriented domain analysis ( FODA ) is a domain analysis method being researched and applied by the SEI
Ofili, Elizabeth O; Fair, Alecia; Norris, Keith; Verbalis, Joseph G; Poland, Russell; Bernard, Gordon; Stephens, David S; Dubinett, Steven M; Imperato-McGinley, Julianne; Dottin, Robert P; Pulley, Jill; West, Andrew; Brown, Arleen; Mellman, Thomas A
2013-12-01
Health disparities are an immense challenge to American society. Clinical and Translational Science Awards (CTSAs) housed within the National Center for Advancing Translational Science (NCATS) are designed to accelerate the translation of experimental findings into clinically meaningful practices and bring new therapies to the doorsteps of all patients. Research Centers at Minority Institutions (RCMI) program at the National Institute on Minority Health and Health Disparities (NIMHD) are designed to build capacity for biomedical research and training at minority serving institutions. The CTSA created a mechanism fostering formal collaborations between research intensive universities and minority serving institutions (MSI) supported by the RCMI program. These consortium-level collaborations activate unique translational research approaches to reduce health disparities with credence to each academic institutions history and unique characteristics. Five formal partnerships between research intensive universities and MSI have formed as a result of the CTSA and RCMI programs. These partnerships present a multifocal approach; shifting cultural change and consciousness toward addressing health disparities, and training the next generation of minority scientists. This collaborative model is based on the respective strengths and contributions of the partnering institutions, allowing bidirectional interchange and leveraging NIH and institutional investments providing measurable benchmarks toward the elimination of health disparities. © 2013 Wiley Periodicals, Inc.
Dose rate calculations around 192Ir brachytherapy sources using a Sievert integration model
NASA Astrophysics Data System (ADS)
Karaiskos, P.; Angelopoulos, A.; Baras, P.; Rozaki-Mavrouli, H.; Sandilos, P.; Vlachos, L.; Sakelliou, L.
2000-02-01
The classical Sievert integral method is a valuable tool for dose rate calculations around brachytherapy sources, combining simplicity with reasonable computational times. However, its accuracy in predicting dose rate anisotropy around 192 Ir brachytherapy sources has been repeatedly put into question. In this work, we used a primary and scatter separation technique to improve an existing modification of the Sievert integral (Williamson's isotropic scatter model) that determines dose rate anisotropy around commercially available 192 Ir brachytherapy sources. The proposed Sievert formalism provides increased accuracy while maintaining the simplicity and computational time efficiency of the Sievert integral method. To describe transmission within the materials encountered, the formalism makes use of narrow beam attenuation coefficients which can be directly and easily calculated from the initially emitted 192 Ir spectrum. The other numerical parameters required for its implementation, once calculated with the aid of our home-made Monte Carlo simulation code, can be used for any 192 Ir source design. Calculations of dose rate and anisotropy functions with the proposed Sievert expression, around commonly used 192 Ir high dose rate sources and other 192 Ir elongated source designs, are in good agreement with corresponding accurate Monte Carlo results which have been reported by our group and other authors.
Fair, Alecia; Norris, Keith; Verbalis, Joseph G.; Poland, Russell; Bernard, Gordon; Stephens, David S.; Dubinett, Steven M.; Imperato‐McGinley, Julianne; Dottin, Robert P.; Pulley, Jill; West, Andrew; Brown, Arleen; Mellman, Thomas A.
2013-01-01
Abstract Health disparities are an immense challenge to American society. Clinical and Translational Science Awards (CTSAs) housed within the National Center for Advancing Translational Science (NCATS) are designed to accelerate the translation of experimental findings into clinically meaningful practices and bring new therapies to the doorsteps of all patients. Research Centers at Minority Institutions (RCMI) program at the National Institute on Minority Health and Health Disparities (NIMHD) are designed to build capacity for biomedical research and training at minority serving institutions. The CTSA created a mechanism fostering formal collaborations between research intensive universities and minority serving institutions (MSI) supported by the RCMI program. These consortium‐level collaborations activate unique translational research approaches to reduce health disparities with credence to each academic institutions history and unique characteristics. Five formal partnerships between research intensive universities and MSI have formed as a result of the CTSA and RCMI programs. These partnerships present a multifocal approach; shifting cultural change and consciousness toward addressing health disparities, and training the next generation of minority scientists. This collaborative model is based on the respective strengths and contributions of the partnering institutions, allowing bidirectional interchange and leveraging NIH and institutional investments providing measurable benchmarks toward the elimination of health disparities. PMID:24119157
NASA Technical Reports Server (NTRS)
Kershaw, John
1990-01-01
The VIPER project has so far produced a formal specification of a 32 bit RISC microprocessor, an implementation of that chip in radiation-hard SOS technology, a partial proof of correctness of the implementation which is still being extended, and a large body of supporting software. The time has now come to consider what has been achieved and what directions should be pursued in the future. The most obvious lesson from the VIPER project was the time and effort needed to use formal methods properly. Most of the problems arose in the interfaces between different formalisms, e.g., between the (informal) English description and the HOL spec, between the block-level spec in HOL and the equivalent in ELLA needed by the low-level CAD tools. These interfaces need to be made rigorous or (better) eliminated. VIPER 1A (the latest chip) is designed to operate in pairs, to give protection against breakdowns in service as well as design faults. We have come to regard redundancy and formal design methods as complementary, the one to guard against normal component failures and the other to provide insurance against the risk of the common-cause failures which bedevil reliability predictions. Any future VIPER chips will certainly need improved performance to keep up with increasingly demanding applications. We have a prototype design (not yet specified formally) which includes 32 and 64 bit multiply, instruction pre-fetch, more efficient interface timing, and a new instruction to allow a quick response to peripheral requests. Work is under way to specify this device in MIRANDA, and then to refine the spec into a block-level design by top-down transformations. When the refinement is complete, a relatively simple proof checker should be able to demonstrate its correctness. This paper is presented in viewgraph form.
Evolving cell models for systems and synthetic biology.
Cao, Hongqing; Romero-Campero, Francisco J; Heeb, Stephan; Cámara, Miguel; Krasnogor, Natalio
2010-03-01
This paper proposes a new methodology for the automated design of cell models for systems and synthetic biology. Our modelling framework is based on P systems, a discrete, stochastic and modular formal modelling language. The automated design of biological models comprising the optimization of the model structure and its stochastic kinetic constants is performed using an evolutionary algorithm. The evolutionary algorithm evolves model structures by combining different modules taken from a predefined module library and then it fine-tunes the associated stochastic kinetic constants. We investigate four alternative objective functions for the fitness calculation within the evolutionary algorithm: (1) equally weighted sum method, (2) normalization method, (3) randomly weighted sum method, and (4) equally weighted product method. The effectiveness of the methodology is tested on four case studies of increasing complexity including negative and positive autoregulation as well as two gene networks implementing a pulse generator and a bandwidth detector. We provide a systematic analysis of the evolutionary algorithm's results as well as of the resulting evolved cell models.
Nichols, J.D.; Runge, M.C.; Johnson, F.A.; Williams, B.K.; Schodde, Richard; Hannon, Susan; Scheiffarth, Gregor; Bairlein, Franz
2006-01-01
The history of North American waterfowl harvest management has been characterized by attempts to use population monitoring data to make informed harvest management decisions. Early attempts can be characterized as intuitive decision processes, and later efforts were guided increasingly by population models and associated predictions. In 1995, a formal adaptive management process was implemented, and annual decisions about duck harvest regulations in the United States are still based on this process. This formal decision process is designed to deal appropriately with the various forms of uncertainty that characterize management decisions, environmental uncertainty, structural uncertainty, partial controllability and partial observability. The key components of the process are (1) objectives, (2) potential management actions, (3) model(s) of population response to management actions, (4) credibility measures for these models, and (5) a monitoring program. The operation of this iterative process is described, and a brief history of a decade of its use is presented. Future challenges range from social and political issues such as appropriate objectives and management actions, to technical issues such as multispecies management, geographic allocation of harvest, and incorporation of actions that include habitat acquisition and management.
Modified Petri net model sensitivity to workload manipulations
NASA Technical Reports Server (NTRS)
White, S. A.; Mackinnon, D. P.; Lyman, J.
1986-01-01
Modified Petri Nets (MPNs) are investigated as a workload modeling tool. The results of an exploratory study of the sensitivity of MPNs to work load manipulations in a dual task are described. Petri nets have been used to represent systems with asynchronous, concurrent and parallel activities (Peterson, 1981). These characteristics led some researchers to suggest the use of Petri nets in workload modeling where concurrent and parallel activities are common. Petri nets are represented by places and transitions. In the workload application, places represent operator activities and transitions represent events. MPNs have been used to formally represent task events and activities of a human operator in a man-machine system. Some descriptive applications demonstrate the usefulness of MPNs in the formal representation of systems. It is the general hypothesis herein that in addition to descriptive applications, MPNs may be useful for workload estimation and prediction. The results are reported of the first of a series of experiments designed to develop and test a MPN system of workload estimation and prediction. This first experiment is a screening test of MPN model general sensitivity to changes in workload. Positive results from this experiment will justify the more complicated analyses and techniques necessary for developing a workload prediction system.
Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design
NASA Astrophysics Data System (ADS)
Koga, Tsuyoshi; Aoyama, Kazuhiro
This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.
NASA Langley Research and Technology-Transfer Program in Formal Methods
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Caldwell, James L.; Carreno, Victor A.; Holloway, C. Michael; Miner, Paul S.; DiVito, Ben L.
1995-01-01
This paper presents an overview of NASA Langley research program in formal methods. The major goals of this work are to make formal methods practical for use on life critical systems, and to orchestrate the transfer of this technology to U.S. industry through use of carefully designed demonstration projects. Several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of five NASA civil servants and contractors from Odyssey Research Associates, SRI International, and VIGYAN Inc.
Towards the formal verification of the requirements and design of a processor interface unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.
NASA Technical Reports Server (NTRS)
Hale, Mark A.
1996-01-01
Computer applications for design have evolved rapidly over the past several decades, and significant payoffs are being achieved by organizations through reductions in design cycle times. These applications are overwhelmed by the requirements imposed during complex, open engineering systems design. Organizations are faced with a number of different methodologies, numerous legacy disciplinary tools, and a very large amount of data. Yet they are also faced with few interdisciplinary tools for design collaboration or methods for achieving the revolutionary product designs required to maintain a competitive advantage in the future. These organizations are looking for a software infrastructure that integrates current corporate design practices with newer simulation and solution techniques. Such an infrastructure must be robust to changes in both corporate needs and enabling technologies. In addition, this infrastructure must be user-friendly, modular and scalable. This need is the motivation for the research described in this dissertation. The research is focused on the development of an open computing infrastructure that facilitates product and process design. In addition, this research explicitly deals with human interactions during design through a model that focuses on the role of a designer as that of decision-maker. The research perspective here is taken from that of design as a discipline with a focus on Decision-Based Design, Theory of Languages, Information Science, and Integration Technology. Given this background, a Model of IPPD is developed and implemented along the lines of a traditional experimental procedure: with the steps of establishing context, formalizing a theory, building an apparatus, conducting an experiment, reviewing results, and providing recommendations. Based on this Model, Design Processes and Specification can be explored in a structured and implementable architecture. An architecture for exploring design called DREAMS (Developing Robust Engineering Analysis Models and Specifications) has been developed which supports the activities of both meta-design and actual design execution. This is accomplished through a systematic process which is comprised of the stages of Formulation, Translation, and Evaluation. During this process, elements from a Design Specification are integrated into Design Processes. In addition, a software infrastructure was developed and is called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment). This represents a virtual apparatus in the Design Experiment conducted in this research. IMAGE is an innovative architecture because it explicitly supports design-related activities. This is accomplished through a GUI driven and Agent-based implementation of DREAMS. A HSCT design has been adopted from the Framework for Interdisciplinary Design Optimization (FIDO) and is implemented in IMAGE. This problem shows how Design Processes and Specification interact in a design system. In addition, the problem utilizes two different solution models concurrently: optimal and satisfying. The satisfying model allows for more design flexibility and allows a designer to maintain design freedom. As a result of following this experimental procedure, this infrastructure is an open system that it is robust to changes in both corporate needs and computer technologies. The development of this infrastructure leads to a number of significant intellectual contributions: 1) A new approach to implementing IPPD with the aid of a computer; 2) A formal Design Experiment; 3) A combined Process and Specification architecture that is language-based; 4) An infrastructure for exploring design; 5) An integration strategy for implementing computer resources; and 6) A seamless modeling language. The need for these contributions is emphasized by the demand by industry and government agencies for the development of these technologies.
Design and analysis of DNA strand displacement devices using probabilistic model checking
Lakin, Matthew R.; Parker, David; Cardelli, Luca; Kwiatkowska, Marta; Phillips, Andrew
2012-01-01
Designing correct, robust DNA devices is difficult because of the many possibilities for unwanted interference between molecules in the system. DNA strand displacement has been proposed as a design paradigm for DNA devices, and the DNA strand displacement (DSD) programming language has been developed as a means of formally programming and analysing these devices to check for unwanted interference. We demonstrate, for the first time, the use of probabilistic verification techniques to analyse the correctness, reliability and performance of DNA devices during the design phase. We use the probabilistic model checker prism, in combination with the DSD language, to design and debug DNA strand displacement components and to investigate their kinetics. We show how our techniques can be used to identify design flaws and to evaluate the merits of contrasting design decisions, even on devices comprising relatively few inputs. We then demonstrate the use of these components to construct a DNA strand displacement device for approximate majority voting. Finally, we discuss some of the challenges and possible directions for applying these methods to more complex designs. PMID:22219398
Malin, Bradley A
2005-01-01
The incorporation of genomic data into personal medical records poses many challenges to patient privacy. In response, various systems for preserving patient privacy in shared genomic data have been developed and deployed. Although these systems de-identify the data by removing explicit identifiers (e.g., name, address, or Social Security number) and incorporate sound security design principles, they suffer from a lack of formal modeling of inferences learnable from shared data. This report evaluates the extent to which current protection systems are capable of withstanding a range of re-identification methods, including genotype-phenotype inferences, location-visit patterns, family structures, and dictionary attacks. For a comparative re-identification analysis, the systems are mapped to a common formalism. Although there is variation in susceptibility, each system is deficient in its protection capacity. The author discovers patterns of protection failure and discusses several of the reasons why these systems are susceptible. The analyses and discussion within provide guideposts for the development of next-generation protection methods amenable to formal proofs.
Malin, Bradley A.
2005-01-01
The incorporation of genomic data into personal medical records poses many challenges to patient privacy. In response, various systems for preserving patient privacy in shared genomic data have been developed and deployed. Although these systems de-identify the data by removing explicit identifiers (e.g., name, address, or Social Security number) and incorporate sound security design principles, they suffer from a lack of formal modeling of inferences learnable from shared data. This report evaluates the extent to which current protection systems are capable of withstanding a range of re-identification methods, including genotype–phenotype inferences, location–visit patterns, family structures, and dictionary attacks. For a comparative re-identification analysis, the systems are mapped to a common formalism. Although there is variation in susceptibility, each system is deficient in its protection capacity. The author discovers patterns of protection failure and discusses several of the reasons why these systems are susceptible. The analyses and discussion within provide guideposts for the development of next-generation protection methods amenable to formal proofs. PMID:15492030
NASA Astrophysics Data System (ADS)
Liu, Qiang; Chattopadhyay, Aditi
2000-06-01
Aeromechanical stability plays a critical role in helicopter design and lead-lag damping is crucial to this design. In this paper, the use of segmented constrained damping layer (SCL) treatment and composite tailoring is investigated for improved rotor aeromechanical stability using formal optimization technique. The principal load-carrying member in the rotor blade is represented by a composite box beam, of arbitrary thickness, with surface bonded SCLs. A comprehensive theory is used to model the smart box beam. A ground resonance analysis model and an air resonance analysis model are implemented in the rotor blade built around the composite box beam with SCLs. The Pitt-Peters dynamic inflow model is used in air resonance analysis under hover condition. A hybrid optimization technique is used to investigate the optimum design of the composite box beam with surface bonded SCLs for improved damping characteristics. Parameters such as stacking sequence of the composite laminates and placement of SCLs are used as design variables. Detailed numerical studies are presented for aeromechanical stability analysis. It is shown that optimum blade design yields significant increase in rotor lead-lag regressive modal damping compared to the initial system.
Monitoring is not enough: on the need for a model-based approach to migratory bird management
Nichols, J.D.; Bonney, Rick; Pashley, David N.; Cooper, Robert; Niles, Larry
2000-01-01
Informed management requires information about system state and about effects of potential management actions on system state. Population monitoring can provide the needed information about system state, as well as information that can be used to investigate effects of management actions. Three methods for investigating effects of management on bird populations are (1) retrospective analysis, (2) formal experimentation and constrained-design studies, and (3) adaptive management. Retrospective analyses provide weak inferences, regardless of the quality of the monitoring data. The active use of monitoring data in experimental or constrained-design studies or in adaptive management is recommended. Under both approaches, learning occurs via the comparison of estimates from the monitoring program with predictions from competing management models.
Landau's statistical mechanics for quasi-particle models
NASA Astrophysics Data System (ADS)
Bannur, Vishnu M.
2014-04-01
Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.
Revisiting the PLUMBER Experiments from a Process-Diagnostics Perspective
NASA Astrophysics Data System (ADS)
Nearing, G. S.; Ruddell, B. L.; Clark, M. P.; Nijssen, B.; Peters-Lidard, C. D.
2017-12-01
The PLUMBER benchmarking experiments [1] showed that some of the most sophisticated land models (CABLE, CH-TESSEL, COLA-SSiB, ISBA-SURFEX, JULES, Mosaic, Noah, ORCHIDEE) were outperformed - in simulations of half-hourly surface energy fluxes - by instantaneous, out-of-sample, and globally-stationary regressions with no state memory. One criticism of PLUMBER is that the benchmarking methodology was not derived formally, so that applying a similar methodology with different performance metrics can result in qualitatively different results. Another common criticism of model intercomparison projects in general is that they offer little insight into process-level deficiencies in the models, and therefore are of marginal value for helping to improve the models. We address both of these issues by proposing a formal benchmarking methodology that also yields a formal and quantitative method for process-level diagnostics. We apply this to the PLUMBER experiments to show that (1) the PLUMBER conclusions were generally correct - the models use only a fraction of the information available to them from met forcing data (<50% by our analysis), and (2) all of the land models investigated by PLUMBER have similar process-level error structures, and therefore together do not represent a meaningful sample of structural or epistemic uncertainty. We conclude by suggesting two ways to improve the experimental design of model intercomparison and/or model benchmarking studies like PLUMBER. First, PLUMBER did not report model parameter values, and it is necessary to know these values to separate parameter uncertainty from structural uncertainty. This is a first order requirement if we want to use intercomparison studies to provide feedback to model development. Second, technical documentation of land models is inadequate. Future model intercomparison projects should begin with a collaborative effort by model developers to document specific differences between model structures. This could be done in a reproducible way using a unified, process-flexible system like SUMMA [2]. [1] Best, M.J. et al. (2015) 'The plumbing of land surface models: benchmarking model performance', J. Hydrometeor. [2] Clark, M.P. et al. (2015) 'A unified approach for process-based hydrologic modeling: 1. Modeling concept', Water Resour. Res.
Assessing Requirements Quality through Requirements Coverage
NASA Technical Reports Server (NTRS)
Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt
2008-01-01
In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.
Phase-locked loops. [analog, hybrid, discrete and digital systems
NASA Technical Reports Server (NTRS)
Gupta, S. C.
1974-01-01
The basic analysis and design procedures are described for the realization of analog phase-locked loops (APLL), hybrid phase-locked loops (HPLL), discrete phase-locked loops, and digital phase-locked loops (DPLL). Basic configurations are diagrammed, and performance curves are given. A discrete communications model is derived and developed. The use of the APLL as an optimum angle demodulator and the Kalman-Bucy approach to APLL design are discussed. The literature in the area of phase-locked loops is reviewed, and an extensive bibliography is given. Although the design of APLLs is fairly well documented, work on discrete, hybrid, and digital PLLs is scattered, and more will have to be done in the future to pinpoint the formal design of DPLLs.
Modeling users, context and devices for ambient assisted living environments.
Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming
2014-03-17
The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works.
Modeling Users, Context and Devices for Ambient Assisted Living Environments
Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming
2014-01-01
The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works. PMID:24643006
An Introduction to the BFS Method and Its Use to Model Binary NiAl Alloys
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Noebe, Ronald D.; Ferrante, J.; Amador, C.
1998-01-01
We introduce the Bozzolo-Ferrante-Smith (BFS) method for alloys as a computationally efficient tool for aiding in the process of alloy design. An intuitive description of the BFS method is provided, followed by a formal discussion of its implementation. The method is applied to the study of the defect structure of NiAl binary alloys. The groundwork is laid for a detailed progression to higher order NiAl-based alloys linking theoretical calculations and computer simulations based on the BFS method and experimental work validating each step of the alloy design process.
CHAMPION: Intelligent Hierarchical Reasoning Agents for Enhanced Decision Support
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hohimer, Ryan E.; Greitzer, Frank L.; Noonan, Christine F.
2011-11-15
We describe the design and development of an advanced reasoning framework employing semantic technologies, organized within a hierarchy of computational reasoning agents that interpret domain specific information. Designed based on an inspirational metaphor of the pattern recognition functions performed by the human neocortex, the CHAMPION reasoning framework represents a new computational modeling approach that derives invariant knowledge representations through memory-prediction belief propagation processes that are driven by formal ontological language specification and semantic technologies. The CHAMPION framework shows promise for enhancing complex decision making in diverse problem domains including cyber security, nonproliferation and energy consumption analysis.
Formal methods and digital systems validation for airborne systems
NASA Technical Reports Server (NTRS)
Rushby, John
1993-01-01
This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.
A brief overview of NASA Langley's research program in formal methods
NASA Technical Reports Server (NTRS)
1992-01-01
An overview of NASA Langley's research program in formal methods is presented. The major goal of this work is to bring formal methods technology to a sufficiently mature level for use by the United States aerospace industry. Towards this goal, work is underway to design and formally verify a fault-tolerant computing platform suitable for advanced flight control applications. Also, several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of six NASA civil servants and contractors from Boeing Military Aircraft Company, Computational Logic Inc., Odyssey Research Associates, SRI International, University of California at Davis, and Vigyan Inc.
A rigorous approach to self-checking programming
NASA Technical Reports Server (NTRS)
Hua, Kien A.; Abraham, Jacob A.
1986-01-01
Self-checking programming is shown to be an effective concurrent error detection technique. The reliability of a self-checking program however relies on the quality of its assertion statements. A self-checking program written without formal guidelines could provide a poor coverage of the errors. A constructive technique for self-checking programming is presented. A Structured Program Design Language (SPDL) suitable for self-checking software development is defined. A set of formal rules, was also developed, that allows the transfromation of SPDL designs into self-checking designs to be done in a systematic manner.
Formal Methods for Verification and Validation of Partial Specifications: A Case Study
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Callahan, John
1997-01-01
This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.
SynBioSS-aided design of synthetic biological constructs.
Kaznessis, Yiannis N
2011-01-01
We present walkthrough examples of using SynBioSS to design, model, and simulate synthetic gene regulatory networks. SynBioSS stands for Synthetic Biology Software Suite, a platform that is publicly available with Open Licenses at www.synbioss.org. An important aim of computational synthetic biology is the development of a mathematical modeling formalism that is applicable to a wide variety of simple synthetic biological constructs. SynBioSS-based modeling of biomolecular ensembles that interact away from the thermodynamic limit and not necessarily at steady state affords for a theoretical framework that is generally applicable to known synthetic biological systems, such as bistable switches, AND gates, and oscillators. Here, we discuss how SynBioSS creates links between DNA sequences and targeted dynamic phenotypes of these simple systems. Copyright © 2011 Elsevier Inc. All rights reserved.
Capability maturity models for offshore organisational management.
Strutt, J E; Sharp, J V; Terry, E; Miles, R
2006-12-01
The goal setting regime imposed by the UK safety regulator has important implications for an organisation's ability to manage health and safety related risks. Existing approaches to safety assurance based on risk analysis and formal safety assessments are increasingly considered unlikely to create the step change improvement in safety to which the offshore industry aspires and alternative approaches are being considered. One approach, which addresses the important issue of organisational behaviour and which can be applied at a very early stage of design, is the capability maturity model (CMM). The paper describes the development of a design safety capability maturity model, outlining the key processes considered necessary to safety achievement, definition of maturity levels and scoring methods. The paper discusses how CMM is related to regulatory mechanisms and risk based decision making together with the potential of CMM to environmental risk management.
GAMES II Project: a general architecture for medical knowledge-based systems.
Bruno, F; Kindler, H; Leaning, M; Moustakis, V; Scherrer, J R; Schreiber, G; Stefanelli, M
1994-10-01
GAMES II aims at developing a comprehensive and commercially viable methodology to avoid problems ordinarily occurring in KBS development. GAMES II methodology proposes to design a KBS starting from an epistemological model of medical reasoning (the Select and Test Model). The design is viewed as a process of adding symbol level information to the epistemological model. The architectural framework provided by GAMES II integrates the use of different formalisms and techniques providing a large set of tools. The user can select the most suitable one for representing a piece of knowledge after a careful analysis of its epistemological characteristics. Special attention is devoted to the tools dealing with knowledge acquisition (both manual and automatic). A panel of practicing physicians are assessing the medical value of such a framework and its related tools by using it in a practical application.
Interacting hadron resonance gas model in the K -matrix formalism
NASA Astrophysics Data System (ADS)
Dash, Ashutosh; Samanta, Subhasis; Mohanty, Bedangadas
2018-05-01
An extension of hadron resonance gas (HRG) model is constructed to include interactions using relativistic virial expansion of partition function. The noninteracting part of the expansion contains all the stable baryons and mesons and the interacting part contains all the higher mass resonances which decay into two stable hadrons. The virial coefficients are related to the phase shifts which are calculated using K -matrix formalism in the present work. We have calculated various thermodynamics quantities like pressure, energy density, and entropy density of the system. A comparison of thermodynamic quantities with noninteracting HRG model, calculated using the same number of hadrons, shows that the results of the above formalism are larger. A good agreement between equation of state calculated in K -matrix formalism and lattice QCD simulations is observed. Specifically, the lattice QCD calculated interaction measure is well described in our formalism. We have also calculated second-order fluctuations and correlations of conserved charges in K -matrix formalism. We observe a good agreement of second-order fluctuations and baryon-strangeness correlation with lattice data below the crossover temperature.
Formal Specification of Information Systems Requirements.
ERIC Educational Resources Information Center
Kampfner, Roberto R.
1985-01-01
Presents a formal model for specification of logical requirements of computer-based information systems that incorporates structural and dynamic aspects based on two separate models: the Logical Information Processing Structure and the Logical Information Processing Network. The model's role in systems development is discussed. (MBR)
The equivalence of a human observer and an ideal observer in binary diagnostic tasks
NASA Astrophysics Data System (ADS)
He, Xin; Samuelson, Frank; Gallas, Brandon D.; Sahiner, Berkman; Myers, Kyle
2013-03-01
The Ideal Observer (IO) is "ideal" for given data populations. In the image perception process, as the raw images are degraded by factors such as display and eye optics, there is an equivalent IO (EIO). The EIO uses the statistical information that exits the perception/cognitive degradations as the data. We assume a human observer who received sufficient training, e.g., radiologists, and hypothesize that such a human observer can be modeled as if he is an EIO. To measure the likelihood ratio (LR) distributions of an EIO, we formalize experimental design principles that encourage rationality based on von Neumann and Morgenstern's (vNM) axioms. We present examples to show that many observer study design refinements, although motivated by empirical principles explicitly, implicitly encourage rationality. Our hypothesis is supported by a recent review paper on ROC curve convexity by Pesce, Metz, and Berbaum. We also provide additional evidence based on a collection of observer studies in medical imaging. EIO theory shows that the "sub-optimal" performance of a human observer can be mathematically formalized in the form of an IO, and measured through rationality encouragement.
Modeling Cyber Conflicts Using an Extended Petri Net Formalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zakrzewska, Anita N; Ferragut, Erik M
2011-01-01
When threatened by automated attacks, critical systems that require human-controlled responses have difficulty making optimal responses and adapting protections in real- time and may therefore be overwhelmed. Consequently, experts have called for the development of automatic real-time reaction capabilities. However, a technical gap exists in the modeling and analysis of cyber conflicts to automatically understand the repercussions of responses. There is a need for modeling cyber assets that accounts for concurrent behavior, incomplete information, and payoff functions. Furthermore, we address this need by extending the Petri net formalism to allow real-time cyber conflicts to be modeled in a way thatmore » is expressive and concise. This formalism includes transitions controlled by players as well as firing rates attached to transitions. This allows us to model both player actions and factors that are beyond the control of players in real-time. We show that our formalism is able to represent situational aware- ness, concurrent actions, incomplete information and objective functions. These factors make it well-suited to modeling cyber conflicts in a way that allows for useful analysis. MITRE has compiled the Common Attack Pattern Enumera- tion and Classification (CAPEC), an extensive list of cyber attacks at various levels of abstraction. CAPEC includes factors such as attack prerequisites, possible countermeasures, and attack goals. These elements are vital to understanding cyber attacks and to generating the corresponding real-time responses. We demonstrate that the formalism can be used to extract precise models of cyber attacks from CAPEC. Several case studies show that our Petri net formalism is more expressive than other models, such as attack graphs, for modeling cyber conflicts and that it is amenable to exploring cyber strategies.« less
Comprehensive Aspectual UML approach to support AspectJ.
Magableh, Aws; Shukur, Zarina; Ali, Noorazean Mohd
2014-01-01
Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a "good design" criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs.
The ADVANCE project : formal evaluation of the targeted deployment. Volume 2
DOT National Transportation Integrated Search
1997-01-01
This document reports on the formal evaluation of the targeted (limited but highly focused) deployment of the Advanced Driver and Vehicle Advisory Navigation ConcEpt (ADVANCE), an in-vehicle advanced traveler information system designed to provide sh...
NASA software specification and evaluation system design, part 1
NASA Technical Reports Server (NTRS)
1976-01-01
The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.
32 CFR 644.377 - Formal revocation of public land withdrawals and reservations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... domain, the BLM Land Office will transmit to the DE a draft of public land order (PLO) designed to formally revoke the order or reservation which withdrew or reserved the land. The DE will review the draft...
32 CFR 644.377 - Formal revocation of public land withdrawals and reservations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... domain, the BLM Land Office will transmit to the DE a draft of public land order (PLO) designed to formally revoke the order or reservation which withdrew or reserved the land. The DE will review the draft...
Learning design in healthcare education.
Ellaway, Rachel; Dalziel, James; Dalziel, Bronwen
2008-01-01
Emerging from ongoing work into educational modelling languages, learning design principles and the IMS Learning Design framework provide formal ways to annotate and record educational activities. Once educational activities have been encoded they can be played, replayed, adopted, shared, and analysed, thereby reifying much that is otherwise lost in face-to-face teaching. The use of learning design tools, including the free and open source LAMS system (www.lamsfoundation.org), allow practitioners to experiment with learning design approaches in their own teaching, both in terms of creating and encoding their own designs and playing, adapting and analysing designs from other teachers either from within or outside a particular field or subject area. This paper reviews the key issues associated with designing for learning in the context of healthcare education, some of the themes and approaches already in development or use, and the implications of this approach on the practice and theory of healthcare education.
Ding, Hansheng; Wang, Changying; Xie, Chunyan; Yang, Yitong; Jin, Chunlin
2017-01-01
The need for formal care among the elderly population has been increasing due to their greater longevity and the evolution of family structure. We examined the determinants of the use and expenses of formal care among in-home elderly adults in Shanghai. A two-part model based on the data from the Shanghai Long-Term Care Needs Assessment Questionnaire was applied. A total of 8428 participants responded in 2014 and 7100 were followed up in 2015. The determinants of the probability of using formal care were analyzed in the first part of the model and the determinants of formal care expenses were analyzed in the second part. Demographic indicators, living arrangements, physical health status, and care type in 2014 were selected as independent variables. We found that individuals of older age; women; those with higher Activities of Daily Living (ADL) scores; those without spouse; those with higher income; those suffering from stroke, dementia, lower limb fracture, or advanced tumor; and those with previous experience of formal and informal care were more likely to receive formal care in 2015. Furthermore, age, income and formal care fee in 2014 were significant predictors of formal care expenses in 2015. Taken together, the results showed that formal care provision in Shanghai was not determined by ADL scores, but was instead more related to income. This implied an inappropriate distribution of formal care among elderly population in Shanghai. Additionally, it appeared difficult for the elderly to quit the formal care once they begun to use it. These results highlighted the importance of assessing the need for formal care, and suggested that the government offer guidance on formal care use for the elderly. PMID:28448628
On Abstractions and Simplifications in the Design of Human-Automation Interfaces
NASA Technical Reports Server (NTRS)
Heymann, Michael; Degani, Asaf; Shafto, Michael; Meyer, George; Clancy, Daniel (Technical Monitor)
2001-01-01
This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the, issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, i.e., that with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be as succinct as possible. The report discusses the underlying concepts and the formal methods for this approach. Several examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.
On Abstractions and Simplifications in the Design of Human-Automation Interfaces
NASA Technical Reports Server (NTRS)
Heymann, Michael; Degani, Asaf; Clancy, Daniel (Technical Monitor)
2002-01-01
This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, that is, with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be succinct. The report discusses the underlying concepts and the formal methods for this approach. Two examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.
Aerodynamic Optimization of Rocket Control Surface Geometry Using Cartesian Methods and CAD Geometry
NASA Technical Reports Server (NTRS)
Nelson, Andrea; Aftosmis, Michael J.; Nemec, Marian; Pulliam, Thomas H.
2004-01-01
Aerodynamic design is an iterative process involving geometry manipulation and complex computational analysis subject to physical constraints and aerodynamic objectives. A design cycle consists of first establishing the performance of a baseline design, which is usually created with low-fidelity engineering tools, and then progressively optimizing the design to maximize its performance. Optimization techniques have evolved from relying exclusively on designer intuition and insight in traditional trial and error methods, to sophisticated local and global search methods. Recent attempts at automating the search through a large design space with formal optimization methods include both database driven and direct evaluation schemes. Databases are being used in conjunction with surrogate and neural network models as a basis on which to run optimization algorithms. Optimization algorithms are also being driven by the direct evaluation of objectives and constraints using high-fidelity simulations. Surrogate methods use data points obtained from simulations, and possibly gradients evaluated at the data points, to create mathematical approximations of a database. Neural network models work in a similar fashion, using a number of high-fidelity database calculations as training iterations to create a database model. Optimal designs are obtained by coupling an optimization algorithm to the database model. Evaluation of the current best design then gives either a new local optima and/or increases the fidelity of the approximation model for the next iteration. Surrogate methods have also been developed that iterate on the selection of data points to decrease the uncertainty of the approximation model prior to searching for an optimal design. The database approximation models for each of these cases, however, become computationally expensive with increase in dimensionality. Thus the method of using optimization algorithms to search a database model becomes problematic as the number of design variables is increased.
Using the structure of social networks to map inter-agency relationships in public health services.
West, Robert M; House, Allan O; Keen, Justin; Ward, Vicky L
2015-11-01
This article investigates network governance in the context of health and wellbeing services in England, focussing on relationships between managers in a range of services. There are three aims, namely to investigate, (i) the configurations of networks, (ii) the stability of network relationships over time and, (iii) the balance between formal and informal ties that underpin inter-agency relationships. Latent position cluster network models were used to characterise relationships. Managers were asked two questions, both designed to characterise informal relationships. The resulting networks differed substantially from one another in membership. Managers described networks of relationships that spanned organisational boundaries, and that changed substantially over time. The findings suggest that inter-agency co-ordination depends more on informal than on formal relationships. Copyright © 2015 Elsevier Ltd. All rights reserved.
Properties of a Formal Method to Model Emergence in Swarm-Based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.
Matsuoka, Yukiko; Ghosh, Samik; Kitano, Hiroaki
2009-01-01
The discovery by design paradigm driving research in synthetic biology entails the engineering of de novo biological constructs with well-characterized input–output behaviours and interfaces. The construction of biological circuits requires iterative phases of design, simulation and assembly, leading to the fabrication of a biological device. In order to represent engineered models in a consistent visual format and further simulating them in silico, standardization of representation and model formalism is imperative. In this article, we review different efforts for standardization, particularly standards for graphical visualization and simulation/annotation schemata adopted in systems biology. We identify the importance of integrating the different standardization efforts and provide insights into potential avenues for developing a common framework for model visualization, simulation and sharing across various tools. We envision that such a synergistic approach would lead to the development of global, standardized schemata in biology, empowering deeper understanding of molecular mechanisms as well as engineering of novel biological systems. PMID:19493898
Design of Astrometric Mission (JASMINE) by Applying Model Driven System Engineering
NASA Astrophysics Data System (ADS)
Yamada, Y.; Miyashita, H.; Nakamura, H.; Suenaga, K.; Kamiyoshi, S.; Tsuiki, A.
2010-12-01
We are planning space astrometric satellite mission named JASMINE. The target accuracy of parallaxes in JASMINE observation is 10 micro arc second, which corresponds to 1 nm scale on the focal plane. It is very hard to measure the 1 nm scale deformation of focal plane. Eventually, we need to add the deformation to the observation equations when estimating stellar astrometric parameters, which requires considering many factors such as instrument models and observation data analysis. In this situation, because the observation equations become more complex, we may reduce the stability of the hardware, nevertheless, we require more samplings due to the lack of rigidity of each estimation. This mission imposes a number of trades-offs in the engineering choices and then decide the optimal design from a number of candidates. In order to efficiently support such decisions, we apply Model Driven Systems Engineering (MDSE), which improves the efficiency of the engineering by revealing and formalizing requirements, specifications, and designs to find a good balance among various trade-offs.
VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data
Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel
2014-01-01
This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198
Numerical modeling of the atmosphere with an isentropic vertical coordinate
NASA Technical Reports Server (NTRS)
Hsu, Yueh-Jiuan G.; Arakawa, Akio
1990-01-01
A theta-coordinate model simulating the nonlinear evolution of a baroclinic wave is presented. In the model, vertical discretization maintains important integral constraints such as conservation of the angular momentum and total energy. A massless-layer approach is used in the treatment of the intersections of coordinate surfaces with the lower boundary. This formally eliminates the intersection problem, but raises other computational problems. Horizontal discretization of the continuity and momentum equations in the model are designed to overcome these problems. Selected results from a 10-day integration with the 25-layer, beta-plane version of the model are presented. It is concluded that the model can simulate the nonlinear evolution of a baroclinic wave and associated dynamical processes without major computational difficulties.
Theory and ontology for sharing temporal knowledge
NASA Technical Reports Server (NTRS)
Loganantharaj, Rasiah
1996-01-01
Using current technology, the sharing or re-using of knowledge-bases is very difficult, if not impossible. ARPA has correctly recognized the problem and funded a knowledge sharing initiative. One of the outcomes of this project is a formal language called Knowledge Interchange Format (KIF) for representing knowledge that could be translated into other languages. Capturing and representing design knowledge and reasoning with them have become very important for NASA who is a pioneer of innovative design of unique products. For upgrading an existing design for changing technology, needs, or requirements, it is essential to understand the design rationale, design choices, options and other relevant information associated with the design. Capturing such information and presenting them in the appropriate form are part of the ongoing Design Knowledge Capture project of NASA. The behavior of an object and various other aspects related to time are captured by the appropriate temporal knowledge. The captured design knowledge will be represented in such a way that various groups of NASA who are interested in various aspects of the design cycle should be able to access and use the design knowledge effectively. To facilitate knowledge sharing among these groups, one has to develop a very well defined ontology. Ontology is a specification of conceptualization. In the literature several specific domains were studied and some well defined ontologies were developed for such domains. However, very little, or no work has been done in the area of representing temporal knowledge to facilitate sharing. During the ASEE summer program, I have investigated several temporal models and have proposed a theory for time that is flexible to accommodate the time elements, such as, points and intervals, and is capable of handling the qualitative and quantitative temporal constraints. I have also proposed a primitive temporal ontology using which other relevant temporal ontologies can be built. I have investigated various issues of sharing knowledge and have proposed a formal framework for modeling the concept of knowledge sharing. This work may be implemented and tested in the software environment supplied by Knowledge Based System, Inc.
An Evaluation of the Preceptor Model versus the Formal Teaching Model.
ERIC Educational Resources Information Center
Shamian, Judith; Lemieux, Suzanne
1984-01-01
This study evaluated the effectiveness of two teaching methods to determine which is more effective in enhancing the knowledge base of participating nurses: the preceptor model embodies decentralized instruction by a member of the nursing staff, and the formal teaching model uses centralized teaching by the inservice education department. (JOW)
Developing Formal Object-oriented Requirements Specifications: A Model, Tool and Technique.
ERIC Educational Resources Information Center
Jackson, Robert B.; And Others
1995-01-01
Presents a formal object-oriented specification model (OSS) for computer software system development that is supported by a tool that automatically generates a prototype from an object-oriented analysis model (OSA) instance, lets the user examine the prototype, and permits the user to refine the OSA model instance to generate a requirements…
Structuring Formal Control Systems Specifications for Reuse: Surviving Hardware Changes
NASA Technical Reports Server (NTRS)
Thompson, Jeffrey M.; Heimdahl, Mats P. E.; Erickson, Debra M.
2000-01-01
Formal capture and analysis of the required behavior of control systems have many advantages. For instance, it encourages rigorous requirements analysis, the required behavior is unambiguously defined, and we can assure that various safety properties are satisfied. Formal modeling is, however, a costly and time consuming process and if one could reuse the formal models over a family of products, significant cost savings would be realized. In an ongoing project we are investigating how to structure state-based models to achieve a high level of reusability within product families. In this paper we discuss a high-level structure of requirements models that achieves reusability of the desired control behavior across varying hardware platforms in a product family. The structuring approach is demonstrated through a case study in the mobile robotics domain where the desired robot behavior is reused on two diverse platforms-one commercial mobile platform and one build in-house. We use our language RSML (-e) to capture the control behavior for reuse and our tool NIMBUS to demonstrate how the formal specification can be validated and used as a prototype on the two platforms.
NASA Astrophysics Data System (ADS)
Kasim, N.; Zainal Abidin, N. A.; Zainal, R.; Sarpin, N.; Rahim, M. H. I. Abd; Saikah, M.
2017-11-01
Implementation of Building Information Modelling (BIM) was expected to bring improvement in current practices of Malaysian construction industry. In the design phase, there is a lack of a ready pool of skilled workers who are able to develop BIM strategic plan and effectively utilise it. These create boundaries for BIM nature in Malaysian construction industry specifically in the design phase to achieve its best practices. Therefore, the objectives of this research are to investigate the current practices of BIM implementation in the design phase as well as the best practices factors of BIM implementation in the design phase. The qualitative research approach is carried out through semi-structured interviews with the designers of different organisations which adopt BIM in the design phase. Data collection is analysed by executing content analysis method. From the findings, the best practices factors of BIM implementation in design phase such as the incentive for BIM training, formal approach to monitoring automated Level of Detailing (LOD), run a virtual meeting and improve Industry Foundation Class (IFC). Thus, best practices factors which lead to practices improvements in the design phase of project development which subsequently improves the implementation of BIM in the design phase of Malaysian construction industry.
Large co-axial pulse tube preliminary results
NASA Astrophysics Data System (ADS)
Emery, N.; Caughley, A.; Meier, J.; Nation, M.; Tanchon, J.; Trollier, T.; Ravex, A.
2014-01-01
We report that Callaghan Innovation, formally known as Industrial Research Ltd (IRL), has designed and built its largest of three high frequency single-stage co-axial pulse tubes, closely coupled to a metal diaphragm pressure wave generator (PWG). The previous pulse tube achieved 110 W of cooling power @ 77 K, with an electrical input power of 3.1 kW from a 90 cc swept volume PWG. The pulse tubes have all been tuned to operate at 50 Hz, with a mean helium working pressure of 2.5 MPa. Sage pulse tube simulation software was used to model the latest pulse tube and predicted 280 W of cooling power @ 77 K. The nominal 250 W cryocooler was designed to be an intermediate step to up-scale pulse tube technology for our 1000 cc swept-volume PWG, to provide liquefaction of gases and cooling for HTS applications. Details of the modeling, design, development and preliminary experimental results are discussed.
Single-Vector Calibration of Wind-Tunnel Force Balances
NASA Technical Reports Server (NTRS)
Parker, P. A.; DeLoach, R.
2003-01-01
An improved method of calibrating a wind-tunnel force balance involves the use of a unique load application system integrated with formal experimental design methodology. The Single-Vector Force Balance Calibration System (SVS) overcomes the productivity and accuracy limitations of prior calibration methods. A force balance is a complex structural spring element instrumented with strain gauges for measuring three orthogonal components of aerodynamic force (normal, axial, and side force) and three orthogonal components of aerodynamic torque (rolling, pitching, and yawing moments). Force balances remain as the state-of-the-art instrument that provide these measurements on a scale model of an aircraft during wind tunnel testing. Ideally, each electrical channel of the balance would respond only to its respective component of load, and it would have no response to other components of load. This is not entirely possible even though balance designs are optimized to minimize these undesirable interaction effects. Ultimately, a calibration experiment is performed to obtain the necessary data to generate a mathematical model and determine the force measurement accuracy. In order to set the independent variables of applied load for the calibration 24 NASA Tech Briefs, October 2003 experiment, a high-precision mechanical system is required. Manual deadweight systems have been in use at Langley Research Center (LaRC) since the 1940s. These simple methodologies produce high confidence results, but the process is mechanically complex and labor-intensive, requiring three to four weeks to complete. Over the past decade, automated balance calibration systems have been developed. In general, these systems were designed to automate the tedious manual calibration process resulting in an even more complex system which deteriorates load application quality. The current calibration approach relies on a one-factor-at-a-time (OFAT) methodology, where each independent variable is incremented individually throughout its full-scale range, while all other variables are held at a constant magnitude. This OFAT approach has been widely accepted because of its inherent simplicity and intuitive appeal to the balance engineer. LaRC has been conducting research in a "modern design of experiments" (MDOE) approach to force balance calibration. Formal experimental design techniques provide an integrated view to the entire calibration process covering all three major aspects of an experiment; the design of the experiment, the execution of the experiment, and the statistical analyses of the data. In order to overcome the weaknesses in the available mechanical systems and to apply formal experimental techniques, a new mechanical system was required. The SVS enables the complete calibration of a six-component force balance with a series of single force vectors.
Code of Federal Regulations, 2010 CFR
2010-07-01
... overhaul; and (2) An analysis of the cost to implement the overhaul within a year versus a proposed... be based on a formal comprehensive appraisal or a series of formal appraisals of the functional...
Litzelman, Debra K; Cottingham, Ann H
2007-04-01
There is growing recognition in the medical community that being a good doctor requires more than strong scientific knowledge and excellent clinical skills. Many key qualities are essential to providing comprehensive care, including the abilities to communicate effectively with patients and colleagues, act in a professional manner, cultivate an awareness of one's own values and prejudices, and provide care with an understanding of the cultural and spiritual dimensions of patients' lives. To ensure that Indiana University School of Medicine (IUSM) graduates demonstrate this range of abilities, IUSM has undertaken a substantial transformation of both its formal curriculum and learning environment (informal curriculum). The authors provide an overview of IUSM's two-part initiative to develop and implement a competency-based formal curriculum that requires students to demonstrate proficiency in nine core competencies and to create simultaneously an informal curriculum that models and supports the moral, professional, and humane values expressed in the formal curriculum. The authors describe the institutional and curricular transformations that have enabled and furthered the new IUSM curricular goals: changes in education administration; education implementation, assessment, and curricular design; admissions procedures; performance tracking; and the development of an electronic infrastructure to facilitate the expanded curriculum. The authors address the cost of reform and the results of two progress reviews. Specific case examples illustrate the interweaving of the formal competency curriculum through the students' four years of training, as well as techniques that are being used to positively influence the IUSM informal curriculum.
From non-trivial geometries to power spectra and vice versa
NASA Astrophysics Data System (ADS)
Brooker, D. J.; Tsamis, N. C.; Woodard, R. P.
2018-04-01
We review a recent formalism which derives the functional forms of the primordial—tensor and scalar—power spectra of scalar potential inflationary models. The formalism incorporates the case of geometries with non-constant first slow-roll parameter. Analytic expressions for the power spectra are given that explicitly display the dependence on the geometric properties of the background. Moreover, we present the full algorithm for using our formalism, to reconstruct the model from the observed power spectra. Our techniques are applied to models possessing "features" in their potential with excellent agreement.
Formal verification of AI software
NASA Technical Reports Server (NTRS)
Rushby, John; Whitehurst, R. Alan
1989-01-01
The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.
The Need for, and the Role of the Toxicological Chemist in the Design of Safer Chemicals.
DeVito, Stephen C
2018-02-01
During the past several decades, there has been an ever increasing emphasis for designers of new commercial (nonpharmaceutical) chemicals to include considerations of the potential impacts a planned chemical may have on human health and the environment as part of the design of the chemical, and to design chemicals such that they possess the desired use efficacy while minimizing threats to human health and the environment. Achievement of this goal would be facilitated by the availability of individuals specifically and formally trained to design such chemicals. Medicinal chemists are specifically trained to design and develop safe and clinically efficacious pharmaceutical substances. No such formally trained science hybrid exists for the design of safer commercial (nonpharmaceutical) chemicals. This article describes the need for and role of the "toxicological chemist," an individual who is formally trained in synthetic organic chemistry, biochemistry, physiology, toxicology, environmental science, and in the relationships between structure and commercial use efficacy, structure and toxicity, structure and environmental fate and effects, and global hazard, and trained to integrate this knowledge to design safer commercially efficacious chemicals. Using examples, this article illustrates the role of the toxicological chemist in designing commercially efficacious, safer chemical candidates. Published by Oxford University Press on behalf of the Society of Toxicology 2017. This work is written by a US Government employee and is in the public domain in the US.
Evaluating a Control System Architecture Based on a Formally Derived AOCS Model
NASA Astrophysics Data System (ADS)
Ilic, Dubravka; Latvala, Timo; Varpaaniemi, Kimmo; Vaisanen, Pauli; Troubitsyna, Elena; Laibinis, Linas
2010-08-01
Attitude & Orbit Control System (AOCS) refers to a wider class of control systems which are used to determine and control the attitude of the spacecraft while in orbit, based on the information obtained from various sensors. In this paper, we propose an approach to evaluate a typical (yet somewhat simplified) AOCS architecture using formal development - based on the Event-B method. As a starting point, an Ada specification of the AOCS is translated into a formal specification and further refined to incorporate all the details of its original source code specification. This way we are able not only to evaluate the Ada specification by expressing and verifying specific system properties in our formal models, but also to determine how well the chosen modelling framework copes with the level of detail required for an actual implementation and code generation from the derived models.
A Formal Approach to Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.
Improved formalism for precision Higgs coupling fits
NASA Astrophysics Data System (ADS)
Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; Karl, Robert; List, Jenny; Ogawa, Tomohisa; Peskin, Michael E.; Tian, Junping
2018-03-01
Future e+e- colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e+e- data, based on the effective field theory description of corrections to the Standard Model. We apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e+e- colliders.
NASA Technical Reports Server (NTRS)
Boulet, C.; Ma, Qiancheng; Tipping, R. H.
2015-01-01
Starting from the refined Robert-Bonamy formalism [Q. Ma, C. Boulet, and R. H. Tipping, J. Chem. Phys. 139, 034305 (2013)], we propose here an extension of line mixing studies to infrared absorptions of linear polyatomic molecules having stretching and bending modes. The present formalism does not neglect the internal degrees of freedom of the perturbing molecules, contrary to the energy corrected sudden (ECS) modeling, and enables one to calculate the whole relaxation matrix starting from the potential energy surface. Meanwhile, similar to the ECS modeling, the present formalism properly accounts for roles played by all the internal angular momenta in the coupling process, including the vibrational angular momentum. The formalism has been applied to the important case of CO2 broadened by N2. Applications to two kinds of vibrational bands (sigma yields sigma and sigma yields pi) have shown that the present results are in good agreement with both experimental data and results derived from the ECS model.
QSAR DataBank - an approach for the digital organization and archiving of QSAR model information
2014-01-01
Background Research efforts in the field of descriptive and predictive Quantitative Structure-Activity Relationships or Quantitative Structure–Property Relationships produce around one thousand scientific publications annually. All the materials and results are mainly communicated using printed media. The printed media in its present form have obvious limitations when they come to effectively representing mathematical models, including complex and non-linear, and large bodies of associated numerical chemical data. It is not supportive of secondary information extraction or reuse efforts while in silico studies poses additional requirements for accessibility, transparency and reproducibility of the research. This gap can and should be bridged by introducing domain-specific digital data exchange standards and tools. The current publication presents a formal specification of the quantitative structure-activity relationship data organization and archival format called the QSAR DataBank (QsarDB for shorter, or QDB for shortest). Results The article describes QsarDB data schema, which formalizes QSAR concepts (objects and relationships between them) and QsarDB data format, which formalizes their presentation for computer systems. The utility and benefits of QsarDB have been thoroughly tested by solving everyday QSAR and predictive modeling problems, with examples in the field of predictive toxicology, and can be applied for a wide variety of other endpoints. The work is accompanied with open source reference implementation and tools. Conclusions The proposed open data, open source, and open standards design is open to public and proprietary extensions on many levels. Selected use cases exemplify the benefits of the proposed QsarDB data format. General ideas for future development are discussed. PMID:24910716
Directly executable formal models of middleware for MANET and Cloud Networking and Computing
NASA Astrophysics Data System (ADS)
Pashchenko, D. V.; Sadeq Jaafar, Mustafa; Zinkin, S. A.; Trokoz, D. A.; Pashchenko, T. U.; Sinev, M. P.
2016-04-01
The article considers some “directly executable” formal models that are suitable for the specification of computing and networking in the cloud environment and other networks which are similar to wireless networks MANET. These models can be easily programmed and implemented on computer networks.
A Model-Driven Approach to Teaching Concurrency
ERIC Educational Resources Information Center
Carro, Manuel; Herranz, Angel; Marino, Julio
2013-01-01
We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so…
Applying Automated Theorem Proving to Computer Security
2008-03-01
CS96]”. Violations of policy can also be specified in this model. La Padula [Pad90] discusses a domain-independent formal model which imple- ments a...Science Laboratory, SRI International, Menlo Park, CA, September 1999. Pad90. L.J. La Padula . Formal modeling in a generalized framework for ac- cess
Reconfigurable Hardware Adapts to Changing Mission Demands
NASA Technical Reports Server (NTRS)
2003-01-01
A new class of computing architectures and processing systems, which use reconfigurable hardware, is creating a revolutionary approach to implementing future spacecraft systems. With the increasing complexity of electronic components, engineers must design next-generation spacecraft systems with new technologies in both hardware and software. Derivation Systems, Inc., of Carlsbad, California, has been working through NASA s Small Business Innovation Research (SBIR) program to develop key technologies in reconfigurable computing and Intellectual Property (IP) soft cores. Founded in 1993, Derivation Systems has received several SBIR contracts from NASA s Langley Research Center and the U.S. Department of Defense Air Force Research Laboratories in support of its mission to develop hardware and software for high-assurance systems. Through these contracts, Derivation Systems began developing leading-edge technology in formal verification, embedded Java, and reconfigurable computing for its PF3100, Derivational Reasoning System (DRS ), FormalCORE IP, FormalCORE PCI/32, FormalCORE DES, and LavaCORE Configurable Java Processor, which are designed for greater flexibility and security on all space missions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marchisio, Mario Andrea, E-mail: marchisio@hit.edu.cn
Published in 2008, Parts & Pools represents one of the first attempts to conceptualize the modular design of bacterial synthetic gene circuits with Standard Biological Parts (DNA segments) and Pools of molecules referred to as common signal carriers (e.g., RNA polymerases and ribosomes). The original framework for modeling bacterial components and designing prokaryotic circuits evolved over the last years and brought, first, to the development of an algorithm for the automatic design of Boolean gene circuits. This is a remarkable achievement since gene digital circuits have a broad range of applications that goes from biosensors for health and environment caremore » to computational devices. More recently, Parts & Pools was enabled to give a proper formal description of eukaryotic biological circuit components. This was possible by employing a rule-based modeling approach, a technique that permits a faithful calculation of all the species and reactions involved in complex systems such as eukaryotic cells and compartments. In this way, Parts & Pools is currently suitable for the visual and modular design of synthetic gene circuits in yeast and mammalian cells too.« less
Hard and Soft Constraints in Reliability-Based Design Optimization
NASA Technical Reports Server (NTRS)
Crespo, L.uis G.; Giesy, Daniel P.; Kenny, Sean P.
2006-01-01
This paper proposes a framework for the analysis and design optimization of models subject to parametric uncertainty where design requirements in the form of inequality constraints are present. Emphasis is given to uncertainty models prescribed by norm bounded perturbations from a nominal parameter value and by sets of componentwise bounded uncertain variables. These models, which often arise in engineering problems, allow for a sharp mathematical manipulation. Constraints can be implemented in the hard sense, i.e., constraints must be satisfied for all parameter realizations in the uncertainty model, and in the soft sense, i.e., constraints can be violated by some realizations of the uncertain parameter. In regard to hard constraints, this methodology allows (i) to determine if a hard constraint can be satisfied for a given uncertainty model and constraint structure, (ii) to generate conclusive, formally verifiable reliability assessments that allow for unprejudiced comparisons of competing design alternatives and (iii) to identify the critical combination of uncertain parameters leading to constraint violations. In regard to soft constraints, the methodology allows the designer (i) to use probabilistic uncertainty models, (ii) to calculate upper bounds to the probability of constraint violation, and (iii) to efficiently estimate failure probabilities via a hybrid method. This method integrates the upper bounds, for which closed form expressions are derived, along with conditional sampling. In addition, an l(sub infinity) formulation for the efficient manipulation of hyper-rectangular sets is also proposed.
Linear quadratic servo control of a reusable rocket engine
NASA Technical Reports Server (NTRS)
Musgrave, Jeffrey L.
1991-01-01
The paper deals with the development of a design method for a servo component in the frequency domain using singular values and its application to a reusable rocket engine. A general methodology used to design a class of linear multivariable controllers for intelligent control systems is presented. Focus is placed on performance and robustness characteristics, and an estimator design performed in the framework of the Kalman-filter formalism with emphasis on using a sensor set different from the commanded values is discussed. It is noted that loop transfer recovery modifies the nominal plant noise intensities in order to obtain the desired degree of robustness to uncertainty reflected at the plant input. Simulation results demonstrating the performance of the linear design on a nonlinear engine model over all power levels during mainstage operation are discussed.
Adolescent thinking ála Piaget: The formal stage.
Dulit, E
1972-12-01
Two of the formal-stage experiments of Piaget and Inhelder, selected largely for their closeness to the concepts defining the stage, were replicated with groups of average and gifted adolescents. This report describes the relevant Piagetian concepts (formal stage, concrete stage) in context, gives the methods and findings of this study, and concludes with a section discussing implications and making some reformulations which generally support but significantly qualify some of the central themes of the Piaget-Inhelder work. Fully developed formal-stage thinking emerges as far from commonplace among normal or average adolescents (by marked contrast with the impression created by the Piaget-Inhelder text, which chooses to report no middle or older adolescents who function at less than fully formal levels). In this respect, the formal stage differs appreciably from the earlier Piagetian stages, and early adolescence emerges as the age for which a "single path" model of cognitive development becomes seriously inadequate and a more complex model becomes essential. Formal-stage thinking seems best conceptualized, like most other aspects of psychological maturity, as a potentiality only partially attained by most and fully attained only by some.
Formal verification of an avionics microprocessor
NASA Technical Reports Server (NTRS)
Srivas, Mandayam, K.; Miller, Steven P.
1995-01-01
Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
Towards Formal Implementation of PUS Standard
NASA Astrophysics Data System (ADS)
Ilić, D.
2009-05-01
As an effort to promote the reuse of on-board and ground systems ESA developed a standard for packet telemetry and telecommand - PUS. It defines a set of standard service models with the corresponding structures of the associated telemetry and telecommand packets. Various missions then can choose to implement those standard PUS services that best conform to their specific requirements. In this paper we propose a formal development (based on the Event-B method) of reusable service patterns, which can be instantiated for concrete application. Our formal models allow us to formally express and verify specific service properties including various telecommand and telemetry packet structure validation.
Provable Transient Recovery for Frame-Based, Fault-Tolerant Computing Systems
NASA Technical Reports Server (NTRS)
DiVito, Ben L.; Butler, Ricky W.
1992-01-01
We present a formal verification of the transient fault recovery aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system architecture for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization accommodates a wide variety of voting schemes for purging the effects of transients.
Formal System Verification for Trustworthy Embedded Systems
2011-04-19
microkernel basis. We had previously achieved code- level formal verification of the seL4 microkernel [3]. In the present project, over 12 months with 0.6 FTE...project, we designed and implemented a secure network access device (SAC) on top of the verified seL4 microkernel. The device allows a trusted front...Engelhardt, Rafal Kolan- ski, Michael Norrish, Thomas Sewell, Harvey Tuch, and Simon Winwood. seL4 : Formal verification of an OS kernel. CACM, 53(6):107
Experimental Evaluation of a Planning Language Suitable for Formal Verification
NASA Technical Reports Server (NTRS)
Butler, Rick W.; Munoz, Cesar A.; Siminiceanu, Radu I.
2008-01-01
The marriage of model checking and planning faces two seemingly diverging alternatives: the need for a planning language expressive enough to capture the complexity of real-life applications, as opposed to a language simple, yet robust enough to be amenable to exhaustive verification and validation techniques. In an attempt to reconcile these differences, we have designed an abstract plan description language, ANMLite, inspired from the Action Notation Modeling Language (ANML) [17]. We present the basic concepts of the ANMLite language as well as an automatic translator from ANMLite to the model checker SAL (Symbolic Analysis Laboratory) [7]. We discuss various aspects of specifying a plan in terms of constraints and explore the implications of choosing a robust logic behind the specification of constraints, rather than simply propose a new planning language. Additionally, we provide an initial assessment of the efficiency of model checking to search for solutions of planning problems. To this end, we design a basic test benchmark and study the scalability of the generated SAL models in terms of plan complexity.
A Reference Architecture for Space Information Management
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Crichton, Daniel J.; Hughes, J. Steven; Ramirez, Paul M.; Berrios, Daniel C.
2006-01-01
We describe a reference architecture for space information management systems that elegantly overcomes the rigid design of common information systems in many domains. The reference architecture consists of a set of flexible, reusable, independent models and software components that function in unison, but remain separately managed entities. The main guiding principle of the reference architecture is to separate the various models of information (e.g., data, metadata, etc.) from implemented system code, allowing each to evolve independently. System modularity, systems interoperability, and dynamic evolution of information system components are the primary benefits of the design of the architecture. The architecture requires the use of information models that are substantially more advanced than those used by the vast majority of information systems. These models are more expressive and can be more easily modularized, distributed and maintained than simpler models e.g., configuration files and data dictionaries. Our current work focuses on formalizing the architecture within a CCSDS Green Book and evaluating the architecture within the context of the C3I initiative.
2009-01-01
Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075
Modeling of Biometric Identification System Using the Colored Petri Nets
NASA Astrophysics Data System (ADS)
Petrosyan, G. R.; Ter-Vardanyan, L. A.; Gaboutchian, A. V.
2015-05-01
In this paper we present a model of biometric identification system transformed into Petri Nets. Petri Nets, as a graphical and mathematical tool, provide a uniform environment for modelling, formal analysis, and design of discrete event systems. The main objective of this paper is to introduce the fundamental concepts of Petri Nets to the researchers and practitioners, both from identification systems, who are involved in the work in the areas of modelling and analysis of biometric identification types of systems, as well as those who may potentially be involved in these areas. In addition, the paper introduces high-level Petri Nets, as Colored Petri Nets (CPN). In this paper the model of Colored Petri Net describes the identification process much simpler.
Statistical mechanics of shell models for two-dimensional turbulence
NASA Astrophysics Data System (ADS)
Aurell, E.; Boffetta, G.; Crisanti, A.; Frick, P.; Paladin, G.; Vulpiani, A.
1994-12-01
We study shell models that conserve the analogs of energy and enstrophy and hence are designed to mimic fluid turbulence in two-dimensions (2D). The main result is that the observed state is well described as a formal statistical equilibrium, closely analogous to the approach to two-dimensional ideal hydrodynamics of Onsager [Nuovo Cimento Suppl. 6, 279 (1949)], Hopf [J. Rat. Mech. Anal. 1, 87 (1952)], and Lee [Q. Appl. Math. 10, 69 (1952)]. In the presence of forcing and dissipation we observe a forward flux of enstrophy and a backward flux of energy. These fluxes can be understood as mean diffusive drifts from a source to two sinks in a system which is close to local equilibrium with Lagrange multipliers (``shell temperatures'') changing slowly with scale. This is clear evidence that the simplest shell models are not adequate to reproduce the main features of two-dimensional turbulence. The dimensional predictions on the power spectra from a supposed forward cascade of enstrophy and from one branch of the formal statistical equilibrium coincide in these shell models in contrast to the corresponding predictions for the Navier-Stokes and Euler equations in 2D. This coincidence has previously led to the mistaken conclusion that shell models exhibit a forward cascade of enstrophy. We also study the dynamical properties of the models and the growth of perturbations.
Magrane, Diane; Helitzer, Deborah; Morahan, Page; Chang, Shine; Gleason, Katharine; Cardinali, Gina; Wu, Chih-Chieh
2012-12-01
Surprisingly little research is available to explain the well-documented organizational and societal influences on persistent inequities in advancement of women faculty. The Systems of Career Influences Model is a framework for exploring factors influencing women's progression to advanced academic rank, executive positions, and informal leadership roles in academic medicine. The model situates faculty as agents within a complex adaptive system consisting of a trajectory of career advancement with opportunities for formal professional development programming; a dynamic system of influences of organizational policies, practices, and culture; and a dynamic system of individual choices and decisions. These systems of influence may promote or inhibit career advancement. Within this system, women weigh competing influences to make career advancement decisions, and leaders of academic health centers prioritize limited resources to support the school's mission. The Systems of Career Influences Model proved useful to identify key research questions. We used the model to probe how research in academic career development might be applied to content and methods of formal professional development programs. We generated a series of questions and hypotheses about how professional development programs might influence professional development of health science faculty members. Using the model as a guide, we developed a study using a quantitative and qualitative design. These analyses should provide insight into what works in recruiting and supporting productive men and women faculty in academic medical centers.
Analyses of Public Utility Building - Students Designs, Aimed at their Energy Efficiency Improvement
NASA Astrophysics Data System (ADS)
Wołoszyn, Marek Adam
2017-10-01
Public utility buildings are formally, structurally and functionally complex entities. Frequently, the process of their design involves the retroactive reconsideration of energy engineering issues, once a building concept has already been completed. At that stage, minor formal corrections are made along with the design of the external layer of the building in order to satisfy applicable standards. Architecture students do the same when designing assigned public utility buildings. In order to demonstrate energy-related defects of building designs developed by students, the conduct of analyses was proposed. The completed designs of public utility buildings were examined with regard to energy efficiency of the solutions they feature through the application of the following programs: Ecotect, Vasari, and in case of simpler analyses ArchiCad program extensions were sufficient.
Palm: Easing the Burden of Analytical Performance Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tallent, Nathan R.; Hoisie, Adolfy
2014-06-01
Analytical (predictive) application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult because they must be both accurate and concise. To ease the burden of performance modeling, we developed Palm, a modeling tool that combines top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. To express insight, Palm defines a source code modeling annotation language. By coordinating models and source code, Palm's models are `first-class' and reproducible. Unlike prior work, Palm formally links models, functions, and measurements. As a result, Palm (a) uses functions to either abstract or express complexitymore » (b) generates hierarchical models (representing an application's static and dynamic structure); and (c) automatically incorporates measurements to focus attention, represent constant behavior, and validate models. We discuss generating models for three different applications.« less
Ontology for assessment studies of human-computer-interaction in surgery.
Machno, Andrej; Jannin, Pierre; Dameron, Olivier; Korb, Werner; Scheuermann, Gerik; Meixensberger, Jürgen
2015-02-01
New technologies improve modern medicine, but may result in unwanted consequences. Some occur due to inadequate human-computer-interactions (HCI). To assess these consequences, an investigation model was developed to facilitate the planning, implementation and documentation of studies for HCI in surgery. The investigation model was formalized in Unified Modeling Language and implemented as an ontology. Four different top-level ontologies were compared: Object-Centered High-level Reference, Basic Formal Ontology, General Formal Ontology (GFO) and Descriptive Ontology for Linguistic and Cognitive Engineering, according to the three major requirements of the investigation model: the domain-specific view, the experimental scenario and the representation of fundamental relations. Furthermore, this article emphasizes the distinction of "information model" and "model of meaning" and shows the advantages of implementing the model in an ontology rather than in a database. The results of the comparison show that GFO fits the defined requirements adequately: the domain-specific view and the fundamental relations can be implemented directly, only the representation of the experimental scenario requires minor extensions. The other candidates require wide-ranging extensions, concerning at least one of the major implementation requirements. Therefore, the GFO was selected to realize an appropriate implementation of the developed investigation model. The ensuing development considered the concrete implementation of further model aspects and entities: sub-domains, space and time, processes, properties, relations and functions. The investigation model and its ontological implementation provide a modular guideline for study planning, implementation and documentation within the area of HCI research in surgery. This guideline helps to navigate through the whole study process in the form of a kind of standard or good clinical practice, based on the involved foundational frameworks. Furthermore, it allows to acquire the structured description of the applied assessment methods within a certain surgical domain and to consider this information for own study design or to perform a comparison of different studies. The investigation model and the corresponding ontology can be used further to create new knowledge bases of HCI assessment in surgery. Copyright © 2014 Elsevier B.V. All rights reserved.
Surrogate models for efficient stability analysis of brake systems
NASA Astrophysics Data System (ADS)
Nechak, Lyes; Gillot, Frédéric; Besset, Sébastien; Sinou, Jean-Jacques
2015-07-01
This study assesses capacities of the global sensitivity analysis combined together with the kriging formalism to be useful in the robust stability analysis of brake systems, which is too costly when performed with the classical complex eigenvalues analysis (CEA) based on finite element models (FEMs). By considering a simplified brake system, the global sensitivity analysis is first shown very helpful for understanding the effects of design parameters on the brake system's stability. This is allowed by the so-called Sobol indices which discriminate design parameters with respect to their influence on the stability. Consequently, only uncertainty of influent parameters is taken into account in the following step, namely, the surrogate modelling based on kriging. The latter is then demonstrated to be an interesting alternative to FEMs since it allowed, with a lower cost, an accurate estimation of the system's proportions of instability corresponding to the influent parameters.
Recent Developments: PKI Square Dish for the Soleras Project
NASA Technical Reports Server (NTRS)
Rogers, W. E.
1984-01-01
The Square Dish solar collectors are subjected to rigorous design attention regarding corrosion at the site, and certification of the collector structure. The microprocessor controls and tracking mechanisms are improved in the areas of fail safe operations, durability, and low parasitic power requirements. Prototype testing demonstrates performance efficiency of approximately 72% at 730 F outlet temperature. Studies are conducted that include developing formal engineering design studies, developing formal engineering design drawing and fabrication details, establishing subcontracts for fabrication of major components, and developing a rigorous quality control system. The improved design is more cost effective to product and the extensive manuals developed for assembly and operation/maintenance result in faster field assembly and ease of operation.
Recent developments: PKI square dish for the Soleras Project
NASA Astrophysics Data System (ADS)
Rogers, W. E.
1984-03-01
The Square Dish solar collectors are subjected to rigorous design attention regarding corrosion at the site, and certification of the collector structure. The microprocessor controls and tracking mechanisms are improved in the areas of fail safe operations, durability, and low parasitic power requirements. Prototype testing demonstrates performance efficiency of approximately 72% at 730 F outlet temperature. Studies are conducted that include developing formal engineering design studies, developing formal engineering design drawing and fabrication details, establishing subcontracts for fabrication of major components, and developing a rigorous quality control system. The improved design is more cost effective to product and the extensive manuals developed for assembly and operation/maintenance result in faster field assembly and ease of operation.
Case Study of 'Engineering Peer Meetings' in JPL's ST-6 Project
NASA Technical Reports Server (NTRS)
Chao, Lawrence P.; Tumer, Irem
2004-01-01
This design process error-proofing case study describes a design review practice implemented by a project manager at NASA Jet Propulsion Laboratory. There are many types of reviews at NASA: required and not, formalized and informal, programmatic and technical. Standing project formal reviews such as the Preliminary Design Review (PDR) and Critical Design Review (CDR) are a required part of every project and mission development. However, the engineering peer reviews that support teams technical work on such projects are often informal, ad hoc, and inconsistent across the organization. This case study discusses issues and innovations identified by a project manager at JPL and implemented in 'engineering peer meetings' for his group.
Case Study of "Engineering Peer Meetings" in JPL's ST-6 Project
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; Chao, Lawrence P.
2003-01-01
This design process error-proofing case study describes a design review practice implemented by a project manager at NASA Jet Propulsion Laboratory. There are many types of reviews at NASA: required and not, formalized and informal, programmatic and technical. Standing project formal reviews such as the Preliminary Design Review (PDR) and Critical Design Review (CDR) are a required part of every project and mission development. However, the engineering peer reviews that support teams technical work on such projects are often informal, ad hoc, and inconsistent across the organization. This case study discusses issues and innovations identified by a project manager at JPL and implemented in "engineering peer meetings" for his group.
Bierer, S Beth; Dannefer, Elaine F; Tetzlaff, John E
2015-09-01
Remediation in the era of competency-based assessment demands a model that empowers students to improve performance. To examine a remediation model where students, rather than faculty, develop remedial plans to improve performance. Private medical school, 177 medical students. A promotion committee uses student-generated portfolios and faculty referrals to identify struggling students, and has them develop formal remediation plans with personal reflections, improvement strategies, and performance evidence. Students submit reports to document progress until formally released from remediation by the promotion committee. Participants included 177 students from six classes (2009-2014). Twenty-six were placed in remediation, with more referrals occurring during Years 1 or 2 (n = 20, 76 %). Unprofessional behavior represented the most common reason for referral in Years 3-5. Remedial students did not differ from classmates (n = 151) on baseline characteristics (Age, Gender, US citizenship, MCAT) or willingness to recommend their medical school to future students (p < 0.05). Two remedial students did not graduate and three did not pass USLME licensure exams on first attempt. Most remedial students (92 %) generated appropriate plans to address performance deficits. Students can successfully design remedial interventions. This learner-driven remediation model promotes greater autonomy and reinforces self-regulated learning.
A Formalization of HIPAA for a Medical Messaging System
NASA Astrophysics Data System (ADS)
Lam, Peifung E.; Mitchell, John C.; Sundaram, Sharada
The complexity of regulations in healthcare, financial services, and other industries makes it difficult for enterprises to design and deploy effective compliance systems. We believe that in some applications, it may be practical to support compliance by using formalized portions of applicable laws to regulate business processes that use information systems. In order to explore this possibility, we use a stratified fragment of Prolog with limited use of negation to formalize a portion of the US Health Insurance Portability and Accountability Act (HIPAA). As part of our study, we also explore the deployment of our formalization in a prototype hospital Web portal messaging system.
An overview of very high level software design methods
NASA Technical Reports Server (NTRS)
Asdjodi, Maryam; Hooper, James W.
1988-01-01
Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.
Improved formalism for precision Higgs coupling fits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon
Future e +e – colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e +e – data, based on the effective field theory description of corrections to the Standard Model. Lastly, we apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e +e – colliders.
Improved formalism for precision Higgs coupling fits
Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; ...
2018-03-20
Future e +e – colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e +e – data, based on the effective field theory description of corrections to the Standard Model. Lastly, we apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e +e – colliders.
A theoretical framework for improving education in geriatric medicine.
Boreham, N C
1983-01-01
Alternative concepts of learning include a formal system in which part of the medical curriculum is designated as that for geriatric medicine; a non-formal system including conferences, lectures, broadcasts, available to both medical students and physicians; and thirdly, an informal system in which doctors learn medicine through their experience practising the profession. While the most emphasis in medical schools would seem to be on the formal system it is essential that medical educators (if they wish their students in later life to maintain high levels of self-initiated learning) must use all three strategies. The structure of a system of formal teaching for geriatric medicine is examined. An important objective is attitude change and it is in achieving this that geriatricians must be particularly involved in non-formal and informal systems.
δ M formalism and anisotropic chaotic inflation power spectrum
NASA Astrophysics Data System (ADS)
Talebian-Ashkezari, A.; Ahmadi, N.
2018-05-01
A new analytical approach to linear perturbations in anisotropic inflation has been introduced in [A. Talebian-Ashkezari, N. Ahmadi and A.A. Abolhasani, JCAP 03 (2018) 001] under the name of δ M formalism. In this paper we apply the mentioned approach to a model of anisotropic inflation driven by a scalar field, coupled to the kinetic term of a vector field with a U(1) symmetry. The δ M formalism provides an efficient way of computing tensor-tensor, tensor-scalar as well as scalar-scalar 2-point correlations that are needed for the analysis of the observational features of an anisotropic model on the CMB. A comparison between δ M results and the tedious calculations using in-in formalism shows the aptitude of the δ M formalism in calculating accurate two point correlation functions between physical modes of the system.
THE CIDOC CRM GAME: A Serious Game Approach to Ontology Learning
NASA Astrophysics Data System (ADS)
Guillem, A.; Bruseker, G.
2017-08-01
Formal ontologies such as CIDOC CRM (Conceptual Reference Model) form part of the central strategy for the medium and longterm integration of cultural heritage data to allow for its greater valorization and dissemination. Despite this, uptake of CIDOC CRM at the ground level of Cultural Heriage (CH) practice is limited. Part of the reason behind this lack of uptake lies in the fact that ontologies are considered too complicated and abstract for application in real life scenarios. This paper presents the rationale behind and the design of a CIDOC CRM game, the intent of which is to provide a learning mechanism to allow learners of wide backgrounds and interests to approach CIDOC CRM in a hands-on and interactive fashion. The CIDOC CRM game consist of decks of cards and game boards that allow players to engage with the concepts of a formal ontology in relation to real data in an entertaining and informative way. It is argued that the CIDOC CRM Game can form an important part of introducing the basic elements of formal ontology and this standard to a wider audience in order to aid wider understanding and adoption of the same.
Compositional and enumerative designs for medical language representation.
Rassinoux, A. M.; Miller, R. A.; Baud, R. H.; Scherrer, J. R.
1997-01-01
Medical language is in essence highly compositional, allowing complex information to be expressed from more elementary pieces. Embedding the expressive power of medical language into formal systems of representation is recognized in the medical informatics community as a key step towards sharing such information among medical record, decision support, and information retrieval systems. Accordingly, such representation requires managing both the expressiveness of the formalism and its computational tractability, while coping with the level of detail expected by clinical applications. These desiderata can be supported by enumerative as well as compositional approaches, as argued in this paper. These principles have been applied in recasting a frame-based system for general medical findings developed during the 1980s. The new system captures the precise meaning of a subset of over 1500 medical terms for general internal medicine identified from the Quick Medical Reference (QMR) lexicon. In order to evaluate the adequacy of this formal structure in reflecting the deep meaning of the QMR findings, a validation process was implemented. It consists of automatically rebuilding the semantic representation of the QMR findings by analyzing them through the RECIT natural language analyzer, whose semantic components have been adjusted to this frame-based model for the understanding task. PMID:9357700
Compositional and enumerative designs for medical language representation.
Rassinoux, A M; Miller, R A; Baud, R H; Scherrer, J R
1997-01-01
Medical language is in essence highly compositional, allowing complex information to be expressed from more elementary pieces. Embedding the expressive power of medical language into formal systems of representation is recognized in the medical informatics community as a key step towards sharing such information among medical record, decision support, and information retrieval systems. Accordingly, such representation requires managing both the expressiveness of the formalism and its computational tractability, while coping with the level of detail expected by clinical applications. These desiderata can be supported by enumerative as well as compositional approaches, as argued in this paper. These principles have been applied in recasting a frame-based system for general medical findings developed during the 1980s. The new system captures the precise meaning of a subset of over 1500 medical terms for general internal medicine identified from the Quick Medical Reference (QMR) lexicon. In order to evaluate the adequacy of this formal structure in reflecting the deep meaning of the QMR findings, a validation process was implemented. It consists of automatically rebuilding the semantic representation of the QMR findings by analyzing them through the RECIT natural language analyzer, whose semantic components have been adjusted to this frame-based model for the understanding task.
ERIC Educational Resources Information Center
Kalechofsky, Robert
This research paper proposes several mathematical models which help clarify Piaget's theory of cognition on the concrete and formal operational stages. Some modified lattice models were used for the concrete stage and a combined Boolean Algebra and group theory model was used for the formal stage. The researcher used experiments cited in the…
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
NASA Technical Reports Server (NTRS)
Manning, Robert M.
1986-01-01
A rain attenuation prediction model is described for use in calculating satellite communication link availability for any specific location in the world that is characterized by an extended record of rainfall. Such a formalism is necessary for the accurate assessment of such availability predictions in the case of the small user-terminal concept of the Advanced Communication Technology Satellite (ACTS) Project. The model employs the theory of extreme value statistics to generate the necessary statistical rainrate parameters from rain data in the form compiled by the National Weather Service. These location dependent rain statistics are then applied to a rain attenuation model to obtain a yearly prediction of the occurrence of attenuation on any satellite link at that location. The predictions of this model are compared to those of the Crane Two-Component Rain Model and some empirical data and found to be very good. The model is then used to calculate rain attenuation statistics at 59 locations in the United States (including Alaska and Hawaii) for the 20 GHz downlinks and 30 GHz uplinks of the proposed ACTS system. The flexibility of this modeling formalism is such that it allows a complete and unified treatment of the temporal aspects of rain attenuation that leads to the design of an optimum stochastic power control algorithm, the purpose of which is to efficiently counter such rain fades on a satellite link.
Problem-based learning: effects on student’s scientific reasoning skills in science
NASA Astrophysics Data System (ADS)
Wulandari, F. E.; Shofiyah, N.
2018-04-01
This research aimed to develop instructional package of problem-based learning to enhance student’s scientific reasoning from concrete to formal reasoning skills level. The instructional package was developed using the Dick and Carey Model. Subject of this study was instructional package of problem-based learning which was consisting of lesson plan, handout, student’s worksheet, and scientific reasoning test. The instructional package was tried out on 4th semester science education students of Universitas Muhammadiyah Sidoarjo by using the one-group pre-test post-test design. The data of scientific reasoning skills was collected by making use of the test. The findings showed that the developed instructional package reflecting problem-based learning was feasible to be implemented in classroom. Furthermore, through applying the problem-based learning, students could dominate formal scientific reasoning skills in terms of functionality and proportional reasoning, control variables, and theoretical reasoning.
A Semantic Approach for Geospatial Information Extraction from Unstructured Documents
NASA Astrophysics Data System (ADS)
Sallaberry, Christian; Gaio, Mauro; Lesbegueries, Julien; Loustau, Pierre
Local cultural heritage document collections are characterized by their content, which is strongly attached to a territory and its land history (i.e., geographical references). Our contribution aims at making the content retrieval process more efficient whenever a query includes geographic criteria. We propose a core model for a formal representation of geographic information. It takes into account characteristics of different modes of expression, such as written language, captures of drawings, maps, photographs, etc. We have developed a prototype that fully implements geographic information extraction (IE) and geographic information retrieval (IR) processes. All PIV prototype processing resources are designed as Web Services. We propose a geographic IE process based on semantic treatment as a supplement to classical IE approaches. We implement geographic IR by using intersection computing algorithms that seek out any intersection between formal geocoded representations of geographic information in a user query and similar representations in document collection indexes.
Prayer and Adolescence: Can Formal Instruction Make a Difference?
ERIC Educational Resources Information Center
Sigel, Deena
2016-01-01
This design experiment in prayer education for Jewish students was motivated by a current educational concern: educating for spirituality and religious practice. Educators are tasked with formally nurturing spirituality (Wright 2001). It is known that attitude to religious observance may change during adolescence (Hyde 1963), thus attitude to…
Hearing-Impaired Formal Inservice Program.
ERIC Educational Resources Information Center
Northeast Regional Media Center for the Deaf, Amherst, MA.
The HI-FI (Hearing-Impaired Formal Inservice) Program is described as a set of inservice materials targeted for workshops of regular classroom teachers and other school personnel concerned with school district and classroom management of hearing impaired (HI) children. An introductory section focuses on the design of the program materials,…
Educating Pediatric Residents about Developmental and Social-Emotional Health
ERIC Educational Resources Information Center
Bauer, Sarah C.; Smith, Peter J.; Chien, Alyna T.; Berry, Anita D.; Msall, Michael
2009-01-01
Enhancing Developmentally Oriented Primary Care (EDOPC) is a formal didactic curriculum based on Healthy Steps materials that is designed to improve practicing pediatricians' knowledge and confidence in developmental screening within the medical home. We modified the EDOPC program to provide a formal curriculum to pediatric residents serving…
Preschool Children Learn about Causal Structure from Conditional Interventions
ERIC Educational Resources Information Center
Schulz, Laura E.; Gopnik, Alison; Glymour, Clark
2007-01-01
The conditional intervention principle is a formal principle that relates patterns of interventions and outcomes to causal structure. It is a central assumption of experimental design and the causal Bayes net formalism. Two studies suggest that preschoolers can use the conditional intervention principle to distinguish causal chains, common cause…
Informal and Formal Learning of General Practitioners
ERIC Educational Resources Information Center
Spaan, Nadia Roos; Dekker, Anne R. J.; van der Velden, Alike W.; de Groot, Esther
2016-01-01
Purpose: The purpose of this study is to understand the influence of formal learning from a web-based training and informal (workplace) learning afterwards on the behaviour of general practitioners (GPs) with respect to prescription of antibiotics. Design/methodology/approach: To obtain insight in various learning processes, semi-structured…
ERIC Educational Resources Information Center
Arons, A. B.
1976-01-01
Describes special factors and procedures which are utilized in an introductory physical science course for nonscience majors. It is designed to enable students who are at a concrete or transitional stage to attain the formal operational level of development. (Author/SL)
Operant Learning, Cognitive Development, and Job Aids.
ERIC Educational Resources Information Center
Harmon, N. Paul; King, David R.
1979-01-01
Examines the relationship between learning and development in the most general terms, discusses the developmental distinction between concrete and formal operational thought as manifested in adult behavior, and considers the implications of the concrete-formal dichotomy for the design and use of job aids. Notes and a bibliography are provided.…
Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B
NASA Technical Reports Server (NTRS)
Yeganefard, Sanaz; Butler, Michael; Rezazadeh, Abdolbaghi
2010-01-01
Recently a set of guidelines, or cookbook, has been developed for modelling and refinement of control problems in Event-B. The Event-B formal method is used for system-level modelling by defining states of a system and events which act on these states. It also supports refinement of models. This cookbook is intended to systematize the process of modelling and refining a control problem system by distinguishing environment, controller and command phenomena. Our main objective in this paper is to investigate and evaluate the usefulness and effectiveness of this cookbook by following it throughout the formal modelling of cruise control system found in cars. The outcomes are identifying the benefits of the cookbook and also giving guidance to its future users.
37 CFR 1.152 - Design drawings.
Code of Federal Regulations, 2012 CFR
2012-07-01
... are not permitted in a design drawing. Photographs and ink drawings are not permitted to be combined as formal drawings in one application. Photographs submitted in lieu of ink drawings in design patent...
37 CFR 1.152 - Design drawings.
Code of Federal Regulations, 2014 CFR
2014-07-01
... are not permitted in a design drawing. Photographs and ink drawings are not permitted to be combined as formal drawings in one application. Photographs submitted in lieu of ink drawings in design patent...
Boman, Inga-Lill; Persson, Ann-Christine; Bartfai, Aniko
2016-03-07
This project Smart Assisted Living involving Informal careGivers++ (SALIG) intends to develop an ICT-based device for persons with cognitive impairment combined with remote support possibilities for significant others and formal caregivers. This paper presents the identification of the target groups' needs and requirements of such device and the evaluation of the first mock-up, demonstrated in a tablet. The inclusive design method that includes end-users in the design process was chosen. First, a scoping review was conducted in order to examine the target group's need of an ICT-based device, and to gather recommendations regarding its design and functionalities. In order to capture the users' requirements of the design and functionalities of the device three targeted focus groups were conducted. Based on the findings from the publications and the focus groups a user requirement specification was developed. After that a design concept and a first mock-up was developed in an iterative process. The mock-up was evaluated through interviews with persons with cognitive impairment, health care professionals and significant others. Data were analysed using content analysis. Several useful recommendations of the design and functionalities of the SALIG device for persons with cognitive impairment were identified. The main benefit of the mock-up was that it was a single device with a set of functionalities installed on a tablet and designed for persons with cognitive impairment. An additional benefit was that it could be used remotely by significant others and formal caregivers. The SALIG device has the potentials to facilitate everyday life for persons with cognitive impairment, their significant others and the work situation for formal caregivers. The results may provide guidance in the development of different types of technologies for the target population and for people with diverse disabilities. Further work will focus on developing a prototype to be empirically tested by persons with cognitive impairment, their significant others and formal caregivers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yavari, M., E-mail: yavari@iaukashan.ac.ir
2016-06-15
We generalize the results of Nesterenko [13, 14] and Gogilidze and Surovtsev [15] for DNA structures. Using the generalized Hamiltonian formalism, we investigate solutions of the equilibrium shape equations for the linear free energy model.
Das, Ashok Kumar; Odelu, Vanga; Goswami, Adrijit
2015-09-01
The telecare medicine information system (TMIS) helps the patients to gain the health monitoring facility at home and access medical services over the Internet of mobile networks. Recently, Amin and Biswas presented a smart card based user authentication and key agreement security protocol usable for TMIS system using the cryptographic one-way hash function and biohashing function, and claimed that their scheme is secure against all possible attacks. Though their scheme is efficient due to usage of one-way hash function, we show that their scheme has several security pitfalls and design flaws, such as (1) it fails to protect privileged-insider attack, (2) it fails to protect strong replay attack, (3) it fails to protect strong man-in-the-middle attack, (4) it has design flaw in user registration phase, (5) it has design flaw in login phase, (6) it has design flaw in password change phase, (7) it lacks of supporting biometric update phase, and (8) it has flaws in formal security analysis. In order to withstand these security pitfalls and design flaws, we aim to propose a secure and robust user authenticated key agreement scheme for the hierarchical multi-server environment suitable in TMIS using the cryptographic one-way hash function and fuzzy extractor. Through the rigorous security analysis including the formal security analysis using the widely-accepted Burrows-Abadi-Needham (BAN) logic, the formal security analysis under the random oracle model and the informal security analysis, we show that our scheme is secure against possible known attacks. Furthermore, we simulate our scheme using the most-widely accepted and used Automated Validation of Internet Security Protocols and Applications (AVISPA) tool. The simulation results show that our scheme is also secure. Our scheme is more efficient in computation and communication as compared to Amin-Biswas's scheme and other related schemes. In addition, our scheme supports extra functionality features as compared to other related schemes. As a result, our scheme is very appropriate for practical applications in TMIS.
NASA Technical Reports Server (NTRS)
Ciardo, Gianfranco
2004-01-01
The Runway Safety Monitor (RSM) designed by Lockheed Martin is part of NASA's effort to reduce aviation accidents. We developed a Petri net model of the RSM protocol and used the model checking functions of our tool SMART to investigate a number of safety properties in RSM. To mitigate the impact of state-space explosion, we built a highly discretized model of the system, obtained by partitioning the monitored runway zone into a grid of smaller volumes and by considering scenarios involving only two aircraft. The model also assumes that there are no communication failures, such as bad input from radar or lack of incoming data, thus it relies on a consistent view of reality by all participants. In spite of these simplifications, we were able to expose potential problems in the RSM conceptual design. Our findings were forwarded to the design engineers, who undertook corrective action. Additionally, the results stress the efficiency attained by the new model checking algorithms implemented in SMART, and demonstrate their applicability to real-world systems. Attempts to verify RSM with NuSMV and SPIN have failed due to excessive memory consumption.
Formal Verification of a Power Controller Using the Real-Time Model Checker UPPAAL
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Larsen, Kim Guldstrand; Skou, Arne
1999-01-01
A real-time system for power-down control in audio/video components is modeled and verified using the real-time model checker UPPAAL. The system is supposed to reside in an audio/video component and control (read from and write to) links to neighbor audio/video components such as TV, VCR and remote-control. In particular, the system is responsible for the powering up and down of the component in between the arrival of data, and in order to do so in a safe way without loss of data, it is essential that no link interrupts are lost. Hence, a component system is a multitasking system with hard real-time requirements, and we present techniques for modeling time consumption in such a multitasked, prioritized system. The work has been carried out in a collaboration between Aalborg University and the audio/video company B&O. By modeling the system, 3 design errors were identified and corrected, and the following verification confirmed the validity of the design but also revealed the necessity for an upper limit of the interrupt frequency. The resulting design has been implemented and it is going to be incorporated as part of a new product line.
A formal approach to the analysis of clinical computer-interpretable guideline modeling languages.
Grando, M Adela; Glasspool, David; Fox, John
2012-01-01
To develop proof strategies to formally study the expressiveness of workflow-based languages, and to investigate their applicability to clinical computer-interpretable guideline (CIG) modeling languages. We propose two strategies for studying the expressiveness of workflow-based languages based on a standard set of workflow patterns expressed as Petri nets (PNs) and notions of congruence and bisimilarity from process calculus. Proof that a PN-based pattern P can be expressed in a language L can be carried out semi-automatically. Proof that a language L cannot provide the behavior specified by a PNP requires proof by exhaustion based on analysis of cases and cannot be performed automatically. The proof strategies are generic but we exemplify their use with a particular CIG modeling language, PROforma. To illustrate the method we evaluate the expressiveness of PROforma against three standard workflow patterns and compare our results with a previous similar but informal comparison. We show that the two proof strategies are effective in evaluating a CIG modeling language against standard workflow patterns. We find that using the proposed formal techniques we obtain different results to a comparable previously published but less formal study. We discuss the utility of these analyses as the basis for principled extensions to CIG modeling languages. Additionally we explain how the same proof strategies can be reused to prove the satisfaction of patterns expressed in the declarative language CIGDec. The proof strategies we propose are useful tools for analysing the expressiveness of CIG modeling languages. This study provides good evidence of the benefits of applying formal methods of proof over semi-formal ones. Copyright © 2011 Elsevier B.V. All rights reserved.
The nature of advanced reasoning and science instruction
NASA Astrophysics Data System (ADS)
Lawson, Anton E.
Although the development of reasoning is recognized as an important goal of science instruction, its nature remains somewhat of a mystery. This article discusses two key questions: Does formal thought constitute a structured whole? And what role does propositional logic play in advanced reasoning? Aspects of a model of advanced reasoning are presented in which hypothesis generation and testing are viewed as central processes in intellectual development. It is argued that a number of important advanced reasoning schemata are linked by these processes and should be made a part of science instruction designed to improve students' reasoning abilities.Concerning students' development and use of formal reasoning, Linn (1982) calls for research into practical issues such as the roles of task-specific knowledge and individual differences in performance, roles not emphasized by Piaget in his theory and research. From a science teacher's point of view, this is good advice. Accordingly, this article will expand upon some of the issues raised by Linn in a discussion of the nature of advanced reasoning which attempts to reconcile the apparent contradiction between students' differential use of advanced reasoning schemata in varying contexts with the notion of a general stage of formal thought. Two key questions will be discussed: Does formal thought constitute a structured whole? And what role does propositional logic play in advanced reasoning? The underlying assumption of the present discussion is that, among other things, science instruction should concern itself with the improvement of students' reasoning abilities (cf. Arons, 1976; Arons & Karplus, 1976; Bady, 1979; Bauman, 1976; Educational Policies Commission, 1966; Herron, 1978; Karplus, 1979; Kohlberg & Mayer, 1972; Moshman & Thompson, 1981; Lawson, 1979; Levine & linn, 1977; Pallrand, 1977; Renner & Lawson, 1973; Sayre & Ball, 1975; Schneider & Renner, 1980; Wollman, 1978). The questions are of interest because to date they lack clear answers, yet clear answers are necessary if we hope to design effective instruction in reasoning.
A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover
NASA Technical Reports Server (NTRS)
Richards, Dominic; Lester, David
2010-01-01
Bluespec SystemVerilog (BSV) is a Hardware Description Language based on the guarded action model of concurrency. It has an elegant semantics, which makes it well suited for formal reasoning. To date, a number of BSV designs have been verified with hand proofs, but little work has been conducted on the application of automated reasoning. We present a prototype shallow embedding of BSV in the PVS theorem prover. Our embedding is compatible with the PVS model checker, which can automatically prove an important class of theorems, and can also be used in conjunction with the powerful proof strategies of PVS to verify a broader class of properties than can be achieved with model checking alone.
Architecture, Design, Implementatio
2003-05-01
The terms architecture , design , and implementation are typically used informally in partitioning software specifications into three coarse strata of...we formalize the Intension and the Locality criteria, which imply that the distinction between architecture , design , and implementation is
The Archival Photograph and Its Meaning: Formalisms for Modeling Images
ERIC Educational Resources Information Center
Benson, Allen C.
2009-01-01
This article explores ontological principles and their potential applications in the formal description of archival photographs. Current archival descriptive practices are reviewed and the larger question is addressed: do archivists who are engaged in describing photographs need a more formalized system of representation, or do existing encoding…
Thermodynamically Feasible Kinetic Models of Reaction Networks
Ederer, Michael; Gilles, Ernst Dieter
2007-01-01
The dynamics of biological reaction networks are strongly constrained by thermodynamics. An holistic understanding of their behavior and regulation requires mathematical models that observe these constraints. However, kinetic models may easily violate the constraints imposed by the principle of detailed balance, if no special care is taken. Detailed balance demands that in thermodynamic equilibrium all fluxes vanish. We introduce a thermodynamic-kinetic modeling (TKM) formalism that adapts the concepts of potentials and forces from irreversible thermodynamics to kinetic modeling. In the proposed formalism, the thermokinetic potential of a compound is proportional to its concentration. The proportionality factor is a compound-specific parameter called capacity. The thermokinetic force of a reaction is a function of the potentials. Every reaction has a resistance that is the ratio of thermokinetic force and reaction rate. For mass-action type kinetics, the resistances are constant. Since it relies on the thermodynamic concept of potentials and forces, the TKM formalism structurally observes detailed balance for all values of capacities and resistances. Thus, it provides an easy way to formulate physically feasible, kinetic models of biological reaction networks. The TKM formalism is useful for modeling large biological networks that are subject to many detailed balance relations. PMID:17208985
Maternal employment and childhood overweight in low- and middle-income countries.
Oddo, Vanessa M; Mueller, Noel T; Pollack, Keshia M; Surkan, Pamela J; Bleich, Sara N; Jones-Smith, Jessica C
2017-10-01
To investigate the association between maternal employment and childhood overweight in low- and middle-income countries (LMIC). Design/Setting We utilized cross-sectional data from forty-five Demographic and Health Surveys from 2010 to 2016 (n 268 763). Mothers were categorized as formally employed, informally employed or non-employed. We used country-specific logistic regression models to investigate the association between maternal employment and childhood overweight (BMI Z-score>2) and assessed heterogeneity in the association by maternal education with the inclusion of an interaction term. We used meta-analysis to pool the associations across countries. Sensitivity analyses included modelling BMI Z-score and normal weight (weight-for-age Z-score≥-2 to <2) as outcomes. Participants included children 0-5 years old and their mothers (aged 18-49 years). In most countries, neither formal nor informal employment was associated with childhood overweight. However, children of employed mothers, compared with children of non-employed mothers, had higher BMI Z-score and higher odds of normal weight. In countries where the association varied by education, children of formally employed women with high education, compared with children of non-employed women with high education, had higher odds of overweight (pooled OR=1·2; 95 % CI 1·0, 1·4). We find no clear association between employment and child overweight. However, maternal employment is associated with a modestly higher BMI Z-score and normal weight, suggesting that employment is currently associated with beneficial effects on children's weight status in most LMIC.
Do we face a fourth paradigm shift in medicine--algorithms in education?
Eitel, F; Kanz, K G; Hortig, E; Tesche, A
2000-08-01
Medicine has evolved toward rationalization since the Enlightenment, favouring quantitative measures. Now, a paradigm shift toward control through formalization can be observed in health care whose structures and processes are subjected to increasing standardization. However, educational reforms and curricula do not yet adequately respond to this shift. The aim of this article is to describe innovative approaches in medical education for adapting to these changes. The study design is a descriptive case report relying on a literature review and on a reform project's evaluation. Concept mapping is used to graphically represent relationships among concepts, i.e. defined terms from educational literature. Definitions of 'concept map', 'guideline' and 'algorithm' are presented. A prototypical algorithm for organizational decision making in the project's instructional design is shown. Evaluation results of intrinsic learning motivation are demonstrated: intrinsic learning motivation depends upon students' perception of their competence exhibiting path coefficients varying from 0.42 to 0.51. Perception of competence varies with the type of learning environment. An innovative educational format, called 'evidence-based learning (EBL)' is deduced from these findings and described here. Effects of formalization consist of structuring decision making about implementation of different learning environments or about minimizing variance in teaching or learning. Unintended effects of formalization such as implementation problems and bureaucracy are discussed. Formalized tools for designing medical education are available. Specific instructional designs influence students' learning motivation. Concept maps are suitable for controlling educational quality, thus enabling the paradigm shift in medical education.
On the Adequacy of Current Empirical Evaluations of Formal Models of Categorization
ERIC Educational Resources Information Center
Wills, Andy J.; Pothos, Emmanuel M.
2012-01-01
Categorization is one of the fundamental building blocks of cognition, and the study of categorization is notable for the extent to which formal modeling has been a central and influential component of research. However, the field has seen a proliferation of noncomplementary models with little consensus on the relative adequacy of these accounts.…
Reasoning with Conditionals: A Test of Formal Models of Four Theories
ERIC Educational Resources Information Center
Oberauer, Klaus
2006-01-01
The four dominant theories of reasoning from conditionals are translated into formal models: The theory of mental models (Johnson-Laird, P. N., & Byrne, R. M. J. (2002). Conditionals: a theory of meaning, pragmatics, and inference. "Psychological Review," 109, 646-678), the suppositional theory (Evans, J. S. B. T., & Over, D. E. (2004). "If."…
ERIC Educational Resources Information Center
Grabinska, Teresa; Zielinska, Dorota
2010-01-01
The authors examine language from the perspective of models of empirical sciences, which discipline studies the relationship between reality, models, and formalisms. Such a perspective allows one to notice that linguistics approached within the classical framework share a number of problems with other experimental sciences studied initially…
Managing Risk in Mobile Applications with Formal Security Policies
2013-04-01
Alternatively, Breaux and Powers (2009) found the Business Process Modeling Notation ( BPMN ), a declarative language for describing business processes, to be...the Business Process Execution Language (BPEL), preferred as the candidate formal semantics for BPMN , only works for limited classes of BPMN models
A Formal Valuation Framework for Emotions and Their Control.
Huys, Quentin J M; Renz, Daniel
2017-09-15
Computational psychiatry aims to apply mathematical and computational techniques to help improve psychiatric care. To achieve this, the phenomena under scrutiny should be within the scope of formal methods. As emotions play an important role across many psychiatric disorders, such computational methods must encompass emotions. Here, we consider formal valuation accounts of emotions. We focus on the fact that the flexibility of emotional responses and the nature of appraisals suggest the need for a model-based valuation framework for emotions. However, resource limitations make plain model-based valuation impossible and require metareasoning strategies to apportion cognitive resources adaptively. We argue that emotions may implement such metareasoning approximations by restricting the range of behaviors and states considered. We consider the processes that guide the deployment of the approximations, discerning between innate, model-free, heuristic, and model-based controllers. A formal valuation and metareasoning framework may thus provide a principled approach to examining emotions. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Different Strokes for Different Folks: Visual Presentation Design between Disciplines
Gomez, Steven R.; Jianu, Radu; Ziemkiewicz, Caroline; Guo, Hua; Laidlaw, David H.
2015-01-01
We present an ethnographic study of design differences in visual presentations between academic disciplines. Characterizing design conventions between users and data domains is an important step in developing hypotheses, tools, and design guidelines for information visualization. In this paper, disciplines are compared at a coarse scale between four groups of fields: social, natural, and formal sciences; and the humanities. Two commonplace presentation types were analyzed: electronic slideshows and whiteboard “chalk talks”. We found design differences in slideshows using two methods – coding and comparing manually-selected features, like charts and diagrams, and an image-based analysis using PCA called eigenslides. In whiteboard talks with controlled topics, we observed design behaviors, including using representations and formalisms from a participant’s own discipline, that suggest authors might benefit from novel assistive tools for designing presentations. Based on these findings, we discuss opportunities for visualization ethnography and human-centered authoring tools for visual information. PMID:26357149
Different Strokes for Different Folks: Visual Presentation Design between Disciplines.
Gomez, S R; Jianu, R; Ziemkiewicz, C; Guo, Hua; Laidlaw, D H
2012-12-01
We present an ethnographic study of design differences in visual presentations between academic disciplines. Characterizing design conventions between users and data domains is an important step in developing hypotheses, tools, and design guidelines for information visualization. In this paper, disciplines are compared at a coarse scale between four groups of fields: social, natural, and formal sciences; and the humanities. Two commonplace presentation types were analyzed: electronic slideshows and whiteboard "chalk talks". We found design differences in slideshows using two methods - coding and comparing manually-selected features, like charts and diagrams, and an image-based analysis using PCA called eigenslides. In whiteboard talks with controlled topics, we observed design behaviors, including using representations and formalisms from a participant's own discipline, that suggest authors might benefit from novel assistive tools for designing presentations. Based on these findings, we discuss opportunities for visualization ethnography and human-centered authoring tools for visual information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.
2015-09-01
A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less
The New Italian Seismic Hazard Model
NASA Astrophysics Data System (ADS)
Marzocchi, W.; Meletti, C.; Albarello, D.; D'Amico, V.; Luzi, L.; Martinelli, F.; Pace, B.; Pignone, M.; Rovida, A.; Visini, F.
2017-12-01
In 2015 the Seismic Hazard Center (Centro Pericolosità Sismica - CPS) of the National Institute of Geophysics and Volcanology was commissioned of coordinating the national scientific community with the aim to elaborate a new reference seismic hazard model, mainly finalized to the update of seismic code. The CPS designed a roadmap for releasing within three years a significantly renewed PSHA model, with regard both to the updated input elements and to the strategies to be followed. The main requirements of the model were discussed in meetings with the experts on earthquake engineering that then will participate to the revision of the building code. The activities were organized in 6 tasks: program coordination, input data, seismicity models, ground motion predictive equations (GMPEs), computation and rendering, testing. The input data task has been selecting the most updated information about seismicity (historical and instrumental), seismogenic faults, and deformation (both from seismicity and geodetic data). The seismicity models have been elaborating in terms of classic source areas, fault sources and gridded seismicity based on different approaches. The GMPEs task has selected the most recent models accounting for their tectonic suitability and forecasting performance. The testing phase has been planned to design statistical procedures to test with the available data the whole seismic hazard models, and single components such as the seismicity models and the GMPEs. In this talk we show some preliminary results, summarize the overall strategy for building the new Italian PSHA model, and discuss in detail important novelties that we put forward. Specifically, we adopt a new formal probabilistic framework to interpret the outcomes of the model and to test it meaningfully; this requires a proper definition and characterization of both aleatory variability and epistemic uncertainty that we accomplish through an ensemble modeling strategy. We use a weighting scheme of the different components of the PSHA model that has been built through three different independent steps: a formal experts' elicitation, the outcomes of the testing phase, and the correlation between the outcomes. Finally, we explore through different techniques the influence on seismic hazard of the declustering procedure.
Some Implications of a Behavioral Analysis of Verbal Behavior for Logic and Mathematics
2013-01-01
The evident power and utility of the formal models of logic and mathematics pose a puzzle: Although such models are instances of verbal behavior, they are also essentialistic. But behavioral terms, and indeed all products of selection contingencies, are intrinsically variable and in this respect appear to be incommensurate with essentialism. A distinctive feature of verbal contingencies resolves this puzzle: The control of behavior by the nonverbal environment is often mediated by the verbal behavior of others, and behavior under control of verbal stimuli is blind to the intrinsic variability of the stimulating environment. Thus, words and sentences serve as filters of variability and thereby facilitate essentialistic model building and the formal structures of logic, mathematics, and science. Autoclitic frames, verbal chains interrupted by interchangeable variable terms, are ubiquitous in verbal behavior. Variable terms can be substituted in such frames almost without limit, a feature fundamental to formal models. Consequently, our fluency with autoclitic frames fosters generalization to formal models, which in turn permit deduction and other kinds of logical and mathematical inference. PMID:28018038
Comprehensive Aspectual UML Approach to Support AspectJ
Magableh, Aws; Shukur, Zarina; Mohd. Ali, Noorazean
2014-01-01
Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a “good design” criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs. PMID:25136656
A Rule-Based Modeling for the Description of Flexible and Self-healing Business Processes
NASA Astrophysics Data System (ADS)
Boukhebouze, Mohamed; Amghar, Youssef; Benharkat, Aïcha-Nabila; Maamar, Zakaria
In this paper we discuss the importance of ensuring that business processes are label robust and agile at the same time robust and agile. To this end, we consider reviewing the way business processes are managed. For instance we consider offering a flexible way to model processes so that changes in regulations are handled through some self-healing mechanisms. These changes may raise exceptions at run-time if not properly reflected on these processes. To this end we propose a new rule based model that adopts the ECA rules and is built upon formal tools. The business logic of a process can be summarized with a set of rules that implement an organization’s policies. Each business rule is formalized using our ECAPE formalism (Event-Condition-Action-Post condition- post Event). This formalism allows translating a process into a graph of rules that is analyzed in terms of reliably and flexibility.
Development of structured ICD-10 and its application to computer-assisted ICD coding.
Imai, Takeshi; Kajino, Masayuki; Sato, Megumi; Ohe, Kazuhiko
2010-01-01
This paper presents: (1) a framework of formal representation of ICD10, which functions as a bridge between ontological information and natural language expressions; and (2) a methodology to use formally described ICD10 for computer-assisted ICD coding. First, we analyzed and structurized the meanings of categories in 15 chapters of ICD10. Then we expanded the structured ICD10 (S-ICD10) by adding subordinate concepts and labels derived from Japanese Standard Disease Names. The information model to describe formal representation was refined repeatedly. The resultant model includes 74 types of semantic links. We also developed an ICD coding module based on S-ICD10 and a 'Coding Principle,' which achieved high accuracy (>70%) for four chapters. These results not only demonstrate the basic feasibility of our coding framework but might also inform the development of the information model for formal description framework in the ICD11 revision.
Breastfeeding and maternal employment: results from three national nutritional surveys in Mexico.
Rivera-Pasquel, Marta; Escobar-Zaragoza, Leticia; González de Cosío, Teresita
2015-05-01
To evaluate the association between maternal employment and breastfeeding (both duration and status) in Mexican mothers using data from three National Health and Nutrition Surveys conducted in 1999, 2006 and 2012. We analyzed data from the 1999 National Nutrition Survey, the 2006 National Nutrition and Health Survey, and the 2012 National Nutrition and Health Survey (NNS-1999, NHNS-2006 and NHNS-2012) on 5,385 mothers aged 12-49 years, with infants under 1 year. Multivariate logistic regression models were used to analyze the association between breastfeeding and maternal employment adjusted for maternal and infant's socio-demographic covariates. Maternal formal employment was negatively associated with breastfeeding in Mexican mothers with infants under 1 year. Formally employed mothers were 20 % less likely to breastfeed compared to non-formally employed mothers and 27 % less likely to breastfeed compared to unemployed mothers. Difference in median duration of breastfeeding between formally employed and unemployed mothers was 5.7 months for NNS-1999, 4.7 months for NNHS-2006 and 6.7 months for NNHS-2012 respectively (p < 0.05). In NHNS-2006 and NHNS-2012, health care access was associated with longer breastfeeding duration. Maternal employment has been negatively associated with breastfeeding in Mexican mothers of <1 year infants at least for the last 15 years. For Mexicans involved in policy design, implementation or modification, these data might offer robust evidence on this negative association, and can be used confidently as basis for conceiving a more just legislation for working lactating women.
Learning Conditions for Non-Formal and Informal Workplace Learning
ERIC Educational Resources Information Center
Kyndt, Eva; Dochy, Filip; Nijs, Hanne
2009-01-01
Purpose: The purpose of this research paper is to investigate the presence of learning conditions for non-formal and informal workplace learning in relation to the characteristics of the employee and the organisation he or she works for. Design/methodology/approach: The questionnaire developed by Clauwaert and Van Bree on learning conditions was…
Situationally Embodied Curriculum: Relating Formalisms and Contexts
ERIC Educational Resources Information Center
Barab, Sasha; Zuiker, Steve; Warren, Scott; Hickey, Dan; Ingram-Goble, Adam; Kwon, Eun-Ju; Kouper, Inna; Herring, Susan C.
2007-01-01
This study describes an example of design-based research in which we make theoretical improvements in our understanding, in part based on empirical work, and use these to revise our curriculum and, simultaneously, our evolving theory of the relations between contexts and disciplinary formalisms. Prior to this study, we completed a first cycle of…
Dressed to Present: Ratings of Classroom Presentations Vary with Attire
ERIC Educational Resources Information Center
Gurung, Regan A. R.; Kempen, Laura; Klemm, Kayla; Senn, Rebecca; Wysocki, Rosie
2014-01-01
This study investigates the effects of formality of dress on ratings of classroom presentations. Participants (N = 65, 66% women) from a Midwestern university in the United States rated three female students giving a presentation designed for a health psychology class in one of four outfits: casual, party, business casual, or business formal.…
An Ill-Structured PBL-Based Microprocessor Course without Formal Laboratory
ERIC Educational Resources Information Center
Kim, Jungkuk
2012-01-01
This paper introduces a problem-based learning (PBL) microprocessor application course designed according to the following strategies: 1) hands-on training without having a formal laboratory, and 2) intense student-centered cooperative learning through an ill-structured problem. PBL was adopted as the core educational technique of the course to…
Education in Emergencies: The Gender Implications. Advocacy Brief
ERIC Educational Resources Information Center
Kirk, Jackie
2006-01-01
"Education in emergencies" refers to a broad range of education activities, both formal and non-formal, which are life saving and sustaining. They are critical for children, youth and their families in times of crisis. Programmes for emergency education are often designed according to a specific environmental context and sometimes as temporary…
Formal Method of Description Supporting Portfolio Assessment
ERIC Educational Resources Information Center
Morimoto, Yasuhiko; Ueno, Maomi; Kikukawa, Isao; Yokoyama, Setsuo; Miyadera, Youzou
2006-01-01
Teachers need to assess learner portfolios in the field of education. However, they need support in the process of designing and practicing what kind of portfolios are to be assessed. To solve the problem, a formal method of describing the relations between the lesson forms and portfolios that need to be collected and the relations between…
Recognising Women's Skill. EAE647 Non-Formal Learning.
ERIC Educational Resources Information Center
Cox, Eva; Leonard, Helen
The material in this monograph is part of the study materials for the one-semester distance education unit, Non-Formal Learning, in the Open Campus Program at Deakin University (Australia). It is designed to raise issues relating to skill definition. "Choosing a Worker or How Good Are Your Job Descriptions?" explores why interpersonal or…
Teaching about Hazardous and Toxic Materials. Teaching Activities in Environmental Education Series.
ERIC Educational Resources Information Center
Disinger, John F.; Lisowski, Marylin
Designed to assist practitioners of both formal and non-formal settings, this 18th volume of the ERIC Clearinghouse for Science, Mathematics, and Environmental Education's Teaching Activities in Environmental Education series specifically focuses on the theme of hazardous and toxic materials. Initially, basic environmental concepts that deal with…
Using Informal Inferential Reasoning to Develop Formal Concepts: Analyzing an Activity
ERIC Educational Resources Information Center
Weinberg, Aaron; Wiesner, Emilie; Pfaff, Thomas J.
2010-01-01
Inferential reasoning is a central component of statistics. Researchers have suggested that students should develop an informal understanding of the ideas that underlie inference before learning the concepts formally. This paper presents a hands-on activity that is designed to help students in an introductory statistics course draw informal…
Peer Review of a Formal Verification/Design Proof Methodology
NASA Technical Reports Server (NTRS)
1983-01-01
The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.