Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Experiences Using Lightweight Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1997-01-01
This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
NASA Technical Reports Server (NTRS)
1995-01-01
This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.
Mending the Gap, An Effort to Aid the Transfer of Formal Methods Technology
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly
2009-01-01
Formal methods can be applied to many of the development and verification activities required for civil avionics software. RTCA/DO-178B, Software Considerations in Airborne Systems and Equipment Certification, gives a brief description of using formal methods as an alternate method of compliance with the objectives of that standard. Despite this, the avionics industry at large has been hesitant to adopt formal methods, with few developers have actually used formal methods for certification credit. Why is this so, given the volume of evidence of the benefits of formal methods? This presentation will explore some of the challenges to using formal methods in a certification context and describe the effort by the Formal Methods Subgroup of RTCA SC-205/EUROCAE WG-71 to develop guidance to make the use of formal methods a recognized approach.
Formalizing Space Shuttle Software Requirements
NASA Technical Reports Server (NTRS)
Crow, Judith; DiVito, Ben L.
1996-01-01
This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.
Formal Methods for Life-Critical Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
The Second NASA Formal Methods Workshop 1992
NASA Technical Reports Server (NTRS)
Johnson, Sally C. (Compiler); Holloway, C. Michael (Compiler); Butler, Ricky W. (Compiler)
1992-01-01
The primary goal of the workshop was to bring together formal methods researchers and aerospace industry engineers to investigate new opportunities for applying formal methods to aerospace problems. The first part of the workshop was tutorial in nature. The second part of the workshop explored the potential of formal methods to address current aerospace design and verification problems. The third part of the workshop involved on-line demonstrations of state-of-the-art formal verification tools. Also, a detailed survey was filled in by the attendees; the results of the survey are compiled.
Third NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler)
1995-01-01
This publication constitutes the proceedings of NASA Langley Research Center's third workshop on the application of formal methods to the design and verification of life-critical systems. This workshop brought together formal methods researchers, industry engineers, and academicians to discuss the potential of NASA-sponsored formal methods and to investigate new opportunities for applying these methods to industry problems. contained herein are copies of the material presented at the workshop, summaries of many of the presentations, a complete list of attendees, and a detailed summary of the Langley formal methods program. Much of this material is available electronically through the World-Wide Web via the following URL.
NASA Formal Methods Workshop, 1990
NASA Technical Reports Server (NTRS)
Butler, Ricky W. (Compiler)
1990-01-01
The workshop brought together researchers involved in the NASA formal methods research effort for detailed technical interchange and provided a mechanism for interaction with representatives from the FAA and the aerospace industry. The workshop also included speakers from industry to debrief the formal methods researchers on the current state of practice in flight critical system design, verification, and certification. The goals were: define and characterize the verification problem for ultra-reliable life critical flight control systems and the current state of practice in industry today; determine the proper role of formal methods in addressing these problems, and assess the state of the art and recent progress toward applying formal methods to this area.
Formal methods and digital systems validation for airborne systems
NASA Technical Reports Server (NTRS)
Rushby, John
1993-01-01
This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.
NASA Langley Research and Technology-Transfer Program in Formal Methods
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Caldwell, James L.; Carreno, Victor A.; Holloway, C. Michael; Miner, Paul S.; DiVito, Ben L.
1995-01-01
This paper presents an overview of NASA Langley research program in formal methods. The major goals of this work are to make formal methods practical for use on life critical systems, and to orchestrate the transfer of this technology to U.S. industry through use of carefully designed demonstration projects. Several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of five NASA civil servants and contractors from Odyssey Research Associates, SRI International, and VIGYAN Inc.
Verifying Hybrid Systems Modeled as Timed Automata: A Case Study
1997-03-01
Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools
Formal Assurance Certifiable Tooling Formal Assurance Certifiable Tooling Strategy Final Report
NASA Technical Reports Server (NTRS)
Bush, Eric; Oglesby, David; Bhatt, Devesh; Murugesan, Anitha; Engstrom, Eric; Mueller, Joe; Pelican, Michael
2017-01-01
This is the Final Report of a research project to investigate issues and provide guidance for the qualification of formal methods tools under the DO-330 qualification process. It consisted of three major subtasks spread over two years: 1) an assessment of theoretical soundness issues that may affect qualification for three categories of formal methods tools, 2) a case study simulating the DO-330 qualification of two actual tool sets, and 3) an investigation of risk mitigation strategies that might be applied to chains of such formal methods tools in order to increase confidence in their certification of airborne software.
Properties of a Formal Method for Prediction of Emergent Behaviors in Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
Autonomous intelligent swarms of satellites are being proposed for NASA missions that have complex behaviors and interactions. The emergent properties of swarms make these missions powerful, but at the same time more difficult to design and assure that proper behaviors will emerge. This paper gives the results of research into formal methods techniques for verification and validation of NASA swarm-based missions. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft. The NASA ANTS mission was used as an example of swarm intelligence for which to apply the formal methods. This paper will give the evaluation of these formal methods and give partial specifications of the ANTS mission using four selected methods. We then give an evaluation of the methods and the needed properties of a formal method for effective specification and prediction of emergent behavior in swarm-based systems.
A brief overview of NASA Langley's research program in formal methods
NASA Technical Reports Server (NTRS)
1992-01-01
An overview of NASA Langley's research program in formal methods is presented. The major goal of this work is to bring formal methods technology to a sufficiently mature level for use by the United States aerospace industry. Towards this goal, work is underway to design and formally verify a fault-tolerant computing platform suitable for advanced flight control applications. Also, several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of six NASA civil servants and contractors from Boeing Military Aircraft Company, Computational Logic Inc., Odyssey Research Associates, SRI International, University of California at Davis, and Vigyan Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messud, J.; Dinh, P. M.; Suraud, Eric
2009-10-15
We propose a simplification of the time-dependent self-interaction correction (TD-SIC) method using two sets of orbitals, applying the optimized effective potential (OEP) method. The resulting scheme is called time-dependent 'generalized SIC-OEP'. A straightforward approximation, using the spatial localization of one set of orbitals, leads to the 'generalized SIC-Slater' formalism. We show that it represents a great improvement compared to the traditional SIC-Slater and Krieger-Li-Iafrate formalisms.
NASA Astrophysics Data System (ADS)
Messud, J.; Dinh, P. M.; Reinhard, P.-G.; Suraud, Eric
2009-10-01
We propose a simplification of the time-dependent self-interaction correction (TD-SIC) method using two sets of orbitals, applying the optimized effective potential (OEP) method. The resulting scheme is called time-dependent “generalized SIC-OEP.” A straightforward approximation, using the spatial localization of one set of orbitals, leads to the “generalized SIC-Slater” formalism. We show that it represents a great improvement compared to the traditional SIC-Slater and Krieger-Li-Iafrate formalisms.
Formal Solutions for Polarized Radiative Transfer. II. High-order Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janett, Gioele; Steiner, Oskar; Belluzzi, Luca, E-mail: gioele.janett@irsol.ch
When integrating the radiative transfer equation for polarized light, the necessity of high-order numerical methods is well known. In fact, well-performing high-order formal solvers enable higher accuracy and the use of coarser spatial grids. Aiming to provide a clear comparison between formal solvers, this work presents different high-order numerical schemes and applies the systematic analysis proposed by Janett et al., emphasizing their advantages and drawbacks in terms of order of accuracy, stability, and computational cost.
Applying formal methods and object-oriented analysis to existing flight software
NASA Technical Reports Server (NTRS)
Cheng, Betty H. C.; Auernheimer, Brent
1993-01-01
Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.
Experiences applying Formal Approaches in the Development of Swarm-Based Space Exploration Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher A.; Hinchey, Michael G.; Truszkowski, Walter F.; Rash, James L.
2006-01-01
NASA is researching advanced technologies for future exploration missions using intelligent swarms of robotic vehicles. One of these missions is the Autonomous Nan0 Technology Swarm (ANTS) mission that will explore the asteroid belt using 1,000 cooperative autonomous spacecraft. The emergent properties of intelligent swarms make it a potentially powerful concept, but at the same time more difficult to design and ensure that the proper behaviors will emerge. NASA is investigating formal methods and techniques for verification of such missions. The advantage of using formal methods is the ability to mathematically verify the behavior of a swarm, emergent or otherwise. Using the ANTS mission as a case study, we have evaluated multiple formal methods to determine their effectiveness in modeling and ensuring desired swarm behavior. This paper discusses the results of this evaluation and proposes an integrated formal method for ensuring correct behavior of future NASA intelligent swarms.
Formal methods and their role in digital systems validation for airborne systems
NASA Technical Reports Server (NTRS)
Rushby, John
1995-01-01
This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.
New method of contour image processing based on the formalism of spiral light beams
NASA Astrophysics Data System (ADS)
Volostnikov, Vladimir G.; Kishkin, S. A.; Kotova, S. P.
2013-07-01
The possibility of applying the mathematical formalism of spiral light beams to the problems of contour image recognition is theoretically studied. The advantages and disadvantages of the proposed approach are evaluated; the results of numerical modelling are presented.
NASA Astrophysics Data System (ADS)
Kalthoff, Mona; Keim, Frederik; Krull, Holger; Uhrig, Götz S.
2017-05-01
The density matrix formalism and the equation of motion approach are two semi-analytical methods that can be used to compute the non-equilibrium dynamics of correlated systems. While for a bilinear Hamiltonian both formalisms yield the exact result, for any non-bilinear Hamiltonian a truncation is necessary. Due to the fact that the commonly used truncation schemes differ for these two methods, the accuracy of the obtained results depends significantly on the chosen approach. In this paper, both formalisms are applied to the quantum Rabi model. This allows us to compare the approximate results and the exact dynamics of the system and enables us to discuss the accuracy of the approximations as well as the advantages and the disadvantages of both methods. It is shown to which extent the results fulfill physical requirements for the observables and which properties of the methods lead to unphysical results.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
Model Checking JAVA Programs Using Java Pathfinder
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Pressburger, Thomas
2000-01-01
This paper describes a translator called JAVA PATHFINDER from JAVA to PROMELA, the "programming language" of the SPIN model checker. The purpose is to establish a framework for verification and debugging of JAVA programs based on model checking. This work should be seen in a broader attempt to make formal methods applicable "in the loop" of programming within NASA's areas such as space, aviation, and robotics. Our main goal is to create automated formal methods such that programmers themselves can apply these in their daily work (in the loop) without the need for specialists to manually reformulate a program into a different notation in order to analyze the program. This work is a continuation of an effort to formally verify, using SPIN, a multi-threaded operating system programmed in Lisp for the Deep-Space 1 spacecraft, and of previous work in applying existing model checkers and theorem provers to real applications.
A Formal Construction of Term Classes. Technical Report No. TR73-18.
ERIC Educational Resources Information Center
Yu, Clement T.
The computational complexity of a formal process for the construction of term classes for information retrieval is examined. While the process is proven to be difficult computationally, heuristic methods are applied. Experimental results are obtained to illustrate the maximum possible improvement in system performance of retrieval using the formal…
NASA Astrophysics Data System (ADS)
Geslin, Pierre-Antoine; Gatti, Riccardo; Devincre, Benoit; Rodney, David
2017-11-01
We propose a framework to study thermally-activated processes in dislocation glide. This approach is based on an implementation of the nudged elastic band method in a nodal mesoscale dislocation dynamics formalism. Special care is paid to develop a variational formulation to ensure convergence to well-defined minimum energy paths. We also propose a methodology to rigorously parametrize the model on atomistic data, including elastic, core and stacking fault contributions. To assess the validity of the model, we investigate the homogeneous nucleation of partial dislocation loops in aluminum, recovering the activation energies and loop shapes obtained with atomistic calculations and extending these calculations to lower applied stresses. The present method is also applied to heterogeneous nucleation on spherical inclusions.
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.
Thin-film limit formalism applied to surface defect absorption.
Holovský, Jakub; Ballif, Christophe
2014-12-15
The thin-film limit is derived by a nonconventional approach and equations for transmittance, reflectance and absorptance are presented in highly versatile and accurate form. In the thin-film limit the optical properties do not depend on the absorption coefficient, thickness and refractive index individually, but only on their product. We show that this formalism is applicable to the problem of ultrathin defective layer e.g. on a top of a layer of amorphous silicon. We develop a new method of direct evaluation of the surface defective layer and the bulk defects. Applying this method to amorphous silicon on glass, we show that the surface defective layer differs from bulk amorphous silicon in terms of light soaking.
A Tool for Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John
2005-01-01
Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.
NASA Astrophysics Data System (ADS)
Nurhidayati, I.; Suparmi, A.; Cari, C.
2018-03-01
The Schrödinger equation has been extended by applying the minimal length formalism for trigonometric potential. The wave function and energy spectra were used to describe the behavior of subatomic particle. The wave function and energy spectra were obtained by using hypergeometry method. The result showed that the energy increased by the increasing both of minimal length parameter and the potential parameter. The energy were calculated numerically using MatLab.
Keldysh formalism for multiple parallel worlds
NASA Astrophysics Data System (ADS)
Ansari, M.; Nazarov, Y. V.
2016-03-01
We present a compact and self-contained review of the recently developed Keldysh formalism for multiple parallel worlds. The formalism has been applied to consistent quantum evaluation of the flows of informational quantities, in particular, to the evaluation of Renyi and Shannon entropy flows. We start with the formulation of the standard and extended Keldysh techniques in a single world in a form convenient for our presentation. We explain the use of Keldysh contours encompassing multiple parallel worlds. In the end, we briefly summarize the concrete results obtained with the method.
Keldysh formalism for multiple parallel worlds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ansari, M.; Nazarov, Y. V., E-mail: y.v.nazarov@tudelft.nl
We present a compact and self-contained review of the recently developed Keldysh formalism for multiple parallel worlds. The formalism has been applied to consistent quantum evaluation of the flows of informational quantities, in particular, to the evaluation of Renyi and Shannon entropy flows. We start with the formulation of the standard and extended Keldysh techniques in a single world in a form convenient for our presentation. We explain the use of Keldysh contours encompassing multiple parallel worlds. In the end, we briefly summarize the concrete results obtained with the method.
Memory sparing, fast scattering formalism for rigorous diffraction modeling
NASA Astrophysics Data System (ADS)
Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.
2017-07-01
The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.
Egidi, Franco; Sun, Shichao; Goings, Joshua J; Scalmani, Giovanni; Frisch, Michael J; Li, Xiaosong
2017-06-13
We present a linear response formalism for the description of the electronic excitations of a noncollinear reference defined via Kohn-Sham spin density functional methods. A set of auxiliary variables, defined using the density and noncollinear magnetization density vector, allows the generalization of spin density functional kernels commonly used in collinear DFT to noncollinear cases, including local density, GGA, meta-GGA and hybrid functionals. Working equations and derivations of functional second derivatives with respect to the noncollinear density, required in the linear response noncollinear TDDFT formalism, are presented in this work. This formalism takes all components of the spin magnetization into account independent of the type of reference state (open or closed shell). As a result, the method introduced here is able to afford a nonzero local xc torque on the spin magnetization while still satisfying the zero-torque theorem globally. The formalism is applied to a few test cases using the variational exact-two-component reference including spin-orbit coupling to illustrate the capabilities of the method.
Practical Weak-lensing Shear Measurement with Metacalibration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheldon, Erin S.; Huff, Eric M.
2017-05-20
Metacalibration is a recently introduced method to accurately measure weak gravitational lensing shear using only the available imaging data, without need for prior information about galaxy properties or calibration from simulations. The method involves distorting the image with a small known shear, and calculating the response of a shear estimator to that applied shear. The method was shown to be accurate in moderate-sized simulations with galaxy images that had relatively high signal-to-noise ratios, and without significant selection effects. In this work we introduce a formalism to correct for both shear response and selection biases. We also observe that for imagesmore » with relatively low signal-to-noise ratios, the correlated noise that arises during the metacalibration process results in significant bias, for which we develop a simple empirical correction. To test this formalism, we created large image simulations based on both parametric models and real galaxy images, including tests with realistic point-spread functions. We varied the point-spread function ellipticity at the five-percent level. In each simulation we applied a small few-percent shear to the galaxy images. We introduced additional challenges that arise in real data, such as detection thresholds, stellar contamination, and missing data. We applied cuts on the measured galaxy properties to induce significant selection effects. Using our formalism, we recovered the input shear with an accuracy better than a part in a thousand in all cases.« less
Can Regulatory Bodies Expect Efficient Help from Formal Methods?
NASA Technical Reports Server (NTRS)
Lopez Ruiz, Eduardo R.; Lemoine, Michel
2010-01-01
In the context of EDEMOI - a French national project that proposed the use of semiformal and formal methods to infer the consistency and robustness of aeronautical regulations through the analysis of faithfully representative models- a methodology had been suggested (and applied) to different (safety and security-related) aeronautical regulations. This paper summarizes the preliminary results of this experience by stating which were the methodology s expected benefits, from a scientific point of view, and which are its useful benefits, from a regulatory body s point of view.
Results of a Formal Methods Demonstration Project
NASA Technical Reports Server (NTRS)
Kelly, J.; Covington, R.; Hamilton, D.
1994-01-01
This paper describes the results of a cooperative study conducted by a team of researchers in formal methods at three NASA Centers to demonstrate FM techniques and to tailor them to critical NASA software systems. This pilot project applied FM to an existing critical software subsystem, the Shuttle's Jet Select subsystem (Phase I of an ongoing study). The present study shows that FM can be used successfully to uncover hidden issues in a highly critical and mature Functional Subsystem Software Requirements (FSSR) specification which are very difficult to discover by traditional means.
Algebraic Algorithm Design and Local Search
1996-12-01
method for performing algorithm design that is more purely algebraic than that of KIDS. This method is then applied to local search. Local search is a...synthesis. Our approach was to follow KIDS in spirit, but to adopt a pure algebraic formalism, supported by Kestrel’s SPECWARE environment (79), that...design was developed that is more purely algebraic than that of KIDS. This method was then applied to local search. A general theory of local search was
Formal Methods for Automated Diagnosis of Autosub 6000
NASA Technical Reports Server (NTRS)
Ernits, Juhan; Dearden, Richard; Pebody, Miles
2009-01-01
This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.
Practical Weak-lensing Shear Measurement with Metacalibration
Sheldon, Erin S.; Huff, Eric M.
2017-05-19
We report that metacalibration is a recently introduced method to accurately measure weak gravitational lensing shear using only the available imaging data, without need for prior information about galaxy properties or calibration from simulations. The method involves distorting the image with a small known shear, and calculating the response of a shear estimator to that applied shear. The method was shown to be accurate in moderate-sized simulations with galaxy images that had relatively high signal-to-noise ratios, and without significant selection effects. In this work we introduce a formalism to correct for both shear response and selection biases. We also observemore » that for images with relatively low signal-to-noise ratios, the correlated noise that arises during the metacalibration process results in significant bias, for which we develop a simple empirical correction. To test this formalism, we created large image simulations based on both parametric models and real galaxy images, including tests with realistic point-spread functions. We varied the point-spread function ellipticity at the five-percent level. In each simulation we applied a small few-percent shear to the galaxy images. We introduced additional challenges that arise in real data, such as detection thresholds, stellar contamination, and missing data. We applied cuts on the measured galaxy properties to induce significant selection effects. Finally, using our formalism, we recovered the input shear with an accuracy better than a part in a thousand in all cases.« less
{ital R}-matrix theory, formal Casimirs and the periodic Toda lattice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morosi, C.; Pizzocchero, L.
The nonunitary {ital r}-matrix theory and the associated bi- and triHamiltonian schemes are considered. The language of Poisson pencils and of their formal Casimirs is applied in this framework to characterize the biHamiltonian chains of integrals of motion, pointing out the role of the Schur polynomials in these constructions. This formalism is subsequently applied to the periodic Toda lattice. Some different algebraic settings and Lax formulations proposed in the literature for this system are analyzed in detail, and their full equivalence is exploited. In particular, the equivalence between the loop algebra approach and the method of differential-difference operators is illustrated;more » moreover, two alternative Lax formulations are considered, and appropriate reduction algorithms are found in both cases, allowing us to derive the multiHamiltonian formalism from {ital r}-matrix theory. The systems of integrals for the periodic Toda lattice known after Flaschka and H{acute e}non, and their functional relations, are recovered through systematic application of the previously outlined schemes. {copyright} {ital 1996 American Institute of Physics.}« less
Continuum Level Density of a Coupled-Channel System in the Complex Scaling Method
NASA Astrophysics Data System (ADS)
Suzuki, R.; Kruppa, A. T.; Giraud, B. G.; Katō, K.
2008-06-01
We study the continuum level density (CLD) in the formalism of the complex scaling method (CSM) for coupled-channel systems. We apply the formalism to the ^{4}He = [^{3}H + p] + [^3{He} + n] coupled-channel cluster model where there are resonances at low energy. Numerical calculations of the CLD in the CSM with a finite number of L^{2} basis functions are consistent with the exact result calculated from the S-matrix by solving coupled-channel equations. We also study channel densities. In this framework, the extended completeness relation (ECR) plays an important role.
A Formal Valuation Framework for Emotions and Their Control.
Huys, Quentin J M; Renz, Daniel
2017-09-15
Computational psychiatry aims to apply mathematical and computational techniques to help improve psychiatric care. To achieve this, the phenomena under scrutiny should be within the scope of formal methods. As emotions play an important role across many psychiatric disorders, such computational methods must encompass emotions. Here, we consider formal valuation accounts of emotions. We focus on the fact that the flexibility of emotional responses and the nature of appraisals suggest the need for a model-based valuation framework for emotions. However, resource limitations make plain model-based valuation impossible and require metareasoning strategies to apportion cognitive resources adaptively. We argue that emotions may implement such metareasoning approximations by restricting the range of behaviors and states considered. We consider the processes that guide the deployment of the approximations, discerning between innate, model-free, heuristic, and model-based controllers. A formal valuation and metareasoning framework may thus provide a principled approach to examining emotions. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
On the effect of boundary layer growth on the stability of compressible flows
NASA Technical Reports Server (NTRS)
El-Hady, N. M.
1981-01-01
The method of multiple scales is used to describe a formally correct method based on the nonparallel linear stability theory, that examines the two and three dimensional stability of compressible boundary layer flows. The method is applied to the supersonic flat plate layer at Mach number 4.5. The theoretical growth rates are in good agreement with experimental results. The method is also applied to the infinite-span swept wing transonic boundary layer with suction to evaluate the effect of the nonparallel flow on the development of crossflow disturbances.
An overview of very high level software design methods
NASA Technical Reports Server (NTRS)
Asdjodi, Maryam; Hooper, James W.
1988-01-01
Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.
On the Adequacy of Bayesian Evaluations of Categorization Models: Reply to Vanpaemel and Lee (2012)
ERIC Educational Resources Information Center
Wills, Andy J.; Pothos, Emmanuel M.
2012-01-01
Vanpaemel and Lee (2012) argued, and we agree, that the comparison of formal models can be facilitated by Bayesian methods. However, Bayesian methods neither precede nor supplant our proposals (Wills & Pothos, 2012), as Bayesian methods can be applied both to our proposals and to their polar opposites. Furthermore, the use of Bayesian methods to…
An Ontology for State Analysis: Formalizing the Mapping to SysML
NASA Technical Reports Server (NTRS)
Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel
2012-01-01
State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.
A formal protocol test procedure for the Survivable Adaptable Fiber Optic Embedded Network (SAFENET)
NASA Astrophysics Data System (ADS)
High, Wayne
1993-03-01
This thesis focuses upon a new method for verifying the correct operation of a complex, high speed fiber optic communication network. These networks are of growing importance to the military because of their increased connectivity, survivability, and reconfigurability. With the introduction and increased dependence on sophisticated software and protocols, it is essential that their operation be correct. Because of the speed and complexity of fiber optic networks being designed today, they are becoming increasingly difficult to test. Previously, testing was accomplished by application of conformance test methods which had little connection with an implementation's specification. The major goal of conformance testing is to ensure that the implementation of a profile is consistent with its specification. Formal specification is needed to ensure that the implementation performs its intended operations while exhibiting desirable behaviors. The new conformance test method presented is based upon the System of Communicating Machine model which uses a formal protocol specification to generate a test sequence. The major contribution of this thesis is the application of the System of Communicating Machine model to formal profile specifications of the Survivable Adaptable Fiber Optic Embedded Network (SAFENET) standard which results in the derivation of test sequences for a SAFENET profile. The results applying this new method to SAFENET's OSI and Lightweight profiles are presented.
SS-HORSE method for studying resonances
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blokhintsev, L. D.; Mazur, A. I.; Mazur, I. A., E-mail: 008043@pnu.edu.ru
A new method for analyzing resonance states based on the Harmonic-Oscillator Representation of Scattering Equations (HORSE) formalism and analytic properties of partial-wave scattering amplitudes is proposed. The method is tested by applying it to the model problem of neutral-particle scattering and can be used to study resonance states on the basis of microscopic calculations performed within various versions of the shell model.
Clive, Derrick L J; Fletcher, Stephen P; Liu, Dazhan
2004-05-14
An indirect method is described for effecting radical cyclization onto a benzene ring. Cross-conjugated dienones 6, which are readily prepared from phenols, undergo radical cyclization (6 --> 7 --> 8), and the products (8) are easily aromatized. The method has been applied to the synthesis of ent-nocardione A (21).
On the simulation of indistinguishable fermions in the many-body Wigner formalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sellier, J.M., E-mail: jeanmichel.sellier@gmail.com; Dimov, I.
2015-01-01
The simulation of quantum systems consisting of interacting, indistinguishable fermions is an incredible mathematical problem which poses formidable numerical challenges. Many sophisticated methods addressing this problem are available which are based on the many-body Schrödinger formalism. Recently a Monte Carlo technique for the resolution of the many-body Wigner equation has been introduced and successfully applied to the simulation of distinguishable, spinless particles. This numerical approach presents several advantages over other methods. Indeed, it is based on an intuitive formalism in which quantum systems are described in terms of a quasi-distribution function, and highly scalable due to its Monte Carlo nature.more » In this work, we extend the many-body Wigner Monte Carlo method to the simulation of indistinguishable fermions. To this end, we first show how fermions are incorporated into the Wigner formalism. Then we demonstrate that the Pauli exclusion principle is intrinsic to the formalism. As a matter of fact, a numerical simulation of two strongly interacting fermions (electrons) is performed which clearly shows the appearance of a Fermi (or exchange–correlation) hole in the phase-space, a clear signature of the presence of the Pauli principle. To conclude, we simulate 4, 8 and 16 non-interacting fermions, isolated in a closed box, and show that, as the number of fermions increases, we gradually recover the Fermi–Dirac statistics, a clear proof of the reliability of our proposed method for the treatment of indistinguishable particles.« less
Su, Hongling; Li, Shengtai
2016-02-03
In this study, we propose two new energy/dissipation-preserving Birkhoffian multi-symplectic methods (Birkhoffian and Birkhoffian box) for Maxwell's equations with dissipation terms. After investigating the non-autonomous and autonomous Birkhoffian formalism for Maxwell's equations with dissipation terms, we first apply a novel generating functional theory to the non-autonomous Birkhoffian formalism to propose our Birkhoffian scheme, and then implement a central box method to the autonomous Birkhoffian formalism to derive the Birkhoffian box scheme. We have obtained four formal local conservation laws and three formal energy global conservation laws. We have also proved that both of our derived schemes preserve the discrete versionmore » of the global/local conservation laws. Furthermore, the stability, dissipation and dispersion relations are also investigated for the schemes. Theoretical analysis shows that the schemes are unconditionally stable, dissipation-preserving for Maxwell's equations in a perfectly matched layer (PML) medium and have second order accuracy in both time and space. Numerical experiments for problems with exact theoretical results are given to demonstrate that the Birkhoffian multi-symplectic schemes are much more accurate in preserving energy than both the exponential finite-difference time-domain (FDTD) method and traditional Hamiltonian scheme. Finally, we also solve the electromagnetic pulse (EMP) propagation problem and the numerical results show that the Birkhoffian scheme recovers the magnitude of the current source and reaction history very well even after long time propagation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Su, Hongling; Li, Shengtai
In this study, we propose two new energy/dissipation-preserving Birkhoffian multi-symplectic methods (Birkhoffian and Birkhoffian box) for Maxwell's equations with dissipation terms. After investigating the non-autonomous and autonomous Birkhoffian formalism for Maxwell's equations with dissipation terms, we first apply a novel generating functional theory to the non-autonomous Birkhoffian formalism to propose our Birkhoffian scheme, and then implement a central box method to the autonomous Birkhoffian formalism to derive the Birkhoffian box scheme. We have obtained four formal local conservation laws and three formal energy global conservation laws. We have also proved that both of our derived schemes preserve the discrete versionmore » of the global/local conservation laws. Furthermore, the stability, dissipation and dispersion relations are also investigated for the schemes. Theoretical analysis shows that the schemes are unconditionally stable, dissipation-preserving for Maxwell's equations in a perfectly matched layer (PML) medium and have second order accuracy in both time and space. Numerical experiments for problems with exact theoretical results are given to demonstrate that the Birkhoffian multi-symplectic schemes are much more accurate in preserving energy than both the exponential finite-difference time-domain (FDTD) method and traditional Hamiltonian scheme. Finally, we also solve the electromagnetic pulse (EMP) propagation problem and the numerical results show that the Birkhoffian scheme recovers the magnitude of the current source and reaction history very well even after long time propagation.« less
Chen, Stephanie C; Kim, Scott Y H
2016-01-01
Background/Aims Standard of care pragmatic clinical trials (SCPCTs) that compare treatments already in use could improve care and reduce cost but there is considerable debate about the research risks of SCPCTs and how to apply informed consent regulations to such trials. We sought to develop a framework integrating the insights from opposing sides of the debate. Methods We developed a formal risk-benefit analysis framework for SCPCTs and then applied it to key provisions of the U.S. federal regulations. Results Our formal framework for SCPCT risk-benefit analysis takes into account three key considerations: the ex ante estimates of risks and benefits of the treatments to be compared in a SCPCT, the allocation ratios of treatments inside and outside a SCPCT, and the significance of some participants receiving a different treatment inside a SCPCT than outside the trial. The framework provides practical guidance on how the research ethics regulations on informed consent should be applied to SCPCTs. Conclusions Our proposed formal model makes explicit the relationship between the concepts used by opposing sides of the debate about the research risks of SCPCTs and can be used to clarify the implications for informed consent. PMID:27365010
Deriving Safety Cases from Automatically Constructed Proofs
NASA Technical Reports Server (NTRS)
Basir, Nurlida; Denney, Ewen; Fischer, Bernd
2009-01-01
Formal proofs provide detailed justification for the validity of claims and are widely used in formal software development methods. However, they are often complex and difficult to understand, because the formalism in which they are constructed and encoded is usually machine-oriented, and they may also be based on assumptions that are not justified. This causes concerns about the trustworthiness of using formal proofs as arguments in safety-critical applications. Here, we present an approach to develop safety cases that correspond to formal proofs found by automated theorem provers and reveal the underlying argumentation structure and top-level assumptions. We concentrate on natural deduction style proofs, which are closer to human reasoning than resolution proofs, and show how to construct the safety cases by covering the natural deduction proof tree with corresponding safety case fragments. We also abstract away logical book-keeping steps, which reduces the size of the constructed safety cases. We show how the approach can be applied to the proofs found by the Muscadet prover.
Formal methods demonstration project for space applications
NASA Technical Reports Server (NTRS)
Divito, Ben L.
1995-01-01
The Space Shuttle program is cooperating in a pilot project to apply formal methods to live requirements analysis activities. As one of the larger ongoing shuttle Change Requests (CR's), the Global Positioning System (GPS) CR involves a significant upgrade to the Shuttle's navigation capability. Shuttles are to be outfitted with GPS receivers and the primary avionics software will be enhanced to accept GPS-provided positions and integrate them into navigation calculations. Prior to implementing the CR, requirements analysts at Loral Space Information Systems, the Shuttle software contractor, must scrutinize the CR to identify and resolve any requirements issues. We describe an ongoing task of the Formal Methods Demonstration Project for Space Applications whose goal is to find an effective way to use formal methods in the GPS CR requirements analysis phase. This phase is currently under way and a small team from NASA Langley, ViGYAN Inc. and Loral is now engaged in this task. Background on the GPS CR is provided and an overview of the hardware/software architecture is presented. We outline the approach being taken to formalize the requirements, only a subset of which is being attempted. The approach features the use of the PVS specification language to model 'principal functions', which are major units of Shuttle software. Conventional state machine techniques form the basis of our approach. Given this background, we present interim results based on a snapshot of work in progress. Samples of requirements specifications rendered in PVS are offered to illustration. We walk through a specification sketch for the principal function known as GPS Receiver State processing. Results to date are summarized and feedback from Loral requirements analysts is highlighted. Preliminary data is shown comparing issues detected by the formal methods team versus those detected using existing requirements analysis methods. We conclude by discussing our plan to complete the remaining activities of this task.
Integrating Security into the Curriculum
1998-12-01
predicate calculus, discrete math , and finite-state machine the- ory. In addition to applying standard mathematical foundations to constructing hardware and...models, specifi- cations, and the use of formal methods for verification and covert channel analysis. The means for analysis is based on discrete math , information
ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis
NASA Technical Reports Server (NTRS)
Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.
2006-01-01
Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.
Why formal learning theory matters for cognitive science.
Fulop, Sean; Chater, Nick
2013-01-01
This article reviews a number of different areas in the foundations of formal learning theory. After outlining the general framework for formal models of learning, the Bayesian approach to learning is summarized. This leads to a discussion of Solomonoff's Universal Prior Distribution for Bayesian learning. Gold's model of identification in the limit is also outlined. We next discuss a number of aspects of learning theory raised in contributed papers, related to both computational and representational complexity. The article concludes with a description of how semi-supervised learning can be applied to the study of cognitive learning models. Throughout this overview, the specific points raised by our contributing authors are connected to the models and methods under review. Copyright © 2013 Cognitive Science Society, Inc.
Application of the Extended Completeness Relation to the Absorbing Boundary Condition
NASA Astrophysics Data System (ADS)
Iwasaki, Masataka; Otani, Reiji; Ito, Makoto
The strength function of the linear response by the external field is calculated in the formalism of the absorbing boundary condition (ABC). The dipole excitation of a schematic two-body system is treated in the present study. The extended completeness relation, which is assumed on the analogy of the formulation in the complex scaling method (CSM), is applied to the calculation of the strength function. The calculation of the strength function is successful in the present formalism and hence, the extended completeness relation seems to work well in the ABC formalism. The contributions from the resonance and the non-resonant continuum are also analyzed according to the decomposition of the energy levels in the extended completeness relation.
A Formal Methods Approach to the Analysis of Mode Confusion
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.
2004-01-01
The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal models and analyses can be used to help eliminate mode confusion from flight deck designs and at the same time increase our confidence in the safety of the implementation. The paper is based upon interim results from a new project involving NASA Langley and Rockwell Collins in applying formal methods to a realistic business jet Flight Guidance System (FGS).
Bio-Inspired Genetic Algorithms with Formalized Crossover Operators for Robotic Applications.
Zhang, Jie; Kang, Man; Li, Xiaojuan; Liu, Geng-Yang
2017-01-01
Genetic algorithms are widely adopted to solve optimization problems in robotic applications. In such safety-critical systems, it is vitally important to formally prove the correctness when genetic algorithms are applied. This paper focuses on formal modeling of crossover operations that are one of most important operations in genetic algorithms. Specially, we for the first time formalize crossover operations with higher-order logic based on HOL4 that is easy to be deployed with its user-friendly programing environment. With correctness-guaranteed formalized crossover operations, we can safely apply them in robotic applications. We implement our technique to solve a path planning problem using a genetic algorithm with our formalized crossover operations, and the results show the effectiveness of our technique.
NASA Astrophysics Data System (ADS)
Matsubara, Takahiko
2003-02-01
We formulate a general method for perturbative evaluations of statistics of smoothed cosmic fields and provide useful formulae for application of the perturbation theory to various statistics. This formalism is an extensive generalization of the method used by Matsubara, who derived a weakly nonlinear formula of the genus statistic in a three-dimensional density field. After describing the general method, we apply the formalism to a series of statistics, including genus statistics, level-crossing statistics, Minkowski functionals, and a density extrema statistic, regardless of the dimensions in which each statistic is defined. The relation between the Minkowski functionals and other geometrical statistics is clarified. These statistics can be applied to several cosmic fields, including three-dimensional density field, three-dimensional velocity field, two-dimensional projected density field, and so forth. The results are detailed for second-order theory of the formalism. The effect of the bias is discussed. The statistics of smoothed cosmic fields as functions of rescaled threshold by volume fraction are discussed in the framework of second-order perturbation theory. In CDM-like models, their functional deviations from linear predictions plotted against the rescaled threshold are generally much smaller than that plotted against the direct threshold. There is still a slight meatball shift against rescaled threshold, which is characterized by asymmetry in depths of troughs in the genus curve. A theory-motivated asymmetry factor in the genus curve is proposed.
A Framework for Modeling Human-Machine Interactions
NASA Technical Reports Server (NTRS)
Shafto, Michael G.; Rosekind, Mark R. (Technical Monitor)
1996-01-01
Modern automated flight-control systems employ a variety of different behaviors, or modes, for managing the flight. While developments in cockpit automation have resulted in workload reduction and economical advantages, they have also given rise to an ill-defined class of human-machine problems, sometimes referred to as 'automation surprises'. Our interest in applying formal methods for describing human-computer interaction stems from our ongoing research on cockpit automation. In this area of aeronautical human factors, there is much concern about how flight crews interact with automated flight-control systems, so that the likelihood of making errors, in particular mode-errors, is minimized and the consequences of such errors are contained. The goal of the ongoing research on formal methods in this context is: (1) to develop a framework for describing human interaction with control systems; (2) to formally categorize such automation surprises; and (3) to develop tests for identification of these categories early in the specification phase of a new human-machine system.
NASA Astrophysics Data System (ADS)
Chatterjee, Saikat; Koopmans, Léon V. E.
2018-02-01
In the last decade, the detection of individual massive dark matter sub-haloes has been possible using potential correction formalism in strong gravitational lens imaging. Here, we propose a statistical formalism to relate strong gravitational lens surface brightness anomalies to the lens potential fluctuations arising from dark matter distribution in the lens galaxy. We consider these fluctuations as a Gaussian random field in addition to the unperturbed smooth lens model. This is very similar to weak lensing formalism and we show that in this way we can measure the power spectrum of these perturbations to the potential. We test the method by applying it to simulated mock lenses of different geometries and by performing an MCMC analysis of the theoretical power spectra. This method can measure density fluctuations in early type galaxies on scales of 1-10 kpc at typical rms levels of a per cent, using a single lens system observed with the Hubble Space Telescope with typical signal-to-noise ratios obtained in a single orbit.
A new method based on the Butler-Volmer formalism to evaluate voltammetric cation and anion sensors.
Cano, Manuel; Rodríguez-Amaro, Rafael; Fernández Romero, Antonio J
2008-12-11
A new method based on the Butler-Volmer formalism is applied to assess the capability of two voltammetric ion sensors based on polypyrrole films: PPy/DBS and PPy/ClO4 modified electrodes were studied as voltammetric cation and anion sensors, respectively. The reversible potential versus electrolyte concentrations semilogarithm plots provided positive calibration slopes for PPy/DBS and negative ones for PPy/ClO4, as was expected from the proposed method and that based on the Nernst equation. The slope expressions deduced from Butler-Volmer include the electron-transfer coefficient, which allows slope values different from the ideal Nernstian value to be explained. Both polymeric films exhibited a degree of ion-selectivity when they were immersed in mixed-analyte solutions. Selectivity coefficients for the two proposed voltammetric cation and anion sensors were obtained by several experimental methods, including the separated solution method (SSM) and matched potential method (MPM). The K values acquired by the different methods were very close for both polymeric sensors.
Formal methods technology transfer: Some lessons learned
NASA Technical Reports Server (NTRS)
Hamilton, David
1992-01-01
IBM has a long history in the application of formal methods to software development and verification. There have been many successes in the development of methods, tools and training to support formal methods. And formal methods have been very successful on several projects. However, the use of formal methods has not been as widespread as hoped. This presentation summarizes several approaches that have been taken to encourage more widespread use of formal methods, and discusses the results so far. The basic problem is one of technology transfer, which is a very difficult problem. It is even more difficult for formal methods. General problems of technology transfer, especially the transfer of formal methods technology, are also discussed. Finally, some prospects for the future are mentioned.
Zhou, Yirong; Breit, Bernhard
2017-12-22
An unprecedented asymmetric N-H functionalization of quinazolinones with allenes and allylic carbonates was successfully achieved by rhodium catalysis with the assistance of chiral bidentate diphosphine ligands. The high efficiency and practicality of this method was demonstrated by a low catalyst loading of 1 mol % as well as excellent chemo-, regio-, and enantioselectivities with broad functional group compatibility. Furthermore, this newly developed strategy was applied as key step in the first enantioselective formal total synthesis of (-)-chaetominine. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Siddique, Juned; Harel, Ofer; Crespi, Catherine M.; Hedeker, Donald
2014-01-01
The true missing data mechanism is never known in practice. We present a method for generating multiple imputations for binary variables that formally incorporates missing data mechanism uncertainty. Imputations are generated from a distribution of imputation models rather than a single model, with the distribution reflecting subjective notions of missing data mechanism uncertainty. Parameter estimates and standard errors are obtained using rules for nested multiple imputation. Using simulation, we investigate the impact of missing data mechanism uncertainty on post-imputation inferences and show that incorporating this uncertainty can increase the coverage of parameter estimates. We apply our method to a longitudinal smoking cessation trial where nonignorably missing data were a concern. Our method provides a simple approach for formalizing subjective notions regarding nonresponse and can be implemented using existing imputation software. PMID:24634315
ERIC Educational Resources Information Center
Rienties, Bart; Héliot, YingFei
2018-01-01
While interdisciplinary courses are regarded as a promising method for students to learn and apply knowledge from other disciplines, there is limited empirical evidence available whether interdisciplinary courses can effectively "create" interdisciplinary students. In this innovative quasi-experimental study amongst 377 Master's…
Lessons Learned from Client Projects in an Undergraduate Project Management Course
ERIC Educational Resources Information Center
Pollard, Carol E.
2012-01-01
This work proposes that a subtle combination of three learning methods offering "just in time" project management knowledge, coupled with hands-on project management experience can be particularly effective in producing project management students with employable skills. Students were required to apply formal project management knowledge to gain…
Concepts of formal concept analysis
NASA Astrophysics Data System (ADS)
Žáček, Martin; Homola, Dan; Miarka, Rostislav
2017-07-01
The aim of this article is apply of Formal Concept Analysis on concept of world. Formal concept analysis (FCA) as a methodology of data analysis, information management and knowledge representation has potential to be applied to a verity of linguistic problems. FCA is mathematical theory for concepts and concept hierarchies that reflects an understanding of concept. Formal concept analysis explicitly formalizes extension and intension of a concept, their mutual relationships. A distinguishing feature of FCA is an inherent integration of three components of conceptual processing of data and knowledge, namely, the discovery and reasoning with concepts in data, discovery and reasoning with dependencies in data, and visualization of data, concepts, and dependencies with folding/unfolding capabilities.
Sub-grid scale models for discontinuous Galerkin methods based on the Mori-Zwanzig formalism
NASA Astrophysics Data System (ADS)
Parish, Eric; Duraisamy, Karthk
2017-11-01
The optimal prediction framework of Chorin et al., which is a reformulation of the Mori-Zwanzig (M-Z) formalism of non-equilibrium statistical mechanics, provides a framework for the development of mathematically-derived closure models. The M-Z formalism provides a methodology to reformulate a high-dimensional Markovian dynamical system as a lower-dimensional, non-Markovian (non-local) system. In this lower-dimensional system, the effects of the unresolved scales on the resolved scales are non-local and appear as a convolution integral. The non-Markovian system is an exact statement of the original dynamics and is used as a starting point for model development. In this work, we investigate the development of M-Z-based closures model within the context of the Variational Multiscale Method (VMS). The method relies on a decomposition of the solution space into two orthogonal subspaces. The impact of the unresolved subspace on the resolved subspace is shown to be non-local in time and is modeled through the M-Z-formalism. The models are applied to hierarchical discontinuous Galerkin discretizations. Commonalities between the M-Z closures and conventional flux schemes are explored. This work was supported in part by AFOSR under the project ''LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.
Expert2OWL: A Methodology for Pattern-Based Ontology Development.
Tahar, Kais; Xu, Jie; Herre, Heinrich
2017-01-01
The formalization of expert knowledge enables a broad spectrum of applications employing ontologies as underlying technology. These include eLearning, Semantic Web and expert systems. However, the manual construction of such ontologies is time-consuming and thus expensive. Moreover, experts are often unfamiliar with the syntax and semantics of formal ontology languages such as OWL and usually have no experience in developing formal ontologies. To overcome these barriers, we developed a new method and tool, called Expert2OWL that provides efficient features to support the construction of OWL ontologies using GFO (General Formal Ontology) as a top-level ontology. This method allows a close and effective collaboration between ontologists and domain experts. Essentially, this tool integrates Excel spreadsheets as part of a pattern-based ontology development and refinement process. Expert2OWL enables us to expedite the development process and modularize the resulting ontologies. We applied this method in the field of Chinese Herbal Medicine (CHM) and used Expert2OWL to automatically generate an accurate Chinese Herbology ontology (CHO). The expressivity of CHO was tested and evaluated using ontology query languages SPARQL and DL. CHO shows promising results and can generate answers to important scientific questions such as which Chinese herbal formulas contain which substances, which substances treat which diseases, and which ones are the most frequently used in CHM.
Formal methods in the design of Ada 1995
NASA Technical Reports Server (NTRS)
Guaspari, David
1995-01-01
Formal, mathematical methods are most useful when applied early in the design and implementation of a software system--that, at least, is the familiar refrain. I will report on a modest effort to apply formal methods at the earliest possible stage, namely, in the design of the Ada 95 programming language itself. This talk is an 'experience report' that provides brief case studies illustrating the kinds of problems we worked on, how we approached them, and the extent (if any) to which the results proved useful. It also derives some lessons and suggestions for those undertaking future projects of this kind. Ada 95 is the first revision of the standard for the Ada programming language. The revision began in 1988, when the Ada Joint Programming Office first asked the Ada Board to recommend a plan for revising the Ada standard. The first step in the revision was to solicit criticisms of Ada 83. A set of requirements for the new language standard, based on those criticisms, was published in 1990. A small design team, the Mapping Revision Team (MRT), became exclusively responsible for revising the language standard to satisfy those requirements. The MRT, from Intermetrics, is led by S. Tucker Taft. The work of the MRT was regularly subject to independent review and criticism by a committee of distinguished Reviewers and by several advisory teams--for example, the two User/Implementor teams, each consisting of an industrial user (attempting to make significant use of the new language on a realistic application) and a compiler vendor (undertaking, experimentally, to modify its current implementation in order to provide the necessary new features). One novel decision established the Language Precision Team (LPT), which investigated language proposals from a mathematical point of view. The LPT applied formal mathematical analysis to help improve the design of Ada 95 (e.g., by clarifying the language proposals) and to help promote its acceptance (e.g., by identifying a verifiable subset that would meet the needs of safety-critical applications). The first LPT project, which ran from the fall of 1990 unti the end of 1992, produced studies of several language issues: optimization, sharing and storage, tasking and protected records, overload resolution, the floating point model, distribution, program erros, and object-oriented programming. The second LPT project, in 1994, formally modeled the dynamic semantics of a large part of the (almost) final language definition, looking especially for interactions between language features.
Formal Methods Tool Qualification
NASA Technical Reports Server (NTRS)
Wagner, Lucas G.; Cofer, Darren; Slind, Konrad; Tinelli, Cesare; Mebsout, Alain
2017-01-01
Formal methods tools have been shown to be effective at finding defects in safety-critical digital systems including avionics systems. The publication of DO-178C and the accompanying formal methods supplement DO-333 allows applicants to obtain certification credit for the use of formal methods without providing justification for them as an alternative method. This project conducted an extensive study of existing formal methods tools, identifying obstacles to their qualification and proposing mitigations for those obstacles. Further, it interprets the qualification guidance for existing formal methods tools and provides case study examples for open source tools. This project also investigates the feasibility of verifying formal methods tools by generating proof certificates which capture proof of the formal methods tool's claim, which can be checked by an independent, proof certificate checking tool. Finally, the project investigates the feasibility of qualifying this proof certificate checker, in the DO-330 framework, in lieu of qualifying the model checker itself.
Symplectic Quantization of a Vector-Tensor Gauge Theory with Topological Coupling
NASA Astrophysics Data System (ADS)
Barcelos-Neto, J.; Silva, M. B. D.
We use the symplectic formalism to quantize a gauge theory where vectors and tensors fields are coupled in a topological way. This is an example of reducible theory and a procedure like of ghosts-of-ghosts of the BFV method is applied but in terms of Lagrange multipliers. Our final results are in agreement with the ones found in the literature by using the Dirac method.
Maximum entropy formalism for the analytic continuation of matrix-valued Green's functions
NASA Astrophysics Data System (ADS)
Kraberger, Gernot J.; Triebl, Robert; Zingl, Manuel; Aichhorn, Markus
2017-10-01
We present a generalization of the maximum entropy method to the analytic continuation of matrix-valued Green's functions. To treat off-diagonal elements correctly based on Bayesian probability theory, the entropy term has to be extended for spectral functions that are possibly negative in some frequency ranges. In that way, all matrix elements of the Green's function matrix can be analytically continued; we introduce a computationally cheap element-wise method for this purpose. However, this method cannot ensure important constraints on the mathematical properties of the resulting spectral functions, namely positive semidefiniteness and Hermiticity. To improve on this, we present a full matrix formalism, where all matrix elements are treated simultaneously. We show the capabilities of these methods using insulating and metallic dynamical mean-field theory (DMFT) Green's functions as test cases. Finally, we apply the methods to realistic material calculations for LaTiO3, where off-diagonal matrix elements in the Green's function appear due to the distorted crystal structure.
BRST quantization of cosmological perturbations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armendariz-Picon, Cristian; Şengör, Gizem
2016-11-08
BRST quantization is an elegant and powerful method to quantize theories with local symmetries. In this article we study the Hamiltonian BRST quantization of cosmological perturbations in a universe dominated by a scalar field, along with the closely related quantization method of Dirac. We describe how both formalisms apply to perturbations in a time-dependent background, and how expectation values of gauge-invariant operators can be calculated in the in-in formalism. Our analysis focuses mostly on the free theory. By appropriate canonical transformations we simplify and diagonalize the free Hamiltonian. BRST quantization in derivative gauges allows us to dramatically simplify the structuremore » of the propagators, whereas Dirac quantization, which amounts to quantization in synchronous gauge, dispenses with the need to introduce ghosts and preserves the locality of the gauge-fixed action.« less
NASA Astrophysics Data System (ADS)
Chiron, L.; Oger, G.; de Leffe, M.; Le Touzé, D.
2018-02-01
While smoothed-particle hydrodynamics (SPH) simulations are usually performed using uniform particle distributions, local particle refinement techniques have been developed to concentrate fine spatial resolutions in identified areas of interest. Although the formalism of this method is relatively easy to implement, its robustness at coarse/fine interfaces can be problematic. Analysis performed in [16] shows that the radius of refined particles should be greater than half the radius of unrefined particles to ensure robustness. In this article, the basics of an Adaptive Particle Refinement (APR) technique, inspired by AMR in mesh-based methods, are presented. This approach ensures robustness with alleviated constraints. Simulations applying the new formalism proposed achieve accuracy comparable to fully refined spatial resolutions, together with robustness, low CPU times and maintained parallel efficiency.
Power Block Geometry Applied to the Building of Power Electronics Converters
ERIC Educational Resources Information Center
dos Santos, E. C., Jr.; da Silva, E. R. C.
2013-01-01
This paper proposes a new methodology, Power Block Geometry (PBG), for the presentation of power electronics topologies that process ac voltage. PBG's strategy uses formal methods based on a geometrical representation with particular rules and defines a universe with axioms and conjectures to establish a formation law. It allows power…
A Real-time Evaluation of Human-based Approaches to Safety Testing: What We Can Do Now (TDS)
Despite ever-increasing efforts in early safety assessment in all industries, there are still many chemicals that prove toxic in humans. While greater use of human in vitro test methods may serve to reduce this problem, the formal validation process applied to such tests represen...
Blending and nudging in fluid dynamics: some simple observations
NASA Astrophysics Data System (ADS)
Germano, M.
2017-10-01
Blending and nudging methods have been recently applied in fluid dynamics, particularly regarding the assimilation of experimental data into the computations. In the paper we formally derive the differential equation associated to blending and compare it to the standard nudging equation. Some simple considerations related to these techniques and their mutual relations are exposed.
Formally verifying Ada programs which use real number types
NASA Technical Reports Server (NTRS)
Sutherland, David
1986-01-01
Formal verification is applied to programs which use real number arithmetic operations (mathematical programs). Formal verification of a program P consists of creating a mathematical model of F, stating the desired properties of P in a formal logical language, and proving that the mathematical model has the desired properties using a formal proof calculus. The development and verification of the mathematical model are discussed.
Matching biomedical ontologies based on formal concept analysis.
Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei
2018-03-19
The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign demonstrates the effectiveness of FCA-Map and its competitiveness with the top-ranked systems. FCA-Map can achieve a better balance between precision and recall for large-scale domain ontologies through constructing multiple FCA structures, whereas it performs unsatisfactorily for smaller-sized ontologies with less lexical and semantic expressions. Compared with other FCA-based OM systems, the study in this paper is more comprehensive as an attempt to push the envelope of the Formal Concept Analysis formalism in ontology matching tasks. Five types of formal contexts are constructed incrementally, and their derived concept lattices are used to cluster the commonalities among classes at lexical and structural level, respectively. Experiments on large, real-world domain ontologies show promising results and reveal the power of FCA.
A new mathematical approach for shock-wave solution in a dusty plasma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Das, G.C.; Dwivedi, C.B.; Talukdar, M.
1997-12-01
The problem of nonlinear Burger equation in a plasma contaminated with heavy dust grains has been revisited. As discussed earlier [C. B. Dwivedi and B. P. Pandey, Phys. Plasmas {bold 2}, 9 (1995)], the Burger equation originates due to dust charge fluctuation dynamics. A new alternate mathematical approach based on a simple traveling wave formalism has been applied to find out the solution of the derived Burger equation, and the method recovers the known shock-wave solution. This technique, although having its own limitation, predicts successfully the salient features of the weak shock-wave structure in a dusty plasma with dust chargemore » fluctuation dynamics. It is emphasized that this approach of the traveling wave formalism is being applied for the first time to solve the nonlinear wave equation in plasmas. {copyright} {ital 1997 American Institute of Physics.}« less
δ M formalism: a new approach to cosmological perturbation theory in anisotropic inflation
NASA Astrophysics Data System (ADS)
Talebian-Ashkezari, A.; Ahmadi, N.; Abolhasani, A. A.
2018-03-01
We study the evolution of the metric perturbations in a Bianchi background in the long-wavelength limit. By applying the gradient expansion to the equations of motion we exhibit a generalized "Separate Universe" approach to the cosmological perturbation theory. Having found this consistent separate universe picture, we introduce the δ M formalism for calculating the evolution of the linear tensor perturbations in anisotropic inflation models in almost the same way that the so-called δ N formula is applied to the super-horizon dynamics of the curvature perturbations. Similar to her twin formula, δ N, this new method can substantially reduce the amount of calculations related to the evolution of tensor modes. However, it is not as general as δ N it is a "perturbative" formula and solves the shear only to linear order. In other words, it is restricted to weak shear limit.
Quasipolynomial generalization of Lotka-Volterra mappings
NASA Astrophysics Data System (ADS)
Hernández-Bermejo, Benito; Brenig, Léon
2002-07-01
In recent years, it has been shown that Lotka-Volterra mappings constitute a valuable tool from both the theoretical and the applied points of view, with developments in very diverse fields such as physics, population dynamics, chemistry and economy. The purpose of this work is to demonstrate that many of the most important ideas and algebraic methods that constitute the basis of the quasipolynomial formalism (originally conceived for the analysis of ordinary differential equations) can be extended into the mapping domain. The extension of the formalism into the discrete-time context is remarkable as far as the quasipolynomial methodology had never been shown to be applicable beyond the differential case. It will be demonstrated that Lotka-Volterra mappings play a central role in the quasipolynomial formalism for the discrete-time case. Moreover, the extension of the formalism into the discrete-time domain allows a significant generalization of Lotka-Volterra mappings as well as a whole transfer of algebraic methods into the discrete-time context. The result is a novel and more general conceptual framework for the understanding of Lotka-Volterra mappings as well as a new range of possibilities that become open not only for the theoretical analysis of Lotka-Volterra mappings and their generalizations, but also for the development of new applications.
Developing a Treatment Planning Software Based on TG-43U1 Formalism for Cs-137 LDR Brachytherapy.
Sina, Sedigheh; Faghihi, Reza; Soleimani Meigooni, Ali; Siavashpour, Zahra; Mosleh-Shirazi, Mohammad Amin
2013-08-01
The old Treatment Planning Systems (TPSs) used for intracavitary brachytherapy with Cs-137 Selectron source utilize traditional dose calculation methods, considering each source as a point source. Using such methods introduces significant errors in dose estimation. As of 1995, TG-43 is used as the main dose calculation formalism in treatment TPSs. The purpose of this study is to design and establish a treatment planning software for Cs-137 Solectron brachytherapy source, based on TG-43U1 formalism by applying the effects of the applicator and dummy spacers. Two softwares used for treatment planning of Cs-137 sources in Iran (STPS and PLATO), are based on old formalisms. The purpose of this work is to establish and develop a TPS for Selectron source based on TG-43 formalism. In this planning system, the dosimetry parameters of each pellet in different places inside applicators were obtained by MCNP4c code. Then the dose distribution around every combination of active and inactive pellets was obtained by summing the doses. The accuracy of this algorithm was checked by comparing its results for special combination of active and inactive pellets with MC simulations. Finally, the uncertainty of old dose calculation formalism was investigated by comparing the results of STPS and PLATO softwares with those obtained by the new algorithm. For a typical arrangement of 10 active pellets in the applicator, the percentage difference between doses obtained by the new algorithm at 1cm distance from the tip of the applicator and those obtained by old formalisms is about 30%, while the difference between the results of MCNP and the new algorithm is less than 5%. According to the results, the old dosimetry formalisms, overestimate the dose especially towards the applicator's tip. While the TG-43U1 based software perform the calculations more accurately.
NASA Technical Reports Server (NTRS)
Boulet, Christian; Ma, Qiancheng; Thibault, Franck
2014-01-01
A symmetrized version of the recently developed refined Robert-Bonamy formalism [Q. Ma, C. Boulet, and R. H. Tipping, J. Chem. Phys. 139, 034305 (2013)] is proposed. This model takes into account line coupling effects and hence allows the calculation of the off-diagonal elements of the relaxation matrix, without neglecting the rotational structure of the perturbing molecule. The formalism is applied to the isotropic Raman spectra of autoperturbed N2 for which a benchmark quantum relaxation matrix has recently been proposed. The consequences of the classical path approximation are carefully analyzed. Methods correcting for effects of inelasticity are considered. While in the right direction, these corrections appear to be too crude to provide off diagonal elements which would yield, via the sum rule, diagonal elements in good agreement with the quantum results. In order to overcome this difficulty, a re-normalization procedure is applied, which ensures that the off-diagonal elements do lead to the exact quantum diagonal elements. The agreement between the (re-normalized) semi-classical and quantum relaxation matrices is excellent, at least for the Raman spectra of N2, opening the way to the analysis of more complex molecular systems.
Hamilton-Jacobi formalism to warm inflationary scenario
NASA Astrophysics Data System (ADS)
Sayar, K.; Mohammadi, A.; Akhtari, L.; Saaidi, Kh.
2017-01-01
The Hamilton-Jacobi formalism as a powerful method is being utilized to reconsider the warm inflationary scenario, where the scalar field as the main component driving inflation interacts with other fields. Separating the context into strong and weak dissipative regimes, the goal is followed for two popular functions of Γ . Applying slow-rolling approximation, the required perturbation parameters are extracted and, by comparing to the latest Planck data, the free parameters are restricted. The possibility of producing an acceptable inflation is studied where the result shows that for all cases the model could successfully suggest the amplitude of scalar perturbation, scalar spectral index, its running, and the tensor-to-scalar ratio.
Tree-oriented interactive processing with an application to theorem-proving, appendix E
NASA Technical Reports Server (NTRS)
Hammerslag, David; Kamin, Samuel N.; Campbell, Roy H.
1985-01-01
The concept of unstructured structure editing and ted, an editor for unstructured trees, is described. Ted is used to manipulate hierarchies of information in an unrestricted manner. The tool was implemented and applied to the problem of organizing formal proofs. As a proof management tool, it maintains the validity of a proof and its constituent lemmas independently from the methods used to validate the proof. It includes an adaptable interface which may be used to invoke theorem provers and other aids to proof construction. Using ted, a user may construct, maintain, and verify formal proofs using a variety of theorem provers, proof checkers, and formatters.
A discriminatory function for prediction of protein-DNA interactions based on alpha shape modeling.
Zhou, Weiqiang; Yan, Hong
2010-10-15
Protein-DNA interaction has significant importance in many biological processes. However, the underlying principle of the molecular recognition process is still largely unknown. As more high-resolution 3D structures of protein-DNA complex are becoming available, the surface characteristics of the complex become an important research topic. In our work, we apply an alpha shape model to represent the surface structure of the protein-DNA complex and developed an interface-atom curvature-dependent conditional probability discriminatory function for the prediction of protein-DNA interaction. The interface-atom curvature-dependent formalism captures atomic interaction details better than the atomic distance-based method. The proposed method provides good performance in discriminating the native structures from the docking decoy sets, and outperforms the distance-dependent formalism in terms of the z-score. Computer experiment results show that the curvature-dependent formalism with the optimal parameters can achieve a native z-score of -8.17 in discriminating the native structure from the highest surface-complementarity scored decoy set and a native z-score of -7.38 in discriminating the native structure from the lowest RMSD decoy set. The interface-atom curvature-dependent formalism can also be used to predict apo version of DNA-binding proteins. These results suggest that the interface-atom curvature-dependent formalism has a good prediction capability for protein-DNA interactions. The code and data sets are available for download on http://www.hy8.com/bioinformatics.htm kenandzhou@hotmail.com.
Guidance for Using Formal Methods in a Certification Context
NASA Technical Reports Server (NTRS)
Brown, Duncan; Delseny, Herve; Hayhurst, Kelly; Wiels, Virginie
2010-01-01
This paper discusses some of the challenges to using formal methods in a certification context and describes the effort by the Formal Methods Subgroup of RTCA SC-205/EUROCAE WG-71 to propose guidance to make the use of formal methods a recognized approach. This guidance, expected to take the form of a Formal Methods Technical Supplement to DO-178C/ED-12C, is described, including the activities that are needed when using formal methods, new or modified objectives with respect to the core DO-178C/ED-12C document, and evidence needed for meeting those objectives.
ERIC Educational Resources Information Center
Jerez Gomez, Maximo J.
Divided into two areas of emphasis, this paper explores the potential of non-formal education in developing countries and non-formal education as it relates to the Dominican Republic. The first section presents background material on non-formal education and discusses types of programs being applied in a number of countries throughout the world.…
Applying Chomsky's Linguistic Methodology to the Clinical Interpretation of Symbolic Play.
ERIC Educational Resources Information Center
Ariel, Shlomo
This paper summarizes how Chomsky's methodological principles of linguistics may be applied to the clinical interpretation of children's play. Based on Chomsky's derivation of a "universal grammar" (the set of essential, formal, and substantive traits of any human language), a number of hypothesized formal universals of…
Collective coordinates and constrained hamiltonian systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dayi, O.F.
1992-07-01
A general method of incorporating collective coordinates (transformation of fields into an overcomplete basis) with constrained hamiltonian systems is given where the original phase space variables and collective coordinates can be bosonic or/and fermionic. This method is illustrated by applying it to the SU(2) Yang-Mills-Higgs theory and its BFV-BRST quantization is discussed. Moreover, this formalism is used to give a systematic way of converting second class constraints into effectively first class ones, by considering second class constraints as first class constraints and gauge fixing conditions. This approach is applied to the massive superparticle. Proca lagrangian, and some topological quantum fieldmore » theories.« less
The VATES-Diamond as a Verifier's Best Friend
NASA Astrophysics Data System (ADS)
Glesner, Sabine; Bartels, Björn; Göthel, Thomas; Kleine, Moritz
Within a model-based software engineering process it needs to be ensured that properties of abstract specifications are preserved by transformations down to executable code. This is even more important in the area of safety-critical real-time systems where additionally non-functional properties are crucial. In the VATES project, we develop formal methods for the construction and verification of embedded systems. We follow a novel approach that allows us to formally relate abstract process algebraic specifications to their implementation in a compiler intermediate representation. The idea is to extract a low-level process algebraic description from the intermediate code and to formally relate it to previously developed abstract specifications. We apply this approach to a case study from the area of real-time operating systems and show that this approach has the potential to seamlessly integrate modeling, implementation, transformation and verification stages of embedded system development.
NASA Astrophysics Data System (ADS)
Bellver, Fernando Gimeno; Garratón, Manuel Caravaca; Soto Meca, Antonio; López, Juan Antonio Vera; Guirao, Juan L. G.; Fernández-Martínez, Manuel
In this paper, we explore the chaotic behavior of resistively and capacitively shunted Josephson junctions via the so-called Network Simulation Method. Such a numerical approach establishes a formal equivalence among physical transport processes and electrical networks, and hence, it can be applied to efficiently deal with a wide range of differential systems. The generality underlying that electrical equivalence allows to apply the circuit theory to several scientific and technological problems. In this work, the Fast Fourier Transform has been applied for chaos detection purposes and the calculations have been carried out in PSpice, an electrical circuit software. Overall, it holds that such a numerical approach leads to quickly computationally solve Josephson differential models. An empirical application regarding the study of the Josephson model completes the paper.
A formal approach to the analysis of clinical computer-interpretable guideline modeling languages.
Grando, M Adela; Glasspool, David; Fox, John
2012-01-01
To develop proof strategies to formally study the expressiveness of workflow-based languages, and to investigate their applicability to clinical computer-interpretable guideline (CIG) modeling languages. We propose two strategies for studying the expressiveness of workflow-based languages based on a standard set of workflow patterns expressed as Petri nets (PNs) and notions of congruence and bisimilarity from process calculus. Proof that a PN-based pattern P can be expressed in a language L can be carried out semi-automatically. Proof that a language L cannot provide the behavior specified by a PNP requires proof by exhaustion based on analysis of cases and cannot be performed automatically. The proof strategies are generic but we exemplify their use with a particular CIG modeling language, PROforma. To illustrate the method we evaluate the expressiveness of PROforma against three standard workflow patterns and compare our results with a previous similar but informal comparison. We show that the two proof strategies are effective in evaluating a CIG modeling language against standard workflow patterns. We find that using the proposed formal techniques we obtain different results to a comparable previously published but less formal study. We discuss the utility of these analyses as the basis for principled extensions to CIG modeling languages. Additionally we explain how the same proof strategies can be reused to prove the satisfaction of patterns expressed in the declarative language CIGDec. The proof strategies we propose are useful tools for analysing the expressiveness of CIG modeling languages. This study provides good evidence of the benefits of applying formal methods of proof over semi-formal ones. Copyright © 2011 Elsevier B.V. All rights reserved.
Formal verification of AI software
NASA Technical Reports Server (NTRS)
Rushby, John; Whitehurst, R. Alan
1989-01-01
The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.
NASA Astrophysics Data System (ADS)
Bommier, Véronique
2017-11-01
Context. In previous papers of this series, we presented a formalism able to account for both statistical equilibrium of a multilevel atom and coherent and incoherent scatterings (partial redistribution). Aims: This paper provides theoretical expressions of the redistribution function for the two-term atom. This redistribution function includes both coherent (RII) and incoherent (RIII) scattering contributions with their branching ratios. Methods: The expressions were derived by applying the formalism outlined above. The statistical equilibrium equation for the atomic density matrix is first formally solved in the case of the two-term atom with unpolarized and infinitely sharp lower levels. Then the redistribution function is derived by substituting this solution for the expression of the emissivity. Results: Expressions are provided for both magnetic and non-magnetic cases. Atomic fine structure is taken into account. Expressions are also separately provided under zero and non-zero hyperfine structure. Conclusions: Redistribution functions are widely used in radiative transfer codes. In our formulation, collisional transitions between Zeeman sublevels within an atomic level (depolarizing collisions effect) are taken into account when possible (I.e., in the non-magnetic case). However, the need for a formal solution of the statistical equilibrium as a preliminary step prevents us from taking into account collisional transfers between the levels of the upper term. Accounting for these collisional transfers could be done via a numerical solution of the statistical equilibrium equation system.
BRST Quantization of the Proca Model Based on the BFT and the BFV Formalism
NASA Astrophysics Data System (ADS)
Kim, Yong-Wan; Park, Mu-In; Park, Young-Jai; Yoon, Sean J.
The BRST quantization of the Abelian Proca model is performed using the Batalin-Fradkin-Tyutin and the Batalin-Fradkin-Vilkovisky formalism. First, the BFT Hamiltonian method is applied in order to systematically convert a second class constraint system of the model into an effectively first class one by introducing new fields. In finding the involutive Hamiltonian we adopt a new approach which is simpler than the usual one. We also show that in our model the Dirac brackets of the phase space variables in the original second class constraint system are exactly the same as the Poisson brackets of the corresponding modified fields in the extended phase space due to the linear character of the constraints comparing the Dirac or Faddeev-Jackiw formalisms. Then, according to the BFV formalism we obtain that the desired resulting Lagrangian preserving BRST symmetry in the standard local gauge fixing procedure naturally includes the Stückelberg scalar related to the explicit gauge symmetry breaking effect due to the presence of the mass term. We also analyze the nonstandard nonlocal gauge fixing procedure.
NASA Astrophysics Data System (ADS)
Randler, Christoph; Kummer, Barbara; Wilhelm, Christian
2012-06-01
The aim of this study was to assess the outcome of a zoo visit in terms of learning and retention of knowledge concerning the adaptations and behavior of vertebrate species. Basis of the work was the concept of implementing zoo visits as an out-of-school setting for formal, curriculum based learning. Our theoretical framework centers on the self-determination theory, therefore, we used a group-based, hands-on learning environment. To address this questions, we used a treatment—control design (BACI) with different treatments and a control group. Pre-, post- and retention tests were applied. All treatments led to a substantial increase of learning and retention knowledge compared to the control group. Immediately after the zoo visit, the zoo-guide tour provided the highest scores, while after a delay of 6 weeks, the learner-centered environment combined with a teacher-guided summarizing scored best. We suggest incorporating the zoo as an out-of-school environment into formal school learning, and we propose different methods to improve learning in zoo settings.
Generalized Bondi-Sachs equations for characteristic formalism of numerical relativity
NASA Astrophysics Data System (ADS)
Cao, Zhoujian; He, Xiaokai
2013-11-01
The Cauchy formalism of numerical relativity has been successfully applied to simulate various dynamical spacetimes without any symmetry assumption. But discovering how to set a mathematically consistent and physically realistic boundary condition is still an open problem for Cauchy formalism. In addition, the numerical truncation error and finite region ambiguity affect the accuracy of gravitational wave form calculation. As to the finite region ambiguity issue, the characteristic extraction method helps much. But it does not solve all of the above issues. Besides the above problems for Cauchy formalism, the computational efficiency is another problem. Although characteristic formalism of numerical relativity suffers the difficulty from caustics in the inner near zone, it has advantages in relation to all of the issues listed above. Cauchy-characteristic matching (CCM) is a possible way to take advantage of characteristic formalism regarding these issues and treat the inner caustics at the same time. CCM has difficulty treating the gauge difference between the Cauchy part and the characteristic part. We propose generalized Bondi-Sachs equations for characteristic formalism for the Cauchy-characteristic matching end. Our proposal gives out a possible same numerical evolution scheme for both the Cauchy part and the characteristic part. And our generalized Bondi-Sachs equations have one adjustable gauge freedom which can be used to relate the gauge used in the Cauchy part. Then these equations can make the Cauchy part and the characteristic part share a consistent gauge condition. So our proposal gives a possible new starting point for Cauchy-characteristic matching.
Informal Theory: The Ignored Link in Theory-to-Practice
ERIC Educational Resources Information Center
Love, Patrick
2012-01-01
Applying theory to practice in student affairs is dominated by the assumption that formal theory is directly applied to practice. Among the problems with this assumption is that many practitioners believe they must choose between their lived experiences and formal theory, and that graduate students are taught that their experience "does not…
NASA Astrophysics Data System (ADS)
Song, Linze; Shi, Qiang
2017-02-01
We present a theoretical approach to study nonequilibrium quantum heat transport in molecular junctions described by a spin-boson type model. Based on the Feynman-Vernon path integral influence functional formalism, expressions for the average value and high-order moments of the heat current operators are derived, which are further obtained directly from the auxiliary density operators (ADOs) in the hierarchical equations of motion (HEOM) method. Distribution of the heat current is then derived from the high-order moments. As the HEOM method is nonperturbative and capable of treating non-Markovian system-environment interactions, the method can be applied to various problems of nonequilibrium quantum heat transport beyond the weak coupling regime.
NASA Technical Reports Server (NTRS)
Puliafito, E.; Bevilacqua, R.; Olivero, J.; Degenhardt, W.
1992-01-01
The formal retrieval error analysis of Rodgers (1990) allows the quantitative determination of such retrieval properties as measurement error sensitivity, resolution, and inversion bias. This technique was applied to five numerical inversion techniques and two nonlinear iterative techniques used for the retrieval of middle atmospheric constituent concentrations from limb-scanning millimeter-wave spectroscopic measurements. It is found that the iterative methods have better vertical resolution, but are slightly more sensitive to measurement error than constrained matrix methods. The iterative methods converge to the exact solution, whereas two of the matrix methods under consideration have an explicit constraint, the sensitivity of the solution to the a priori profile. Tradeoffs of these retrieval characteristics are presented.
Gender and ergonomics: a case study on the 'non-formal' work of women nurses.
Salerno, Silvana; Livigni, Lucilla; Magrini, Andrea; Talamanca, Irene Figà
2012-01-01
Women's work activities are often characterised by 'non-formal actions' (such as giving support). Gender differences in ergonomics may be due to this peculiarity. We applied the method of organisational congruencies (MOC) to ascertain the 'non-formal' work portion of nurses employed in three hospital units (haematology, emergency room and general medicine) during the three work shifts in a major University Hospital in Rome, Italy. We recorded a total of 802 technical actions performed by nine nurses in 72 h of work. Twenty-six percent of the actions in direct patient's care were communicative actions (mainly giving psychological support) while providing physical care. These 'double actions' are often not considered to be a formal part of the job by hospital management. In our case study, the 'non-formal' work of nurses (psychological support) is mainly represented by double actions while taking physical care of the patients. The dual task paradigm in gender oriented research is discussed in terms of its implications in prevention in occupational health. The main purpose of the study was to assess all the formal and non-formal activities of women in the nursing work setting. Offering psychological support to patients is often not considered to be a formal part of the job. Our case study found that nurses receive no explicit guidelines on this activity and no time is assigned to perform it. In measuring the burden of providing psychological support to patients, we found that this is often done while nurses are performing tasks of physical care for the patients (double actions). The article discusses the significance of non-formal psychological work load of women nurses through double actions from the ergonomic point view.
The value of a year's general education for reducing the symptoms of dementia.
Brent, Robert J
2018-01-01
We present a method for estimating the benefits of years of education for reducing dementia symptoms based on the cost savings that would accrue from continuing independent living rather than relying on formal or informal carers. Our method for estimating the benefits of education involves three steps: first taking a year of education and seeing how much this lowers dementia, second using this dementia reduction and estimating how much independent living is affected and third applying the change in caregiving costs associated with the independent living change. We apply our method for estimating education benefits to a National Alzheimer's Coordinating Center sample of 17,239 participants at 32 US Alzheimer's disease centres over the period September 2005 and May 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritboon, Atirach, E-mail: atirach.3.14@gmail.com; Department of Physics, Faculty of Science, Prince of Songkla University, Hat Yai 90112; Daengngam, Chalongrat, E-mail: chalongrat.d@psu.ac.th
2016-08-15
Biakynicki-Birula introduced a photon wave function similar to the matter wave function that satisfies the Schrödinger equation. Its second quantization form can be applied to investigate nonlinear optics at nearly full quantum level. In this paper, we applied the photon wave function formalism to analyze both linear optical processes in the well-known Mach–Zehnder interferometer and nonlinear optical processes for sum-frequency generation in dispersive and lossless medium. Results by photon wave function formalism agree with the well-established Maxwell treatments and existing experimental verifications.
A Multiobjective Approach Applied to the Protein Structure Prediction Problem
2002-03-07
like a low energy search landscape . 2.1.1 Symbolic/Formalized Problem Domain Description. Every computer representable problem can also be embodied...method [60]. 3.4 Energy Minimization Methods The energy landscape algorithms are based on the idea that a protein’s final resting conformation is...in our GA used to search the PSP problem energy landscape ). 3.5.1 Simple GA. The main routine in a sGA, after encoding the problem, builds a
NASA Astrophysics Data System (ADS)
Nielsen, N. K.; Quaade, U. J.
1995-07-01
The physical phase space of the relativistic top, as defined by Hansson and Regge, is expressed in terms of canonical coordinates of the Poincaré group manifold. The system is described in the Hamiltonian formalism by the mass-shell condition and constraints that reduce the number of spin degrees of freedom. The constraints are second class and are modified into a set of first class constraints by adding combinations of gauge-fixing functions. The Batalin-Fradkin-Vilkovisky method is then applied to quantize the system in the path integral formalism in Hamiltonian form. It is finally shown that different gauge choices produce different equivalent forms of the constraints.
Bubbles in extended inflation and multi-production of universes
NASA Astrophysics Data System (ADS)
Sakai, Nobuyuki; Maeda, Kei-ichi
Developing the thin-wall method of Israel, we present a formalism to investigate bubble dynamics in generalized Einstein theories. We derive the equations of motion for a bubble, finding that the space-time inside a bubble is always inhomogeneous. Applying this formalism to extended inflation, we find the following two results: (1) Any true vacuum bubble expands, contrary to the results of Goldwirth-Zaglauer, who claim that bubbles created initially later collapse. We show that their initial conditions for collapsing bubbles are physically inconsistent. (2) Concerning the global space-time structure of the Universe in extended inflation, we show that worm-holes are produced as in old inflation, resulting in the multi-production of universes.
An Introduction to the BFS Method and Its Use to Model Binary NiAl Alloys
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Noebe, Ronald D.; Ferrante, J.; Amador, C.
1998-01-01
We introduce the Bozzolo-Ferrante-Smith (BFS) method for alloys as a computationally efficient tool for aiding in the process of alloy design. An intuitive description of the BFS method is provided, followed by a formal discussion of its implementation. The method is applied to the study of the defect structure of NiAl binary alloys. The groundwork is laid for a detailed progression to higher order NiAl-based alloys linking theoretical calculations and computer simulations based on the BFS method and experimental work validating each step of the alloy design process.
Using formal methods for content validation of medical procedure documents.
Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia
2017-08-01
We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.
Improving basic math skills through integrated dynamic representation strategies.
González-Castro, Paloma; Cueli, Marisol; Cabeza, Lourdes; Álvarez-García, David; Rodríguez, Celestino
2014-01-01
In this paper, we analyze the effectiveness of the Integrated Dynamic Representation strategy (IDR) to develop basic math skills. The study involved 72 students, aged between 6 and 8 years. We compared the development of informal basic skills (numbers, comparison, informal calculation, and informal concepts) and formal (conventionalisms, number facts, formal calculus, and formal concepts) in an experimental group (n = 35) where we applied the IDR strategy and in a Control group (n = 37) in order to identify the impact of the procedure. The experimental group improved significantly in all variables except for number facts and formal calculus. It can therefore be concluded that IDR favors the development of the skills more closely related to applied mathematics than those related to automatic mathematics and mental arithmetic.
ERIC Educational Resources Information Center
Thiem, Alrik
2017-01-01
The search for necessary and sufficient causes of some outcome of interest, referred to as "configurational comparative research," has long been one of the main preoccupations of evaluation scholars and practitioners. However, only the last three decades have witnessed the evolution of a set of formal methods that are sufficiently…
Fermionic Tunneling Effect and Hawking Radiation in a Non Commutative FRW Universe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bouhalouf, H.; Aissaoui, H.; Mebarki, N.
2010-10-31
The formalism of a non commutative gauge gravity is applied to an FRW universe and the corresponding modified metric, veirbein and spin connection components are obtained. Moreover, using the Hamilton-Jacobi method and as a pure space-time deformation effect, the NCG Hawking radiation via a fermionic tunneling transition through the dynamical NCG horizon is also studied.
Recent trends related to the use of formal methods in software engineering
NASA Technical Reports Server (NTRS)
Prehn, Soren
1986-01-01
An account is given of some recent developments and trends related to the development and use of formal methods in software engineering. Ongoing activities in Europe are focussed on, since there seems to be a notable difference in attitude towards industrial usage of formal methods in Europe and in the U.S. A more detailed account is given of the currently most widespread formal method in Europe: the Vienna Development Method. Finally, the use of Ada is discussed in relation to the application of formal methods, and the potential for constructing Ada-specific tools based on that method is considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boulet, Christian, E-mail: Christian.boulet@u-psud.fr; Ma, Qiancheng; Thibault, Franck
A symmetrized version of the recently developed refined Robert-Bonamy formalism [Q. Ma, C. Boulet, and R. H. Tipping, J. Chem. Phys. 139, 034305 (2013)] is proposed. This model takes into account line coupling effects and hence allows the calculation of the off-diagonal elements of the relaxation matrix, without neglecting the rotational structure of the perturbing molecule. The formalism is applied to the isotropic Raman spectra of autoperturbed N{sub 2} for which a benchmark quantum relaxation matrix has recently been proposed. The consequences of the classical path approximation are carefully analyzed. Methods correcting for effects of inelasticity are considered. While inmore » the right direction, these corrections appear to be too crude to provide off diagonal elements which would yield, via the sum rule, diagonal elements in good agreement with the quantum results. In order to overcome this difficulty, a re-normalization procedure is applied, which ensures that the off-diagonal elements do lead to the exact quantum diagonal elements. The agreement between the (re-normalized) semi-classical and quantum relaxation matrices is excellent, at least for the Raman spectra of N{sub 2}, opening the way to the analysis of more complex molecular systems.« less
Master Logic Diagram: method for hazard and initiating event identification in process plants.
Papazoglou, I A; Aneziris, O N
2003-02-28
Master Logic Diagram (MLD), a method for identifying events initiating accidents in chemical installations, is presented. MLD is a logic diagram that resembles a fault tree but without the formal mathematical properties of the latter. MLD starts with a Top Event "Loss of Containment" and decomposes it into simpler contributing events. A generic MLD has been developed which may be applied to all chemical installations storing toxic and/or flammable substances. The method is exemplified through its application to an ammonia storage facility.
The CanMEDS role of Collaborator: How is it taught and assessed according to faculty and residents?
Berger, Elizabeth; Chan, Ming-Ka; Kuper, Ayelet; Albert, Mathieu; Jenkins, Deirdre; Harrison, Megan; Harris, Ilene
2012-01-01
OBJECTIVE: To explore the perspectives of paediatric residents and faculty regarding how the Collaborator role is taught and assessed. METHODS: Using a constructivist grounded theory approach, focus groups at four Canadian universities were conducted. Data were analyzed iteratively for emergent themes. RESULTS: Residents reported learning about collaboration through faculty role modelling but did not perceive that it was part of the formal curriculum. Faculty reported that they were not trained in how to effectively model this role. Both groups reported a need for training in conflict management, particularly as it applies to intraprofessional (physician-to-physician) relationships. Finally, the participants asserted that current methods to assess residents on their performance as collaborators are suboptimal. CONCLUSIONS: The Collaborator role should be a formal part of the residency curriculum. Residents need to be better educated with regard to managing conflict and handling intraprofessional relationships. Finally, innovative methods of assessing residents on this non-medical expert role need to be created. PMID:24294063
Whatever Happened to Formal Methods for Security?
Voas, J; Schaffer, K
2016-08-01
We asked 7 experts 7 questions to find out what has occurred recently in terms of applying formal methods (FM) to security-centric, cyber problems. We are continually reminded of the 1996 paper by Tony Hoare "How did Software Get So Reliable Without Proof?" [1] In that vein, how did we get so insecure with proof? Given daily press announcements concerning new malware, data breaches, and privacy loss, is FM still relevant or was it ever? Our experts answered with unique personal insights. We were curious as to whether this successful methodology in "safety-critical" has succeeded as well for today's "build it, hack it, patch it" mindset. Our experts were John McLean (Naval Research Labs), Paul Black (National Institute of Standards and Technology), Karl Levitt (University of California at Davis), Joseph Williams (CloudEconomist.Com), Connie Heitmeyer (Naval Research Labs), Eugene Spafford (Purdue University), and Joseph Kiniry (Galois, Inc.). The questions and responses follow.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartmann, Anja, E-mail: hartmann@ipk-gatersleben.de; Schreiber, Falk; Martin-Luther-University Halle-Wittenberg, Halle
The characterization of biological systems with respect to their behavior and functionality based on versatile biochemical interactions is a major challenge. To understand these complex mechanisms at systems level modeling approaches are investigated. Different modeling formalisms allow metabolic models to be analyzed depending on the question to be solved, the biochemical knowledge and the availability of experimental data. Here, we describe a method for an integrative analysis of the structure and dynamics represented by qualitative and quantitative metabolic models. Using various formalisms, the metabolic model is analyzed from different perspectives. Determined structural and dynamic properties are visualized in the contextmore » of the metabolic model. Interaction techniques allow the exploration and visual analysis thereby leading to a broader understanding of the behavior and functionality of the underlying biological system. The System Biology Metabolic Model Framework (SBM{sup 2} – Framework) implements the developed method and, as an example, is applied for the integrative analysis of the crop plant potato.« less
Wave excitation at Lindblad resonances using the method of multiple scales
NASA Astrophysics Data System (ADS)
Horák, Jiří
2017-12-01
In this note, the method of multiple scales is adopted to the problem of excitation of non–axisymmetric acoustic waves in vertically integrated disk by tidal gravitational fields. We derive a formula describing a waveform of exited wave that is uniformly valid in a whole disk as long as only a single Lindblad resonance is present. Our formalism is subsequently applied to two classical problems: trapped p–mode oscillations in relativistic accretion disks and the excitation of waves in infinite disks.
Specifying and Verifying Organizational Security Properties in First-Order Logic
NASA Astrophysics Data System (ADS)
Brandt, Christoph; Otten, Jens; Kreitz, Christoph; Bibel, Wolfgang
In certain critical cases the data flow between business departments in banking organizations has to respect security policies known as Chinese Wall or Bell-La Padula. We show that these policies can be represented by formal requirements and constraints in first-order logic. By additionally providing a formal model for the flow of data between business departments we demonstrate how security policies can be applied to a concrete organizational setting and checked with a first-order theorem prover. Our approach can be applied without requiring a deep formal expertise and it therefore promises a high potential of usability in the business.
Ab initio method for calculating total cross sections
NASA Technical Reports Server (NTRS)
Bhatia, A. K.; Schneider, B. I.; Temkin, A.
1993-01-01
A method for calculating total cross sections without formally including nonelastic channels is presented. The idea is to use a one channel T-matrix variational principle with a complex correlation function. The derived T matrix is therefore not unitary. Elastic scattering is calculated from T-parallel-squared, but total scattering is derived from the imaginary part of T using the optical theorem. The method is applied to the spherically symmetric model of electron-hydrogen scattering. No spurious structure arises; results for sigma(el) and sigma(total) are in excellent agreement with calculations of Callaway and Oza (1984). The method has wide potential applicability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, D.; Fertitta, E.; Paulus, B.
Due to the importance of both static and dynamical correlation in the bond formation, low-dimensional beryllium systems constitute interesting case studies to test correlation methods. Aiming to describe the whole dissociation curve of extended Be systems we chose to apply the method of increments (MoI) in its multireference (MR) formalism. To gain insight into the main characteristics of the wave function, we started by focusing on the description of small Be chains using standard quantum chemical methods. In a next step we applied the MoI to larger beryllium systems, starting from the Be{sub 6} ring. The complete active space formalismmore » was employed and the results were used as reference for local MR calculations of the whole dissociation curve. Although this is a well-established approach for systems with limited multireference character, its application regarding the description of whole dissociation curves requires further testing. Subsequent to the discussion of the role of the basis set, the method was finally applied to larger rings and extrapolated to an infinite chain.« less
Formal Operations and Learning Style Predict Success in Statistics and Computer Science Courses.
ERIC Educational Resources Information Center
Hudak, Mary A.; Anderson, David E.
1990-01-01
Studies 94 undergraduate students in introductory statistics and computer science courses. Applies Formal Operations Reasoning Test (FORT) and Kolb's Learning Style Inventory (LSI). Finds that substantial numbers of students have not achieved the formal operation level of cognitive maturity. Emphasizes need to examine students learning style and…
Critical Analysis on Open Source LMSs Using FCA
ERIC Educational Resources Information Center
Sumangali, K.; Kumar, Ch. Aswani
2013-01-01
The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…
The Formal Elements Art Therapy Scale: A Measurement System for Global Variables in Art
ERIC Educational Resources Information Center
Gantt, Linda M.
2009-01-01
The Formal Elements Art Therapy Scale (FEATS) is a measurement system for applying numbers to global variables in two-dimensional art (drawing and painting). While it was originally developed for use with the single-picture assessment ("Draw a person picking an apple from a tree" [PPAT]), researchers can also apply many of the 14 scales of the…
The value of a year’s general education for reducing the symptoms of dementia
Brent, Robert J.
2017-01-01
We present a method for estimating the benefits of years of education for reducing dementia symptoms based on the cost savings that would accrue from continuing independent living rather than relying on formal or informal carers. Our method for estimating the benefits of education involves three steps: first taking a year of education and seeing how much this lowers dementia, second using this dementia reduction and estimating how much independent living is affected and third applying the change in caregiving costs associated with the independent living change. We apply our method for estimating education benefits to a National Alzheimer’s Coordinating Center sample of 17,239 participants at 32 US Alzheimer’s disease centres over the period September 2005 and May 2015. PMID:29743729
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
Applying dynamic Bayesian networks to perturbed gene expression data.
Dojer, Norbert; Gambin, Anna; Mizera, Andrzej; Wilczyński, Bartek; Tiuryn, Jerzy
2006-05-08
A central goal of molecular biology is to understand the regulatory mechanisms of gene transcription and protein synthesis. Because of their solid basis in statistics, allowing to deal with the stochastic aspects of gene expressions and noisy measurements in a natural way, Bayesian networks appear attractive in the field of inferring gene interactions structure from microarray experiments data. However, the basic formalism has some disadvantages, e.g. it is sometimes hard to distinguish between the origin and the target of an interaction. Two kinds of microarray experiments yield data particularly rich in information regarding the direction of interactions: time series and perturbation experiments. In order to correctly handle them, the basic formalism must be modified. For example, dynamic Bayesian networks (DBN) apply to time series microarray data. To our knowledge the DBN technique has not been applied in the context of perturbation experiments. We extend the framework of dynamic Bayesian networks in order to incorporate perturbations. Moreover, an exact algorithm for inferring an optimal network is proposed and a discretization method specialized for time series data from perturbation experiments is introduced. We apply our procedure to realistic simulations data. The results are compared with those obtained by standard DBN learning techniques. Moreover, the advantages of using exact learning algorithm instead of heuristic methods are analyzed. We show that the quality of inferred networks dramatically improves when using data from perturbation experiments. We also conclude that the exact algorithm should be used when it is possible, i.e. when considered set of genes is small enough.
Generalized radiation-field quantization method and the Petermann excess-noise factor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Y.-J.; Siegman, A.E.; E.L. Ginzton Laboratory, Stanford University, Stanford, California 94305
2003-10-01
We propose a generalized radiation-field quantization formalism, where quantization does not have to be referenced to a set of power-orthogonal eigenmodes as conventionally required. This formalism can be used to directly quantize the true system eigenmodes, which can be non-power-orthogonal due to the open nature of the system or the gain/loss medium involved in the system. We apply this generalized field quantization to the laser linewidth problem, in particular, lasers with non-power-orthogonal oscillation modes, and derive the excess-noise factor in a fully quantum-mechanical framework. We also show that, despite the excess-noise factor for oscillating modes, the total spatially averaged decaymore » rate for the laser atoms remains unchanged.« less
Feng, Liang-Wen; Ren, Hai; Xiong, Hu; Wang, Pan; Wang, Lijia; Tang, Yong
2017-03-06
A ligand-promoted catalytic [4+2] annulation reaction using indole derivatives and donor-acceptor (D-A) cyclobutanes is reported, thus providing an efficient and atom-economical access to versatile cyclohexa-fused indolines with excellent levels of diastereoselectivity and a broad substrate scope. In the presence of a chiral SaBOX ligand, excellent enantioselectivity was realized with up to 94 % ee. This novel synthetic method is applied as a general protocol for the total synthesis of (±)-akuammicine and the formal total synthesis of (±)-strychnine from the same common-core scaffold. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Hierarchical coarse-graining transform.
Pancaldi, Vera; King, Peter R; Christensen, Kim
2009-03-01
We present a hierarchical transform that can be applied to Laplace-like differential equations such as Darcy's equation for single-phase flow in a porous medium. A finite-difference discretization scheme is used to set the equation in the form of an eigenvalue problem. Within the formalism suggested, the pressure field is decomposed into an average value and fluctuations of different kinds and at different scales. The application of the transform to the equation allows us to calculate the unknown pressure with a varying level of detail. A procedure is suggested to localize important features in the pressure field based only on the fine-scale permeability, and hence we develop a form of adaptive coarse graining. The formalism and method are described and demonstrated using two synthetic toy problems.
Modelling and temporal performances evaluation of networked control systems using (max, +) algebra
NASA Astrophysics Data System (ADS)
Ammour, R.; Amari, S.
2015-01-01
In this paper, we address the problem of temporal performances evaluation of producer/consumer networked control systems. The aim is to develop a formal method for evaluating the response time of this type of control systems. Our approach consists on modelling, using Petri nets classes, the behaviour of the whole architecture including the switches that support multicast communications used by this protocol. (max, +) algebra formalism is then exploited to obtain analytical formulas of the response time and the maximal and minimal bounds. The main novelty is that our approach takes into account all delays experienced at the different stages of networked automation systems. Finally, we show how to apply the obtained results through an example of networked control system.
Bibliometrics for Social Validation.
Hicks, Daniel J
2016-01-01
This paper introduces a bibliometric, citation network-based method for assessing the social validation of novel research, and applies this method to the development of high-throughput toxicology research at the US Environmental Protection Agency. Social validation refers to the acceptance of novel research methods by a relevant scientific community; it is formally independent of the technical validation of methods, and is frequently studied in history, philosophy, and social studies of science using qualitative methods. The quantitative methods introduced here find that high-throughput toxicology methods are spread throughout a large and well-connected research community, which suggests high social validation. Further assessment of social validation involving mixed qualitative and quantitative methods are discussed in the conclusion.
Bibliometrics for Social Validation
2016-01-01
This paper introduces a bibliometric, citation network-based method for assessing the social validation of novel research, and applies this method to the development of high-throughput toxicology research at the US Environmental Protection Agency. Social validation refers to the acceptance of novel research methods by a relevant scientific community; it is formally independent of the technical validation of methods, and is frequently studied in history, philosophy, and social studies of science using qualitative methods. The quantitative methods introduced here find that high-throughput toxicology methods are spread throughout a large and well-connected research community, which suggests high social validation. Further assessment of social validation involving mixed qualitative and quantitative methods are discussed in the conclusion. PMID:28005974
Ten Commandments of Formal Methods...Ten Years Later
NASA Technical Reports Server (NTRS)
Bowen, Jonathan P.; Hinchey, Michael G.
2006-01-01
More than a decade ago, in "Ten Commandments of Formal Methods," we offered practical guidelines for projects that sought to use formal methods. Over the years, the article, which was based on our knowledge of successful industrial projects, has been widely cited and has generated much positive feedback. However, despite this apparent enthusiasm, formal methods use has not greatly increased, and some of the same attitudes about the infeasibility of adopting them persist. Formal methodists believe that introducing greater rigor will improve the software development process and yield software with better structure, greater maintainability, and fewer errors.
Authenticity in Learning for the Twenty-First Century: Bridging the Formal and the Informal
ERIC Educational Resources Information Center
Hung, David; Lee, Shu-Shing; Lim, Kenneth Y. T.
2012-01-01
The paper attempts to bridge informal and formal learning by leveraging on affordance structures associated with informal environments to help learners develop social, cognitive, and metacognitive dispositions that can be applied to learning in classrooms. Most studies focus on either learning in formal or informal contexts, but this study seeks…
Formalized Epistemology, Logic, and Grammar
NASA Astrophysics Data System (ADS)
Bitbol, Michel
The task of a formal epistemology is defined. It appears that a formal epistemology must be a generalization of "logic" in the sense of Wittgenstein's Tractatus. The generalization is required because, whereas logic presupposes a strict relation between activity and language, this relation may be broken in some domains of experimental enquiry (e.g., in microscopic physics). However, a formal epistemology should also retain a major feature of Wittgenstein's "logic": It must not be a discourse about scientific knowledge, but rather a way of making manifest the structures usually implicit in knowledge-gaining activity. This strategy is applied to the formalism of quantum mechanics.
Selecting essential information for biosurveillance--a multi-criteria decision analysis.
Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.
Survey of Verification and Validation Techniques for Small Satellite Software Development
NASA Technical Reports Server (NTRS)
Jacklin, Stephen A.
2015-01-01
The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification
Development and Evaluation of an Ontology for Guiding Appropriate Antibiotic Prescribing
Furuya, E. Yoko; Kuperman, Gilad J.; Cimino, James J.; Bakken, Suzanne
2011-01-01
Objectives To develop and apply formal ontology creation methods to the domain of antimicrobial prescribing and to formally evaluate the resulting ontology through intrinsic and extrinsic evaluation studies. Methods We extended existing ontology development methods to create the ontology and implemented the ontology using Protégé-OWL. Correctness of the ontology was assessed using a set of ontology design principles and domain expert review via the laddering technique. We created three artifacts to support the extrinsic evaluation (set of prescribing rules, alerts and an ontology-driven alert module, and a patient database) and evaluated the usefulness of the ontology for performing knowledge management tasks to maintain the ontology and for generating alerts to guide antibiotic prescribing. Results The ontology includes 199 classes, 10 properties, and 1,636 description logic restrictions. Twenty-three Semantic Web Rule Language rules were written to generate three prescribing alerts: 1) antibiotic-microorganism mismatch alert; 2) medication-allergy alert; and 3) non-recommended empiric antibiotic therapy alert. The evaluation studies confirmed the correctness of the ontology, usefulness of the ontology for representing and maintaining antimicrobial treatment knowledge rules, and usefulness of the ontology for generating alerts to provide feedback to clinicians during antibiotic prescribing. Conclusions This study contributes to the understanding of ontology development and evaluation methods and addresses one knowledge gap related to using ontologies as a clinical decision support system component—a need for formal ontology evaluation methods to measure their quality from the perspective of their intrinsic characteristics and their usefulness for specific tasks. PMID:22019377
ERIC Educational Resources Information Center
Ribera, Tony
2012-01-01
Student affairs professionals have been called to apply pedagogical methods to promote student learning in the out-of-class setting and show evidence of their contributions to student learning. To fulfill their professional responsibilities, practitioners should enter the student affairs profession with a basic understanding of ways to gather,…
Formal Language Design in the Context of Domain Engineering
2000-03-28
73 Related Work 75 5.1 Feature oriented domain analysis ( FODA ) 75 5.2 Organizational domain modeling (ODM) 76 5.3 Domain-Specific Software...However there are only a few that are well defined and used repeatedly in practice. These include: Feature oriented domain analysis ( FODA ), Organizational...Feature oriented domain analysis ( FODA ) Feature oriented domain analysis ( FODA ) is a domain analysis method being researched and applied by the SEI
Forecasts of non-Gaussian parameter spaces using Box-Cox transformations
NASA Astrophysics Data System (ADS)
Joachimi, B.; Taylor, A. N.
2011-09-01
Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.
Two-Point Turbulence Closure Applied to Variable Resolution Modeling
NASA Technical Reports Server (NTRS)
Girimaji, Sharath S.; Rubinstein, Robert
2011-01-01
Variable resolution methods have become frontline CFD tools, but in order to take full advantage of this promising new technology, more formal theoretical development is desirable. Two general classes of variable resolution methods can be identified: hybrid or zonal methods in which RANS and LES models are solved in different flow regions, and bridging or seamless models which interpolate smoothly between RANS and LES. This paper considers the formulation of bridging methods using methods of two-point closure theory. The fundamental problem is to derive a subgrid two-equation model. We compare and reconcile two different approaches to this goal: the Partially Integrated Transport Model, and the Partially Averaged Navier-Stokes method.
A design automation framework for computational bioenergetics in biological networks.
Angione, Claudio; Costanza, Jole; Carapezza, Giovanni; Lió, Pietro; Nicosia, Giuseppe
2013-10-01
The bioenergetic activity of mitochondria can be thoroughly investigated by using computational methods. In particular, in our work we focus on ATP and NADH, namely the metabolites representing the production of energy in the cell. We develop a computational framework to perform an exhaustive investigation at the level of species, reactions, genes and metabolic pathways. The framework integrates several methods implementing the state-of-the-art algorithms for many-objective optimization, sensitivity, and identifiability analysis applied to biological systems. We use this computational framework to analyze three case studies related to the human mitochondria and the algal metabolism of Chlamydomonas reinhardtii, formally described with algebraic differential equations or flux balance analysis. Integrating the results of our framework applied to interacting organelles would provide a general-purpose method for assessing the production of energy in a biological network.
Use of an auxiliary basis set to describe the polarization in the fragment molecular orbital method
NASA Astrophysics Data System (ADS)
Fedorov, Dmitri G.; Kitaura, Kazuo
2014-03-01
We developed a dual basis approach within the fragment molecular orbital formalism enabling efficient and accurate use of large basis sets. The method was tested on water clusters and polypeptides and applied to perform geometry optimization of chignolin (PDB: 1UAO) in solution at the level of DFT/6-31++G∗∗, obtaining a structure in agreement with experiment (RMSD of 0.4526 Å). The polarization in polypeptides is discussed with a comparison of the α-helix and β-strand.
Formal Methods Case Studies for DO-333
NASA Technical Reports Server (NTRS)
Cofer, Darren; Miller, Steven P.
2014-01-01
RTCA DO-333, Formal Methods Supplement to DO-178C and DO-278A provides guidance for software developers wishing to use formal methods in the certification of airborne systems and air traffic management systems. The supplement identifies the modifications and additions to DO-178C and DO-278A objectives, activities, and software life cycle data that should be addressed when formal methods are used as part of the software development process. This report presents three case studies describing the use of different classes of formal methods to satisfy certification objectives for a common avionics example - a dual-channel Flight Guidance System. The three case studies illustrate the use of theorem proving, model checking, and abstract interpretation. The material presented is not intended to represent a complete certification effort. Rather, the purpose is to illustrate how formal methods can be used in a realistic avionics software development project, with a focus on the evidence produced that could be used to satisfy the verification objectives found in Section 6 of DO-178C.
Graduate nurse internship program: a formalized orientation program.
Phillips, Tracy; Hall, Mellisa
2014-01-01
The graduate nurse internship program was developed on the basis of Watson's Human Caring Theory. In this article, the author discusses how an orientation program was formalized into an internship program and how the theory was applied.
Giorgi, R; Gouvernet, J; Dufour, J; Degoulet, P; Laugier, R; Quilichini, F; Fieschi, M
2001-01-01
Present the method used to elaborate and formalize current scientific knowledge to provide physicians with tools available on the Internet, that enable them to evaluate individual patient risk, give personalized preventive recommendations or early screening measures. The approach suggested in this article is in line with medical procedures based on levels of evidence (Evidence-based Medicine). A cyclical process for developing recommendations allows us to quickly incorporate current scientific information. At each phase, the analysis is reevaluated by experts in the field collaborating on the project. The information is formalized through the use of levels of evidence and grades of recommendations. GLIF model is used to implement recommendations for clinical practice guidelines. The most current scientific evidence incorporated in a cyclical process includes several steps: critical analysis according to the Evidence-based Medicine method; identification of predictive factors; setting-up risk levels; identification of prevention measures; elaboration of personalized recommendation. The information technology implementation of the clinical practice guideline enables physicians to quickly obtain personalized information for their patients. Cases of colorectal prevention illustrate our approach. Integration of current scientific knowledge is an important process. The delay between the moment new information arrives and the moment the practitioner applies it, is thus reduced.
Charge noise in quantum dot qubits: beyond the Markovian approximation.
NASA Astrophysics Data System (ADS)
Yang, Yuan-Chi; Friesen, Mark; Coppersmith, S. N.
Charge noise is a limiting factor in the performance of semiconductor quantum dot qubits, including both spin and charge qubits. In this work, we develop an analytical formalism for treating semiclassical noise beyond the Markovian approximation, which allows us to investigate noise models relevant for quantum dots, such as 1 / f noise. We apply our methods to both charge qubits and quantum dot hybrid qubits, and study the effects of charge noise on single-qubit rotations in these systems. The formalism is also directly applicable to the case of strong microwave driving, for which the rotating wave approximation breaks down. This work was supported in part by ARO (W911NF-12-0607) and ONR (N00014-15-1-0029), and the University of Wisconsin-Madison.
Equilibrium Free Energies from Nonequilibrium Metadynamics
NASA Astrophysics Data System (ADS)
Bussi, Giovanni; Laio, Alessandro; Parrinello, Michele
2006-03-01
In this Letter we propose a new formalism to map history-dependent metadynamics in a Markovian process. We apply this formalism to model Langevin dynamics and determine the equilibrium distribution of a collection of simulations. We demonstrate that the reconstructed free energy is an unbiased estimate of the underlying free energy and analytically derive an expression for the error. The present results can be applied to other history-dependent stochastic processes, such as Wang-Landau sampling.
NASA Astrophysics Data System (ADS)
Tkacz, J.; Bukowiec, A.; Doligalski, M.
2017-08-01
The paper presentes the method of modeling and implementation of concurrent controllers. Concurrent controllers are specified by Petri nets. Then Petri nets are decomposed using symbolic deduction method of analysis. Formal methods like sequent calculus system with considered elements of Thelen's algorithm have been used here. As a result, linked state machines (LSMs) are received. Each FSM is implemented using methods of structural decomposition during process of logic synthesis. The method of multiple encoding of microinstruction has been applied. It leads to decreased number of Boolean function realized by combinational part of FSM. The additional decoder could be implemented with the use of memory blocks.
Applying Formal Methods to NASA Projects: Transition from Research to Practice
NASA Technical Reports Server (NTRS)
Othon, Bill
2009-01-01
NASA project managers attempt to manage risk by relying on mature, well-understood process and technology when designing spacecraft. In the case of crewed systems, the margin for error is even tighter and leads to risk aversion. But as we look to future missions to the Moon and Mars, the complexity of the systems will increase as the spacecraft and crew work together with less reliance on Earth-based support. NASA will be forced to look for new ways to do business. Formal methods technologies can help NASA develop complex but cost effective spacecraft in many domains, including requirements and design, software development and inspection, and verification and validation of vehicle subsystems. To realize these gains, the technologies must be matured and field-tested so that they are proven when needed. During this discussion, current activities used to evaluate FM technologies for Orion spacecraft design will be reviewed. Also, suggestions will be made to demonstrate value to current designers, and mature the technology for eventual use in safety-critical NASA missions.
Whatever Happened to Formal Methods for Security?
Voas, J.; Schaffer, K.
2016-01-01
We asked 7 experts 7 questions to find out what has occurred recently in terms of applying formal methods (FM) to security-centric, cyber problems. We are continually reminded of the 1996 paper by Tony Hoare “How did Software Get So Reliable Without Proof?” [1] In that vein, how did we get so insecure with proof? Given daily press announcements concerning new malware, data breaches, and privacy loss, is FM still relevant or was it ever? Our experts answered with unique personal insights. We were curious as to whether this successful methodology in “safety-critical” has succeeded as well for today’s “build it, hack it, patch it” mindset. Our experts were John McLean (Naval Research Labs), Paul Black (National Institute of Standards and Technology), Karl Levitt (University of California at Davis), Joseph Williams (CloudEconomist.Com), Connie Heitmeyer (Naval Research Labs), Eugene Spafford (Purdue University), and Joseph Kiniry (Galois, Inc.). The questions and responses follow. PMID:27890940
NASA Technical Reports Server (NTRS)
Hynes, Charles S.; Hardy, Gordon H.; Sherry, Lance
2007-01-01
Volume I of this report presents a new method for synthesizing hybrid systems directly from design requirements, and applies the method to design of a hybrid system for longitudinal control of transport aircraft. The resulting system satisfies general requirement for safety and effectiveness specified a priori, enabling formal validation to be achieved. Volume II contains seven appendices intended to make the report accessible to readers with backgrounds in human factors, fli ght dynamics and control. and formal logic. Major design goals are (1) system desi g n integrity based on proof of correctness at the design level, (2), significant simplification and cost reduction in system development and certification, and (3) improved operational efficiency, with significant alleviation of human-factors problems encountered by pilots in current transport aircraft. This report provides for the first time a firm technical basis for criteria governing design and certification of avionic systems for transport aircraft. It should be of primary interest to designers of next-generation avionic systems.
NASA Technical Reports Server (NTRS)
Hynes, Charles S.; Hardy, Gordon H.; Sherry, Lance
2007-01-01
Volume I of this report presents a new method for synthesizing hybrid systems directly from desi gn requirements, and applies the method to design of a hybrid system for longitudinal control of transport aircraft. The resulting system satisfies general requirement for safety and effectiveness specified a priori, enabling formal validation to be achieved. Volume II contains seven appendices intended to make the report accessible to readers with backgrounds in human factors, flight dynamics and control, and formal logic. Major design goals are (1) system design integrity based on proof of correctness at the design level, (2) significant simplification and cost reduction in system development and certification, and (3) improved operational efficiency, with significant alleviation of human-factors problems encountered by pilots in current transport aircraft. This report provides for the first time a firm technical basis for criteria governing design and certification of avionic systems for transport aircraft. It should be of primary interest to designers of next-generation avionic systems.
Karman, Tijs; van der Avoird, Ad; Groenenboom, Gerrit C
2015-02-28
We discuss three quantum mechanical formalisms for calculating collision-induced absorption spectra. First, we revisit the established theory of collision-induced absorption, assuming distinguishable molecules which interact isotropically. Then, the theory is rederived incorporating exchange effects between indistinguishable molecules. It is shown that the spectrum can no longer be written as an incoherent sum of the contributions of the different spherical components of the dipole moment. Finally, we derive an efficient method to include the effects of anisotropic interactions in the computation of the absorption spectrum. This method calculates the dipole coupling on-the-fly, which allows for the uncoupled treatment of the initial and final states without the explicit reconstruction of the many-component wave functions. The three formalisms are applied to the collision-induced rotation-translation spectra of hydrogen molecules in the far-infrared. Good agreement with experimental data is obtained. Significant effects of anisotropic interactions are observed in the far wing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brabec, Jiri; van Dam, Hubertus JJ; Pittner, Jiri
2012-03-28
The recently proposed Universal State-Selective (USS) corrections [K. Kowalski, J. Chem. Phys. 134, 194107 (2011)] to approximate Multi-Reference Coupled Cluster (MRCC) energies can be commonly applied to any type of MRCC theory based on the Jeziorski-Monkhorst [B. Jeziorski, H.J. Monkhorst, Phys. Rev. A 24, 1668 (1981)] exponential Ansatz. In this letter we report on the performance of a simple USS correction to the Brillouin-Wigner MRCC (BW-MRCC) formalism employing single and double excitations (BW-MRCCSD). It is shown that the resulting formalism (USS-BW-MRCCSD), which uses the manifold of single and double excitations to construct the correction, can be related to a posteriorimore » corrections utilized in routine BW-MRCCSD calculations. In several benchmark calculations we compare the results of the USS-BW-MRCCSD method with results of the BW-MRCCSD approach employing a posteriori corrections and with results obtained with the Full Configuration Interaction (FCI) method.« less
NASA Astrophysics Data System (ADS)
Collart, T. G.; Stacey, W. M.
2015-11-01
Several methods are presented for extending the traditional analytic ``circular'' representation of flux-surface aligned curvilinear coordinate systems to more accurately describe equilibrium plasma geometry and magnetic fields in DIII-D. The formalism originally presented by Miller is extended to include different poloidal variations in the upper and lower hemispheres. A coordinate system based on separate Fourier expansions of major radius and vertical position greatly improves accuracy in edge plasma structure representation. Scale factors and basis vectors for a system formed by expanding the circular model minor radius can be represented using linear combinations of Fourier basis functions. A general method for coordinate system orthogonalization is presented and applied to all curvilinear models. A formalism for the magnetic field structure in these curvilinear models is presented, and the resulting magnetic field predictions are compared against calculations performed in a Cartesian system using an experimentally based EFIT prediction for the Grad-Shafranov equilibrium. Supported by: US DOE under DE-FG02-00ER54538.
NASA Astrophysics Data System (ADS)
Shi, Lin; Xu, Ke; Wang, Lin-Wang
2015-05-01
Nonradiative carrier recombination is of both great applied and fundamental importance, but the correct ab initio approaches to calculate it remain to be inconclusive. Here we used five different approximations to calculate the nonradiative carrier recombinations of two complex defect structures GaP :Z nGa-OP and GaN :Z nGa-VN , and compared the results with experiments. In order to apply different multiphonon assisted electron transition formalisms, we have calculated the electron-phonon coupling constants by ab initio density functional theory for all phonon modes. Compared with different methods, the capture coefficients calculated by the static coupling theory are 4.30 ×10-8 and 1.46 ×10-7c m3/s for GaP :Z nGa-OP and GaN :Z nGa-VN , which are in good agreement with the experiment results, (4-1+2) ×10-8 and 3.0 ×10-7c m3/s , respectively. We also provided arguments for why the static coupling theory should be used to calculate the nonradiative decays of semiconductors.
QCD evolution of the Sivers function
NASA Astrophysics Data System (ADS)
Aybat, S. M.; Collins, J. C.; Qiu, J. W.; Rogers, T. C.
2012-02-01
We extend the Collins-Soper-Sterman (CSS) formalism to apply it to the spin dependence governed by the Sivers function. We use it to give a correct numerical QCD evolution of existing fixed-scale fits of the Sivers function. With the aid of approximations useful for the nonperturbative region, we present the results as parametrizations of a Gaussian form in transverse-momentum space, rather than in the Fourier conjugate transverse coordinate space normally used in the CSS formalism. They are specifically valid at small transverse momentum. Since evolution has been applied, our results can be used to make predictions for Drell-Yan and semi-inclusive deep inelastic scattering at energies different from those where the original fits were made. Our evolved functions are of a form that they can be used in the same parton-model factorization formulas as used in the original fits, but now with a predicted scale dependence in the fit parameters. We also present a method by which our evolved functions can be corrected to allow for twist-3 contributions at large parton transverse momentum.
Nonequilibrium Green's function method for quantum thermal transport
NASA Astrophysics Data System (ADS)
Wang, Jian-Sheng; Agarwalla, Bijay Kumar; Li, Huanan; Thingna, Juzar
2014-12-01
This review deals with the nonequilibrium Green's function (NEGF) method applied to the problems of energy transport due to atomic vibrations (phonons), primarily for small junction systems. We present a pedagogical introduction to the subject, deriving some of the well-known results such as the Laudauer-like formula for heat current in ballistic systems. The main aim of the review is to build the machinery of the method so that it can be applied to other situations, which are not directly treated here. In addition to the above, we consider a number of applications of NEGF, not in routine model system calculations, but in a few new aspects showing the power and usefulness of the formalism. In particular, we discuss the problems of multiple leads, coupled left-right-lead system, and system without a center. We also apply the method to the problem of full counting statistics. In the case of nonlinear systems, we make general comments on the thermal expansion effect, phonon relaxation time, and a certain class of mean-field approximations. Lastly, we examine the relationship between NEGF, reduced density matrix, and master equation approaches to thermal transport.
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
Safety programmes in the Egyptian construction industry.
Hassanein, Amr A G; Hanna, Ragaa S
2007-12-01
This study is aimed at exploring the nature of the safety programmes applied by large-size contractors operating in Egypt. Results revealed that safety programmes applied by those contractors were less formal than the programmes applied by their American counterparts. Only three contractors out of the surveyed sample had accident records broken down by projects, provided workers with formal safety orientation, and trained safety personnel on first-aid. The study recommended that reforms to the scheme of the employers' contribution to social insurance are necessary. This is meant to serve as a strong incentive for safety management.
Two Paradoxes in Linear Regression Analysis.
Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong
2016-12-25
Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.
Why are Formal Methods Not Used More Widely?
NASA Technical Reports Server (NTRS)
Knight, John C.; DeJong, Colleen L.; Gibble, Matthew S.; Nakano, Luis G.
1997-01-01
Despite extensive development over many years and significant demonstrated benefits, formal methods remain poorly accepted by industrial practitioners. Many reasons have been suggested for this situation such as a claim that they extent the development cycle, that they require difficult mathematics, that inadequate tools exist, and that they are incompatible with other software packages. There is little empirical evidence that any of these reasons is valid. The research presented here addresses the question of why formal methods are not used more widely. The approach used was to develop a formal specification for a safety-critical application using several specification notations and assess the results in a comprehensive evaluation framework. The results of the experiment suggests that there remain many impediments to the routine use of formal methods.
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
Inverse problems in quantum chemistry
NASA Astrophysics Data System (ADS)
Karwowski, Jacek
Inverse problems constitute a branch of applied mathematics with well-developed methodology and formalism. A broad family of tasks met in theoretical physics, in civil and mechanical engineering, as well as in various branches of medical and biological sciences has been formulated as specific implementations of the general theory of inverse problems. In this article, it is pointed out that a number of approaches met in quantum chemistry can (and should) be classified as inverse problems. Consequently, the methodology used in these approaches may be enriched by applying ideas and theorems developed within the general field of inverse problems. Several examples, including the RKR method for the construction of potential energy curves, determining parameter values in semiempirical methods, and finding external potentials for which the pertinent Schrödinger equation is exactly solvable, are discussed in detail.
Development and evaluation of an ontology for guiding appropriate antibiotic prescribing.
Bright, Tiffani J; Yoko Furuya, E; Kuperman, Gilad J; Cimino, James J; Bakken, Suzanne
2012-02-01
To develop and apply formal ontology creation methods to the domain of antimicrobial prescribing and to formally evaluate the resulting ontology through intrinsic and extrinsic evaluation studies. We extended existing ontology development methods to create the ontology and implemented the ontology using Protégé-OWL. Correctness of the ontology was assessed using a set of ontology design principles and domain expert review via the laddering technique. We created three artifacts to support the extrinsic evaluation (set of prescribing rules, alerts and an ontology-driven alert module, and a patient database) and evaluated the usefulness of the ontology for performing knowledge management tasks to maintain the ontology and for generating alerts to guide antibiotic prescribing. The ontology includes 199 classes, 10 properties, and 1636 description logic restrictions. Twenty-three Semantic Web Rule Language rules were written to generate three prescribing alerts: (1) antibiotic-microorganism mismatch alert; (2) medication-allergy alert; and (3) non-recommended empiric antibiotic therapy alert. The evaluation studies confirmed the correctness of the ontology, usefulness of the ontology for representing and maintaining antimicrobial treatment knowledge rules, and usefulness of the ontology for generating alerts to provide feedback to clinicians during antibiotic prescribing. This study contributes to the understanding of ontology development and evaluation methods and addresses one knowledge gap related to using ontologies as a clinical decision support system component-a need for formal ontology evaluation methods to measure their quality from the perspective of their intrinsic characteristics and their usefulness for specific tasks. Copyright © 2011 Elsevier Inc. All rights reserved.
Continuous-time random-walk model for financial distributions
NASA Astrophysics Data System (ADS)
Masoliver, Jaume; Montero, Miquel; Weiss, George H.
2003-02-01
We apply the formalism of the continuous-time random walk to the study of financial data. The entire distribution of prices can be obtained once two auxiliary densities are known. These are the probability densities for the pausing time between successive jumps and the corresponding probability density for the magnitude of a jump. We have applied the formalism to data on the U.S. dollar deutsche mark future exchange, finding good agreement between theory and the observed data.
Probabilities for time-dependent properties in classical and quantum mechanics
NASA Astrophysics Data System (ADS)
Losada, Marcelo; Vanni, Leonardo; Laura, Roberto
2013-05-01
We present a formalism which allows one to define probabilities for expressions that involve properties at different times for classical and quantum systems and we study its lattice structure. The formalism is based on the notion of time translation of properties. In the quantum case, the properties involved should satisfy compatibility conditions in order to obtain well-defined probabilities. The formalism is applied to describe the double-slit experiment.
Trust and Trustworthiness in Human-Robot Interaction: A Formal Conceptualization
2016-05-11
AFRL-AFOSR-VA-TR-2016-0198 Trust and Trustworthiness in Human- Robot Interaction: A formal conceptualization Alan Wagner GEORGIA TECH APPLIED RESEARCH...27/2013-03/31/2016 4. TITLE AND SUBTITLE Trust and Trustworthiness in Human- Robot Interaction: A formal conceptualization 5a. CONTRACT NUMBER 5b...evaluated algorithms for characterizing trust during interactions between a robot and a human and employed strategies for repairing trust during emergency
Weaving a Formal Methods Education with Problem-Based Learning
NASA Astrophysics Data System (ADS)
Gibson, J. Paul
The idea of weaving formal methods through computing (or software engineering) degrees is not a new one. However, there has been little success in developing and implementing such a curriculum. Formal methods continue to be taught as stand-alone modules and students, in general, fail to see how fundamental these methods are to the engineering of software. A major problem is one of motivation — how can the students be expected to enthusiastically embrace a challenging subject when the learning benefits, beyond passing an exam and achieving curriculum credits, are not clear? Problem-based learning has gradually moved from being an innovative pedagogique technique, commonly used to better-motivate students, to being widely adopted in the teaching of many different disciplines, including computer science and software engineering. Our experience shows that a good problem can be re-used throughout a student's academic life. In fact, the best computing problems can be used with children (young and old), undergraduates and postgraduates. In this paper we present a process for weaving formal methods through a University curriculum that is founded on the application of problem-based learning and a library of good software engineering problems, where students learn about formal methods without sitting a traditional formal methods module. The process of constructing good problems and integrating them into the curriculum is shown to be analagous to the process of engineering software. This approach is not intended to replace more traditional formal methods modules: it will better prepare students for such specialised modules and ensure that all students have an understanding and appreciation for formal methods even if they do not go on to specialise in them.
A Variational Method in Out-of-Equilibrium Physical Systems
Pinheiro, Mario J.
2013-01-01
We propose a new variational principle for out-of-equilibrium dynamic systems that are fundamentally based on the method of Lagrange multipliers applied to the total entropy of an ensemble of particles. However, we use the fundamental equation of thermodynamics on differential forms, considering U and S as 0-forms. We obtain a set of two first order differential equations that reveal the same formal symplectic structure shared by classical mechanics, fluid mechanics and thermodynamics. From this approach, a topological torsion current emerges of the form , where Aj and ωk denote the components of the vector potential (gravitational and/or electromagnetic) and where ω denotes the angular velocity of the accelerated frame. We derive a special form of the Umov-Poynting theorem for rotating gravito-electromagnetic systems. The variational method is then applied to clarify the working mechanism of particular devices. PMID:24316718
NASA Astrophysics Data System (ADS)
Miller, T. N.; Brumbaugh, E. J.; Barker, M.; Ly, V.; Schick, R.; Rogers, L.
2015-12-01
The NASA DEVELOP National Program conducts over eighty Earth science projects every year. Each project applies NASA Earth observations to impact decision-making related to a local or regional community concern. Small, interdisciplinary teams create a methodology to address the specific issue, and then pass on the results to partner organizations, as well as providing them with instruction to continue using remote sensing for future decisions. Many different methods are used by individual teams, and the program as a whole, to communicate results and research accomplishments to decision-makers, stakeholders, alumni, and the general public. These methods vary in scope from formal publications to more informal venues, such as social media. This presentation will highlight the communication techniques used by the DEVELOP program. Audiences, strategies, and outlets will be discussed, including a newsletter, microjournal, video contest, and several others.
Does Choice of Multicriteria Method Matter? An Experiment in Water Resources Planning
NASA Astrophysics Data System (ADS)
Hobbs, Benjamin F.; Chankong, Vira; Hamadeh, Wael; Stakhiv, Eugene Z.
1992-07-01
Many multiple criteria decision making methods have been proposed and applied to water planning. Their purpose is to provide information on tradeoffs among objectives and to help users articulate value judgments in a systematic, coherent, and documentable manner. The wide variety of available techniques confuses potential users, causing inappropriate matching of methods with problems. Experiments in which water planners apply more than one multicriteria procedure to realistic problems can help dispel this confusion by testing method appropriateness, ease of use, and validity. We summarize one such experiment where U.S. Army Corps of Engineers personnel used several methods to screen urban water supply plans. The methods evaluated include goal programming, ELECTRE I, additive value functions, multiplicative utility functions, and three techniques for choosing weights (direct rating, indifference tradeoff, and the analytical hierarchy process). Among the conclusions we reach are the following. First, experienced planners generally prefer simpler, more transparent methods. Additive value functions are favored. Yet none of the methods are endorsed by a majority of the participants; many preferred to use no formal method at all. Second, there is strong evidence that rating, the most commonly applied weight selection method, is likely to lead to weights that fail to represent the trade-offs that users are willing to make among criteria. Finally, we show that decisions can be as or more sensitive to the method used as to which person applies it. Therefore, if who chooses is important, then so too is how a choice is made.
NASA Technical Reports Server (NTRS)
Bhatia, A. K.
2012-01-01
The P-wave hybrid theory of electron-hydrogen elastic scattering [Phys. Rev. A 85, 052708 (2012)] is applied to the P-wave scattering from He ion. In this method, both short-range and long-range correlations are included in the Schroedinger equation at the same time, by using a combination of a modified method of polarized orbitals and the optical potential formalism. The short-correlation functions are of Hylleraas type. It is found that the phase shifts are not significantly affected by the modification of the target function by a method similar to the method of polarized orbitals and they are close to the phase shifts calculated earlier by Bhatia [Phys. Rev. A 69, 032714 (2004)]. This indicates that the correlation function is general enough to include the target distortion (polarization) in the presence of the incident electron. The important fact is that in the present calculation, to obtain similar results only a 20-term correlation function is needed in the wave function compared to the 220- term wave function required in the above-mentioned calculation. Results for the phase shifts, obtained in the present hybrid formalism, are rigorous lower bounds to the exact phase shifts. The lowest P-wave resonances in He atom and hydrogen ion have been calculated and compared with the results obtained using the Feshbach projection operator formalism [Phys. Rev. A, 11, 2018 (1975)]. It is concluded that accurate resonance parameters can be obtained by the present method, which has the advantage of including corrections due to neighboring resonances, bound states and the continuum in which these resonance are embedded.
Selecting Essential Information for Biosurveillance—A Multi-Criteria Decision Analysis
Generous, Nicholas; Margevicius, Kristen J.; Taylor-McCabe, Kirsten J.; Brown, Mac; Daniel, W. Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillancedefines biosurveillance as “the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels.” However, the strategy does not specify how “essential information” is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being “essential”. Thequestion of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of “essential information” for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system. PMID:24489748
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor)
2010-01-01
Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.
Ten Commandments Revisited: A Ten-Year Perspective on the Industrial Application of Formal Methods
NASA Technical Reports Server (NTRS)
Bowen, Jonathan P.; Hinchey, Michael G.
2005-01-01
Ten years ago, our 1995 paper Ten Commandments of Formal Methods suggested some guidelines to help ensure the success of a formal methods project. It proposed ten important requirements (or "commandments") for formal developers to consider and follow, based on our knowledge of several industrial application success stories, most of which have been reported in more detail in two books. The paper was surprisingly popular, is still widely referenced, and used as required reading in a number of formal methods courses. However, not all have agreed with some of our commandments, feeling that they may not be valid in the long-term. We re-examine the original commandments ten years on, and consider their validity in the light of a further decade of industrial best practice and experiences.
NASA Technical Reports Server (NTRS)
Bolton, Matthew L.; Bass, Ellen J.
2009-01-01
Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.
Formal Safety Certification of Aerospace Software
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2005-01-01
In principle, formal methods offer many advantages for aerospace software development: they can help to achieve ultra-high reliability, and they can be used to provide evidence of the reliability claims which can then be subjected to external scrutiny. However, despite years of research and many advances in the underlying formalisms of specification, semantics, and logic, formal methods are not much used in practice. In our opinion this is related to three major shortcomings. First, the application of formal methods is still expensive because they are labor- and knowledge-intensive. Second, they are difficult to scale up to complex systems because they are based on deep mathematical insights about the behavior of the systems (t.e., they rely on the "heroic proof"). Third, the proofs can be difficult to interpret, and typically stand in isolation from the original code. In this paper, we describe a tool for formally demonstrating safety-relevant aspects of aerospace software, which largely circumvents these problems. We focus on safely properties because it has been observed that safety violations such as out-of-bounds memory accesses or use of uninitialized variables constitute the majority of the errors found in the aerospace domain. In our approach, safety means that the program will not violate a set of rules that can range for the simple memory access rules to high-level flight rules. These different safety properties are formalized as different safety policies in Hoare logic, which are then used by a verification condition generator along with the code and logical annotations in order to derive formal safety conditions; these are then proven using an automated theorem prover. Our certification system is currently integrated into a model-based code generation toolset that generates the annotations together with the code. However, this automated formal certification technology is not exclusively constrained to our code generator and could, in principle, also be integrated with other code generators such as RealTime Workshop or even applied to legacy code. Our approach circumvents the historical problems with formal methods by increasing the degree of automation on all levels. The restriction to safety policies (as opposed to arbitrary functional behavior) results in simpler proof problems that can generally be solved by fully automatic theorem proves. An automated linking mechanism between the safety conditions and the code provides some of the traceability mandated by process standards such as DO-178B. An automated explanation mechanism uses semantic markup added by the verification condition generator to produce natural-language explanations of the safety conditions and thus supports their interpretation in relation to the code. It shows an automatically generated certification browser that lets users inspect the (generated) code along with the safety conditions (including textual explanations), and uses hyperlinks to automate tracing between the two levels. Here, the explanations reflect the logical structure of the safety obligation but the mechanism can in principle be customized using different sets of domain concepts. The interface also provides some limited control over the certification process itself. Our long-term goal is a seamless integration of certification, code generation, and manual coding that results in a "certified pipeline" in which specifications are automatically transformed into executable code, together with the supporting artifacts necessary for achieving and demonstrating the high level of assurance needed in the aerospace domain.
Renormalization group methods for the Reynolds stress transport equations
NASA Technical Reports Server (NTRS)
Rubinstein, R.
1992-01-01
The Yakhot-Orszag renormalization group is used to analyze the pressure gradient-velocity correlation and return to isotropy terms in the Reynolds stress transport equations. The perturbation series for the relevant correlations, evaluated to lowest order in the epsilon-expansion of the Yakhot-Orszag theory, are infinite series in tensor product powers of the mean velocity gradient and its transpose. Formal lowest order Pade approximations to the sums of these series produce a rapid pressure strain model of the form proposed by Launder, Reece, and Rodi, and a return to isotropy model of the form proposed by Rotta. In both cases, the model constants are computed theoretically. The predicted Reynolds stress ratios in simple shear flows are evaluated and compared with experimental data. The possibility is discussed of deriving higher order nonlinear models by approximating the sums more accurately. The Yakhot-Orszag renormalization group provides a systematic procedure for deriving turbulence models. Typical applications have included theoretical derivation of the universal constants of isotropic turbulence theory, such as the Kolmogorov constant, and derivation of two equation models, again with theoretically computed constants and low Reynolds number forms of the equations. Recent work has applied this formalism to Reynolds stress modeling, previously in the form of a nonlinear eddy viscosity representation of the Reynolds stresses, which can be used to model the simplest normal stress effects. The present work attempts to apply the Yakhot-Orszag formalism to Reynolds stress transport modeling.
NASA Technical Reports Server (NTRS)
Farrell, W. M.; Hurley, D. M.; Esposito, V. J.; Mclain, J. L.; Zimmerman, M. I.
2017-01-01
We present a new formalism to describe the outgassing of hydrogen initially implanted by the solar wind protons into exposed soils on airless bodies. The formalism applies a statistical mechanics approach similar to that applied recently to molecular adsorption onto activated surfaces. The key element enabling this formalism is the recognition that the interatomic potential between the implanted H and regolith-residing oxides is not of singular value but possess a distribution of trapped energy values at a given temperature, F(U,T). All subsequent derivations of the outward diffusion and H retention rely on the specific properties of this distribution. We find that solar wind hydrogen can be retained if there are sites in the implantation layer with activation energy values exceeding 0.5eV. We especially examine the dependence of H retention applying characteristic energy values found previously for irradiated silica and mature lunar samples. We also apply the formalism to two cases that differ from the typical solar wind implantation at the Moon. First, we test for a case of implantation in magnetic anomaly regions where significantly lower-energy ions of solar wind origin are expected to be incident with the surface. In magnetic anomalies, H retention is found to be reduced due to the reduced ion flux and shallower depth of implantation. Second, we also apply the model to Phobos where the surface temperature range is not as extreme as the Moon. We find the H atom retention in this second case is higher than the lunar case due to the reduced thermal extremes (that reduces outgassing).
Improved formalism for precision Higgs coupling fits
NASA Astrophysics Data System (ADS)
Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; Karl, Robert; List, Jenny; Ogawa, Tomohisa; Peskin, Michael E.; Tian, Junping
2018-03-01
Future e+e- colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e+e- data, based on the effective field theory description of corrections to the Standard Model. We apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e+e- colliders.
Nuclear deformation in the laboratory frame
NASA Astrophysics Data System (ADS)
Gilbreth, C. N.; Alhassid, Y.; Bertsch, G. F.
2018-01-01
We develop a formalism for calculating the distribution of the axial quadrupole operator in the laboratory frame within the rotationally invariant framework of the configuration-interaction shell model. The calculation is carried out using a finite-temperature auxiliary-field quantum Monte Carlo method. We apply this formalism to isotope chains of even-mass samarium and neodymium nuclei and show that the quadrupole distribution provides a model-independent signature of nuclear deformation. Two technical advances are described that greatly facilitate the calculations. The first is to exploit the rotational invariance of the underlying Hamiltonian to reduce the statistical fluctuations in the Monte Carlo calculations. The second is to determine quadruple invariants from the distribution of the axial quadrupole operator in the laboratory frame. This allows us to extract effective values of the intrinsic quadrupole shape parameters without invoking an intrinsic frame or a mean-field approximation.
Huang, Shih-Wei; Chi, Wen-Chou; Yen, Chia-Feng; Chang, Kwang-Hwa; Liao, Hua-Fang; Escorpizo, Reuben; Chang, Feng-Hang; Liou, Tsan-Hon
2017-01-01
Background WHO Disability Assessment Schedule 2.0 (WHODAS 2.0) is a feasible tool for assessing functional disability and analysing the risk of institutionalisation among elderly patients with dementia. However, the data for the effect of education on disability status in patients with dementia is lacking. The aim of this large-scale, population-based study was to analyse the effect of education on the disability status of elderly Taiwanese patients with dementia by using WHODAS 2.0. Methods From the Taiwan Data Bank of Persons with Disability, we enrolled 7698 disabled elderly (older than 65 years) patients diagnosed with dementia between July 2012 and January 2014. According to their education status, we categorised these patients with and without formal education (3849 patients each). We controlled for the demographic variables through propensity score matching. The standardised scores of these patients in the six domains of WHODAS 2.0 were evaluated by certified interviewers. Student’s t-test was used for comparing the WHODAS 2.0 scores of patients with dementia in the two aforementioned groups. Poisson regression was applied for analysing the association among all the investigated variables. Results Patients with formal education had low disability status in the domains of getting along and social participation than did patients without formal education. Poisson regression revealed that standardised scores in all domains of WHODAS 2.0—except self-care—were associated with education status. Conclusions This study revealed lower disability status in the WHODAS 2.0 domains of getting along and social participation for patients with dementia with formal education compared with those without formal education. For patients with disability and dementia without formal education, community intervention of social participation should be implemented to maintain better social interaction ability. PMID:28473510
NASA Astrophysics Data System (ADS)
Tichý, Vladimír; Hudec, René; Němcová, Šárka
2016-06-01
The algorithm presented is intended mainly for lobster eye optics. This type of optics (and some similar types) allows for a simplification of the classical ray-tracing procedure that requires great many rays to simulate. The method presented performs the simulation of a only few rays; therefore it is extremely effective. Moreover, to simplify the equations, a specific mathematical formalism is used. Only a few simple equations are used, therefore the program code can be simple as well. The paper also outlines how to apply the method to some other reflective optical systems.
Path Finding on High-Dimensional Free Energy Landscapes
NASA Astrophysics Data System (ADS)
Díaz Leines, Grisell; Ensing, Bernd
2012-07-01
We present a method for determining the average transition path and the free energy along this path in the space of selected collective variables. The formalism is based upon a history-dependent bias along a flexible path variable within the metadynamics framework but with a trivial scaling of the cost with the number of collective variables. Controlling the sampling of the orthogonal modes recovers the average path and the minimum free energy path as the limiting cases. The method is applied to resolve the path and the free energy of a conformational transition in alanine dipeptide.
Symmetries of SU(2) Skyrmion in Hamiltonian and Lagrangian Approaches
NASA Astrophysics Data System (ADS)
Hong, Soon-Tae; Kim, Yong-Wan; Park, Young-Jai
We apply the Batalin-Fradkin-Tyutin (BFT) method to the SU(2) Skyrmion to study the full symmetry structure of the model at the first-class Hamiltonian level. On the other hand, we also analyze the symmetry structure of the action having the WZ term, which corresponds to this Hamiltonian, in the framework of the Lagrangian approach. Furthermore, following the BFV formalism we derive the BRST invariant gauge fixed Lagrangian from the above extended action.
1978-01-17
approach to designing computers: Formal mathematical methods were applied and computers themselves began to be widely used in designing other...capital, labor resources and the funds of consumers. Analysis of the model indicates that at the present time the average complexity of production of...ALGORITHMIC COMPLETENESS AND COMPLEXITY OF MICROPROGRAMS Kiev KIBERNETIKA in Russian No 3, May/Jun 77 pp 1-15 manuscript received 22 Dec 76 G0LUNK0V
Two Paradoxes in Linear Regression Analysis
FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong
2016-01-01
Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214
2015-01-13
applying formal methods to systems software, e.g., IronClad [16] and seL4 [19], promise that this vision is not a fool’s er- rand after all. In this...kernel seL4 [19] is fully verified for functional correct- ness and it runs with other deprivileged services. How- ever, the verification process used...portion, which is non-trivial for theorem proving-based approaches. In our COSS example, adding the trusted network logging extensions to seL4 will
Trost, Barry M; Debien, Laurent
2016-01-01
Diorganocuprate(I) reagents derived from lithiated heterocycles and CuCN react with enantioenriched secondary propagryl bromides to give the corresponding propargylated heterocycles. While propargyl electrophiles typically undergo S N 2' displacement, this transformation represents the first example of the reaction of hard carbanions with propargyl eletrophiles in an S N 2 fashion and occurs with excellent levels of stereoinversion. The new method was applied to the formal synthesis of (+)-frondosin B.
NASA Astrophysics Data System (ADS)
Kjærgaard, Thomas; Baudin, Pablo; Bykov, Dmytro; Eriksen, Janus Juul; Ettenhuber, Patrick; Kristensen, Kasper; Larkin, Jeff; Liakh, Dmitry; Pawłowski, Filip; Vose, Aaron; Wang, Yang Min; Jørgensen, Poul
2017-03-01
We present a scalable cross-platform hybrid MPI/OpenMP/OpenACC implementation of the Divide-Expand-Consolidate (DEC) formalism with portable performance on heterogeneous HPC architectures. The Divide-Expand-Consolidate formalism is designed to reduce the steep computational scaling of conventional many-body methods employed in electronic structure theory to linear scaling, while providing a simple mechanism for controlling the error introduced by this approximation. Our massively parallel implementation of this general scheme has three levels of parallelism, being a hybrid of the loosely coupled task-based parallelization approach and the conventional MPI +X programming model, where X is either OpenMP or OpenACC. We demonstrate strong and weak scalability of this implementation on heterogeneous HPC systems, namely on the GPU-based Cray XK7 Titan supercomputer at the Oak Ridge National Laboratory. Using the "resolution of the identity second-order Møller-Plesset perturbation theory" (RI-MP2) as the physical model for simulating correlated electron motion, the linear-scaling DEC implementation is applied to 1-aza-adamantane-trione (AAT) supramolecular wires containing up to 40 monomers (2440 atoms, 6800 correlated electrons, 24 440 basis functions and 91 280 auxiliary functions). This represents the largest molecular system treated at the MP2 level of theory, demonstrating an efficient removal of the scaling wall pertinent to conventional quantum many-body methods.
Two-Step Formal Advertisement: An Examination.
1976-10-01
The purpose of this report is to examine the potential application of the Two-Step Formal Advertisement method of procurement. Emphasis is placed on...Step formal advertising is a method of procurement designed to take advantage of negotiation flexibility and at the same time obtain the benefits of...formal advertising . It is used where the specifications are not sufficiently definite or may be too restrictive to permit full and free competition
Formal Methods for Verification and Validation of Partial Specifications: A Case Study
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Callahan, John
1997-01-01
This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.
Vector-Vector Scattering on the Lattice
NASA Astrophysics Data System (ADS)
Romero-López, Fernando; Urbach, Carsten; Rusetsky, Akaki
2018-03-01
In this work we present an extension of the LüScher formalism to include the interaction of particles with spin, focusing on the scattering of two vector particles. The derived formalism will be applied to Scalar QED in the Higgs Phase, where the U(1) gauge boson acquires mass.
Formal hardware verification of digital circuits
NASA Technical Reports Server (NTRS)
Joyce, J.; Seger, C.-J.
1991-01-01
The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.
Landau's statistical mechanics for quasi-particle models
NASA Astrophysics Data System (ADS)
Bannur, Vishnu M.
2014-04-01
Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.
Improved formalism for precision Higgs coupling fits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon
Future e +e – colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e +e – data, based on the effective field theory description of corrections to the Standard Model. Lastly, we apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e +e – colliders.
Improved formalism for precision Higgs coupling fits
Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; ...
2018-03-20
Future e +e – colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e +e – data, based on the effective field theory description of corrections to the Standard Model. Lastly, we apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e +e – colliders.
Formal functional test designs with a test representation language
NASA Technical Reports Server (NTRS)
Hops, J. M.
1993-01-01
The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.
NASA Technical Reports Server (NTRS)
Bhatia, Anand K.
2008-01-01
Applications of the hybrid theory to the scattering of electrons from Ile+ and Li++ and resonances in these systems, A. K. Bhatia, NASA/Goddard Space Flight Center- The Hybrid theory of electron-hydrogen elastic scattering [I] is applied to the S-wave scattering of electrons from He+ and Li++. In this method, both short-range and long-range correlations are included in the Schrodinger equation at the same time. Phase shifts obtained in this calculation have rigorous lower bounds to the exact phase shifts and they are compared with those obtained using the Feshbach projection operator formalism [2], the close-coupling approach [3], and Harris-Nesbet method [4]. The agreement among all the calculations is very good. These systems have doubly-excited or Feshbach resonances embedded in the continuum. The resonance parameters for the lowest ' S resonances in He and Li+ are calculated and they are compared with the results obtained using the Feshbach projection operator formalism [5,6]. It is concluded that accurate resonance parameters can be obtained by the present method, which has the advantage of including corrections due to neighboring resonances and the continuum in which these resonances are embedded.
Applying Automated Theorem Proving to Computer Security
2008-03-01
CS96]”. Violations of policy can also be specified in this model. La Padula [Pad90] discusses a domain-independent formal model which imple- ments a...Science Laboratory, SRI International, Menlo Park, CA, September 1999. Pad90. L.J. La Padula . Formal modeling in a generalized framework for ac- cess
Formal and Applied Counseling in Israel
ERIC Educational Resources Information Center
Israelashvili, Moshe; Wegman-Rozi, Orit
2012-01-01
Living in Israel is intensive and demanding but also meaningful and exciting. This article addresses the gap between the narrowly defined formal status of counseling in Israel and the widespread occurrence of counseling in various settings. It is argued that several recent changes, especially in the definition of treatment, along with the…
Piagetian Research as Applied to Teaching Science to Secondary and College Students.
ERIC Educational Resources Information Center
Gabel, Dorothy L.
1979-01-01
Piaget's formal operational stage is related to the teaching of science by focusing on the development of paper and pencil tests for determining students' cognitive level of development and on procedures for helping concrete operational students improve achievement and become more formal in their thinking. (JMF)
A Review of Auditing Methods Applied to the Content of Controlled Biomedical Terminologies
Zhu, Xinxin; Fan, Jung-Wei; Baorto, David M.; Weng, Chunhua; Cimino, James J.
2012-01-01
Although controlled biomedical terminologies have been with us for centuries, it is only in the last couple of decades that close attention has been paid to the quality of these terminologies. The result of this attention has been the development of auditing methods that apply formal methods to assessing whether terminologies are complete and accurate. We have performed an extensive literature review to identify published descriptions of these methods and have created a framework for characterizing them. The framework considers manual, systematic and heuristic methods that use knowledge (within or external to the terminology) to measure quality factors of different aspects of the terminology content (terms, semantic classification, and semantic relationships). The quality factors examined included concept orientation, consistency, non-redundancy, soundness and comprehensive coverage. We reviewed 130 studies that were retrieved based on keyword search on publications in PubMed, and present our assessment of how they fit into our framework. We also identify which terminologies have been audited with the methods and provide examples to illustrate each part of the framework. PMID:19285571
Heuristics structure and pervade formal risk assessment.
MacGillivray, Brian H
2014-04-01
Lay perceptions of risk appear rooted more in heuristics than in reason. A major concern of the risk regulation literature is that such "error-strewn" perceptions may be replicated in policy, as governments respond to the (mis)fears of the citizenry. This has led many to advocate a relatively technocratic approach to regulating risk, characterized by high reliance on formal risk and cost-benefit analysis. However, through two studies of chemicals regulation, we show that the formal assessment of risk is pervaded by its own set of heuristics. These include rules to categorize potential threats, define what constitutes valid data, guide causal inference, and to select and apply formal models. Some of these heuristics lay claim to theoretical or empirical justifications, others are more back-of-the-envelope calculations, while still more purport not to reflect some truth but simply to constrain discretion or perform a desk-clearing function. These heuristics can be understood as a way of authenticating or formalizing risk assessment as a scientific practice, representing a series of rules for bounding problems, collecting data, and interpreting evidence (a methodology). Heuristics are indispensable elements of induction. And so they are not problematic per se, but they can become so when treated as laws rather than as contingent and provisional rules. Pitfalls include the potential for systematic error, masking uncertainties, strategic manipulation, and entrenchment. Our central claim is that by studying the rules of risk assessment qua rules, we develop a novel representation of the methods, conventions, and biases of the prior art. © 2013 Society for Risk Analysis.
Baji, Petra; Pavlova, Milena; Gulácsi, László; Farkas, Miklós; Groot, Wim
2014-11-01
We examine the willingness of health care consumers to pay formal fees for health care use and how this willingness to pay is associated with past informal payments. We use data from a survey carried out in Hungary in 2010 among a representative sample of 1,037 respondents. The contingent valuation method is used to elicit the willingness to pay official charges for health care services covered by the social health insurance if certain quality attributes (regarding the health care facility, access to the services and health care personnel) are guaranteed. A bivariate probit model is applied to examine the relationship between willingness to pay and past informal payments. We find that 66% of the respondents are willing to pay formal fees for specialist examinations and 56% are willing to pay for planned hospitalizations if these services are provided with certain quality and access attributes. The act of making past informal payments for health care services is positively associated with the willingness to pay formal charges. The probability that a respondent is willing to pay official charges for health care services is 22% points higher for specialist examinations and 45% points higher for hospitalization if the respondent paid informally during the last 12 months. The introduction of formal fees should be accompanied by adequate service provision to assure acceptance of the fees. Furthermore, our results suggest that the problem of informal patient payments may remain even after the implementation of user fees.
Methodological developments in US state-level Genuine Progress Indicators: toward GPI 2.0
Bagstad, Kenneth J.; Berik, Günseli; Gaddis, Erica J. Brown
2014-01-01
The Genuine Progress Indicator (GPI) has emerged as an important monetary measure of economic well-being. Unlike mainstream economic indicators, primarily Gross Domestic Product (GDP), the GPI accounts for both the benefits and costs of economic production across diverse economic, social, and environmental domains in a more comprehensive manner. Recently, the GPI has gained traction in subnational policy in the United States, with GPI studies being conducted in a number of states and with their formal adoption by several state governments. As the GPI is applied in different locations, new methods are developed, different data sources are available, and new issues of policy relevance are addressed using its component indicators. This has led to a divergence in methods, reducing comparability between studies and yielding results that are of varying methodological sophistication. In this study, we review the “state of the art” in recent US state-level GPI studies, focusing on those from Hawaii, Maryland, Ohio, Utah, and Vermont. Through adoption of a consistent approach, these and future GPI studies could utilize a framework that supports more uniform, comparable, and accurate measurements of progress. We also identify longer-term issues, particularly related to treatment of nonrenewable resource depletion, government spending, income inequality, and ecosystem services. As these issues are successfully addressed and disseminated, a “GPI 2.0” will emerge that better measures economic well-being and has greater accuracy and policy relevance than past GPI measurements. As the GPI expands further into mainstream policy analysis, a more formal process by which methods could be updated, standardized, and applied is needed.
Zhang, Yong; Otani, Akihito; Maginn, Edward J
2015-08-11
Equilibrium molecular dynamics is often used in conjunction with a Green-Kubo integral of the pressure tensor autocorrelation function to compute the shear viscosity of fluids. This approach is computationally expensive and is subject to a large amount of variability because the plateau region of the Green-Kubo integral is difficult to identify unambiguously. Here, we propose a time decomposition approach for computing the shear viscosity using the Green-Kubo formalism. Instead of one long trajectory, multiple independent trajectories are run and the Green-Kubo relation is applied to each trajectory. The averaged running integral as a function of time is fit to a double-exponential function with a weighting function derived from the standard deviation of the running integrals. Such a weighting function minimizes the uncertainty of the estimated shear viscosity and provides an objective means of estimating the viscosity. While the formal Green-Kubo integral requires an integration to infinite time, we suggest an integration cutoff time tcut, which can be determined by the relative values of the running integral and the corresponding standard deviation. This approach for computing the shear viscosity can be easily automated and used in computational screening studies where human judgment and intervention in the data analysis are impractical. The method has been applied to the calculation of the shear viscosity of a relatively low-viscosity liquid, ethanol, and relatively high-viscosity ionic liquid, 1-n-butyl-3-methylimidazolium bis(trifluoromethane-sulfonyl)imide ([BMIM][Tf2N]), over a range of temperatures. These test cases show that the method is robust and yields reproducible and reliable shear viscosity values.
NASA Technical Reports Server (NTRS)
Jamsek, Damir A.
1993-01-01
A brief example of the use of formal methods techniques in the specification of a software system is presented. The report is part of a larger effort targeted at defining a formal methods pilot project for NASA. One possible application domain that may be used to demonstrate the effective use of formal methods techniques within the NASA environment is presented. It is not intended to provide a tutorial on either formal methods techniques or the application being addressed. It should, however, provide an indication that the application being considered is suitable for a formal methods by showing how such a task may be started. The particular system being addressed is the Structured File Services (SFS), which is a part of the Data Storage and Retrieval Subsystem (DSAR), which in turn is part of the Data Management System (DMS) onboard Spacestation Freedom. This is a software system that is currently under development for NASA. An informal mathematical development is presented. Section 3 contains the same development using Penelope (23), an Ada specification and verification system. The complete text of the English version Software Requirements Specification (SRS) is reproduced in Appendix A.
Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request
NASA Technical Reports Server (NTRS)
DiVito, Ben L.; Roberts, Larry W.
1996-01-01
We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.
New QCD sum rules based on canonical commutation relations
NASA Astrophysics Data System (ADS)
Hayata, Tomoya
2012-04-01
New derivation of QCD sum rules by canonical commutators is developed. It is the simple and straightforward generalization of Thomas-Reiche-Kuhn sum rule on the basis of Kugo-Ojima operator formalism of a non-abelian gauge theory and a suitable subtraction of UV divergences. By applying the method to the vector and axial vector current in QCD, the exact Weinberg’s sum rules are examined. Vector current sum rules and new fractional power sum rules are also discussed.
NASA Astrophysics Data System (ADS)
Wang, S. M.; Michel, N.; Nazarewicz, W.; Xu, F. R.
2017-10-01
Background: Weakly bound and unbound nuclear states appearing around particle thresholds are prototypical open quantum systems. Theories of such states must take into account configuration mixing effects in the presence of strong coupling to the particle continuum space. Purpose: To describe structure and decays of three-body systems, we developed a Gamow coupled-channel (GCC) approach in Jacobi coordinates by employing the complex-momentum formalism. We benchmarked the complex-energy Gamow shell model (GSM) against the new framework. Methods: The GCC formalism is expressed in Jacobi coordinates, so that the center-of-mass motion is automatically eliminated. To solve the coupled-channel equations, we use hyperspherical harmonics to describe the angular wave functions while the radial wave functions are expanded in the Berggren ensemble, which includes bound, scattering, and Gamow states. Results: We show that the GCC method is both accurate and robust. Its results for energies, decay widths, and nucleon-nucleon angular correlations are in good agreement with the GSM results. Conclusions: We have demonstrated that a three-body GSM formalism explicitly constructed in the cluster-orbital shell model coordinates provides results similar to those with a GCC framework expressed in Jacobi coordinates, provided that a large configuration space is employed. Our calculations for A =6 systems and 26O show that nucleon-nucleon angular correlations are sensitive to the valence-neutron interaction. The new GCC technique has many attractive features when applied to bound and unbound states of three-body systems: it is precise, is efficient, and can be extended by introducing a microscopic model of the core.
Smarr formula for Lovelock black holes: A Lagrangian approach
NASA Astrophysics Data System (ADS)
Liberati, Stefano; Pacilio, Costantino
2016-04-01
The mass formula for black holes can be formally expressed in terms of a Noether charge surface integral plus a suitable volume integral, for any gravitational theory. The integrals can be constructed as an application of Wald's formalism. We apply this formalism to compute the mass and the Smarr formula for static Lovelock black holes. Finally, we propose a new prescription for Wald's entropy in the case of Lovelock black holes, which takes into account topological contributions to the entropy functional.
Coarse-grained hydrodynamics from correlation functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmer, Bruce
This paper will describe a formalism for using correlation functions between different grid cells as the basis for determining coarse-grained hydrodynamic equations for modeling the behavior of mesoscopic fluid systems. Configuration from a molecular dynamics simulation are projected onto basis functions representing grid cells in a continuum hydrodynamic simulation. Equilbrium correlation functions between different grid cells are evaluated from the molecular simulation and used to determine the evolution operator for the coarse-grained hydrodynamic system. The formalism is applied to some simple hydrodynamic cases to determine the feasibility of applying this to realistic nanoscale systems.
NASA Technical Reports Server (NTRS)
Shin, Jong-Yeob; Belcastro, Christine
2008-01-01
Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. As a part of the validation process, this paper describes an analysis method for determining a reliable flight regime in the flight envelope within which an integrated resilent control system can achieve the desired performance of tracking command signals and detecting additive faults in the presence of parameter uncertainty and unmodeled dynamics. To calculate a reliable flight regime, a structured singular value analysis method is applied to analyze the closed-loop system over the entire flight envelope. To use the structured singular value analysis method, a linear fractional transform (LFT) model of a transport aircraft longitudinal dynamics is developed over the flight envelope by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The developed LFT model can capture original nonlinear dynamics over the flight envelope with the ! block which contains key varying parameters: angle of attack and velocity, and real parameter uncertainty: aerodynamic coefficient uncertainty and moment of inertia uncertainty. Using the developed LFT model and a formal robustness analysis method, a reliable flight regime is calculated for a transport aircraft closed-loop system.
A new statistical method for characterizing the atmospheres of extrasolar planets
NASA Astrophysics Data System (ADS)
Henderson, Cassandra S.; Skemer, Andrew J.; Morley, Caroline V.; Fortney, Jonathan J.
2017-10-01
By detecting light from extrasolar planets, we can measure their compositions and bulk physical properties. The technologies used to make these measurements are still in their infancy, and a lack of self-consistency suggests that previous observations have underestimated their systemic errors. We demonstrate a statistical method, newly applied to exoplanet characterization, which uses a Bayesian formalism to account for underestimated errorbars. We use this method to compare photometry of a substellar companion, GJ 758b, with custom atmospheric models. Our method produces a probability distribution of atmospheric model parameters including temperature, gravity, cloud model (fsed) and chemical abundance for GJ 758b. This distribution is less sensitive to highly variant data and appropriately reflects a greater uncertainty on parameter fits.
Properties of a Formal Method to Model Emergence in Swarm-Based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.
Formal Methods of V&V of Partial Specifications: An Experience Report
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Callahan, John
1997-01-01
This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe an experiment in the application of the method SCR. to testing for consistency properties of a partial model of requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification is valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.
NASA Astrophysics Data System (ADS)
Nagai, Tetsuro
2017-01-01
Replica-exchange molecular dynamics (REMD) has demonstrated its efficiency by combining trajectories of a wide range of temperatures. As an extension of the method, the author formalizes the mass-manipulating replica-exchange molecular dynamics (MMREMD) method that allows for arbitrary mass scaling with respect to temperature and individual particles. The formalism enables the versatile application of mass-scaling approaches to the REMD method. The key change introduced in the novel formalism is the generalized rules for the velocity and momentum scaling after accepted replica-exchange attempts. As an application of this general formalism, the refinement of the viscosity-REMD (V-REMD) method [P. H. Nguyen,
Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop
NASA Technical Reports Server (NTRS)
Rozier, Kristin Yvonne (Editor)
2008-01-01
Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.
Using Data-Driven Model-Brain Mappings to Constrain Formal Models of Cognition
Borst, Jelmer P.; Nijboer, Menno; Taatgen, Niels A.; van Rijn, Hedderik; Anderson, John R.
2015-01-01
In this paper we propose a method to create data-driven mappings from components of cognitive models to brain regions. Cognitive models are notoriously hard to evaluate, especially based on behavioral measures alone. Neuroimaging data can provide additional constraints, but this requires a mapping from model components to brain regions. Although such mappings can be based on the experience of the modeler or on a reading of the literature, a formal method is preferred to prevent researcher-based biases. In this paper we used model-based fMRI analysis to create a data-driven model-brain mapping for five modules of the ACT-R cognitive architecture. We then validated this mapping by applying it to two new datasets with associated models. The new mapping was at least as powerful as an existing mapping that was based on the literature, and indicated where the models were supported by the data and where they have to be improved. We conclude that data-driven model-brain mappings can provide strong constraints on cognitive models, and that model-based fMRI is a suitable way to create such mappings. PMID:25747601
NASA Astrophysics Data System (ADS)
Zhou, Chenyi; Guo, Hong
2017-01-01
We report a diagrammatic method to solve the general problem of calculating configurationally averaged Green's function correlators that appear in quantum transport theory for nanostructures containing disorder. The theory treats both equilibrium and nonequilibrium quantum statistics on an equal footing. Since random impurity scattering is a problem that cannot be solved exactly in a perturbative approach, we combine our diagrammatic method with the coherent potential approximation (CPA) so that a reliable closed-form solution can be obtained. Our theory not only ensures the internal consistency of the diagrams derived at different levels of the correlators but also satisfies a set of Ward-like identities that corroborate the conserving consistency of transport calculations within the formalism. The theory is applied to calculate the quantum transport properties such as average ac conductance and transmission moments of a disordered tight-binding model, and results are numerically verified to high precision by comparing to the exact solutions obtained from enumerating all possible disorder configurations. Our formalism can be employed to predict transport properties of a wide variety of physical systems where disorder scattering is important.
Grammar-based Automatic 3D Model Reconstruction from Terrestrial Laser Scanning Data
NASA Astrophysics Data System (ADS)
Yu, Q.; Helmholz, P.; Belton, D.; West, G.
2014-04-01
The automatic reconstruction of 3D buildings has been an important research topic during the last years. In this paper, a novel method is proposed to automatically reconstruct the 3D building models from segmented data based on pre-defined formal grammar and rules. Such segmented data can be extracted e.g. from terrestrial or mobile laser scanning devices. Two steps are considered in detail. The first step is to transform the segmented data into 3D shapes, for instance using the DXF (Drawing Exchange Format) format which is a CAD data file format used for data interchange between AutoCAD and other program. Second, we develop a formal grammar to describe the building model structure and integrate the pre-defined grammars into the reconstruction process. Depending on the different segmented data, the selected grammar and rules are applied to drive the reconstruction process in an automatic manner. Compared with other existing approaches, our proposed method allows the model reconstruction directly from 3D shapes and takes the whole building into account.
NASA Astrophysics Data System (ADS)
Galley, Chad R.; Rothstein, Ira Z.
2017-05-01
We utilize the dynamical renormalization group formalism to calculate the real space trajectory of a compact binary inspiral for long times via a systematic resummation of secularly growing terms. This method generates closed form solutions without orbit averaging, and the accuracy can be systematically improved. The expansion parameter is v5ν Ω (t -t0) where t0 is the initial time, t is the time elapsed, and Ω and v are the angular orbital frequency and initial speed, respectively. ν is the binary's symmetric mass ratio. We demonstrate how to apply the renormalization group method to resum solutions beyond leading order in two ways. First, we calculate the second-order corrections of the leading radiation reaction force, which involves highly nontrivial checks of the formalism (i.e., its renormalizability). Second, we show how to systematically include post-Newtonian corrections to the radiation reaction force. By avoiding orbit averaging, we gain predictive power and eliminate ambiguities in the initial conditions. Finally, we discuss how this methodology can be used to find analytic solutions to the spin equations of motion that are valid over long times.
Revisiting the radio interferometer measurement equation. I. A full-sky Jones formalism
NASA Astrophysics Data System (ADS)
Smirnov, O. M.
2011-03-01
Context. Since its formulation by Hamaker et al., the radio interferometer measurement equation (RIME) has provided a rigorous mathematical basis for the development of novel calibration methods and techniques, including various approaches to the problem of direction-dependent effects (DDEs). However, acceptance of the RIME in the radio astronomical community at large has been slow, which is partially due to the limited availability of software to exploit its power, and the sparsity of practical results. This needs to change urgently. Aims: This series of papers aims to place recent developments in the treatment of DDEs into one RIME-based mathematical framework, and to demonstrate the ease with which the various effects can be described and understood. It also aims to show the benefits of a RIME-based approach to calibration. Methods: Paper I re-derives the RIME from first principles, extends the formalism to the full-sky case, and incorporates DDEs. Paper II then uses the formalism to describe self-calibration, both with a full RIME, and with the approximate equations of older software packages, and shows how this is affected by DDEs. It also gives an overview of real-life DDEs and proposed methods of dealing with them. Finally, in Paper III some of these methods are exercised to achieve an extremely high-dynamic range calibration of WSRT observations of 3C 147 at 21 cm, with full treatment of DDEs. Results: The RIME formalism is extended to the full-sky case (Paper I), and is shown to be an elegant way of describing calibration and DDEs (Paper II). Applying this to WSRT data (Paper III) results in a noise-limited image of the field around 3C 147 with a very high dynamic range (1.6 million), and none of the off-axis artifacts that plague regular selfcal. The resulting differential gain solutions contain significant information on DDEs and errors in the sky model. Conclusions: The RIME is a powerful formalism for describing radio interferometry, and underpins the development of novel calibration methods, in particular those dealing with DDEs. One of these is the differential gains approach used for the 3C 147 reduction. Differential gains can eliminate DDE-related artifacts, and provide information for iterative improvements of sky models. Perhaps most importantly, sources as faint as 2 mJy have been shown to yield meaningful differential gain solutions, and thus can be used as potential calibration beacons in other DDE-related schemes.
Genway, Sam; Garrahan, Juan P; Lesanovsky, Igor; Armour, Andrew D
2012-05-01
Recent progress in the study of dynamical phase transitions has been made with a large-deviation approach to study trajectories of stochastic jumps using a thermodynamic formalism. We study this method applied to an open quantum system consisting of a superconducting single-electron transistor, near the Josephson quasiparticle resonance, coupled to a resonator. We find that the dynamical behavior shown in rare trajectories can be rich even when the mean dynamical activity is small, and thus the formalism gives insights into the form of fluctuations. The structure of the dynamical phase diagram found from the quantum-jump trajectories of the resonator is studied, and we see that sharp transitions in the dynamical activity may be related to the appearance and disappearance of bistabilities in the state of the resonator as system parameters are changed. We also demonstrate that for a fast resonator, the trajectories of quasiparticles are similar to the resonator trajectories.
A medical ontology for intelligent web-based skin lesions image retrieval.
Maragoudakis, Manolis; Maglogiannis, Ilias
2011-06-01
Researchers have applied increasing efforts towards providing formal computational frameworks to consolidate the plethora of concepts and relations used in the medical domain. In the domain of skin related diseases, the variability of semantic features contained within digital skin images is a major barrier to the medical understanding of the symptoms and development of early skin cancers. The desideratum of making these standards machine-readable has led to their formalization in ontologies. In this work, in an attempt to enhance an existing Core Ontology for skin lesion images, hand-coded from image features, high quality images were analyzed by an autonomous ontology creation engine. We show that by exploiting agglomerative clustering methods with distance criteria upon the existing ontological structure, the original domain model could be enhanced with new instances, attributes and even relations, thus allowing for better classification and retrieval of skin lesion categories from the web.
Executable Architecture Research at Old Dominion University
NASA Technical Reports Server (NTRS)
Tolk, Andreas; Shuman, Edwin A.; Garcia, Johnny J.
2011-01-01
Executable Architectures allow the evaluation of system architectures not only regarding their static, but also their dynamic behavior. However, the systems engineering community do not agree on a common formal specification of executable architectures. To close this gap and identify necessary elements of an executable architecture, a modeling language, and a modeling formalism is topic of ongoing PhD research. In addition, systems are generally defined and applied in an operational context to provide capabilities and enable missions. To maximize the benefits of executable architectures, a second PhD effort introduces the idea of creating an executable context in addition to the executable architecture. The results move the validation of architectures from the current information domain into the knowledge domain and improve the reliability of such validation efforts. The paper presents research and results of both doctoral research efforts and puts them into a common context of state-of-the-art of systems engineering methods supporting more agility.
A synthesis of studies of access point density as a risk factor for road accidents.
Elvik, Rune
2017-10-01
Studies of the relationship between access point density (number of access points, or driveways, per kilometre of road) and accident frequency or rate (number of accidents per unit of exposure) have consistently found that accident rate increases when access point density increases. This paper presents a formal synthesis of the findings of these studies. It was found that the addition of one access point per kilometre of road is associated with an increase of 4% in the expected number of accidents, controlling for traffic volume. Although studies consistently indicate an increase in accident rate as access point density increases, the size of the increase varies substantially between studies. In addition to reviewing studies of access point density as a risk factor, the paper discusses some issues related to formally synthesising regression coefficients by applying the inverse-variance method of meta-analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Security Hardened Cyber Components for Nuclear Power Plants: Phase I SBIR Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franusich, Michael D.
SpiralGen, Inc. built a proof-of-concept toolkit for enhancing the cyber security of nuclear power plants and other critical infrastructure with high-assurance instrumentation and control code. The toolkit is based on technology from the DARPA High-Assurance Cyber Military Systems (HACMS) program, which has focused on applying the science of formal methods to the formidable set of problems involved in securing cyber physical systems. The primary challenges beyond HACMS in developing this toolkit were to make the new technology usable by control system engineers and compatible with the regulatory and commercial constraints of the nuclear power industry. The toolkit, packaged as amore » Simulink add-on, allows a system designer to assemble a high-assurance component from formally specified and proven blocks and generate provably correct control and monitor code for that subsystem.« less
NASA Astrophysics Data System (ADS)
Kurvaeva, L. V.; Gavrilova, I. V.; Mahmutova, M. V.; Chichilanova, S. A.; Povituhin, S. A.
2018-05-01
The choice of educational digital content, according to education goals (descriptors which are formed by competences, labor functions, etc.), becomes an important practical task because of the variety of existing educational online systems that is available to persons within formal, informal IT education formats. Ontologies can form a basis for working out knowledge bases, which are center of intellectual system support in IT specialist training. The paper describes a technology of ontological model creation; analyzes the structure and the content of basic data. The structure of knowledge interrelation of the considered subject and IT education is considered. This knowledge base is applied for solving tasks of educational and methodical supplementation of educational programs of the higher and additional professional education, corporate training; for creating systems of certification and testing for students and practicing experts; for forming individual trajectories of training and career development.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
Optimization of permanent breast seed implant dosimetry incorporating tissue heterogeneity
NASA Astrophysics Data System (ADS)
Mashouf, Shahram
Seed brachytherapy is currently used for adjuvant radiotherapy of early stage prostate and breast cancer patients. The current standard for calculation of dose around brachytherapy sources is based on the AAPM TG43 formalism, which generates the dose in homogeneous water medium. Recently, AAPM task group no. 186 (TG186) emphasized the importance of accounting for heterogeneities. In this work we introduce an analytical dose calculation algorithm in heterogeneous media using CT images. The advantages over other methods are computational efficiency and the ease of integration into clinical use. An Inhomogeneity Correction Factor (ICF) is introduced as the ratio of absorbed dose in tissue to that in water medium. ICF is a function of tissue properties and independent of the source structure. The ICF is extracted using CT images and the absorbed dose in tissue can then be calculated by multiplying the dose as calculated by the TG43 formalism times ICF. To evaluate the methodology, we compared our results with Monte Carlo simulations as well as experiments in phantoms with known density and atomic compositions. The dose distributions obtained through applying ICF to TG43 protocol agreed very well with those of Monte Carlo simulations and experiments in all phantoms. In all cases, the mean relative error was reduced by at least a factor of two when ICF correction factor was applied to the TG43 protocol. In conclusion we have developed a new analytical dose calculation method, which enables personalized dose calculations in heterogeneous media using CT images. The methodology offers several advantages including the use of standard TG43 formalism, fast calculation time and extraction of the ICF parameters directly from Hounsfield Units. The methodology was implemented into our clinical treatment planning system where a cohort of 140 patients were processed to study the clinical benefits of a heterogeneity corrected dose.
Formal and Informal Registration as Marketing Tools: Do They Produce "Trapped" Executives?
ERIC Educational Resources Information Center
Apple, L. Eugene
1993-01-01
A marketing concept was applied to college registration procedures in an experiment, focusing on degree of "escalation" of effort of students who had failed twice to register in desired courses, type of registration used (formal or informal) on each of three tries, and student characteristics (time until graduation, major, gender). (MSE)
Applying the Formal Elements Art Therapy Scale (FEATS) to Adults in an Asian Population
ERIC Educational Resources Information Center
Nan, Joshua Kin-man; Hinz, Lisa D.
2012-01-01
Assessment is the foundation for conceptualizing effective interventions. Due to their nonverbal nature, art therapy assessments have an advantage over traditional verbal assessments in some populations and potentially across cultures. This pilot study provides preliminary reliability data to support the cross-cultural use of the Formal Elements…
Counting Strategies and Semantic Analysis as Applied to Class Inclusion. Report No. 61.
ERIC Educational Resources Information Center
Wilkinson, Alexander
This investigation examined strategic and semantic aspects of the answers given by preschool children to class inclusion problems. The Piagetian logical formalism for class inclusion was contrasted with a new, problem processing formalism in three experiments. In experiment 1, it was found that 48 nursery school subjects nearly always performed…
Integrating reasoning and clinical archetypes using OWL ontologies and SWRL rules.
Lezcano, Leonardo; Sicilia, Miguel-Angel; Rodríguez-Solano, Carlos
2011-04-01
Semantic interoperability is essential to facilitate the computerized support for alerts, workflow management and evidence-based healthcare across heterogeneous electronic health record (EHR) systems. Clinical archetypes, which are formal definitions of specific clinical concepts defined as specializations of a generic reference (information) model, provide a mechanism to express data structures in a shared and interoperable way. However, currently available archetype languages do not provide direct support for mapping to formal ontologies and then exploiting reasoning on clinical knowledge, which are key ingredients of full semantic interoperability, as stated in the SemanticHEALTH report [1]. This paper reports on an approach to translate definitions expressed in the openEHR Archetype Definition Language (ADL) to a formal representation expressed using the Ontology Web Language (OWL). The formal representations are then integrated with rules expressed with Semantic Web Rule Language (SWRL) expressions, providing an approach to apply the SWRL rules to concrete instances of clinical data. Sharing the knowledge expressed in the form of rules is consistent with the philosophy of open sharing, encouraged by archetypes. Our approach also allows the reuse of formal knowledge, expressed through ontologies, and extends reuse to propositions of declarative knowledge, such as those encoded in clinical guidelines. This paper describes the ADL-to-OWL translation approach, describes the techniques to map archetypes to formal ontologies, and demonstrates how rules can be applied to the resulting representation. We provide examples taken from a patient safety alerting system to illustrate our approach. Copyright © 2010 Elsevier Inc. All rights reserved.
Software Formal Inspections Guidebook
NASA Technical Reports Server (NTRS)
1993-01-01
The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.
Royle, J. Andrew; Chandler, Richard B.; Yackulic, Charles; Nichols, James D.
2012-01-01
1. Understanding the factors affecting species occurrence is a pre-eminent focus of applied ecological research. However, direct information about species occurrence is lacking for many species. Instead, researchers sometimes have to rely on so-called presence-only data (i.e. when no direct information about absences is available), which often results from opportunistic, unstructured sampling. MAXENT is a widely used software program designed to model and map species distribution using presence-only data. 2. We provide a critical review of MAXENT as applied to species distribution modelling and discuss how it can lead to inferential errors. A chief concern is that MAXENT produces a number of poorly defined indices that are not directly related to the actual parameter of interest – the probability of occurrence (ψ). This focus on an index was motivated by the belief that it is not possible to estimate ψ from presence-only data; however, we demonstrate that ψ is identifiable using conventional likelihood methods under the assumptions of random sampling and constant probability of species detection. 3. The model is implemented in a convenient r package which we use to apply the model to simulated data and data from the North American Breeding Bird Survey. We demonstrate that MAXENT produces extreme under-predictions when compared to estimates produced by logistic regression which uses the full (presence/absence) data set. We note that MAXENT predictions are extremely sensitive to specification of the background prevalence, which is not objectively estimated using the MAXENT method. 4. As with MAXENT, formal model-based inference requires a random sample of presence locations. Many presence-only data sets, such as those based on museum records and herbarium collections, may not satisfy this assumption. However, when sampling is random, we believe that inference should be based on formal methods that facilitate inference about interpretable ecological quantities instead of vaguely defined indices.
A Formal Approach to Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.
Applications of Formal Methods to Specification and Safety of Avionics Software
NASA Technical Reports Server (NTRS)
Hoover, D. N.; Guaspari, David; Humenn, Polar
1996-01-01
This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.
The potential for increased power from combining P-values testing the same hypothesis.
Ganju, Jitendra; Julie Ma, Guoguang
2017-02-01
The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.
Geometrical optics approach in liquid crystal films with three-dimensional director variations.
Panasyuk, G; Kelly, J; Gartland, E C; Allender, D W
2003-04-01
A formal geometrical optics approach (GOA) to the optics of nematic liquid crystals whose optic axis (director) varies in more than one dimension is described. The GOA is applied to the propagation of light through liquid crystal films whose director varies in three spatial dimensions. As an example, the GOA is applied to the calculation of light transmittance for the case of a liquid crystal cell which exhibits the homeotropic to multidomainlike transition (HMD cell). Properties of the GOA solution are explored, and comparison with the Jones calculus solution is also made. For variations on a smaller scale, where the Jones calculus breaks down, the GOA provides a fast, accurate method for calculating light transmittance. The results of light transmittance calculations for the HMD cell based on the director patterns provided by two methods, direct computer calculation and a previously developed simplified model, are in good agreement.
Kjaergaard, Thomas; Baudin, Pablo; Bykov, Dmytro; ...
2016-11-16
Here, we present a scalable cross-platform hybrid MPI/OpenMP/OpenACC implementation of the Divide–Expand–Consolidate (DEC) formalism with portable performance on heterogeneous HPC architectures. The Divide–Expand–Consolidate formalism is designed to reduce the steep computational scaling of conventional many-body methods employed in electronic structure theory to linear scaling, while providing a simple mechanism for controlling the error introduced by this approximation. Our massively parallel implementation of this general scheme has three levels of parallelism, being a hybrid of the loosely coupled task-based parallelization approach and the conventional MPI +X programming model, where X is either OpenMP or OpenACC. We demonstrate strong and weak scalabilitymore » of this implementation on heterogeneous HPC systems, namely on the GPU-based Cray XK7 Titan supercomputer at the Oak Ridge National Laboratory. Using the “resolution of the identity second-order Moller–Plesset perturbation theory” (RI-MP2) as the physical model for simulating correlated electron motion, the linear-scaling DEC implementation is applied to 1-aza-adamantane-trione (AAT) supramolecular wires containing up to 40 monomers (2440 atoms, 6800 correlated electrons, 24 440 basis functions and 91 280 auxiliary functions). This represents the largest molecular system treated at the MP2 level of theory, demonstrating an efficient removal of the scaling wall pertinent to conventional quantum many-body methods.« less
Bridging the gap between formal and experience-based knowledge for context-aware laparoscopy.
Katić, Darko; Schuck, Jürgen; Wekerle, Anna-Laura; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie
2016-06-01
Computer assistance is increasingly common in surgery. However, the amount of information is bound to overload processing abilities of surgeons. We propose methods to recognize the current phase of a surgery for context-aware information filtering. The purpose is to select the most suitable subset of information for surgical situations which require special assistance. We combine formal knowledge, represented by an ontology, and experience-based knowledge, represented by training samples, to recognize phases. For this purpose, we have developed two different methods. Firstly, we use formal knowledge about possible phase transitions to create a composition of random forests. Secondly, we propose a method based on cultural optimization to infer formal rules from experience to recognize phases. The proposed methods are compared with a purely formal knowledge-based approach using rules and a purely experience-based one using regular random forests. The comparative evaluation on laparoscopic pancreas resections and adrenalectomies employs a consistent set of quality criteria on clean and noisy input. The rule-based approaches proved best with noisefree data. The random forest-based ones were more robust in the presence of noise. Formal and experience-based knowledge can be successfully combined for robust phase recognition.
A Natural Language Interface Concordant with a Knowledge Base.
Han, Yong-Jin; Park, Seong-Bae; Park, Se-Young
2016-01-01
The discordance between expressions interpretable by a natural language interface (NLI) system and those answerable by a knowledge base is a critical problem in the field of NLIs. In order to solve this discordance problem, this paper proposes a method to translate natural language questions into formal queries that can be generated from a graph-based knowledge base. The proposed method considers a subgraph of a knowledge base as a formal query. Thus, all formal queries corresponding to a concept or a predicate in the knowledge base can be generated prior to query time and all possible natural language expressions corresponding to each formal query can also be collected in advance. A natural language expression has a one-to-one mapping with a formal query. Hence, a natural language question is translated into a formal query by matching the question with the most appropriate natural language expression. If the confidence of this matching is not sufficiently high the proposed method rejects the question and does not answer it. Multipredicate queries are processed by regarding them as a set of collected expressions. The experimental results show that the proposed method thoroughly handles answerable questions from the knowledge base and rejects unanswerable ones effectively.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stinis, Panagiotis
We present a comparative study of two methods for thereduction of the dimensionality of a system of ordinary differentialequations that exhibits time-scale separation. Both methods lead to areduced system of stochastic differential equations. The novel feature ofthese methods is that they allow the use, in the reduced system, ofhigher order terms in the resolved variables. The first method, proposedby Majda, Timofeyev and Vanden-Eijnden, is based on an asymptoticstrategy developed by Kurtz. The second method is a short-memoryapproximation of the Mori-Zwanzig projection formalism of irreversiblestatistical mechanics, as proposed by Chorin, Hald and Kupferman. Wepresent conditions under which the reduced models arisingmore » from the twomethods should have similar predictive ability. We apply the two methodsto test cases that satisfy these conditions. The form of the reducedmodels and the numerical simulations show that the two methods havesimilar predictive ability as expected.« less
Verification of Emergent Behaviors in Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.
δ M formalism and anisotropic chaotic inflation power spectrum
NASA Astrophysics Data System (ADS)
Talebian-Ashkezari, A.; Ahmadi, N.
2018-05-01
A new analytical approach to linear perturbations in anisotropic inflation has been introduced in [A. Talebian-Ashkezari, N. Ahmadi and A.A. Abolhasani, JCAP 03 (2018) 001] under the name of δ M formalism. In this paper we apply the mentioned approach to a model of anisotropic inflation driven by a scalar field, coupled to the kinetic term of a vector field with a U(1) symmetry. The δ M formalism provides an efficient way of computing tensor-tensor, tensor-scalar as well as scalar-scalar 2-point correlations that are needed for the analysis of the observational features of an anisotropic model on the CMB. A comparison between δ M results and the tedious calculations using in-in formalism shows the aptitude of the δ M formalism in calculating accurate two point correlation functions between physical modes of the system.
From non-trivial geometries to power spectra and vice versa
NASA Astrophysics Data System (ADS)
Brooker, D. J.; Tsamis, N. C.; Woodard, R. P.
2018-04-01
We review a recent formalism which derives the functional forms of the primordial—tensor and scalar—power spectra of scalar potential inflationary models. The formalism incorporates the case of geometries with non-constant first slow-roll parameter. Analytic expressions for the power spectra are given that explicitly display the dependence on the geometric properties of the background. Moreover, we present the full algorithm for using our formalism, to reconstruct the model from the observed power spectra. Our techniques are applied to models possessing "features" in their potential with excellent agreement.
Medical Named Entity Recognition for Indonesian Language Using Word Representations
NASA Astrophysics Data System (ADS)
Rahman, Arief
2018-03-01
Nowadays, Named Entity Recognition (NER) system is used in medical texts to obtain important medical information, like diseases, symptoms, and drugs. While most NER systems are applied to formal medical texts, informal ones like those from social media (also called semi-formal texts) are starting to get recognition as a gold mine for medical information. We propose a theoretical Named Entity Recognition (NER) model for semi-formal medical texts in our medical knowledge management system by comparing two kinds of word representations: cluster-based word representation and distributed representation.
Fluctuations of tunneling currents in photonic and polaritonic systems
NASA Astrophysics Data System (ADS)
Mantsevich, V. N.; Glazov, M. M.
2018-04-01
Here we develop the nonequilibrium Green's function formalism to analyze the fluctuation spectra of the boson tunneling currents. The approach allows us to calculate the noise spectra in both equilibrium and nonequilibrium conditions. The proposed general formalism is applied to several important realizations of boson transport, including the tunneling transport between two reservoirs and the case where the boson current flows through the intermediate region between the reservoirs. Developed theory can be applied for the analysis of the current noise in waveguides, coupled optical resonators, quantum microcavities, etc., where the tunneling of photons, exciton-polaritons, or excitons can be realized.
Closing the Gap between Formalism and Application--PBL and Mathematical Skills in Engineering
ERIC Educational Resources Information Center
Christensen, Ole Ravn
2008-01-01
A common problem in learning mathematics concerns the gap between, on the one hand, doing the formalisms and calculations of abstract mathematics and, on the other hand, applying these in a specific contextualized setting for example the engineering world. The skills acquired through problem-based learning (PBL), in the special model used at…
ERIC Educational Resources Information Center
Mabingo, Alfdaniels
2015-01-01
Dances from African communities are gradually getting incorporated into formal education at pre-tertiary and tertiary levels in the United States. Whereas strides have been made to embrace this artistic and cultural diversity, the instructional methodologies that are applied in teaching these dances are commonly founded on Western pedagogic canons…
NASA Astrophysics Data System (ADS)
Zheng, Mingfang; He, Cunfu; Lu, Yan; Wu, Bin
2018-01-01
We presented a numerical method to solve phase dispersion curve in general anisotropic plates. This approach involves an exact solution to the problem in the form of the Legendre polynomial of multiple integrals, which we substituted into the state-vector formalism. In order to improve the efficiency of the proposed method, we made a special effort to demonstrate the analytical methodology. Furthermore, we analyzed the algebraic symmetries of the matrices in the state-vector formalism for anisotropic plates. The basic feature of the proposed method was the expansion of field quantities by Legendre polynomials. The Legendre polynomial method avoid to solve the transcendental dispersion equation, which can only be solved numerically. This state-vector formalism combined with Legendre polynomial expansion distinguished the adjacent dispersion mode clearly, even when the modes were very close. We then illustrated the theoretical solutions of the dispersion curves by this method for isotropic and anisotropic plates. Finally, we compared the proposed method with the global matrix method (GMM), which shows excellent agreement.
A general method for the inclusion of radiation chemistry in astrochemical models.
Shingledecker, Christopher N; Herbst, Eric
2018-02-21
In this paper, we propose a general formalism that allows for the estimation of radiolysis decomposition pathways and rate coefficients suitable for use in astrochemical models, with a focus on solid phase chemistry. Such a theory can help increase the connection between laboratory astrophysics experiments and astrochemical models by providing a means for modelers to incorporate radiation chemistry into chemical networks. The general method proposed here is targeted particularly at the majority of species now included in chemical networks for which little radiochemical data exist; however, the method can also be used as a starting point for considering better studied species. We here apply our theory to the irradiation of H 2 O ice and compare the results with previous experimental data.
Formulation of the relativistic moment implicit particle-in-cell method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noguchi, Koichi; Tronci, Cesare; Zuccaro, Gianluca
2007-04-15
A new formulation is presented for the implicit moment method applied to the time-dependent relativistic Vlasov-Maxwell system. The new approach is based on a specific formulation of the implicit moment method that allows us to retain the same formalism that is valid in the classical case despite the formidable complication introduced by the nonlinear nature of the relativistic equations of motion. To demonstrate the validity of the new formulation, an implicit finite difference algorithm is developed to solve the Maxwell's equations and equations of motion. A number of benchmark problems are run: two stream instability, ion acoustic wave damping, Weibelmore » instability, and Poynting flux acceleration. The numerical results are all in agreement with analytical solutions.« less
Smiga, Szymon; Fabiano, Eduardo
2017-11-15
We have developed a simplified coupled cluster (SCC) methodology, using the basic idea of scaled MP2 methods. The scheme has been applied to the coupled cluster double equations and implemented in three different non-iterative variants. This new method (especially the SCCD[3] variant, which utilizes a spin-resolved formalism) has been found to be very efficient and to yield an accurate approximation of the reference CCD results for both total and interaction energies of different atoms and molecules. Furthermore, we demonstrate that the equations determining the scaling coefficients for the SCCD[3] approach can generate non-empirical SCS-MP2 scaling coefficients which are in good agreement with previous theoretical investigations.
IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation
NASA Technical Reports Server (NTRS)
Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.
2005-01-01
This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2004-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
NASA Astrophysics Data System (ADS)
Bhatia, A. K.
2012-09-01
The P-wave hybrid theory of electron-hydrogen elastic scattering [Bhatia, Phys. Rev. A10.1103/PhysRevA.85.052708 85, 052708 (2012)] is applied to the P-wave scattering from He ion. In this method, both short-range and long-range correlations are included in the Schrödinger equation at the same time, by using a combination of a modified method of polarized orbitals and the optical potential formalism. The short-range-correlation functions are of Hylleraas type. It is found that the phase shifts are not significantly affected by the modification of the target function by a method similar to the method of polarized orbitals and they are close to the phase shifts calculated earlier by Bhatia [Phys. Rev. A10.1103/PhysRevA.69.032714 69, 032714 (2004)]. This indicates that the correlation function is general enough to include the target distortion (polarization) in the presence of the incident electron. The important fact is that in the present calculation, to obtain similar results only a 20-term correlation function is needed in the wave function compared to the 220-term wave function required in the above-mentioned calculation. Results for the phase shifts, obtained in the present hybrid formalism, are rigorous lower bounds to the exact phase shifts. The lowest P-wave resonances in He atom and hydrogen ion have also been calculated and compared with the results obtained using the Feshbach projection operator formalism [Bhatia and Temkin, Phys. Rev. A10.1103/PhysRevA.11.2018 11, 2018 (1975)] and also with the results of other calculations. It is concluded that accurate resonance parameters can be obtained by the present method, which has the advantage of including corrections due to neighboring resonances, bound states, and the continuum in which these resonances are embedded.
A Novel Ontology Approach to Support Design for Reliability considering Environmental Effects
Sun, Bo; Li, Yu; Ye, Tianyuan
2015-01-01
Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis. PMID:25821857
A novel ontology approach to support design for reliability considering environmental effects.
Sun, Bo; Li, Yu; Ye, Tianyuan; Ren, Yi
2015-01-01
Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis.
Systems, methods and apparatus for verification of knowledge-based systems
NASA Technical Reports Server (NTRS)
Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)
2010-01-01
Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.
ERIC Educational Resources Information Center
Jacob, Bridgette L.
2013-01-01
The difficulties introductory statistics students have with formal statistical inference are well known in the field of statistics education. "Informal" statistical inference has been studied as a means to introduce inferential reasoning well before and without the formalities of formal statistical inference. This mixed methods study…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omar, A; Marteinsdottir, M; Kadesjo, N
Purpose: To provide a general formalism for determination of occupational eye lens dose based on the response of an active personal dosimeter (APD) worn at chest level above the radiation protection apron. Methods: The formalism consists of three factors: (1) APD conversion factor converting the reading at chest level (APDchest) to the corresponding personal dose equivalent at eye level, (2) Dose conversion factor transferring the measured dose quantity, Hp(10), into a dose quantity relevant for the eye lens dose, (3) Correction factor accounting for differences in exposure of the eye(s) compared to the exposure at chest level (e.g., due tomore » protective lead glasses).The different factors were investigated and evaluated based on phantom and clinical measurements performed in an x-ray angiography suite for interventional cardiology. Results: The eye lens dose can be conservatively estimated by assigning an appropriate numerical value to each factor entering the formalism that in most circumstances overestimates the dose. Doing so, the eye lens dose to the primary operator and assisting staff was estimated in this work as D-eye,primary = 2.0 APDchest and D-eye,assisting = 1.0 APDchest, respectively.The annual eye lens dose to three nurses and one cardiologist was estimated to be 2, 2, 2, and 13 mSv (Hp(0.07)), respectively, using a TLD dosimeter worn at eye level. In comparison, using the formalism and APDchest measurements, the respective doses were 2, 2, 2, and 16 mSv (Hp(3)). Conclusion: The formalism outlined in this work can be used to estimate the occupational eye lens dose from the response of an APD worn on the chest. The formalism is general and could be applied also to other types of dosimeters. However, the numerical value of the different factors may differ from those obtained with the APD’s used in this work due to differences in dosimeter properties.« less
Why Engineers Should Consider Formal Methods
NASA Technical Reports Server (NTRS)
Holloway, C. Michael
1997-01-01
This paper presents a logical analysis of a typical argument favoring the use of formal methods for software development, and suggests an alternative argument that is simpler and stronger than the typical one.
NASA Astrophysics Data System (ADS)
Arsenault, Louis-Francois; Neuberg, Richard; Hannah, Lauren A.; Millis, Andrew J.
We present a machine learning-based statistical regression approach to the inversion of Fredholm integrals of the first kind by studying an important example for the quantum materials community, the analytical continuation problem of quantum many-body physics. It involves reconstructing the frequency dependence of physical excitation spectra from data obtained at specific points in the complex frequency plane. The approach provides a natural regularization in cases where the inverse of the Fredholm kernel is ill-conditioned and yields robust error metrics. The stability of the forward problem permits the construction of a large database of input-output pairs. Machine learning methods applied to this database generate approximate solutions which are projected onto the subspace of functions satisfying relevant constraints. We show that for low input noise the method performs as well or better than Maximum Entropy (MaxEnt) under standard error metrics, and is substantially more robust to noise. We expect the methodology to be similarly effective for any problem involving a formally ill-conditioned inversion, provided that the forward problem can be efficiently solved. AJM was supported by the Office of Science of the U.S. Department of Energy under Subcontract No. 3F-3138 and LFA by the Columbia Univeristy IDS-ROADS project, UR009033-05 which also provided part support to RN and LH.
Evidence Arguments for Using Formal Methods in Software Certification
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Pai, Ganesh
2013-01-01
We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plimak, L.I., E-mail: Lev.Plimak@mbi-berlin.de; Olsen, M.K.
2014-12-15
In this work we present the formal background used to develop the methods used in earlier works to extend the truncated Wigner representation of quantum and atom optics in order to address multi-time problems. Analogs of Wick’s theorem for the Weyl ordering are verified. Using the Bose–Hubbard chain as an example, we show how these may be applied to constructing a mapping of the system in question to phase space. Regularisation issues and the reordering problem for the Heisenberg operators are addressed.
On singlet s-wave electron-hydrogen scattering.
NASA Technical Reports Server (NTRS)
Madan, R. N.
1973-01-01
Discussion of various zeroth-order approximations to s-wave scattering of electrons by hydrogen atoms below the first excitation threshold. The formalism previously developed by the author (1967, 1968) is applied to Feshbach operators to derive integro-differential equations, with the optical-potential set equal to zero, for the singlet and triplet cases. Phase shifts of s-wave scattering are computed in the zeroth-order approximation of the Feshbach operator method and in the static-exchange approximation. It is found that the convergence of numerical computations is faster in the former approximation than in the latter.
Quantum mechanics of a constrained particle on an ellipsoid: Bein formalism and Geometric momentum
DOE Office of Scientific and Technical Information (OSTI.GOV)
Panahi, H., E-mail: t-panahi@guilan.ac.ir; Jahangiri, L., E-mail: laleh.jahangiry@yahoo.com
2016-09-15
In this work we apply the Dirac method in order to obtain the classical relations for a particle on an ellipsoid. We also determine the quantum mechanical form of these relations by using Dirac quantization. Then by considering the canonical commutation relations between the position and momentum operators in terms of curved coordinates, we try to propose the suitable representations for momentum operator that satisfy the obtained commutators between position and momentum in Euclidean space. We see that our representations for momentum operators are the same as geometric one.
BRST Formalism in Self-Dual Chern-Simons Theory with Matter Fields
NASA Astrophysics Data System (ADS)
Dai, Jialiang; Fan, Engui
2018-04-01
We apply BRST method to the self-dual Chern-Simons gauge theory with matter fields and the generators of symmetries of the system from an elegant Lie algebra structure under the operation of Poisson bracket. We discuss four different cases: abelian, nonabelian, relativistic, and nonrelativistic situations and extend the system to the whole phase space including ghost fields. In addition, we obtain the BRST charge of the field system and check its nilpotence of the BRST transformation which plays an important role such as in topological quantum field theory and string theory.
IRONSIDES: DNS With No Single Packet Denial of Service or Remote Code Execution Vulnerabilities
2012-02-27
Caching DNSSEC TSIG 1Pv6 Wildcard S fi.. In terface y y Y * N __ mo e ---- o •• vare y y y N NN y m progress y y y NY N Web, Y command...Proceedings of the 2007 IEEE Aerospace Conference. [6) C. Heitmeyer, M . Archer, E. Leonard and J. Mclean, "Applying formal methods to a certifiably secure...2003). [1 5] DNSSEC-The DNS Security Extensions, http:// http://www.dnssec.net/ (16] S . Conchon, E. Contcjean and J. Kanig, "Ergo : A theorem prover
Least-squares analysis of the Mueller matrix.
Reimer, Michael; Yevick, David
2006-08-15
In a single-mode fiber excited by light with a fixed polarization state, the output polarizations obtained at two different optical frequencies are related by a Mueller matrix. We examine least-squares procedures for estimating this matrix from repeated measurements of the output Stokes vector for a random set of input polarization states. We then apply these methods to the determination of polarization mode dispersion and polarization-dependent loss in an optical fiber. We find that a relatively simple formalism leads to results that are comparable with those of far more involved techniques.
Investigation of the effect of scattering centers on low dimensional nanowire channel
NASA Astrophysics Data System (ADS)
Cariappa, K. S.; Shukla, Raja; Sarkar, Niladri
2018-05-01
In this work, we studied the effect of scattering centers on the electron density profiles of a one dimensional Nanowire channel. Density Matrix Formalism is used for calculating the local electron densities at room temperature. Various scattering centers have been simulated in the channel. The nearest neighbor tight binding method is applied to construct the Hamiltonian of nanoscale devices. We invoke scattering centers by adding local scattering potentials to the Hamiltonian. This analysis could give an insight into the understanding and utilization of defects for device engineering.
Deformation quantization of fermi fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galaviz, I.; Garcia-Compean, H.; Departamento de Fisica, Centro de Investigacion y de Estudios Avanzados del IPN, P.O. Box 14-740, 07000 Mexico, D.F.
2008-04-15
Deformation quantization for any Grassmann scalar free field is described via the Weyl-Wigner-Moyal formalism. The Stratonovich-Weyl quantizer, the Moyal *-product and the Wigner functional are obtained by extending the formalism proposed recently in [I. Galaviz, H. Garcia-Compean, M. Przanowski, F.J. Turrubiates, Weyl-Wigner-Moyal Formalism for Fermi Classical Systems, arXiv:hep-th/0612245] to the fermionic systems of infinite number of degrees of freedom. In particular, this formalism is applied to quantize the Dirac free field. It is observed that the use of suitable oscillator variables facilitates considerably the procedure. The Stratonovich-Weyl quantizer, the Moyal *-product, the Wigner functional, the normal ordering operator, and finally,more » the Dirac propagator have been found with the use of these variables.« less
Ding, Hansheng; Wang, Changying; Xie, Chunyan; Yang, Yitong; Jin, Chunlin
2017-01-01
The need for formal care among the elderly population has been increasing due to their greater longevity and the evolution of family structure. We examined the determinants of the use and expenses of formal care among in-home elderly adults in Shanghai. A two-part model based on the data from the Shanghai Long-Term Care Needs Assessment Questionnaire was applied. A total of 8428 participants responded in 2014 and 7100 were followed up in 2015. The determinants of the probability of using formal care were analyzed in the first part of the model and the determinants of formal care expenses were analyzed in the second part. Demographic indicators, living arrangements, physical health status, and care type in 2014 were selected as independent variables. We found that individuals of older age; women; those with higher Activities of Daily Living (ADL) scores; those without spouse; those with higher income; those suffering from stroke, dementia, lower limb fracture, or advanced tumor; and those with previous experience of formal and informal care were more likely to receive formal care in 2015. Furthermore, age, income and formal care fee in 2014 were significant predictors of formal care expenses in 2015. Taken together, the results showed that formal care provision in Shanghai was not determined by ADL scores, but was instead more related to income. This implied an inappropriate distribution of formal care among elderly population in Shanghai. Additionally, it appeared difficult for the elderly to quit the formal care once they begun to use it. These results highlighted the importance of assessing the need for formal care, and suggested that the government offer guidance on formal care use for the elderly. PMID:28448628
Formal methods for dependable real-time systems
NASA Technical Reports Server (NTRS)
Rushby, John
1993-01-01
The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.
Petri Nets as Modeling Tool for Emergent Agents
NASA Technical Reports Server (NTRS)
Bergman, Marto
2004-01-01
Emergent agents, those agents whose local interactions can cause unexpected global results, require a method of modeling that is both dynamic and structured Petri Nets, a modeling tool developed for dynamic discrete event system of mainly functional agents, provide this, and have the benefit of being an established tool. We present here the details of the modeling method here and discuss how to implement its use for modeling agent-based systems. Petri Nets have been used extensively in the modeling of functional agents, those agents who have defined purposes and whose actions should result in a know outcome. However, emergent agents, those agents who have a defined structure but whose interaction causes outcomes that are unpredictable, have not yet found a modeling style that suits them. A problem with formally modeling emergent agents that any formal modeling style usually expects to show the results of a problem and the results of problems studied using emergent agents are not apparent from the initial construction. However, the study of emergent agents still requires a method to analyze the agents themselves, and have sensible conversation about the differences and similarities between types of emergent agents. We attempt to correct this problem by applying Petri Nets to the characterization of emergent agents. In doing so, the emergent properties of these agents can be highlighted, and conversation about the nature and compatibility of the differing methods of agent creation can begin.
Multisymplectic unified formalism for Einstein-Hilbert gravity
NASA Astrophysics Data System (ADS)
Gaset, Jordi; Román-Roy, Narciso
2018-03-01
We present a covariant multisymplectic formulation for the Einstein-Hilbert model of general relativity. As it is described by a second-order singular Lagrangian, this is a gauge field theory with constraints. The use of the unified Lagrangian-Hamiltonian formalism is particularly interesting when it is applied to these kinds of theories, since it simplifies the treatment of them, in particular, the implementation of the constraint algorithm, the retrieval of the Lagrangian description, and the construction of the covariant Hamiltonian formalism. In order to apply this algorithm to the covariant field equations, they must be written in a suitable geometrical way, which consists of using integrable distributions, represented by multivector fields of a certain type. We apply all these tools to the Einstein-Hilbert model without and with energy-matter sources. We obtain and explain the geometrical and physical meaning of the Lagrangian constraints and we construct the multimomentum (covariant) Hamiltonian formalisms in both cases. As a consequence of the gauge freedom and the constraint algorithm, we see how this model is equivalent to a first-order regular theory, without gauge freedom. In the case of the presence of energy-matter sources, we show how some relevant geometrical and physical characteristics of the theory depend on the type of source. In all the cases, we obtain explicitly multivector fields which are solutions to the gravitational field equations. Finally, a brief study of symmetries and conservation laws is done in this context.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.
Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.
Super-sample covariance approximations and partial sky coverage
NASA Astrophysics Data System (ADS)
Lacasa, Fabien; Lima, Marcos; Aguena, Michel
2018-04-01
Super-sample covariance (SSC) is the dominant source of statistical error on large scale structure (LSS) observables for both current and future galaxy surveys. In this work, we concentrate on the SSC of cluster counts, also known as sample variance, which is particularly useful for the self-calibration of the cluster observable-mass relation; our approach can similarly be applied to other observables, such as galaxy clustering and lensing shear. We first examined the accuracy of two analytical approximations proposed in the literature for the flat sky limit, finding that they are accurate at the 15% and 30-35% level, respectively, for covariances of counts in the same redshift bin. We then developed a harmonic expansion formalism that allows for the prediction of SSC in an arbitrary survey mask geometry, such as large sky areas of current and future surveys. We show analytically and numerically that this formalism recovers the full sky and flat sky limits present in the literature. We then present an efficient numerical implementation of the formalism, which allows fast and easy runs of covariance predictions when the survey mask is modified. We applied our method to a mask that is broadly similar to the Dark Energy Survey footprint, finding a non-negligible negative cross-z covariance, i.e. redshift bins are anti-correlated. We also examined the case of data removal from holes due to, for example bright stars, quality cuts, or systematic removals, and find that this does not have noticeable effects on the structure of the SSC matrix, only rescaling its amplitude by the effective survey area. These advances enable analytical covariances of LSS observables to be computed for current and future galaxy surveys, which cover large areas of the sky where the flat sky approximation fails.
Integrating Science and Engineering to Implement Evidence-Based Practices in Health Care Settings.
Wu, Shinyi; Duan, Naihua; Wisdom, Jennifer P; Kravitz, Richard L; Owen, Richard R; Sullivan, J Greer; Wu, Albert W; Di Capua, Paul; Hoagwood, Kimberly Eaton
2015-09-01
Integrating two distinct and complementary paradigms, science and engineering, may produce more effective outcomes for the implementation of evidence-based practices in health care settings. Science formalizes and tests innovations, whereas engineering customizes and optimizes how the innovation is applied tailoring to accommodate local conditions. Together they may accelerate the creation of an evidence-based healthcare system that works effectively in specific health care settings. We give examples of applying engineering methods for better quality, more efficient, and safer implementation of clinical practices, medical devices, and health services systems. A specific example was applying systems engineering design that orchestrated people, process, data, decision-making, and communication through a technology application to implement evidence-based depression care among low-income patients with diabetes. We recommend that leading journals recognize the fundamental role of engineering in implementation research, to improve understanding of design elements that create a better fit between program elements and local context.
Feynman’s clock, a new variational principle, and parallel-in-time quantum dynamics
McClean, Jarrod R.; Parkhill, John A.; Aspuru-Guzik, Alán
2013-01-01
We introduce a discrete-time variational principle inspired by the quantum clock originally proposed by Feynman and use it to write down quantum evolution as a ground-state eigenvalue problem. The construction allows one to apply ground-state quantum many-body theory to quantum dynamics, extending the reach of many highly developed tools from this fertile research area. Moreover, this formalism naturally leads to an algorithm to parallelize quantum simulation over time. We draw an explicit connection between previously known time-dependent variational principles and the time-embedded variational principle presented. Sample calculations are presented, applying the idea to a hydrogen molecule and the spin degrees of freedom of a model inorganic compound, demonstrating the parallel speedup of our method as well as its flexibility in applying ground-state methodologies. Finally, we take advantage of the unique perspective of this variational principle to examine the error of basis approximations in quantum dynamics. PMID:24062428
The quantization of the chiral Schwinger model based on the BFT - BFV formalism
NASA Astrophysics Data System (ADS)
Kim, Won T.; Kim, Yong-Wan; Park, Mu-In; Park, Young-Jai; Yoon, Sean J.
1997-03-01
We apply the newly improved Batalin - Fradkin - Tyutin (BFT) Hamiltonian method to the chiral Schwinger model in the case of the regularization ambiguity a>1. We show that one can systematically construct the first class constraints by the BFT Hamiltonian method, and also show that the well-known Dirac brackets of the original phase space variables are exactly the Poisson brackets of the corresponding modified fields in the extended phase space. Furthermore, we show that the first class Hamiltonian is simply obtained by replacing the original fields in the canonical Hamiltonian by these modified fields. Performing the momentum integrations, we obtain the corresponding first class Lagrangian in the configuration space.
Prototype design based on NX subdivision modeling application
NASA Astrophysics Data System (ADS)
Zhan, Xianghui; Li, Xiaoda
2018-04-01
Prototype design is an important part of the product design, through a quick and easy way to draw a three-dimensional product prototype. Combined with the actual production, the prototype could be modified several times, resulting in a highly efficient and reasonable design before the formal design. Subdivision modeling is a common method of modeling product prototypes. Through Subdivision modeling, people can in a short time with a simple operation to get the product prototype of the three-dimensional model. This paper discusses the operation method of Subdivision modeling for geometry. Take a vacuum cleaner as an example, the NX Subdivision modeling functions are applied. Finally, the development of Subdivision modeling is forecasted.
Gapless Spin-Liquid Ground State in the S =1 /2 Kagome Antiferromagnet
NASA Astrophysics Data System (ADS)
Liao, H. J.; Xie, Z. Y.; Chen, J.; Liu, Z. Y.; Xie, H. D.; Huang, R. Z.; Normand, B.; Xiang, T.
2017-03-01
The defining problem in frustrated quantum magnetism, the ground state of the nearest-neighbor S =1 /2 antiferromagnetic Heisenberg model on the kagome lattice, has defied all theoretical and numerical methods employed to date. We apply the formalism of tensor-network states, specifically the method of projected entangled simplex states, which combines infinite system size with a correct accounting for multipartite entanglement. By studying the ground-state energy, the finite magnetic order appearing at finite tensor bond dimensions, and the effects of a next-nearest-neighbor coupling, we demonstrate that the ground state is a gapless spin liquid. We discuss the comparison with other numerical studies and the physical interpretation of this result.
The Measurement Process in the Generalized Contexts Formalism for Quantum Histories
NASA Astrophysics Data System (ADS)
Losada, Marcelo; Vanni, Leonardo; Laura, Roberto
2016-02-01
In the interpretations of quantum mechanics involving quantum histories there is no collapse postulate and the measurement is considered as a quantum interaction between the measured system and the measured instrument. For two consecutive non ideal measurements on the same system, we prove that both pointer indications at the end of each measurement are compatible properties in our generalized context formalism for quantum histories. Inmediately after the first measurement an effective state for the measured system is deduced from the formalism, generalizing the state that would be obtained by applying the state collapse postulate.
Unified theory for inhomogeneous thermoelectric generators and coolers including multistage devices.
Gerstenmaier, York Christian; Wachutka, Gerhard
2012-11-01
A novel generalized Lagrange multiplier method for functional optimization with inclusion of subsidiary conditions is presented and applied to the optimization of material distributions in thermoelectric converters. Multistaged devices are considered within the same formalism by inclusion of position-dependent electric current in the legs leading to a modified thermoelectric equation. Previous analytical solutions for maximized efficiencies for generators and coolers obtained by Sherman [J. Appl. Phys. 31, 1 (1960)], Snyder [Phys. Rev. B 86, 045202 (2012)], and Seifert et al. [Phys. Status Solidi A 207, 760 (2010)] by a method of local optimization of reduced efficiencies are recovered by independent proof. The outstanding maximization problems for generated electric power and cooling power can be solved swiftly numerically by solution of a differential equation-system obtained within the new formalism. As far as suitable materials are available, the inhomogeneous TE converters can have increased performance by use of purely temperature-dependent material properties in the thermoelectric legs or by use of purely spatial variation of material properties or by a combination of both. It turns out that the optimization domain is larger for the second kind of device which can, thus, outperform the first kind of device.
NASA Astrophysics Data System (ADS)
Nouri, N. M.; Mostafapour, K.; Kamran, M.
2018-02-01
In a closed water-tunnel circuit, the multi-component strain gauge force and moment sensor (also known as balance) are generally used to measure hydrodynamic forces and moments acting on scaled models. These balances are periodically calibrated by static loading. Their performance and accuracy depend significantly on the rig and the method of calibration. In this research, a new calibration rig was designed and constructed to calibrate multi-component internal strain gauge balances. The calibration rig has six degrees of freedom and six different component-loading structures that can be applied separately and synchronously. The system was designed based on the applicability of formal experimental design techniques, using gravity for balance loading and balance positioning and alignment relative to gravity. To evaluate the calibration rig, a six-component internal balance developed by Iran University of Science and Technology was calibrated using response surface methodology. According to the results, calibration rig met all design criteria. This rig provides the means by which various methods of formal experimental design techniques can be implemented. The simplicity of the rig saves time and money in the design of experiments and in balance calibration while simultaneously increasing the accuracy of these activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohr, Stephan; Masella, Michel; Ratcliff, Laura E.
We present, within Kohn-Sham Density Functional Theory calculations, a quantitative method to identify and assess the partitioning of a large quantum mechanical system into fragments. We then introduce a simple and efficient formalism (which can be written as generalization of other well-known population analyses) to extract, from first principles, electrostatic multipoles for these fragments. The corresponding fragment multipoles can in this way be seen as reliable (pseudo-) observables. By applying our formalism within the code BigDFT, we show that the usage of a minimal set of in-situ optimized basis functions is of utmost importance for having at the same timemore » a proper fragment definition and an accurate description of the electronic structure. With this approach it becomes possible to simplify the modeling of environmental fragments by a set of multipoles, without notable loss of precision in the description of the active quantum mechanical region. Furthermore, this leads to a considerable reduction of the degrees of freedom by an effective coarsegraining approach, eventually also paving the way towards efficient QM/QM and QM/MM methods coupling together different levels of accuracy.« less
Off-energy-shell p-p scattering at sub-Coulomb energies via the Trojan horse method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tumino, A.; Dipartimento di Metodologie Fisiche e Chimiche per l'Ingegneria, Universita di Catania, Catania; Universita Kore di Enna, Enna
2008-12-15
Two-proton scattering at sub-Coulomb energies has been measured indirectly via the Trojan horse method applied to the p + d{yields}p + p + n reaction to investigate off-energy shell effects for scattering processes. The three-body experiment was performed at 5 and 4.7 MeV corresponding to a p-p relative energy ranging from 80 to 670 keV. The free p-p cross section exhibits a deep minimum right within this relative energy region due to Coulomb plus nuclear destructive interference. No minimum occurs instead in the Trojan horse p-p cross section, which was extracted by employing a simple plane-wave impulse approximation. A detailedmore » formalism was developed to build up the expression of the theoretical half-off-shell p-p cross section. Its behavior agrees with the Trojan horse data and in turn formally fits the n-n, n-p, and nuclear p-p cross sections given the fact that in its expression the Coulomb amplitude is negligible with respect to the nuclear one. These results confirm the Trojan horse suppression of the Coulomb amplitude for scattering due to the off-shell character of the process.« less
Mohr, Stephan; Masella, Michel; Ratcliff, Laura E.; ...
2017-07-21
We present, within Kohn-Sham Density Functional Theory calculations, a quantitative method to identify and assess the partitioning of a large quantum mechanical system into fragments. We then introduce a simple and efficient formalism (which can be written as generalization of other well-known population analyses) to extract, from first principles, electrostatic multipoles for these fragments. The corresponding fragment multipoles can in this way be seen as reliable (pseudo-) observables. By applying our formalism within the code BigDFT, we show that the usage of a minimal set of in-situ optimized basis functions is of utmost importance for having at the same timemore » a proper fragment definition and an accurate description of the electronic structure. With this approach it becomes possible to simplify the modeling of environmental fragments by a set of multipoles, without notable loss of precision in the description of the active quantum mechanical region. Furthermore, this leads to a considerable reduction of the degrees of freedom by an effective coarsegraining approach, eventually also paving the way towards efficient QM/QM and QM/MM methods coupling together different levels of accuracy.« less
Multi-Attribute Tradespace Exploration in Space System Design
NASA Astrophysics Data System (ADS)
Ross, A. M.; Hastings, D. E.
2002-01-01
The complexity inherent in space systems necessarily requires intense expenditures of resources both human and monetary. The high level of ambiguity present in the early design phases of these systems causes long, highly iterative, and costly design cycles. This paper looks at incorporating decision theory methods into the early design processes to streamline communication of wants and needs among stakeholders and between levels of design. Communication channeled through formal utility interviews and analysis enables engineers to better understand the key drivers for the system and allows a more thorough exploration of the design tradespace. Multi-Attribute Tradespace Exploration (MATE), an evolving process incorporating decision theory into model and simulation- based design, has been applied to several space system case studies at MIT. Preliminary results indicate that this process can improve the quality of communication to more quickly resolve project ambiguity, and enable the engineer to discover better value designs for multiple stakeholders. MATE is also being integrated into a concurrent design environment to facilitate the transfer knowledge of important drivers into higher fidelity design phases. Formal utility theory provides a mechanism to bridge the language barrier between experts of different backgrounds and differing needs (e.g. scientists, engineers, managers, etc). MATE with concurrent design couples decision makers more closely to the design, and most importantly, maintains their presence between formal reviews.
Applying Evidence-Based Medicine in Telehealth: An Interactive Pattern Recognition Approximation
Fernández-Llatas, Carlos; Meneu, Teresa; Traver, Vicente; Benedi, José-Miguel
2013-01-01
Born in the early nineteen nineties, evidence-based medicine (EBM) is a paradigm intended to promote the integration of biomedical evidence into the physicians daily practice. This paradigm requires the continuous study of diseases to provide the best scientific knowledge for supporting physicians in their diagnosis and treatments in a close way. Within this paradigm, usually, health experts create and publish clinical guidelines, which provide holistic guidance for the care for a certain disease. The creation of these clinical guidelines requires hard iterative processes in which each iteration supposes scientific progress in the knowledge of the disease. To perform this guidance through telehealth, the use of formal clinical guidelines will allow the building of care processes that can be interpreted and executed directly by computers. In addition, the formalization of clinical guidelines allows for the possibility to build automatic methods, using pattern recognition techniques, to estimate the proper models, as well as the mathematical models for optimizing the iterative cycle for the continuous improvement of the guidelines. However, to ensure the efficiency of the system, it is necessary to build a probabilistic model of the problem. In this paper, an interactive pattern recognition approach to support professionals in evidence-based medicine is formalized. PMID:24185841
NASA Astrophysics Data System (ADS)
Leigh, Nathan W. C.; Wegsman, Shalma
2018-05-01
We present a formalism for constructing schematic diagrams to depict chaotic three-body interactions in Newtonian gravity. This is done by decomposing each interaction into a series of discrete transformations in energy- and angular momentum-space. Each time a transformation is applied, the system changes state as the particles re-distribute their energy and angular momenta. These diagrams have the virtue of containing all of the quantitative information needed to fully characterize most bound or unbound interactions through time and space, including the total duration of the interaction, the initial and final stable states in addition to every intervening temporary meta-stable state. As shown via an illustrative example for the bound case, prolonged excursions of one of the particles, which by far dominates the computational cost of the simulations, are reduced to a single discrete transformation in energy- and angular momentum-space, thereby potentially mitigating any computational expense. We further generalize our formalism to sequences of (unbound) three-body interactions, as occur in dense stellar environments during binary hardening. Finally, we provide a method for dynamically evolving entire populations of binaries via three-body scattering interactions, using a purely analytic formalism. In principle, the techniques presented here are adaptable to other three-body problems that conserve energy and angular momentum.
Verification of NASA Emergent Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.
2017-04-17
Cyberphysical Systems, Formal Methods , Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...5 3. Methods , Assumptions, and Procedures...Rockwell Collins has been addressing these challenges by developing compositional reasoning methods that permit the verification of systems that exceed
Dominant partition method. [based on a wave function formalism
NASA Technical Reports Server (NTRS)
Dixon, R. M.; Redish, E. F.
1979-01-01
By use of the L'Huillier, Redish, and Tandy (LRT) wave function formalism, a partially connected method, the dominant partition method (DPM) is developed for obtaining few body reductions of the many body problem in the LRT and Bencze, Redish, and Sloan (BRS) formalisms. The DPM maps the many body problem to a fewer body one by using the criterion that the truncated formalism must be such that consistency with the full Schroedinger equation is preserved. The DPM is based on a class of new forms for the irreducible cluster potential, which is introduced in the LRT formalism. Connectivity is maintained with respect to all partitions containing a given partition, which is referred to as the dominant partition. Degrees of freedom corresponding to the breakup of one or more of the clusters of the dominant partition are treated in a disconnected manner. This approach for simplifying the complicated BRS equations is appropriate for physical problems where a few body reaction mechanism prevails.
Uncertain sightings and the extinction of the Ivory-billed Woodpecker.
Solow, Andrew; Smith, Woollcott; Burgman, Mark; Rout, Tracy; Wintle, Brendan; Roberts, David
2012-02-01
The extinction of a species can be inferred from a record of its sightings. Existing methods for doing so assume that all sightings in the record are valid. Often, however, there are sightings of uncertain validity. To date, uncertain sightings have been treated in an ad hoc way, either excluding them from the record or including them as if they were certain. We developed a Bayesian method that formally accounts for such uncertain sightings. The method assumes that valid and invalid sightings follow independent Poisson processes and use noninformative prior distributions for the rate of valid sightings and for a measure of the quality of uncertain sightings. We applied the method to a recently published record of sightings of the Ivory-billed Woodpecker (Campephilus principalis). This record covers the period 1897-2010 and contains 39 sightings classified as certain and 29 classified as uncertain. The Bayes factor in favor of extinction was 4.03, which constitutes substantial support for extinction. The posterior distribution of the time of extinction has 3 main modes in 1944, 1952, and 1988. The method can be applied to sighting records of other purportedly extinct species. ©2011 Society for Conservation Biology.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Stability of flat spacetime in quantum gravity
NASA Astrophysics Data System (ADS)
Jordan, R. D.
1987-12-01
In a previous paper, a modified effective-action formalism was developed which produces equations satisfied by the expectation value of the field, rather than the usual in-out average. Here this formalism is applied to a quantized scalar field in a background which is a small perturbation from Minkowski spacetime. The one-loop effective field equation describes the back reaction of created particles on the gravitational field, and is calculated in this paper to linear order in the perturbation. In this way we rederive an equation first found by Horowitz using completely different methods. This equation possesses exponentially growing solutions, so we confirm Horowitz's conclusion that flat spacetime is unstable in this approximation to the theory. The new derivation shows that the field equation is just as useful as the one-loop approximation to the in-out equation, contrary to earlier arguments. However, the instability suggests that the one-loop approximation cannot be trusted for gravity. These results are compared with the corresponding situation in QED and QCD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fleury, Pierre; Uzan, Jean-Philippe; Larena, Julien, E-mail: fleury@iap.fr, E-mail: j.larena@ru.ac.za, E-mail: uzan@iap.fr
On the scale of the light beams subtended by small sources, e.g. supernovae, matter cannot be accurately described as a fluid, which questions the applicability of standard cosmic lensing to those cases. In this article, we propose a new formalism to deal with small-scale lensing as a diffusion process: the Sachs and Jacobi equations governing the propagation of narrow light beams are treated as Langevin equations. We derive the associated Fokker-Planck-Kolmogorov equations, and use them to deduce general analytical results on the mean and dispersion of the angular distance. This formalism is applied to random Einstein-Straus Swiss-cheese models, allowing usmore » to: (1) show an explicit example of the involved calculations; (2) check the validity of the method against both ray-tracing simulations and direct numerical integration of the Langevin equation. As a byproduct, we obtain a post-Kantowski-Dyer-Roeder approximation, accounting for the effect of tidal distortions on the angular distance, in excellent agreement with numerical results. Besides, the dispersion of the angular distance is correctly reproduced in some regimes.« less
The theory of stochastic cosmological lensing
NASA Astrophysics Data System (ADS)
Fleury, Pierre; Larena, Julien; Uzan, Jean-Philippe
2015-11-01
On the scale of the light beams subtended by small sources, e.g. supernovae, matter cannot be accurately described as a fluid, which questions the applicability of standard cosmic lensing to those cases. In this article, we propose a new formalism to deal with small-scale lensing as a diffusion process: the Sachs and Jacobi equations governing the propagation of narrow light beams are treated as Langevin equations. We derive the associated Fokker-Planck-Kolmogorov equations, and use them to deduce general analytical results on the mean and dispersion of the angular distance. This formalism is applied to random Einstein-Straus Swiss-cheese models, allowing us to: (1) show an explicit example of the involved calculations; (2) check the validity of the method against both ray-tracing simulations and direct numerical integration of the Langevin equation. As a byproduct, we obtain a post-Kantowski-Dyer-Roeder approximation, accounting for the effect of tidal distortions on the angular distance, in excellent agreement with numerical results. Besides, the dispersion of the angular distance is correctly reproduced in some regimes.
NASA Astrophysics Data System (ADS)
Samlan, C. T.; Naik, Dinesh N.; Viswanathan, Nirmal K.
2016-09-01
Discovered in 1813, the conoscopic interference pattern observed due to light propagating through a crystal, kept between crossed polarizers, shows isochromates and isogyres, respectively containing information about the dynamic and geometric phase acquired by the beam. We propose and demonstrate a closed-fringe Fourier analysis method to disentangle the isogyres from the isochromates, leading us to the azimuthally varying geometric phase and its manifestation as isogyres. This azimuthally varying geometric phase is shown to be the underlying mechanism for the spin-to-orbital angular momentum conversion observed in a diverging optical field propagating through a z-cut uniaxial crystal. We extend the formalism to study the optical activity mediated uniaxial-to-biaxial transformation due to a weak transverse electric field applied across the crystal. Closely associated with the phase and polarization singularities of the optical field, the formalism enables us to understand crystal optics in a new way, paving the way to anticipate several emerging phenomena.
Samlan, C T; Naik, Dinesh N; Viswanathan, Nirmal K
2016-09-14
Discovered in 1813, the conoscopic interference pattern observed due to light propagating through a crystal, kept between crossed polarizers, shows isochromates and isogyres, respectively containing information about the dynamic and geometric phase acquired by the beam. We propose and demonstrate a closed-fringe Fourier analysis method to disentangle the isogyres from the isochromates, leading us to the azimuthally varying geometric phase and its manifestation as isogyres. This azimuthally varying geometric phase is shown to be the underlying mechanism for the spin-to-orbital angular momentum conversion observed in a diverging optical field propagating through a z-cut uniaxial crystal. We extend the formalism to study the optical activity mediated uniaxial-to-biaxial transformation due to a weak transverse electric field applied across the crystal. Closely associated with the phase and polarization singularities of the optical field, the formalism enables us to understand crystal optics in a new way, paving the way to anticipate several emerging phenomena.
Black holes and black strings of N = 2, d = 5 supergravity in the H-FGK formalism
NASA Astrophysics Data System (ADS)
Meessen, Patrick; Ortín, Tomás; Perz, Jan; Shahbazi, C. S.
2012-09-01
We study general classes and properties of extremal and non-extremal static black-hole solutions of N = 2, d = 5 supergravity coupled to vector multiplets using the recently proposed H-FGK formalism, which we also extend to static black strings. We explain how to determine the integration constants and physical parameters of the blackhole and black-string solutions. We derive some model-independent statements, including the transformation of non-extremal flow equations to the form of those for the extremal flow. We apply our methods to the construction of example solutions (among others a new extremal string solution of heterotic string theory on K 3 × S 1). In the cases where we have calculated it explicitly, the product of areas of the inner and outer horizon of a non-extremal solution coincides with the square of the moduli-independent area of the horizon of the extremal solution with the same charges.
Stevenson, Gareth P; Baker, Ruth E; Kennedy, Gareth F; Bond, Alan M; Gavaghan, David J; Gillow, Kathryn
2013-02-14
The potential-dependences of the rate constants associated with heterogeneous electron transfer predicted by the empirically based Butler-Volmer and fundamentally based Marcus-Hush formalisms are well documented for dc cyclic voltammetry. However, differences are often subtle, so, presumably on the basis of simplicity, the Butler-Volmer method is generally employed in theoretical-experimental comparisons. In this study, the ability of Large Amplitude Fourier Transform AC Cyclic Voltammetry to distinguish the difference in behaviour predicted by the two formalisms has been investigated. The focus of this investigation is on the difference in the profiles of the first to sixth harmonics, which are readily accessible when a large amplitude of the applied ac potential is employed. In particular, it is demonstrated that systematic analysis of the higher order harmonic responses in suitable kinetic regimes provides predicted deviations of Marcus-Hush from Butler-Volmer behaviour to be established from a single experiment under conditions where the background charging current is minimal.
Primordial Black Holes from First Principles (Overview)
NASA Astrophysics Data System (ADS)
Lam, Casey; Bloomfield, Jolyon; Moss, Zander; Russell, Megan; Face, Stephen; Guth, Alan
2017-01-01
Given a power spectrum from inflation, our goal is to calculate, from first principles, the number density and mass spectrum of primordial black holes that form in the early universe. Previously, these have been calculated using the Press- Schechter formalism and some demonstrably dubious rules of thumb regarding predictions of black hole collapse. Instead, we use Monte Carlo integration methods to sample field configurations from a power spectrum combined with numerical relativity simulations to obtain a more accurate picture of primordial black hole formation. We demonstrate how this can be applied for both Gaussian perturbations and the more interesting (for primordial black holes) theory of hybrid inflation. One of the tools that we employ is a variant of the BBKS formalism for computing the statistics of density peaks in the early universe. We discuss the issue of overcounting due to subpeaks that can arise from this approach (the ``cloud-in-cloud'' problem). MIT UROP Office- Paul E. Gray (1954) Endowed Fund.
Systems, methods and apparatus for pattern matching in procedure development and verification
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.
Lagrangian methods in the analysis of nonlinear wave interactions in plasma
NASA Technical Reports Server (NTRS)
Galloway, J. J.
1972-01-01
An averaged-Lagrangian method is developed for obtaining the equations which describe the nonlinear interactions of the wave (oscillatory) and background (nonoscillatory) components which comprise a continuous medium. The method applies to monochromatic waves in any continuous medium that can be described by a Lagrangian density, but is demonstrated in the context of plasma physics. The theory is presented in a more general and unified form by way of a new averaged-Lagrangian formalism which simplifies the perturbation ordering procedure. Earlier theory is extended to deal with a medium distributed in velocity space and to account for the interaction of the background with the waves. The analytic steps are systematized, so as to maximize calculational efficiency. An assessment of the applicability and limitations of the method shows that it has some definite advantages over other approaches in efficiency and versatility.
Formal verification of software-based medical devices considering medical guidelines.
Daw, Zamira; Cleaveland, Rance; Vetter, Marcus
2014-01-01
Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one UML diagram. Additionally, the system design can be formally verified automatically.
Design of environmental education module towards the needs of aboriginal community learning
NASA Astrophysics Data System (ADS)
Dasman, Siti Mariam; Yasin, Ruhizan Mohammad
2017-05-01
Non-formal education (NFE) refers to a program that is designed for personal and social education for learners to improve the level of skills and competencies outside formal educational curriculum. Issues related to geography and environment of different Aboriginal communities with other communities play an important role in determining the types and methods that should be made available to the minority community groups. Thus, this concept paper is intended to cater for educational environment through the design and development of learning modules based on non-formal education to the learning of Aboriginal community. Methods and techniques in the design and construction of the modules is based on the Design and Development Research (DDR) that was based on instructional design model of Morrison, Kemp and Ross which is more flexible and prioritizes the needs and characteristics of learners who were involved in the learning modules of the future. The discussion is related to the module development which is suitable to the learning needs of the community and there are several recommendations which may be applied in the implementation of this approach. In conclusion, the community of Orang Asli should be offered the same education as other communities but it is important to distinguish acceptance of learning techniques or approaches used in the education system to meet their standards. The implications of this concept paper is to meet the educational needs of the environment which includes a few aspects of science and some learning activities using effective approaches such as playing and building their own knowledge of meaning.
Numerical method for N electrons bound to a polar quantum dot with a Coulomb impurity
NASA Astrophysics Data System (ADS)
Yau, J. K.; Lee, C. M.
2003-03-01
A numerical method is proposed to calculate the Frohlich Hamiltonian containing N electrons bound to polar quantum dot with a Coulomb impurity without transformation to the coordination frame of the center of mass and by direct diagonalization. As an example to demonstrate the formalism of this method, the low-lying spectra of three interacting electrons bound to an on-center Coulomb impurity, both for accepter and donor, are calculated and analyzed in a polar quantum dot under a perpendicular magnetic field. Taking polaron effect into account, the physical meaning of the phonon-induced terms, both self-square terms and cross terms of the Hamiltonian are discussed. The calculation can also be applied to systems containing particles with opposite charges, such as excitons.
A hierarchical transition state search algorithm
NASA Astrophysics Data System (ADS)
del Campo, Jorge M.; Köster, Andreas M.
2008-07-01
A hierarchical transition state search algorithm is developed and its implementation in the density functional theory program deMon2k is described. This search algorithm combines the double ended saddle interpolation method with local uphill trust region optimization. A new formalism for the incorporation of the distance constrain in the saddle interpolation method is derived. The similarities between the constrained optimizations in the local trust region method and the saddle interpolation are highlighted. The saddle interpolation and local uphill trust region optimizations are validated on a test set of 28 representative reactions. The hierarchical transition state search algorithm is applied to an intramolecular Diels-Alder reaction with several internal rotors, which makes automatic transition state search rather challenging. The obtained reaction mechanism is discussed in the context of the experimentally observed product distribution.
NASA Langley's Formal Methods Research in Support of the Next Generation Air Transportation System
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Munoz, Cesar A.
2008-01-01
This talk will provide a brief introduction to the formal methods developed at NASA Langley and the National Institute for Aerospace (NIA) for air traffic management applications. NASA Langley's formal methods research supports the Interagency Joint Planning and Development Office (JPDO) effort to define and develop the 2025 Next Generation Air Transportation System (NGATS). The JPDO was created by the passage of the Vision 100 Century of Aviation Reauthorization Act in Dec 2003. The NGATS vision calls for a major transformation of the nation s air transportation system that will enable growth to 3 times the traffic of the current system. The transformation will require an unprecedented level of safety-critical automation used in complex procedural operations based on 4-dimensional (4D) trajectories that enable dynamic reconfiguration of airspace scalable to geographic and temporal demand. The goal of our formal methods research is to provide verification methods that can be used to insure the safety of the NGATS system. Our work has focused on the safety assessment of concepts of operation and fundamental algorithms for conflict detection and resolution (CD&R) and self- spacing in the terminal area. Formal analysis of a concept of operations is a novel area of application of formal methods. Here one must establish that a system concept involving aircraft, pilots, and ground resources is safe. The formal analysis of algorithms is a more traditional endeavor. However, the formal analysis of ATM algorithms involves reasoning about the interaction of algorithmic logic and aircraft trajectories defined over an airspace. These trajectories are described using 2D and 3D vectors and are often constrained by trigonometric relations. Thus, in many cases it has been necessary to unload the full power of an advanced theorem prover. The verification challenge is to establish that the safety-critical algorithms produce valid solutions that are guaranteed to maintain separation under all possible scenarios. Current research has assumed perfect knowledge of the location of other aircraft in the vicinity so absolute guarantees are possible, but increasingly we are relaxing the assumptions to allow incomplete, inaccurate, and/or faulty information from communication sources.
Caricato, Marco
2013-07-28
The calculation of vertical electronic transition energies of molecular systems in solution with accurate quantum mechanical methods requires the use of approximate and yet reliable models to describe the effect of the solvent on the electronic structure of the solute. The polarizable continuum model (PCM) of solvation represents a computationally efficient way to describe this effect, especially when combined with coupled cluster (CC) methods. Two formalisms are available to compute transition energies within the PCM framework: State-Specific (SS) and Linear-Response (LR). The former provides a more complete account of the solute-solvent polarization in the excited states, while the latter is computationally very efficient (i.e., comparable to gas phase) and transition properties are well defined. In this work, I review the theory for the two formalisms within CC theory with a focus on their computational requirements, and present the first implementation of the LR-PCM formalism with the coupled cluster singles and doubles method (CCSD). Transition energies computed with LR- and SS-CCSD-PCM are presented, as well as a comparison between solvation models in the LR approach. The numerical results show that the two formalisms provide different absolute values of transition energy, but similar relative solvatochromic shifts (from nonpolar to polar solvents). The LR formalism may then be used to explore the solvent effect on multiple states and evaluate transition probabilities, while the SS formalism may be used to refine the description of specific states and for the exploration of excited state potential energy surfaces of solvated systems.
Chromotomography for a rotating-prism instrument using backprojection, then filtering.
Deming, Ross W
2006-08-01
A simple closed-form solution is derived for reconstructing a 3D spatial-chromatic image cube from a set of chromatically dispersed 2D image frames. The algorithm is tailored for a particular instrument in which the dispersion element is a matching set of mechanically rotated direct vision prisms positioned between a lens and a focal plane array. By using a linear operator formalism to derive the Tikhonov-regularized pseudoinverse operator, it is found that the unique minimum-norm solution is obtained by applying the adjoint operator, followed by 1D filtering with respect to the chromatic variable. Thus the filtering and backprojection (adjoint) steps are applied in reverse order relative to an existing method. Computational efficiency is provided by use of the fast Fourier transform in the filtering step.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mashouf, S; Lai, P; Karotki, A
2014-06-01
Purpose: Seed brachytherapy is currently used for adjuvant radiotherapy of early stage prostate and breast cancer patients. The current standard for calculation of dose surrounding the brachytherapy seeds is based on American Association of Physicist in Medicine Task Group No. 43 (TG-43 formalism) which generates the dose in homogeneous water medium. Recently, AAPM Task Group No. 186 emphasized the importance of accounting for tissue heterogeneities. This can be done using Monte Carlo (MC) methods, but it requires knowing the source structure and tissue atomic composition accurately. In this work we describe an efficient analytical dose inhomogeneity correction algorithm implemented usingmore » MIM Symphony treatment planning platform to calculate dose distributions in heterogeneous media. Methods: An Inhomogeneity Correction Factor (ICF) is introduced as the ratio of absorbed dose in tissue to that in water medium. ICF is a function of tissue properties and independent of source structure. The ICF is extracted using CT images and the absorbed dose in tissue can then be calculated by multiplying the dose as calculated by the TG-43 formalism times ICF. To evaluate the methodology, we compared our results with Monte Carlo simulations as well as experiments in phantoms with known density and atomic compositions. Results: The dose distributions obtained through applying ICF to TG-43 protocol agreed very well with those of Monte Carlo simulations as well as experiments in all phantoms. In all cases, the mean relative error was reduced by at least 50% when ICF correction factor was applied to the TG-43 protocol. Conclusion: We have developed a new analytical dose calculation method which enables personalized dose calculations in heterogeneous media. The advantages over stochastic methods are computational efficiency and the ease of integration into clinical setting as detailed source structure and tissue segmentation are not needed. University of Toronto, Natural Sciences and Engineering Research Council of Canada.« less
1991-10-01
SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department
Lfm2000: Fifth NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler)
2000-01-01
This is the proceedings of Lfm2000: Fifth NASA Langley Formal Methods Workshop. The workshop was held June 13-15, 2000, in Williamsburg, Virginia. See the web site
A direct method for unfolding the resolution function from measurements of neutron induced reactions
NASA Astrophysics Data System (ADS)
Žugec, P.; Colonna, N.; Sabate-Gilarte, M.; Vlachoudis, V.; Massimi, C.; Lerendegui-Marco, J.; Stamatopoulos, A.; Bacak, M.; Warren, S. G.; n TOF Collaboration
2017-12-01
The paper explores the numerical stability and the computational efficiency of a direct method for unfolding the resolution function from the measurements of the neutron induced reactions. A detailed resolution function formalism is laid out, followed by an overview of challenges present in a practical implementation of the method. A special matrix storage scheme is developed in order to facilitate both the memory management of the resolution function matrix, and to increase the computational efficiency of the matrix multiplication and decomposition procedures. Due to its admirable computational properties, a Cholesky decomposition is at the heart of the unfolding procedure. With the smallest but necessary modification of the matrix to be decomposed, the method is successfully applied to system of 105 × 105. However, the amplification of the uncertainties during the direct inversion procedures limits the applicability of the method to high-precision measurements of neutron induced reactions.
Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing
2017-01-01
Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM) model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.
The Hierarchical Structure of Formal Operational Tasks.
ERIC Educational Resources Information Center
Bart, William M.; Mertens, Donna M.
1979-01-01
The hierarchical structure of the formal operational period of Piaget's theory of cognitive development was explored through the application of ordering theoretical methods to a set of data that systematically utilized the various formal operational schemes. Results suggested a common structure underlying task performance. (Author/BH)
Ma, Q; Boulet, C
2016-06-14
The Robert-Bonamy formalism has been commonly used to calculate half-widths and shifts of spectral lines for decades. This formalism is based on several approximations. Among them, two have not been fully addressed: the isolated line approximation and the neglect of coupling between the translational and internal motions. Recently, we have shown that the isolated line approximation is not necessary in developing semi-classical line shape theories. Based on this progress, we have been able to develop a new formalism that enables not only to reduce uncertainties on calculated half-widths and shifts, but also to model line mixing effects on spectra starting from the knowledge of the intermolecular potential. In our previous studies, the new formalism had been applied to linear and asymmetric-top molecules. In the present study, the method has been extended to symmetric-top molecules with inversion symmetry. As expected, the inversion splitting induces a complete failure of the isolated line approximation. We have calculated the complex relaxation matrices of self-broadened NH3. The half-widths and shifts in the ν1 and the pure rotational bands are reported in the present paper. When compared with measurements, the calculated half-widths match the experimental data very well, since the inapplicable isolated line approximation has been removed. With respect to the shifts, only qualitative results are obtained and discussed. Calculated off-diagonal elements of the relaxation matrix and a comparison with the observed line mixing effects are reported in the companion paper (Paper II).
Peer support for CKD patients and carers: overcoming barriers and facilitating access.
Taylor, Francesca; Gutteridge, Robin; Willis, Carol
2016-06-01
Peer support is valued by its users. Nevertheless, there is initial low take-up of formal peer support programmes among patients with chronic kidney disease (CKD), with fewer patients participating than expressing an interest. There is little evidence on reasons for low participation levels. Few studies have examined the perspectives of carers. To explore with CKD patients and carers their needs, wants and expectations from formal peer support and examine how barriers to participation may be overcome. Qualitative interviews with a sample of 26 CKD stage five patients and carers. Principles of Grounded Theory were applied to data coding and analysis. Six NHS Hospital Trusts. Whilst informal peer support might occur naturally and is welcomed, a range of emotional and practical barriers inhibit take-up of more formalized support. Receptivity varies across time and the disease trajectory and is associated with emotional readiness; patients and carers needing to overcome complex psychological hurdles such as acknowledging support needs. Practical barriers include limited understanding of peer support. An attractive peer relationship is felt to involve reciprocity based on sharing experiences and both giving and receiving support. Establishing rapport is linked with development of reciprocity. There is potential to facilitate active uptake of formal peer support by addressing the identified barriers. Our study suggests several facilitation methods, brought together in a conceptual model, including clinician promotion of peer support as an intervention suitable for anyone with CKD and their carers, and opportunity for choice of peer supporter. © 2015 The Authors Health Expectations Published by John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Ma, Q.; Boulet, C.
2016-01-01
The Robert-Bonamy formalism has been commonly used to calculate half-widths and shifts of spectral lines for decades. This formalism is based on several approximations. Among them, two have not been fully addressed: the isolated line approximation and the neglect of coupling between the translational and internal motions. Recently, we have shown that the isolated line approximation is not necessary in developing semi-classical line shape theories. Based on this progress, we have been able to develop a new formalism that enables not only to reduce uncertainties on calculated half-widths and shifts, but also to model line mixing effects on spectra starting from the knowledge of the intermolecular potential. In our previous studies, the new formalism had been applied to linear and asymmetric-top molecules. In the present study, the method has been extended to symmetric-top molecules with inversion symmetry. As expected, the inversion splitting induces a complete failure of the isolated line approximation. We have calculated the complex relaxation matrices of selfbroadened NH3. The half-widths and shifts in the ?1 and the pure rotational bands are reported in the present paper. When compared with measurements, the calculated half-widths match the experimental data very well, since the inapplicable isolated line approximation has been removed. With respect to the shifts, only qualitative results are obtained and discussed. Calculated off-diagonal elements of the relaxation matrix and a comparison with the observed line mixing effects are reported in the companion paper (Paper II).
Smith, Kyle K.G.; Poulsen, Jens Aage; Nyman, Gunnar; ...
2015-06-30
Here, we apply the Feynman-Kleinert Quasi-Classical Wigner (FK-QCW) method developed in our previous work [Smith et al., J. Chem. Phys. 142, 244112 (2015)] for the determination of the dynamic structure factor of liquid para-hydrogen and ortho-deuterium at state points of (T = 20.0 K, n = 21.24 nm -3) and (T = 23.0 K, n = 24.61 nm -3), respectively. When applied to this challenging system, it is shown that this new FK-QCW method consistently reproduces the experimental dynamic structure factor reported by Smith et al. [J. Chem. Phys. 140, 034501 (2014)] for all momentum transfers considered. Moreover, this showsmore » that FK-QCW provides a substantial improvement over the Feynman-Kleinert linearized path-integral method, in which purely classical dynamics are used. Furthermore, for small momentum transfers, it is shown that FK-QCW provides nearly the same results as ring-polymer molecular dynamics (RPMD), thus suggesting that FK-QCW provides a potentially more appealing algorithm than RPMD since it is not formally limited to correlation functions involving linear operators.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kyle K.G.; Poulsen, Jens Aage; Nyman, Gunnar
Here, we apply the Feynman-Kleinert Quasi-Classical Wigner (FK-QCW) method developed in our previous work [Smith et al., J. Chem. Phys. 142, 244112 (2015)] for the determination of the dynamic structure factor of liquid para-hydrogen and ortho-deuterium at state points of (T = 20.0 K, n = 21.24 nm -3) and (T = 23.0 K, n = 24.61 nm -3), respectively. When applied to this challenging system, it is shown that this new FK-QCW method consistently reproduces the experimental dynamic structure factor reported by Smith et al. [J. Chem. Phys. 140, 034501 (2014)] for all momentum transfers considered. Moreover, this showsmore » that FK-QCW provides a substantial improvement over the Feynman-Kleinert linearized path-integral method, in which purely classical dynamics are used. Furthermore, for small momentum transfers, it is shown that FK-QCW provides nearly the same results as ring-polymer molecular dynamics (RPMD), thus suggesting that FK-QCW provides a potentially more appealing algorithm than RPMD since it is not formally limited to correlation functions involving linear operators.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kyle K. G., E-mail: kylesmith@utexas.edu; Poulsen, Jens Aage, E-mail: jens72@chem.gu.se; Nyman, Gunnar, E-mail: nyman@chem.gu.se
We apply the Feynman-Kleinert Quasi-Classical Wigner (FK-QCW) method developed in our previous work [Smith et al., J. Chem. Phys. 142, 244112 (2015)] for the determination of the dynamic structure factor of liquid para-hydrogen and ortho-deuterium at state points of (T = 20.0 K, n = 21.24 nm{sup −3}) and (T = 23.0 K, n = 24.61 nm{sup −3}), respectively. When applied to this challenging system, it is shown that this new FK-QCW method consistently reproduces the experimental dynamic structure factor reported by Smith et al. [J. Chem. Phys. 140, 034501 (2014)] for all momentum transfers considered. This shows that FK-QCWmore » provides a substantial improvement over the Feynman-Kleinert linearized path-integral method, in which purely classical dynamics are used. Furthermore, for small momentum transfers, it is shown that FK-QCW provides nearly the same results as ring-polymer molecular dynamics (RPMD), thus suggesting that FK-QCW provides a potentially more appealing algorithm than RPMD since it is not formally limited to correlation functions involving linear operators.« less
Smith, Kyle K G; Poulsen, Jens Aage; Nyman, Gunnar; Cunsolo, Alessandro; Rossky, Peter J
2015-06-28
We apply the Feynman-Kleinert Quasi-Classical Wigner (FK-QCW) method developed in our previous work [Smith et al., J. Chem. Phys. 142, 244112 (2015)] for the determination of the dynamic structure factor of liquid para-hydrogen and ortho-deuterium at state points of (T = 20.0 K, n = 21.24 nm(-3)) and (T = 23.0 K, n = 24.61 nm(-3)), respectively. When applied to this challenging system, it is shown that this new FK-QCW method consistently reproduces the experimental dynamic structure factor reported by Smith et al. [J. Chem. Phys. 140, 034501 (2014)] for all momentum transfers considered. This shows that FK-QCW provides a substantial improvement over the Feynman-Kleinert linearized path-integral method, in which purely classical dynamics are used. Furthermore, for small momentum transfers, it is shown that FK-QCW provides nearly the same results as ring-polymer molecular dynamics (RPMD), thus suggesting that FK-QCW provides a potentially more appealing algorithm than RPMD since it is not formally limited to correlation functions involving linear operators.
NASA Technical Reports Server (NTRS)
Broderick, Ron
1997-01-01
The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.
A Formal Semantics for the WS-BPEL Recovery Framework
NASA Astrophysics Data System (ADS)
Dragoni, Nicola; Mazzara, Manuel
While current studies on Web services composition are mostly focused - from the technical viewpoint - on standards and protocols, this work investigates the adoption of formal methods for dependable composition. The Web Services Business Process Execution Language (WS-BPEL) - an OASIS standard widely adopted both in academic and industrial environments - is considered as a touchstone for concrete composition languages and an analysis of its ambiguous Recovery Framework specification is offered. In order to show the use of formal methods, a precise and unambiguous description of its (simplified) mechanisms is provided by means of a conservative extension of the π-calculus. This has to be intended as a well known case study providing methodological arguments for the adoption of formal methods in software specification. The aspect of verification is not the main topic of the paper but some hints are given.
NASA Astrophysics Data System (ADS)
Batalin, Igor; Marnelius, Robert
1998-02-01
A general field-antifield BV formalism for antisymplectic first class constraints is proposed. It is as general as the corresponding symplectic BFV-BRST formulation and it is demonstrated to be consistent with a previously proposed formalism for antisymplectic second class constraints through a generalized conversion to corresponding first class constraints. Thereby the basic concept of gauge symmetry is extended to apply to quite a new class of gauge theories potentially possible to exist.
Cycle-expansion method for the Lyapunov exponent, susceptibility, and higher moments.
Charbonneau, Patrick; Li, Yue Cathy; Pfister, Henry D; Yaida, Sho
2017-09-01
Lyapunov exponents characterize the chaotic nature of dynamical systems by quantifying the growth rate of uncertainty associated with the imperfect measurement of initial conditions. Finite-time estimates of the exponent, however, experience fluctuations due to both the initial condition and the stochastic nature of the dynamical path. The scale of these fluctuations is governed by the Lyapunov susceptibility, the finiteness of which typically provides a sufficient condition for the law of large numbers to apply. Here, we obtain a formally exact expression for this susceptibility in terms of the Ruelle dynamical ζ function for one-dimensional systems. We further show that, for systems governed by sequences of random matrices, the cycle expansion of the ζ function enables systematic computations of the Lyapunov susceptibility and of its higher-moment generalizations. The method is here applied to a class of dynamical models that maps to static disordered spin chains with interactions stretching over a varying distance and is tested against Monte Carlo simulations.
A study and evaluation of image analysis techniques applied to remotely sensed data
NASA Technical Reports Server (NTRS)
Atkinson, R. J.; Dasarathy, B. V.; Lybanon, M.; Ramapriyan, H. K.
1976-01-01
An analysis of phenomena causing nonlinearities in the transformation from Landsat multispectral scanner coordinates to ground coordinates is presented. Experimental results comparing rms errors at ground control points indicated a slight improvement when a nonlinear (8-parameter) transformation was used instead of an affine (6-parameter) transformation. Using a preliminary ground truth map of a test site in Alabama covering the Mobile Bay area and six Landsat images of the same scene, several classification methods were assessed. A methodology was developed for automatic change detection using classification/cluster maps. A coding scheme was employed for generation of change depiction maps indicating specific types of changes. Inter- and intraseasonal data of the Mobile Bay test area were compared to illustrate the method. A beginning was made in the study of data compression by applying a Karhunen-Loeve transform technique to a small section of the test data set. The second part of the report provides a formal documentation of the several programs developed for the analysis and assessments presented.
Automatically Grading Customer Confidence in a Formal Specification.
ERIC Educational Resources Information Center
Shukur, Zarina; Burke, Edmund; Foxley, Eric
1999-01-01
Describes an automatic grading system for a formal methods computer science course that is able to evaluate a formal specification written in the Z language. Quality is measured by considering first, specification correctness (syntax, semantics, and satisfaction of customer requirements), and second, specification maintainability (comparison of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Junkel, G. C.; Gunderson, M. A.; Hooper, C. F.
Recently, there has been growing experimental evidence for redshifts in line spectra from highly ionized, high-Z radiators immersed in hot, dense plasmas [O. Renner , J. Quant. Spectrosc. Radiat. Transf. 58, 851 (1997); C. F. Hooper , in Strongly Coupled Coulomb Systems (Plenum, New York, 1998); N. C. Woolsey , J. Quant. Spectrosc. Radiat. Transf. 65, 573 (2000); A. Saemann , Phys. Rev. Lett. 82, 4843 (1999)]. A full Coulomb, multielectron formalism of line broadening due to perturbation by plasma electrons will be presented. A red line shift and asymmetries arise naturally from employing a full Coulomb expression for themore » perturber-radiator interaction, rather than applying the dipole approximation. This formalism can now be applied to arbitrary multielectron radiating ions.« less
Formalizing New Navigation Requirements for NASA's Space Shuttle
NASA Technical Reports Server (NTRS)
DiVito, Ben L.
1996-01-01
We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CRs) were selected as promising targets to demonstrate the utility of formal methods in this demanding application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this industrial usage report. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During a limited analysis conducted on the formal specifications, numerous requirements issues were discovered. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.
NASA Technical Reports Server (NTRS)
Boulet, C.; Ma, Qiancheng; Tipping, R. H.
2015-01-01
Starting from the refined Robert-Bonamy formalism [Q. Ma, C. Boulet, and R. H. Tipping, J. Chem. Phys. 139, 034305 (2013)], we propose here an extension of line mixing studies to infrared absorptions of linear polyatomic molecules having stretching and bending modes. The present formalism does not neglect the internal degrees of freedom of the perturbing molecules, contrary to the energy corrected sudden (ECS) modeling, and enables one to calculate the whole relaxation matrix starting from the potential energy surface. Meanwhile, similar to the ECS modeling, the present formalism properly accounts for roles played by all the internal angular momenta in the coupling process, including the vibrational angular momentum. The formalism has been applied to the important case of CO2 broadened by N2. Applications to two kinds of vibrational bands (sigma yields sigma and sigma yields pi) have shown that the present results are in good agreement with both experimental data and results derived from the ECS model.
Job Search Methods: Consequences for Gender-based Earnings Inequality.
ERIC Educational Resources Information Center
Huffman, Matt L.; Torres, Lisa
2001-01-01
Data from adults in Atlanta, Boston, and Los Angeles (n=1,942) who searched for work using formal (ads, agencies) or informal (networks) methods indicated that type of method used did not contribute to the gender gap in earnings. Results do not support formal job search as a way to reduce gender inequality. (Contains 55 references.) (SK)
Frequency-resolved Monte Carlo.
López Carreño, Juan Camilo; Del Valle, Elena; Laussy, Fabrice P
2018-05-03
We adapt the Quantum Monte Carlo method to the cascaded formalism of quantum optics, allowing us to simulate the emission of photons of known energy. Statistical processing of the photon clicks thus collected agrees with the theory of frequency-resolved photon correlations, extending the range of applications based on correlations of photons of prescribed energy, in particular those of a photon-counting character. We apply the technique to autocorrelations of photon streams from a two-level system under coherent and incoherent pumping, including the Mollow triplet regime where we demonstrate the direct manifestation of leapfrog processes in producing an increased rate of two-photon emission events.
A Test Generation Framework for Distributed Fault-Tolerant Algorithms
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.
2009-01-01
Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.
Dynamical analysis on f(R, G) cosmology
NASA Astrophysics Data System (ADS)
Santos da Costa, S.; Roig, F. V.; Alcaniz, J. S.; Capozziello, S.; De Laurentis, M.; Benetti, M.
2018-04-01
We use a dynamical system approach to study the cosmological viability of f(R, G) gravity theories. The method consists of formulating the evolution equations as an autonomous system of ordinary differential equations, using suitable variables. The formalism is applied to a class of models in which f(R, G)\\propto RnG1-n and its solutions and corresponding stability are analysed in detail. New accelerating solutions that can be attractors in the phase space are found. We also find that this class of models does not exhibit a matter-dominated epoch, a solution which is inconsistent with current cosmological observations.
Lectures on the scattering of light. [by dielectric sphere
NASA Technical Reports Server (NTRS)
Saxon, D. S.
1974-01-01
The exact (Mie) theory for the scattering of a plane wave by a dielectric sphere is presented. Since this infinite series solution is computationally impractical for large spheres, another formulation is given in terms of an integral equation valid for a bounded, but otherwise general array of scatterers. This equation is applied to the scattering by a single sphere, and several methods are suggested for approximating the scattering cross section in closed form. A tensor scattering matrix is introduced, in terms of which some general scattering theorems are derived. The application of the formalism to multiple scattering is briefly considered.
A linguistic geometry for space applications
NASA Technical Reports Server (NTRS)
Stilman, Boris
1994-01-01
We develop a formal theory, the so-called Linguistic Geometry, in order to discover the inner properties of human expert heuristics, which were successful in a certain class of complex control systems, and apply them to different systems. This research relies on the formalization of search heuristics of high-skilled human experts which allow for the decomposition of complex system into the hierarchy of subsystems, and thus solve intractable problems reducing the search. The hierarchy of subsystems is represented as a hierarchy of formal attribute languages. This paper includes a formal survey of the Linguistic Geometry, and new example of a solution of optimization problem for the space robotic vehicles. This example includes actual generation of the hierarchy of languages, some details of trajectory generation and demonstrates the drastic reduction of search in comparison with conventional search algorithms.
Asymptotic symmetries and geometry on the boundary in the first order formalism
NASA Astrophysics Data System (ADS)
Korovin, Yegor
2018-03-01
Proper understanding of the geometry on the boundary of a spacetime is a critical step on the way to extending holography to spaces with non-AdS asymptotics. In general the boundary cannot be described in terms of the Riemannian geometry and the first order formalism is more appropriate as we show. We analyze the asymptotic symmetries in the first order formalism for large classes of theories on AdS, Lifshitz or flat space. In all cases the asymptotic symmetry algebra is realized on the first order variables as a gauged symmetry algebra. First order formalism geometrizes and simplifies the analysis. We apply our framework to the issue of scale versus conformal invariance in AdS/CFT and obtain new perspective on the structure of asymptotic expansions for AdS and flat spaces.
Teif, Vladimir B
2007-01-01
The transfer matrix methodology is proposed as a systematic tool for the statistical-mechanical description of DNA-protein-drug binding involved in gene regulation. We show that a genetic system of several cis-regulatory modules is calculable using this method, considering explicitly the site-overlapping, competitive, cooperative binding of regulatory proteins, their multilayer assembly and DNA looping. In the methodological section, the matrix models are solved for the basic types of short- and long-range interactions between DNA-bound proteins, drugs and nucleosomes. We apply the matrix method to gene regulation at the O(R) operator of phage lambda. The transfer matrix formalism allowed the description of the lambda-switch at a single-nucleotide resolution, taking into account the effects of a range of inter-protein distances. Our calculations confirm previously established roles of the contact CI-Cro-RNAP interactions. Concerning long-range interactions, we show that while the DNA loop between the O(R) and O(L) operators is important at the lysogenic CI concentrations, the interference between the adjacent promoters P(R) and P(RM) becomes more important at small CI concentrations. A large change in the expression pattern may arise in this regime due to anticooperative interactions between DNA-bound RNA polymerases. The applicability of the matrix method to more complex systems is discussed.
Teif, Vladimir B.
2007-01-01
The transfer matrix methodology is proposed as a systematic tool for the statistical–mechanical description of DNA–protein–drug binding involved in gene regulation. We show that a genetic system of several cis-regulatory modules is calculable using this method, considering explicitly the site-overlapping, competitive, cooperative binding of regulatory proteins, their multilayer assembly and DNA looping. In the methodological section, the matrix models are solved for the basic types of short- and long-range interactions between DNA-bound proteins, drugs and nucleosomes. We apply the matrix method to gene regulation at the OR operator of phage λ. The transfer matrix formalism allowed the description of the λ-switch at a single-nucleotide resolution, taking into account the effects of a range of inter-protein distances. Our calculations confirm previously established roles of the contact CI–Cro–RNAP interactions. Concerning long-range interactions, we show that while the DNA loop between the OR and OL operators is important at the lysogenic CI concentrations, the interference between the adjacent promoters PR and PRM becomes more important at small CI concentrations. A large change in the expression pattern may arise in this regime due to anticooperative interactions between DNA-bound RNA polymerases. The applicability of the matrix method to more complex systems is discussed. PMID:17526526
Deep first formal concept search.
Zhang, Tao; Li, Hui; Hong, Wenxue; Yuan, Xiamei; Wei, Xinyu
2014-01-01
The calculation of formal concepts is a very important part in the theory of formal concept analysis (FCA); however, within the framework of FCA, computing all formal concepts is the main challenge because of its exponential complexity and difficulty in visualizing the calculating process. With the basic idea of Depth First Search, this paper presents a visualization algorithm by the attribute topology of formal context. Limited by the constraints and calculation rules, all concepts are achieved by the visualization global formal concepts searching, based on the topology degenerated with the fixed start and end points, without repetition and omission. This method makes the calculation of formal concepts precise and easy to operate and reflects the integrity of the algorithm, which enables it to be suitable for visualization analysis.
Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.
1997-09-30
set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has
Formalisms for user interface specification and design
NASA Technical Reports Server (NTRS)
Auernheimer, Brent J.
1989-01-01
The application of formal methods to the specification and design of human-computer interfaces is described. A broad outline of human-computer interface problems, a description of the field of cognitive engineering and two relevant research results, the appropriateness of formal specification techniques, and potential NASA application areas are described.
A STUDY OF FORMALLY ADVERTISED PROCUREMENT
As a method of procuring goods and services, formally advertised procurement offers a number of advantages. These include the prevention of fraud and...two-thirds of all contracts are let in these cases. This is done by examining over 2,300 contracts let under formal advertising procedures. A measure of
Geometry and Formal Linguistics.
ERIC Educational Resources Information Center
Huff, George A.
This paper presents a method of encoding geometric line-drawings in a way which allows sets of such drawings to be interpreted as formal languages. A characterization of certain geometric predicates in terms of their properties as languages is obtained, and techniques usually associated with generative grammars and formal automata are then applied…
Jiang, Qijun; Bregt, Arnold K; Kooistra, Lammert
2018-04-01
Environmental sensing data provide crucial information for environment-related decision-making. Formal data are provided by official environmental institutes. Beyond those, however, there is a growing body of so-called informal sensing data, which are contributed by citizens using low-cost sensors. How good are these informal data, and how might they be applied, next to formal environmental sensing data? Could both types of sensing data be gainfully integrated? This paper presents the results of an online survey investigating perceptions within citizen science communities, environmental institutes and their networks of formal and informal environmental sensing data. The results show that citizens and experts had different views of formal and informal environmental sensing data, particularly on measurement frequency and the data information provision power. However, there was agreement, too, for example, on the accuracy of formal environmental sensing data. Furthermore, both agreed that the integration of formal and informal environmental sensing data offered potential for improvements on several aspects, particularly spatial coverage, data quantity and measurement frequency. Interestingly, the accuracy of informal environmental sensing data was largely unknown to both experts and citizens. This suggests the need for further investigation of informal environmental sensing data and the potential for its effective integration with formal environmental sensing data, if hurdles like standardisation can be overcome. Copyright © 2017 Elsevier B.V. All rights reserved.
A methodology for commonality analysis, with applications to selected space station systems
NASA Technical Reports Server (NTRS)
Thomas, Lawrence Dale
1989-01-01
The application of commonality in a system represents an attempt to reduce costs by reducing the number of unique components. A formal method for conducting commonality analysis has not been established. In this dissertation, commonality analysis is characterized as a partitioning problem. The cost impacts of commonality are quantified in an objective function, and the solution is that partition which minimizes this objective function. Clustering techniques are used to approximate a solution, and sufficient conditions are developed which can be used to verify the optimality of the solution. This method for commonality analysis is general in scope. It may be applied to the various types of commonality analysis required in the conceptual, preliminary, and detail design phases of the system development cycle.
Logic-Based Models for the Analysis of Cell Signaling Networks†
2010-01-01
Computational models are increasingly used to analyze the operation of complex biochemical networks, including those involved in cell signaling networks. Here we review recent advances in applying logic-based modeling to mammalian cell biology. Logic-based models represent biomolecular networks in a simple and intuitive manner without describing the detailed biochemistry of each interaction. A brief description of several logic-based modeling methods is followed by six case studies that demonstrate biological questions recently addressed using logic-based models and point to potential advances in model formalisms and training procedures that promise to enhance the utility of logic-based methods for studying the relationship between environmental inputs and phenotypic or signaling state outputs of complex signaling networks. PMID:20225868
Computing diffusivities from particle models out of equilibrium
NASA Astrophysics Data System (ADS)
Embacher, Peter; Dirr, Nicolas; Zimmer, Johannes; Reina, Celia
2018-04-01
A new method is proposed to numerically extract the diffusivity of a (typically nonlinear) diffusion equation from underlying stochastic particle systems. The proposed strategy requires the system to be in local equilibrium and have Gaussian fluctuations but it is otherwise allowed to undergo arbitrary out-of-equilibrium evolutions. This could be potentially relevant for particle data obtained from experimental applications. The key idea underlying the method is that finite, yet large, particle systems formally obey stochastic partial differential equations of gradient flow type satisfying a fluctuation-dissipation relation. The strategy is here applied to three classic particle models, namely independent random walkers, a zero-range process and a symmetric simple exclusion process in one space dimension, to allow the comparison with analytic solutions.
Grooms receives 2011 Donald L. Turcotte Award
NASA Astrophysics Data System (ADS)
2012-02-01
Ian Grooms has been awarded the AGU Donald L. Turcotte Award, given annually to recent Ph.D. recipients for outstanding dissertation research that contributes directly to the field of nonlinear geophysics. Grooms's thesis is entitled “Asymptotic and numerical methods for rapidly rotating buoyant flow.” He presented an invited talk and was formally presented with the award at the 2011 AGU Fall Meeting, held 5-9 December in San Francisco, Calif. Grooms received his B.S. in mathematics from the College of William and Mary, Williamsburg, Va., in 2005. He received a Ph.D. in applied mathematics in 2011 under the supervision of Keith Julien at the University of Colorado at Boulder. His research interests include asymptotic and numerical methods for multiscale problems in geophysical fluid dynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tabacchi, G; Hutter, J; Mundy, C
2005-04-07
A combined linear response--frozen electron density model has been implemented in a molecular dynamics scheme derived from an extended Lagrangian formalism. This approach is based on a partition of the electronic charge distribution into a frozen region described by Kim-Gordon theory, and a response contribution determined by the instaneous ionic configuration of the system. The method is free from empirical pair-potentials and the parameterization protocol involves only calculations on properly chosen subsystems. They apply this method to a series of alkali halides in different physical phases and are able to reproduce experimental structural and thermodynamic properties with an accuracy comparablemore » to Kohn-Sham density functional calculations.« less
Automated Decomposition of Model-based Learning Problems
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Millar, Bill
1996-01-01
A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.
NASA Astrophysics Data System (ADS)
Kaplan, Melike; Hosseini, Kamyar; Samadani, Farzan; Raza, Nauman
2018-07-01
A wide range of problems in different fields of the applied sciences especially non-linear optics is described by non-linear Schrödinger's equations (NLSEs). In the present paper, a specific type of NLSEs known as the cubic-quintic non-linear Schrödinger's equation including an anti-cubic term has been studied. The generalized Kudryashov method along with symbolic computation package has been exerted to carry out this objective. As a consequence, a series of optical soliton solutions have formally been retrieved. It is corroborated that the generalized form of Kudryashov method is a direct, effectual, and reliable technique to deal with various types of non-linear Schrödinger's equations.
Stopping power of dense plasmas: The collisional method and limitations of the dielectric formalism.
Clauser, C F; Arista, N R
2018-02-01
We present a study of the stopping power of plasmas using two main approaches: the collisional (scattering theory) and the dielectric formalisms. In the former case, we use a semiclassical method based on quantum scattering theory. In the latter case, we use the full description given by the extension of the Lindhard dielectric function for plasmas of all degeneracies. We compare these two theories and show that the dielectric formalism has limitations when it is used for slow heavy ions or atoms in dense plasmas. We present a study of these limitations and show the regimes where the dielectric formalism can be used, with appropriate corrections to include the usual quantum and classical limits. On the other hand, the semiclassical method shows the correct behavior for all plasma conditions and projectile velocity and charge. We consider different models for the ion charge distributions, including bare and dressed ions as well as neutral atoms.
ERIC Educational Resources Information Center
Crestani, Fabio; Dominich, Sandor; Lalmas, Mounia; van Rijsbergen, Cornelis Joost
2003-01-01
Discusses the importance of research on the use of mathematical, logical, and formal methods in information retrieval to help enhance retrieval effectiveness and clarify underlying concepts of information retrieval. Highlights include logic; probability; spaces; and future research needs. (Author/LRW)
First enantiocontrolled formal synthesis of (+)-neovibsanin B, a neurotrophic diterpenoid.
Esumi, Tomoyuki; Mori, Takehiro; Zhao, Ming; Toyota, Masao; Fukuyama, Yoshiyasu
2010-02-19
An enantiocontrolled formal synthesis of (+)-neovibsanin B has been achieved by a sequence that applies an asymmetric 1,4-addition of (H(2)C=CH)(2)Cu(CN)Li(2) to trisubstituted alpha,beta-carboxylic acid derivative 1 to induce the chirality at the C-11 all-carbon quaternary center. Together with a modified Negishi cyclic carbopalladation-carbonylative esterification tandem reaction for constructing the A-ring, the synthesis was completed.
Towards a formal semantics for Ada 9X
NASA Technical Reports Server (NTRS)
Guaspari, David; Mchugh, John; Wolfgang, Polak; Saaltink, Mark
1995-01-01
The Ada 9X language precision team was formed during the revisions of Ada 83, with the goal of analyzing the proposed design, identifying problems, and suggesting improvements, through the use of mathematical models. This report defines a framework for formally describing Ada 9X, based on Kahn's 'natural semantics', and applies the framework to portions of the language. The proposals for exceptions and optimization freedoms are also analyzed, using a different technique.
Applying A Formal Language of Command and Control For Interoperability Between Systems
2008-05-21
Initially, it must be determined which type of grammar is to be used. The Chomsky hierarchy specifies that grammars can be Type 0 (unrestricted...future research. 2. Development of Formal Grammars In his book “Syntactic Structures” [5], published in 1957, Noam Chomsky answered the question...be finite because recursion is allowed. 2.2 Types of Grammars Chomsky defines four types of grammar . They are ordered within what is
Work reservoirs in thermodynamics
NASA Astrophysics Data System (ADS)
Anacleto, Joaquim
2010-05-01
We stress the usefulness of the work reservoir in the formalism of thermodynamics, in particular in the context of the first law. To elucidate its usefulness, the formalism is then applied to the Joule expansion and other peculiar and instructive experimental situations, clarifying the concepts of configuration and dissipative work. The ideas and discussions presented in this study are primarily intended for undergraduate students, but they might also be useful to graduate students, researchers and teachers.
Marino, Ricardo; Majumdar, Satya N; Schehr, Grégory; Vivo, Pierpaolo
2016-09-01
Let P_{β}^{(V)}(N_{I}) be the probability that a N×Nβ-ensemble of random matrices with confining potential V(x) has N_{I} eigenvalues inside an interval I=[a,b] on the real line. We introduce a general formalism, based on the Coulomb gas technique and the resolvent method, to compute analytically P_{β}^{(V)}(N_{I}) for large N. We show that this probability scales for large N as P_{β}^{(V)}(N_{I})≈exp[-βN^{2}ψ^{(V)}(N_{I}/N)], where β is the Dyson index of the ensemble. The rate function ψ^{(V)}(k_{I}), independent of β, is computed in terms of single integrals that can be easily evaluated numerically. The general formalism is then applied to the classical β-Gaussian (I=[-L,L]), β-Wishart (I=[1,L]), and β-Cauchy (I=[-L,L]) ensembles. Expanding the rate function around its minimum, we find that generically the number variance var(N_{I}) exhibits a nonmonotonic behavior as a function of the size of the interval, with a maximum that can be precisely characterized. These analytical results, corroborated by numerical simulations, provide the full counting statistics of many systems where random matrix models apply. In particular, we present results for the full counting statistics of zero-temperature one-dimensional spinless fermions in a harmonic trap.
Feedback control for unsteady flow and its application to the stochastic Burgers equation
NASA Technical Reports Server (NTRS)
Choi, Haecheon; Temam, Roger; Moin, Parviz; Kim, John
1993-01-01
The study applies mathematical methods of control theory to the problem of control of fluid flow with the long-range objective of developing effective methods for the control of turbulent flows. Model problems are employed through the formalism and language of control theory to present the procedure of how to cast the problem of controlling turbulence into a problem in optimal control theory. Methods of calculus of variations through the adjoint state and gradient algorithms are used to present a suboptimal control and feedback procedure for stationary and time-dependent problems. Two types of controls are investigated: distributed and boundary controls. Several cases of both controls are numerically simulated to investigate the performances of the control algorithm. Most cases considered show significant reductions of the costs to be minimized. The dependence of the control algorithm on the time-descretization method is discussed.
From metadynamics to dynamics.
Tiwary, Pratyush; Parrinello, Michele
2013-12-06
Metadynamics is a commonly used and successful enhanced sampling method. By the introduction of a history dependent bias which depends on a restricted number of collective variables it can explore complex free energy surfaces characterized by several metastable states separated by large free energy barriers. Here we extend its scope by introducing a simple yet powerful method for calculating the rates of transition between different metastable states. The method does not rely on a previous knowledge of the transition states or reaction coordinates, as long as collective variables are known that can distinguish between the various stable minima in free energy space. We demonstrate that our method recovers the correct escape rates out of these stable states and also preserves the correct sequence of state-to-state transitions, with minimal extra computational effort needed over ordinary metadynamics. We apply the formalism to three different problems and in each case find excellent agreement with the results of long unbiased molecular dynamics runs.
NASA Astrophysics Data System (ADS)
Tiwary, Pratyush; Parrinello, Michele
2013-12-01
Metadynamics is a commonly used and successful enhanced sampling method. By the introduction of a history dependent bias which depends on a restricted number of collective variables it can explore complex free energy surfaces characterized by several metastable states separated by large free energy barriers. Here we extend its scope by introducing a simple yet powerful method for calculating the rates of transition between different metastable states. The method does not rely on a previous knowledge of the transition states or reaction coordinates, as long as collective variables are known that can distinguish between the various stable minima in free energy space. We demonstrate that our method recovers the correct escape rates out of these stable states and also preserves the correct sequence of state-to-state transitions, with minimal extra computational effort needed over ordinary metadynamics. We apply the formalism to three different problems and in each case find excellent agreement with the results of long unbiased molecular dynamics runs.
Ontology driven modeling for the knowledge of genetic susceptibility to disease.
Lin, Yu; Sakamoto, Norihiro
2009-05-12
For the machine helped exploring the relationships between genetic factors and complex diseases, a well-structured conceptual framework of the background knowledge is needed. However, because of the complexity of determining a genetic susceptibility factor, there is no formalization for the knowledge of genetic susceptibility to disease, which makes the interoperability between systems impossible. Thus, the ontology modeling language OWL was used for formalization in this paper. After introducing the Semantic Web and OWL language propagated by W3C, we applied text mining technology combined with competency questions to specify the classes of the ontology. Then, an N-ary pattern was adopted to describe the relationships among these defined classes. Based on the former work of OGSF-DM (Ontology of Genetic Susceptibility Factors to Diabetes Mellitus), we formalized the definition of "Genetic Susceptibility", "Genetic Susceptibility Factor" and other classes by using OWL-DL modeling language; and a reasoner automatically performed the classification of the class "Genetic Susceptibility Factor". The ontology driven modeling is used for formalization the knowledge of genetic susceptibility to complex diseases. More importantly, when a class has been completely formalized in an ontology, the OWL reasoning can automatically compute the classification of the class, in our case, the class of "Genetic Susceptibility Factors". With more types of genetic susceptibility factors obtained from the laboratory research, our ontologies always needs to be refined, and many new classes must be taken into account to harmonize with the ontologies. Using the ontologies to develop the semantic web needs to be applied in the future.
Yu, Alexander C; Cimino, James J
2011-04-01
Most existing controlled terminologies can be characterized as collections of terms, wherein the terms are arranged in a simple list or organized in a hierarchy. These kinds of terminologies are considered useful for standardizing terms and encoding data and are currently used in many existing information systems. However, they suffer from a number of limitations that make data reuse difficult. Relatively recently, it has been proposed that formal ontological methods can be applied to some of the problems of terminological design. Biomedical ontologies organize concepts (embodiments of knowledge about biomedical reality) whereas terminologies organize terms (what is used to code patient data at a certain point in time, based on the particular terminology version). However, the application of these methods to existing terminologies is not straightforward. The use of these terminologies is firmly entrenched in many systems, and what might seem to be a simple option of replacing these terminologies is not possible. Moreover, these terminologies evolve over time in order to suit the needs of users. Any methodology must therefore take these constraints into consideration, hence the need for formal methods of managing changes. Along these lines, we have developed a formal representation of the concept-term relation, around which we have also developed a methodology for management of terminology changes. The objective of this study was to determine whether our methodology would result in improved retrieval of data. Comparison of two methods for retrieving data encoded with terms from the International Classification of Diseases (ICD-9-CM), based on their recall when retrieving data for ICD-9-CM terms whose codes had changed but which had retained their original meaning (code change). Recall and interclass correlation coefficient. Statistically significant differences were detected (p<0.05) with the McNemar test for two terms whose codes had changed. Furthermore, when all the cases are combined in an overall category, our method also performs statistically significantly better (p<0.05). Our study shows that an ontology-based ICD-9-CM data retrieval method that takes into account the effects of terminology changes performs better on recall than one that does not in the retrieval of data for terms whose codes had changed but which retained their original meaning. Copyright © 2011 Elsevier Inc. All rights reserved.
Yu, Alexander C.; Cimino, James J.
2012-01-01
Objective Most existing controlled terminologies can be characterized as collections of terms, wherein the terms are arranged in a simple list or organized in a hierarchy. These kinds of terminologies are considered useful for standardizing terms and encoding data and are currently used in many existing information systems. However, they suffer from a number of limitations that make data reuse difficult. Relatively recently, it has been proposed that formal ontological methods can be applied to some of the problems of terminological design. Biomedical ontologies organize concepts (embodiments of knowledge about biomedical reality) whereas terminologies organize terms (what is used to code patient data at a certain point in time, based on the particular terminology version). However, the application of these methods to existing terminologies is not straightforward. The use of these terminologies is firmly entrenched in many systems, and what might seem to be a simple option of replacing these terminologies is not possible. Moreover, these terminologies evolve over time in order to suit the needs of users. Any methodology must therefore take these constraints into consideration, hence the need for formal methods of managing changes. Along these lines, we have developed a formal representation of the concept-term relation, around which we have also developed a methodology for management of terminology changes. The objective of this study was to determine whether our methodology would result in improved retrieval of data. Design Comparison of two methods for retrieving data encoded with terms from the International Classification of Diseases (ICD-9-CM), based on their recall when retrieving data for ICD-9-CM terms whose codes had changed but which had retained their original meaning (code change). Measurements Recall and interclass correlation coefficient. Results Statistically significant differences were detected (p<0.05) with the McNemar test for two terms whose codes had changed. Furthermore, when all the cases are combined in an overall category, our method also performs statistically significantly better (p < 0.05). Conclusion Our study shows that an ontology-based ICD-9-CM data retrieval method that takes into account the effects of terminology changes performs better on recall than one that does not in the retrieval of data for terms whose codes had changed but which retained their original meaning. PMID:21262390
Formal Methods in Air Traffic Management: The Case of Unmanned Aircraft Systems
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.
2015-01-01
As the technological and operational capabilities of unmanned aircraft systems (UAS) continue to grow, so too does the need to introduce these systems into civil airspace. Unmanned Aircraft Systems Integration in the National Airspace System is a NASA research project that addresses the integration of civil UAS into non-segregated airspace operations. One of the major challenges of this integration is the lack of an onboard pilot to comply with the legal requirement that pilots see and avoid other aircraft. The need to provide an equivalent to this requirement for UAS has motivated the development of a detect and avoid (DAA) capability to provide the appropriate situational awareness and maneuver guidance in avoiding and remaining well clear of traffic aircraft. Formal methods has played a fundamental role in the development of this capability. This talk reports on the formal methods work conducted under NASA's Safe Autonomous System Operations project in support of the development of DAA for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations. The talk also discusses technical challenges in formal methods research in the context of the development and safety analysis of advanced air traffic management concepts.
Experimental design and statistical methods for improved hit detection in high-throughput screening.
Malo, Nathalie; Hanley, James A; Carlile, Graeme; Liu, Jing; Pelletier, Jerry; Thomas, David; Nadon, Robert
2010-09-01
Identification of active compounds in high-throughput screening (HTS) contexts can be substantially improved by applying classical experimental design and statistical inference principles to all phases of HTS studies. The authors present both experimental and simulated data to illustrate how true-positive rates can be maximized without increasing false-positive rates by the following analytical process. First, the use of robust data preprocessing methods reduces unwanted variation by removing row, column, and plate biases. Second, replicate measurements allow estimation of the magnitude of the remaining random error and the use of formal statistical models to benchmark putative hits relative to what is expected by chance. Receiver Operating Characteristic (ROC) analyses revealed superior power for data preprocessed by a trimmed-mean polish method combined with the RVM t-test, particularly for small- to moderate-sized biological hits.
Study of multiband disordered systems using the typical medium dynamical cluster approximation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yi; Terletska, Hanna; Moore, C.
We generalize the typical medium dynamical cluster approximation to multiband disordered systems. Using our extended formalism, we perform a systematic study of the nonlocal correlation effects induced by disorder on the density of states and the mobility edge of the three-dimensional two-band Anderson model. We include interband and intraband hopping and an intraband disorder potential. Our results are consistent with those obtained by the transfer matrix and the kernel polynomial methods. We also apply the method to K xFe 2-ySe 2 with Fe vacancies. Despite the strong vacancy disorder and anisotropy, we find the material is not an Anderson insulator.more » Moreover our results demonstrate the application of the typical medium dynamical cluster approximation method to study Anderson localization in real materials.« less
Study of multiband disordered systems using the typical medium dynamical cluster approximation
Zhang, Yi; Terletska, Hanna; Moore, C.; ...
2015-11-06
We generalize the typical medium dynamical cluster approximation to multiband disordered systems. Using our extended formalism, we perform a systematic study of the nonlocal correlation effects induced by disorder on the density of states and the mobility edge of the three-dimensional two-band Anderson model. We include interband and intraband hopping and an intraband disorder potential. Our results are consistent with those obtained by the transfer matrix and the kernel polynomial methods. We also apply the method to K xFe 2-ySe 2 with Fe vacancies. Despite the strong vacancy disorder and anisotropy, we find the material is not an Anderson insulator.more » Moreover our results demonstrate the application of the typical medium dynamical cluster approximation method to study Anderson localization in real materials.« less
A method to stabilize linear systems using eigenvalue gradient information
NASA Technical Reports Server (NTRS)
Wieseman, C. D.
1985-01-01
Formal optimization methods and eigenvalue gradient information are used to develop a stabilizing control law for a closed loop linear system that is initially unstable. The method was originally formulated by using direct, constrained optimization methods with the constraints being the real parts of the eigenvalues. However, because of problems in trying to achieve stabilizing control laws, the problem was reformulated to be solved differently. The method described uses the Davidon-Fletcher-Powell minimization technique to solve an indirect, constrained minimization problem in which the performance index is the Kreisselmeier-Steinhauser function of the real parts of all the eigenvalues. The method is applied successfully to solve two different problems: the determination of a fourth-order control law stabilizes a single-input single-output active flutter suppression system and the determination of a second-order control law for a multi-input multi-output lateral-directional flight control system. Various sets of design variables and initial starting points were chosen to show the robustness of the method.
Frequency Domain Ultrasound Waveform Tomography: Breast Imaging Using a Ring Transducer
Sandhu, G Y; Li, C; Roy, O; Schmidt, S; Duric, N
2016-01-01
Application of the frequency domain acoustic wave equation on data acquired from ultrasound tomography scans is shown to yield high resolution sound speed images on the order of the wavelength of the highest reconstructed frequency. Using a signal bandwidth of 0.4–1 MHz and an average sound speed of 1500 m/s, the resolution is approximately 1.5 mm. The quantitative sound speed values and morphology provided by these images have the potential to inform diagnosis and classification of breast disease. In this study, we present the formalism, practical application, and in vivo results of waveform tomography applied to breast data gathered by two different ultrasound tomography scanners that utilize ring transducers. The formalism includes a review of frequency domain modeling of the wave equation using finite difference operators as well as a review of the gradient descent method for the iterative reconstruction scheme. It is shown that the practical application of waveform tomography requires an accurate starting model, careful data processing, and a method to gradually incorporate higher frequency information into the sound speed reconstruction. Following these steps resulted in high resolution quantitative sound speed images of the breast. These images show marked improvement relative to commonly used ray tomography reconstruction methods. The robustness of the method is demonstrated by obtaining similar results from two different ultrasound tomography devices. We also compare our method to MRI to demonstrate concordant findings. The clinical data used in this work was obtained from a HIPAA compliant clinical study (IRB 040912M1F). PMID:26110909
Working the College System: Six Strategies for Building a Personal Powerbase
ERIC Educational Resources Information Center
Simplicio, Joseph S. C.
2008-01-01
Within each college system there are prescribed formalized methods for accomplishing tasks and achieving established goals. To truly understand how a college, or any large organization functions, it is vital to understand the basis of the formal structure. Those individuals who understand formal systems within a college can use this knowledge to…
An Educational Development Tool Based on Principles of Formal Ontology
ERIC Educational Resources Information Center
Guzzi, Rodolfo; Scarpanti, Stefano; Ballista, Giovanni; Di Nicolantonio, Walter
2005-01-01
Computer science provides with virtual laboratories, places where one can merge real experiments with the formalism of algorithms and mathematics and where, with the advent of multimedia, sounds and movies can also be added. In this paper we present a method, based on principles of formal ontology, allowing one to develop interactive educational…
Teaching Basic Quantum Mechanics in Secondary School Using Concepts of Feynman Path Integrals Method
ERIC Educational Resources Information Center
Fanaro, Maria de los Angeles; Otero, Maria Rita; Arlego, Marcelo
2012-01-01
This paper discusses the teaching of basic quantum mechanics in high school. Rather than following the usual formalism, our approach is based on Feynman's path integral method. Our presentation makes use of simulation software and avoids sophisticated mathematical formalism. (Contains 3 figures.)
On the Need for Practical Formal Methods
1998-01-01
additional research and engineering that is needed to make the current set of formal methods more practical. To illustrate the ideas, I present several exam ...either a good violin or a highly talented violinist. Light-weight techniques o er software developers good violins . A user need not be a talented
A Vector Representation for Thermodynamic Relationships
ERIC Educational Resources Information Center
Pogliani, Lionello
2006-01-01
The existing vector formalism method for thermodynamic relationship maintains tractability and uses accessible mathematics, which can be seen as a diverting and entertaining step into the mathematical formalism of thermodynamics and as an elementary application of matrix algebra. The method is based on ideas and operations apt to improve the…
NASA Astrophysics Data System (ADS)
Hilditch, David; Harms, Enno; Bugner, Marcus; Rüter, Hannes; Brügmann, Bernd
2018-03-01
A long-standing problem in numerical relativity is the satisfactory treatment of future null-infinity. We propose an approach for the evolution of hyperboloidal initial data in which the outer boundary of the computational domain is placed at infinity. The main idea is to apply the ‘dual foliation’ formalism in combination with hyperboloidal coordinates and the generalized harmonic gauge formulation. The strength of the present approach is that, following the ideas of Zenginoğlu, a hyperboloidal layer can be naturally attached to a central region using standard coordinates of numerical relativity applications. Employing a generalization of the standard hyperboloidal slices, developed by Calabrese et al, we find that all formally singular terms take a trivial limit as we head to null-infinity. A byproduct is a numerical approach for hyperboloidal evolution of nonlinear wave equations violating the null-condition. The height-function method, used often for fixed background spacetimes, is generalized in such a way that the slices can be dynamically ‘waggled’ to maintain the desired outgoing coordinate lightspeed precisely. This is achieved by dynamically solving the eikonal equation. As a first numerical test of the new approach we solve the 3D flat space scalar wave equation. The simulations, performed with the pseudospectral bamps code, show that outgoing waves are cleanly absorbed at null-infinity and that errors converge away rapidly as resolution is increased.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-05
... Certification Regarding Eligibility To Apply for Worker Adjustment Assistance In accordance with Section 223 of... Certification of Eligibility to Apply for Worker Adjustment Assistance on January 21, 2010, applicable to workers of Tata Technologies Incorporated, a subsidiary of TATA Technologies Limited, Novi, Michigan. The...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-23
... Certification Regarding Eligibility To Apply for Worker Adjustment Assistance In accordance with section 223 of... Certification of Eligibility to Apply for Worker Adjustment Assistance on January 21, 2010, applicable to workers of Tata Technologies Incorporated, a subsidiary of TATA Technologies Limited, Novi, Michigan. The...
Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM
ERIC Educational Resources Information Center
Warner, Rebecca M.
2007-01-01
This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…
Analysis of Pan-European attitudes to the eradication and control of bovine viral diarrhoea.
Heffernan, C; Misturelli, F; Nielsen, L; Gunn, G J; Yu, J
2009-02-07
At present, national-level policies concerning the eradication and control of bovine viral diarrhoea (BVD) differ widely across Europe. Some Scandinavian countries have enacted strong regulatory frameworks to eradicate the disease, whereas other countries have few formal policies. To examine these differences, the attitudes of stakeholders and policy makers in 17 European countries were investigated. A web-based questionnaire was sent to policy makers, government and private sector veterinarians, and representatives of farmers' organisations. In total, 131 individuals responded to the questionnaire and their responses were analysed by applying a method used in sociolinguistics: frame analysis. The results showed that the different attitudes of countries that applied compulsory or voluntary frameworks were associated with different views about the attribution or blame for BVD and the roles ascribed to farmers and other stakeholders in its eradication and control.
NASA Technical Reports Server (NTRS)
Weber, Doug; Jamsek, Damir
1994-01-01
The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.
Harris, Don; Stanton, Neville A; Starr, Alison
2015-01-01
Function Allocation methods are important for the appropriate allocation of tasks between humans and automated systems. It is proposed that Operational Event Sequence Diagrams (OESDs) provide a simple yet rigorous basis upon which allocation of work can be assessed. This is illustrated with respect to a design concept for a passenger aircraft flown by just a single pilot where the objective is to replace or supplement functions normally undertaken by the second pilot with advanced automation. A scenario-based analysis (take off) was used in which there would normally be considerable demands and interactions with the second pilot. The OESD analyses indicate those tasks that would be suitable for allocation to automated assistance on the flight deck and those tasks that are now redundant in this new configuration (something that other formal Function Allocation approaches cannot identify). Furthermore, OESDs are demonstrated to be an easy to apply and flexible approach to the allocation of function in prospective systems. OESDs provide a simple yet rigorous basis upon which allocation of work can be assessed. The technique can deal with the flexible, dynamic allocation of work and the deletion of functions no longer required. This is illustrated using a novel design concept for a single-crew commercial aircraft.
NASA Astrophysics Data System (ADS)
Schneider, Peter; Sluse, Dominique
2013-11-01
The light travel time differences in strong gravitational lensing systems allows an independent determination of the Hubble constant. This method has been successfully applied to several lens systems. The formally most precise measurements are, however, in tension with the recent determination of H0 from the Planck satellite for a spatially flat six-parameters ΛCDM cosmology. We reconsider the uncertainties of the method, concerning the mass profile of the lens galaxies, and show that the formal precision relies on the assumption that the mass profile is a perfect power law. Simple analytical arguments and numerical experiments reveal that mass-sheet like transformations yield significant freedom in choosing the mass profile, even when exquisite Einstein rings are observed. Furthermore, the characterization of the environment of the lens does not break that degeneracy which is not physically linked to extrinsic convergence. We present an illustrative example where the multiple imaging properties of a composite (baryons + dark matter) lens can be extremely well reproduced by a power-law model having the same velocity dispersion, but with predictions for the Hubble constant that deviate by ~20%. Hence we conclude that the impact of degeneracies between parametrized models have been underestimated in current H0 measurements from lensing, and need to be carefully reconsidered.
Sanz-Sanz, Cristina; Aguado, Alfredo; Roncero, Octavio; Naumkin, Fedor
2016-01-01
Analytical derivatives and non-adiabatic coupling matrix elements are derived for Hn+ systems (n=3, 4 and 5). The method uses a generalized Hellmann-Feynman theorem applied to a multi-state description based on diatomics-in-molecules (for H3+) or triatomics-in-molecules (for H4+ and H5+) formalisms, corrected with a permutationally invariant many-body term to get high accuracy. The analytical non-adiabatic coupling matrix elements are compared with ab initio calculations performed at multi-reference configuration interaction level. These magnitudes are used to calculate H2(v′=0,j′=0)+H2+(v,j=0) collisions, to determine the effect of electronic transitions using a molecular dynamics method with electronic transitions. Cross sections for several initial vibrational states of H2+ are calculated and compared with the available experimental data, yielding an excellent agreement. The effect of vibrational excitation of H2+ reactant, and its relation with non-adiabatic processes are discussed. Also, the behavior at low collisional energies, in the 1 meV-0.1 eV interval, of interest in astrophysical environments, are discussed in terms of the long range behaviour of the interaction potential which is properly described within the TRIM formalism. PMID:26696058
Computational logic: its origins and applications.
Paulson, Lawrence C
2018-02-01
Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the 'logic for computable functions (LCF) approach' pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users' code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself.
Euclidean bridge to the relativistic constituent quark model
NASA Astrophysics Data System (ADS)
Hobbs, T. J.; Alberg, Mary; Miller, Gerald A.
2017-03-01
Background: Knowledge of nucleon structure is today ever more of a precision science, with heightened theoretical and experimental activity expected in coming years. At the same time, a persistent gap lingers between theoretical approaches grounded in Euclidean methods (e.g., lattice QCD, Dyson-Schwinger equations [DSEs]) as opposed to traditional Minkowski field theories (such as light-front constituent quark models). Purpose: Seeking to bridge these complementary world views, we explore the potential of a Euclidean constituent quark model (ECQM). This formalism enables us to study the gluonic dressing of the quark-level axial-vector vertex, which we undertake as a test of the framework. Method: To access its indispensable elements with a minimum of inessential detail, we develop our ECQM using the simplified quark + scalar diquark picture of the nucleon. We construct a hyperspherical formalism involving polynomial expansions of diquark propagators to marry our ECQM with the results of Bethe-Salpeter equation (BSE) analyses, and constrain model parameters by fitting electromagnetic form factor data. Results: From this formalism, we define and compute a new quantity—the Euclidean density function (EDF)—an object that characterizes the nucleon's various charge distributions as functions of the quark's Euclidean momentum. Applying this technology and incorporating information from BSE analyses, we find the quenched dressing effect on the proton's axial-singlet charge to be small in magnitude and consistent with zero, while use of recent determinations of unquenched BSEs results in a large suppression. Conclusions: The quark + scalar diquark ECQM is a step toward a realistic quark model in Euclidean space, and needs additional refinements. The substantial effect we obtain for the impact on the axial-singlet charge of the unquenched dressed vertex compared to the quenched demands further investigation.
Two-dimensional PSF prediction of multiple-reflection optical systems with rough surfaces
NASA Astrophysics Data System (ADS)
Tayabaly, Kashmira; Spiga, Daniele; Sironi, Giorgia; Pareschi, Giovani; Lavagna, Michele
2016-09-01
The focusing accuracy in reflective optical systems, usually expressed in terms of the Point Spread Function (PSF) is chiefly determined by two factors: the deviation of the mirror shape from the nominal design and the surface finishing. While the effects of the former are usually well described by the geometrical optics, the latter is diffractive/interferential in nature and determined by a distribution of defects that cover several decades in the lateral scale (from a few millimeters to a few microns). Clearly, reducing the level of scattered light is crucial to improve the focusing of the collected radiation, particularly for astronomical telescopes that aim to detect faint light signals from our Universe. Telescopes are typically arranged in multiple reflections configuration and the behavior of the multiply-scattered radiation becomes difficult to predict and control. Also it is difficult to disentangle the effect of surface scattering from the PSF degradation caused by the shape deformation of the optical elements. This paper presents a simple and unifying method for evaluating the contribution of optical surfaces defects to the two-dimensional PSF of a multi-reflections system, regardless of the classification of a spectral range as "geometry" or "roughness". This method, entirely based on Huygens-Fresnel principle in the far-field approximation, was already applied in grazing-incidence X-ray mirrors and experimentally validated for a single reflection system, accounting for the real surface topography of the optics. In this work we show the extension of this formalism to a double reflection system and introducing real microroughness data. The formalism is applied to a MAGIC-I panel mirror that was fully characterized, allowing us to predict the PSF and the validation with real measurements of the double reflection ASTRI telescope, a prototype of CTA-SST telescope.
Radiation torque on nonspherical particles in the transition matrix formalism
NASA Astrophysics Data System (ADS)
Borghese, Ferdinando; Denti, Paolo; Saija, Rosalba; Iatì, Maria A.
2006-10-01
The torque exerted by radiation on small particles is recognized to have a considerable relevance, e.g., on the dynamics of cosmic dust grains and for the manipulation of micro and nanoparticles under controlled conditions. In the present paper we derive, in the transition matrix formalism, the radiation torque applied by a plane polarized wave on nonspherical particles. In case of circularly polarized waves impinging on spherical particles our equations reproduce the findings of Marston and Crichton [Phys. Rev. A 30, 2508 2516 (1984)]. Our equations were applied to calculate the torque on a few model particles shaped as aggregates of identical spheres, both axially symmetric and lacking any symmetry, and the conditions for the stability of the induced rotational motion are discussed.
Formal Requirements-Based Programming for Complex Systems
NASA Technical Reports Server (NTRS)
Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis
2005-01-01
Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.
NASA Astrophysics Data System (ADS)
Freni, Gabriele; Mannina, Giorgio
In urban drainage modelling, uncertainty analysis is of undoubted necessity. However, uncertainty analysis in urban water-quality modelling is still in its infancy and only few studies have been carried out. Therefore, several methodological aspects still need to be experienced and clarified especially regarding water quality modelling. The use of the Bayesian approach for uncertainty analysis has been stimulated by its rigorous theoretical framework and by the possibility of evaluating the impact of new knowledge on the modelling predictions. Nevertheless, the Bayesian approach relies on some restrictive hypotheses that are not present in less formal methods like the Generalised Likelihood Uncertainty Estimation (GLUE). One crucial point in the application of Bayesian method is the formulation of a likelihood function that is conditioned by the hypotheses made regarding model residuals. Statistical transformations, such as the use of Box-Cox equation, are generally used to ensure the homoscedasticity of residuals. However, this practice may affect the reliability of the analysis leading to a wrong uncertainty estimation. The present paper aims to explore the influence of the Box-Cox equation for environmental water quality models. To this end, five cases were considered one of which was the “real” residuals distributions (i.e. drawn from available data). The analysis was applied to the Nocella experimental catchment (Italy) which is an agricultural and semi-urbanised basin where two sewer systems, two wastewater treatment plants and a river reach were monitored during both dry and wet weather periods. The results show that the uncertainty estimation is greatly affected by residual transformation and a wrong assumption may also affect the evaluation of model uncertainty. The use of less formal methods always provide an overestimation of modelling uncertainty with respect to Bayesian method but such effect is reduced if a wrong assumption is made regarding the residuals distribution. If residuals are not normally distributed, the uncertainty is over-estimated if Box-Cox transformation is not applied or non-calibrated parameter is used.
What can formal methods offer to digital flight control systems design
NASA Technical Reports Server (NTRS)
Good, Donald I.
1990-01-01
Formal methods research begins to produce methods which will enable mathematic modeling of the physical behavior of digital hardware and software systems. The development of these methods directly supports the NASA mission of increasing the scope and effectiveness of flight system modeling capabilities. The conventional, continuous mathematics that is used extensively in modeling flight systems is not adequate for accurate modeling of digital systems. Therefore, the current practice of digital flight control system design has not had the benefits of extensive mathematical modeling which are common in other parts of flight system engineering. Formal methods research shows that by using discrete mathematics, very accurate modeling of digital systems is possible. These discrete modeling methods will bring the traditional benefits of modeling to digital hardware and hardware design. Sound reasoning about accurate mathematical models of flight control systems can be an important part of reducing risk of unsafe flight control.
Will-Nordtvedt PPN formalism applied to renormalization group extensions of general relativity
NASA Astrophysics Data System (ADS)
Toniato, Júnior D.; Rodrigues, Davi C.; de Almeida, Álefe O. F.; Bertini, Nicolas
2017-09-01
We apply the full Will-Nordtvedt version of the parametrized post-Newtonian (PPN) formalism to a class of general relativity extensions that are based on nontrivial renormalization group (RG) effects at large scales. We focus on a class of models in which the gravitational coupling constant G is correlated with the Newtonian potential. A previous PPN analysis considered a specific realization of the RG effects, and only within the Eddington-Robertson-Schiff version of the PPN formalism, which is a less complete and robust PPN formulation. Here we find stronger, more precise bounds, and with less assumptions. We also consider the external potential effect (EPE), which is an effect that is intrinsic to this framework and depends on the system environment (it has some qualitative similarities to the screening mechanisms of modified gravity theories). We find a single particular RG realization that is not affected by the EPE. Some physical systems have been pointed out as candidates for measuring the possible RG effects in gravity at large scales; for any of them the Solar System bounds need to be considered.
Uncertainty principle in loop quantum cosmology by Moyal formalism
NASA Astrophysics Data System (ADS)
Perlov, Leonid
2018-03-01
In this paper, we derive the uncertainty principle for the loop quantum cosmology homogeneous and isotropic Friedmann-Lemaiter-Robertson-Walker model with the holonomy-flux algebra. The uncertainty principle is between the variables c, with the meaning of connection and μ having the meaning of the physical cell volume to the power 2/3, i.e., v2 /3 or a plaquette area. Since both μ and c are not operators, but rather the random variables, the Robertson uncertainty principle derivation that works for hermitian operators cannot be used. Instead we use the Wigner-Moyal-Groenewold phase space formalism. The Wigner-Moyal-Groenewold formalism was originally applied to the Heisenberg algebra of the quantum mechanics. One can derive it from both the canonical and path integral quantum mechanics as well as the uncertainty principle. In this paper, we apply it to the holonomy-flux algebra in the case of the homogeneous and isotropic space. Another result is the expression for the Wigner function on the space of the cylindrical wave functions defined on Rb in c variables rather than in dual space μ variables.
Integrating Science and Engineering to Implement Evidence-Based Practices in Health Care Settings
Wu, Shinyi; Duan, Naihua; Wisdom, Jennifer P.; Kravitz, Richard L.; Owen, Richard R.; Sullivan, Greer; Wu, Albert W.; Di Capua, Paul; Hoagwood, Kimberly Eaton
2015-01-01
Integrating two distinct and complementary paradigms, science and engineering, may produce more effective outcomes for the implementation of evidence-based practices in health care settings. Science formalizes and tests innovations, whereas engineering customizes and optimizes how the innovation is applied tailoring to accommodate local conditions. Together they may accelerate the creation of an evidence-based healthcare system that works effectively in specific health care settings. We give examples of applying engineering methods for better quality, more efficient, and safer implementation of clinical practices, medical devices, and health services systems. A specific example was applying systems engineering design that orchestrated people, process, data, decision-making, and communication through a technology application to implement evidence-based depression care among low-income patients with diabetes. We recommend that leading journals recognize the fundamental role of engineering in implementation research, to improve understanding of design elements that create a better fit between program elements and local context. PMID:25217100
Xue, Mianqiang; Kendall, Alissa; Xu, Zhenming; Schoenung, Julie M
2015-01-20
Due to economic and societal reasons, informal activities including open burning, backyard recycling, and landfill are still the prevailing methods used for electronic waste treatment in developing countries. Great efforts have been made, especially in China, to promote formal approaches for electronic waste management by enacting laws, developing green recycling technologies, initiating pilot programs, etc. The formal recycling process can, however, engender environmental impact and resource consumption, although information on the environmental loads and resource consumption is currently limited. To quantitatively assess the environmental impact of the processes in a formal printed wiring board (PWB) recycling chain, life cycle assessment (LCA) was applied to a formal recycling chain that includes the steps from waste liberation through materials refining. The metal leaching in the refining stage was identified as a critical process, posing most of the environmental impact in the recycling chain. Global warming potential was the most significant environmental impact category after normalization and weighting, followed by fossil abiotic depletion potential, and marine aquatic eco-toxicity potential. Scenario modeling results showed that variations in the power source and chemical reagents consumption had the greatest influence on the environmental performance. The environmental impact from transportation used for PWB collection was also evaluated. The results were further compared to conventional primary metals production processes, highlighting the environmental benefit of metal recycling from waste PWBs. Optimizing the collection mode, increasing the precious metals recovery efficiency in the beneficiation stage and decreasing the chemical reagents consumption in the refining stage by effective materials liberation and separation are proposed as potential improvement strategies to make the recycling chain more environmentally friendly. The LCA results provide environmental information for the improvement of future integrated technologies and electronic waste management.
Smoothing of Gaussian quantum dynamics for force detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhishen; Sarovar, Mohan
Building on recent work by Gammelmark et al. we develop a formalism for prediction and retrodiction of Gaussian quantum systems undergoing continuous measurements. We apply the resulting formalism to study the advantage of incorporating a full measurement record and retrodiction for impulselike force detection and accelerometry. Here, we find that using retrodiction can only increase accuracy in a limited parameter regime, but that the reduction in estimation noise that it yields results in better detection of impulselike forces.
Gati, Wafa; Rammah, Mohamed M; Rammah, Mohamed B; Evano, Gwilherm
2012-01-01
We have developed a general synthesis of polysubstituted 1,4-dihydropyridines and pyridines based on a highly regioselective lithiation/6-endo-dig intramolecular carbolithiation from readily available N-allyl-ynamides. This reaction, which has been successfully applied to the formal synthesis of the anti-dyskinesia agent sarizotan, further extends the use of ynamides in organic synthesis and further demonstrates the synthetic efficiency of carbometallation reactions.
Smoothing of Gaussian quantum dynamics for force detection
Huang, Zhishen; Sarovar, Mohan
2018-04-10
Building on recent work by Gammelmark et al. we develop a formalism for prediction and retrodiction of Gaussian quantum systems undergoing continuous measurements. We apply the resulting formalism to study the advantage of incorporating a full measurement record and retrodiction for impulselike force detection and accelerometry. Here, we find that using retrodiction can only increase accuracy in a limited parameter regime, but that the reduction in estimation noise that it yields results in better detection of impulselike forces.
A Model-Driven Development Method for Management Information Systems
NASA Astrophysics Data System (ADS)
Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki
Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.
A regularized vortex-particle mesh method for large eddy simulation
NASA Astrophysics Data System (ADS)
Spietz, H. J.; Walther, J. H.; Hejlesen, M. M.
2017-11-01
We present recent developments of the remeshed vortex particle-mesh method for simulating incompressible fluid flow. The presented method relies on a parallel higher-order FFT based solver for the Poisson equation. Arbitrary high order is achieved through regularization of singular Green's function solutions to the Poisson equation and recently we have derived novel high order solutions for a mixture of open and periodic domains. With this approach the simulated variables may formally be viewed as the approximate solution to the filtered Navier Stokes equations, hence we use the method for Large Eddy Simulation by including a dynamic subfilter-scale model based on test-filters compatible with the aforementioned regularization functions. Further the subfilter-scale model uses Lagrangian averaging, which is a natural candidate in light of the Lagrangian nature of vortex particle methods. A multiresolution variation of the method is applied to simulate the benchmark problem of the flow past a square cylinder at Re = 22000 and the obtained results are compared to results from the literature.
NASA Astrophysics Data System (ADS)
Ayu Nurul Handayani, Hemas; Waspada, Indra
2018-05-01
Non-formal Early Childhood Education (non-formal ECE) is an education that is held for children under 4 years old. The implementation in District of Banyumas, Non-formal ECE is monitored by The District Government of Banyumas and helped by Sanggar Kegiatan Belajar (SKB) Purwokerto as one of the organizer of Non-formal Education. The government itself has a program for distributing ECE to all villages in Indonesia. However, The location to construct the ECE school in several years ahead is not arranged yet. Therefore, for supporting that program, a decision support system is made to give some recommendation villages for constructing The ECE building. The data are projected based on Brown’s Double Exponential Smoothing Method and utilizing Preference Ranking Organization Method for Enrichment Evaluation (Promethee) to generate priority order. As the recommendations system, it generates map visualization which is colored according to the priority level of sub-district and village area. The system was tested with black box testing, Promethee testing, and usability testing. The results showed that the system functionality and Promethee algorithm were working properly, and the user was satisfied.
Generalizability of the Ordering among Five Formal Reasoning Tasks by an Ordering-Theoretic Method.
ERIC Educational Resources Information Center
Bart, William M.; And Others
1979-01-01
Five Inhelder-Piaget formal operations tasks were analyzed to determine the extent that the formal operational skills they assess were ordered into a stable hierarchy generalizable across samples of subjects. Subjects were 34 collegiate gymnasts (19 males, 15 females), and 22 students (1 male, 21 females) from a university nursing program.…
ERIC Educational Resources Information Center
Goldratt, Miri; Cohen, Eric H.
2016-01-01
This article explores encounters between formal, informal, and non-formal education and the role of mentor-educators in creating values education in which such encounters take place. Mixed-methods research was conducted in Israeli public schools participating in the Personal Education Model, which combines educational modes. Ethnographic and…
Visualizing, Approximating, and Understanding Black-Hole Binaries
NASA Astrophysics Data System (ADS)
Nichols, David A.
Numerical-relativity simulations of black-hole binaries and advancements in gravitational-wave detectors now make it possible to learn more about the collisions of compact astrophysical bodies. To be able to infer more about the dynamical behavior of these objects requires a fuller analysis of the connection between the dynamics of pairs of black holes and their emitted gravitational waves. The chapters of this thesis describe three approaches to learn more about the relationship between the dynamics of black-hole binaries and their gravitational waves: modeling momentum flow in binaries with the Landau-Lifshitz formalism, approximating binary dynamics near the time of merger with post-Newtonian and black-hole-perturbation theories, and visualizing spacetime curvature with tidal tendexes and frame-drag vortexes. In Chapters 2--4, my collaborators and I present a method to quantify the flow of momentum in black-hole binaries using the Landau-Lifshitz formalism. Chapter 2 reviews an intuitive version of the formalism in the first-post-Newtonian approximation that bears a strong resemblance to Maxwell's theory of electromagnetism. Chapter 3 applies this approximation to relate the simultaneous bobbing motion of rotating black holes in the superkick configuration---equal-mass black holes with their spins anti-aligned and in the orbital plane---to the flow of momentum in the spacetime, prior to the black holes' merger. Chapter 4 then uses the Landau-Lifshitz formalism to explain the dynamics of a head-on merger of spinning black holes, whose spins are anti-aligned and transverse to the infalling motion. Before they merge, the black holes move with a large, transverse, velocity, which we can explain using the post-Newtonian approximation; as the holes merge and form a single black hole, we can use the Landau-Lifshitz formalism without any approximations to connect the slowing of the final black hole to its absorbing momentum density during the merger. In Chapters 5--7, we discuss using analytical approximations, such as post-Newtonian and black-hole-perturbation theories, to gain further understanding into how gravitational waves are generated by black-hole binaries. Chapter 5 presents a way of combining post-Newtonian and black-hole-perturbation theories---which we call the hybrid method---for head-on mergers of black holes. It was able to produce gravitational waveforms and gravitational recoils that agreed well with comparable results from numerical-relativity simulations. Chapter 6 discusses a development of the hybrid model to include a radiation-reaction force, which is better suited for studying inspiralling black-hole binaries. The gravitational waveform from the hybrid method for inspiralling mergers agreed qualitatively with that from numerical-relativity simulations; when applied to the superkick configuration, it gave a simplified picture of the formation of the large black-hole kick. Chapter 7 describes an approximate method of calculating the frequencies of the ringdown gravitational waveforms of rotating black holes (quasinormal modes). The method generalizes a geometric interpretation of black-hole quasinormal modes and explains a degeneracy in the spectrum of these modes. In Chapters 8--11, we describe a new way of visualizing spacetime curvature using tools called tidal tendexes and frame-drag vortexes. This relies upon a time-space split of spacetime, which allows one to break the vacuum Riemann curvature tensor into electric and magnetic parts (symmetric, trace-free tensors that have simple physical interpretations). The regions where the eigenvalues of these tensors are large form the tendexes and vortexes of a spacetime, and the integral curves of their eigenvectors are its tendex and vortex lines, for the electric and magnetic parts, respectively. Chapter 8 provides an overview of these visualization tools and presents initial results from numerical-relativity simulations. Chapter 9 uses topological properties of vortex and tendex lines to classify properties of gravitational waves far from a source. Chapter 10 describes the formalism in more detail, and discusses the vortexes and tendexes of multipolar spacetimes in linearized gravity about flat space. The chapter helps to explain how near-zone vortexes and tendexes become gravitational waves far from a weakly gravitating, time-varying source. Chapter 11 is a detailed investigation of the vortexes and tendexes of stationary and perturbed black holes. It develops insight into how perturbations of (strongly gravitating) black holes extend from near the horizon to become gravitational waves.
Beitsch, Leslie M; Kronstadt, Jessica; Robin, Nathalie; Leep, Carolyn
The Public Health Accreditation Board (PHAB) is now in its 10th year, making it an ideal time to study the impact of PHAB accreditation on local health departments (LHDs). To examine whether applying for PHAB accreditation affects perceptions and activities regarding quality improvement (QI) and performance management (PM) within LHDs. Data from the National Association of County & City Health Officials' 2010, 2013, and 2016 National Profile of Local Health Departments and associated QI modules were linked to PHAB-applicant data collected in e-PHAB in a cross-sectional and longitudinal approach examining self-reported QI/PM activities. Local health departments responding to National Association of County & City Health Officials Profile questionnaires and QI modules in 2010, 2013, and 2016. Implementation of formal QI program within agency, numbers of formal QI projects in the past year, presence of elements indicating formal QI program implementation, and changes over time by accreditation status as of June 2017. Accredited and in-process LHDs showed greater gains over time in all of the outcome measures than LHDs not registered in e-PHAB. Results of logistic regression controlling for population served and governance type found accredited LHDs more likely to report formal QI programs agency-wide (odds ratio: [OR] = 27.0; P < .001) and have implemented 6 to 8 elements of formal QI (OR = 27.0; P < .001) in 2016, compared with nonaccreditation-seeking LHDs. Between 2013 and 2016, LHDs that responded to both survey waves that were registered in e-PHAB or accredited were significantly more likely than nonaccreditation-seeking LHDs to report any increase in overall level of QI implementation (OR = 4.89; P = .006) and increase in number of elements of formal QI (OR = 16.1; P < .001). Local health departments accredited by June 2017 and those in process reported more formal QI activities and showed greater improvements with QI/PM implementation over time than LHDs not undertaking accreditation. Public Health Accreditation Board accreditation appears to influence QI/PM uptake. As health departments are contemplating whether to apply for accreditation, the potential for developing a more robust QI/PM system should be taken into account.
Formal Method of Description Supporting Portfolio Assessment
ERIC Educational Resources Information Center
Morimoto, Yasuhiko; Ueno, Maomi; Kikukawa, Isao; Yokoyama, Setsuo; Miyadera, Youzou
2006-01-01
Teachers need to assess learner portfolios in the field of education. However, they need support in the process of designing and practicing what kind of portfolios are to be assessed. To solve the problem, a formal method of describing the relations between the lesson forms and portfolios that need to be collected and the relations between…
Picture grammars in classification and semantic interpretation of 3D coronary vessels visualisations
NASA Astrophysics Data System (ADS)
Ogiela, M. R.; Tadeusiewicz, R.; Trzupek, M.
2009-09-01
The work presents the new opportunity for making semantic descriptions and analysis of medical structures, especially coronary vessels CT spatial reconstructions, with the use of AI graph-based linguistic formalisms. In the paper there will be discussed the manners of applying methods of computational intelligence to the development of a syntactic semantic description of spatial visualisations of the heart's coronary vessels. Such descriptions may be used for both smart ordering of images while archiving them and for their semantic searches in medical multimedia databases. Presented methodology of analysis can furthermore be used for attaining other goals related performance of computer-assisted semantic interpretation of selected elements and/or the entire 3D structure of the coronary vascular tree. These goals are achieved through the use of graph-based image formalisms based on IE graphs generating grammars that allow discovering and automatic semantic interpretation of irregularities visualised on the images obtained during diagnostic examinations of the heart muscle. The basis for the construction of 3D reconstructions of biological objects used in this work are visualisations obtained from helical CT scans, yet the method itself may be applied also for other methods of medical 3D images acquisition. The obtained semantic information makes it possible to make a description of the structure focused on the semantics of various morphological forms of the visualised vessels from the point of view of the operation of coronary circulation and the blood supply of the heart muscle. Thanks to these, the analysis conducted allows fast and — to a great degree — automated interpretation of the semantics of various morphological changes in the coronary vascular tree, and especially makes it possible to detect these stenoses in the lumen of the vessels that can cause critical decrease in blood supply to extensive or especially important fragments of the heart muscle.
NASA Astrophysics Data System (ADS)
Fitzpatrick, Matthew R. C.; Kennett, Malcolm P.
2018-05-01
We develop a formalism that allows the study of correlations in space and time in both the superfluid and Mott insulating phases of the Bose-Hubbard Model. Specifically, we obtain a two particle irreducible effective action within the contour-time formalism that allows for both equilibrium and out of equilibrium phenomena. We derive equations of motion for both the superfluid order parameter and two-point correlation functions. To assess the accuracy of this formalism, we study the equilibrium solution of the equations of motion and compare our results to existing strong coupling methods as well as exact methods where possible. We discuss applications of this formalism to out of equilibrium situations.
A hierarchical model for probabilistic independent component analysis of multi-subject fMRI studies
Tang, Li
2014-01-01
Summary An important goal in fMRI studies is to decompose the observed series of brain images to identify and characterize underlying brain functional networks. Independent component analysis (ICA) has been shown to be a powerful computational tool for this purpose. Classic ICA has been successfully applied to single-subject fMRI data. The extension of ICA to group inferences in neuroimaging studies, however, is challenging due to the unavailability of a pre-specified group design matrix. Existing group ICA methods generally concatenate observed fMRI data across subjects on the temporal domain and then decompose multi-subject data in a similar manner to single-subject ICA. The major limitation of existing methods is that they ignore between-subject variability in spatial distributions of brain functional networks in group ICA. In this paper, we propose a new hierarchical probabilistic group ICA method to formally model subject-specific effects in both temporal and spatial domains when decomposing multi-subject fMRI data. The proposed method provides model-based estimation of brain functional networks at both the population and subject level. An important advantage of the hierarchical model is that it provides a formal statistical framework to investigate similarities and differences in brain functional networks across subjects, e.g., subjects with mental disorders or neurodegenerative diseases such as Parkinson’s as compared to normal subjects. We develop an EM algorithm for model estimation where both the E-step and M-step have explicit forms. We compare the performance of the proposed hierarchical model with that of two popular group ICA methods via simulation studies. We illustrate our method with application to an fMRI study of Zen meditation. PMID:24033125
Tunneling method for Hawking radiation in the Nariai case
NASA Astrophysics Data System (ADS)
Belgiorno, F.; Cacciatori, S. L.; Dalla Piazza, F.
2017-08-01
We revisit the tunneling picture for the Hawking effect in light of the charged Nariai manifold, because this general relativistic solution, which displays two horizons, provides the bonus to allow the knowledge of exact solutions of the field equations. We first perform a revisitation of the tunneling ansatz in the framework of particle creation in external fields à la Nikishov, which corroborates the interpretation of the semiclassical emission rate Γ_{emission} as the conditional probability rate for the creation of a couple of particles from the vacuum. Then, particle creation associated with the Hawking effect on the Nariai manifold is calculated in two ways. On the one hand, we apply the Hamilton-Jacobi formalism for tunneling, in the case of a charged scalar field on the given background. On the other hand, the knowledge of the exact solutions for the Klein-Gordon equations on Nariai manifold, and their analytic properties on the extended manifold, allow us a direct computation of the flux of particles leaving the horizon, and, as a consequence, we obtain a further corroboration of the semiclassical tunneling picture from the side of S-matrix formalism.
NASA Astrophysics Data System (ADS)
Todoran, D.; Todoran, R.; Anitas, E. M.; Szakacs, Zs.
2017-12-01
This paper presents results concerning optical and electrical properties of galena natural mineral and of the interface layer formed between it and the potassium ethyl xanthate solution. The applied experimental method was differential optical reflectance spectroscopy over the UV-Vis/NIR spectral domain. Computations were made using the Kramers-Kronig formalism. Spectral dependencies of the electron loss functions, determined from the reflectance data obtained from the polished mineral surface, display van Hove singularities, leading to the determination of its valence band gap and electron plasma energy. Time dependent measurement of the spectral dispersion of the relative reflectance of the film formed at the interface, using the same computational formalism, leads to the dynamical determination of the spectral variation of its optical and electrical properties. We computed behaviors of the dielectric constant (dielectric permittivity), the dielectric loss function, refractive index and extinction coefficient, effective valence number and of the electron loss functions. The measurements tend to stabilize when the dynamic adsorption-desorption equilibrium is reached at the interface level.
Multiscale time-dependent density functional theory: Demonstration for plasmons.
Jiang, Jiajian; Abi Mansour, Andrew; Ortoleva, Peter J
2017-08-07
Plasmon properties are of significant interest in pure and applied nanoscience. While time-dependent density functional theory (TDDFT) can be used to study plasmons, it becomes impractical for elucidating the effect of size, geometric arrangement, and dimensionality in complex nanosystems. In this study, a new multiscale formalism that addresses this challenge is proposed. This formalism is based on Trotter factorization and the explicit introduction of a coarse-grained (CG) structure function constructed as the Weierstrass transform of the electron wavefunction. This CG structure function is shown to vary on a time scale much longer than that of the latter. A multiscale propagator that coevolves both the CG structure function and the electron wavefunction is shown to bring substantial efficiency over classical propagators used in TDDFT. This efficiency follows from the enhanced numerical stability of the multiscale method and the consequence of larger time steps that can be used in a discrete time evolution. The multiscale algorithm is demonstrated for plasmons in a group of interacting sodium nanoparticles (15-240 atoms), and it achieves improved efficiency over TDDFT without significant loss of accuracy or space-time resolution.
Unmanned Aircraft Systems in the National Airspace System: A Formal Methods Perspective
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.; Dutle, Aaron; Narkawicz, Anthony; Upchurch, Jason
2016-01-01
As the technological and operational capabilities of unmanned aircraft systems (UAS) have grown, so too have international efforts to integrate UAS into civil airspace. However, one of the major concerns that must be addressed in realizing this integration is that of safety. For example, UAS lack an on-board pilot to comply with the legal requirement that pilots see and avoid other aircraft. This requirement has motivated the development of a detect and avoid (DAA) capability for UAS that provides situational awareness and maneuver guidance to UAS operators to aid them in avoiding and remaining well clear of other aircraft in the airspace. The NASA Langley Research Center Formal Methods group has played a fundamental role in the development of this capability. This article gives a selected survey of the formal methods work conducted in support of the development of a DAA concept for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations.
Chen, Stephanie C; Kim, Scott Yh
2016-12-01
Standard of care pragmatic clinical trials that compare treatments already in use could improve care and reduce costs, but there is considerable debate about the research risks of standard of care pragmatic clinical trials and how to apply informed consent regulations to such trials. We sought to develop a framework integrating the insights from opposing sides of the debate. We developed a formal risk-benefit analysis framework for standard of care pragmatic clinical trials and then applied it to key provisions of the US federal regulations. Our formal framework for standard of care pragmatic clinical trial risk-benefit analysis takes into account three key considerations: the ex ante estimates of risks and benefits of the treatments to be compared in a standard of care pragmatic clinical trial, the allocation ratios of treatments inside and outside such a trial, and the significance of some participants receiving a different treatment inside a trial than outside the trial. The framework provides practical guidance on how the research ethics regulations on informed consent should be applied to standard of care pragmatic clinical trials. Our proposed formal model makes explicit the relationship between the concepts used by opposing sides of the debate about the research risks of standard of care pragmatic clinical trials and can be used to clarify the implications for informed consent. © The Author(s) 2016.
Rare behavior of growth processes via umbrella sampling of trajectories
NASA Astrophysics Data System (ADS)
Klymko, Katherine; Geissler, Phillip L.; Garrahan, Juan P.; Whitelam, Stephen
2018-03-01
We compute probability distributions of trajectory observables for reversible and irreversible growth processes. These results reveal a correspondence between reversible and irreversible processes, at particular points in parameter space, in terms of their typical and atypical trajectories. Thus key features of growth processes can be insensitive to the precise form of the rate constants used to generate them, recalling the insensitivity to microscopic details of certain equilibrium behavior. We obtained these results using a sampling method, inspired by the "s -ensemble" large-deviation formalism, that amounts to umbrella sampling in trajectory space. The method is a simple variant of existing approaches, and applies to ensembles of trajectories controlled by the total number of events. It can be used to determine large-deviation rate functions for trajectory observables in or out of equilibrium.
Probing New Long-Range Interactions by Isotope Shift Spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berengut, Julian C.; Budker, Dmitry; Delaunay, Cédric
We explore a method to probe new long- and intermediate-range interactions using precision atomic isotope shift spectroscopy. We develop a formalism to interpret linear King plots as bounds on new physics with minimal theory inputs. We focus only on bounding the new physics contributions that can be calculated independently of the standard model nuclear effects. We apply our method to existing Ca + data and project its sensitivity to conjectured new bosons with spin-independent couplings to the electron and the neutron using narrow transitions in other atoms and ions, specifically, Sr and Yb. Future measurements are expected to improve themore » relative precision by 5 orders of magnitude, and they can potentially lead to an unprecedented sensitivity for bosons within the 0.3 to 10 MeV mass range.« less
Probing New Long-Range Interactions by Isotope Shift Spectroscopy.
Berengut, Julian C; Budker, Dmitry; Delaunay, Cédric; Flambaum, Victor V; Frugiuele, Claudia; Fuchs, Elina; Grojean, Christophe; Harnik, Roni; Ozeri, Roee; Perez, Gilad; Soreq, Yotam
2018-03-02
We explore a method to probe new long- and intermediate-range interactions using precision atomic isotope shift spectroscopy. We develop a formalism to interpret linear King plots as bounds on new physics with minimal theory inputs. We focus only on bounding the new physics contributions that can be calculated independently of the standard model nuclear effects. We apply our method to existing Ca^{+} data and project its sensitivity to conjectured new bosons with spin-independent couplings to the electron and the neutron using narrow transitions in other atoms and ions, specifically, Sr and Yb. Future measurements are expected to improve the relative precision by 5 orders of magnitude, and they can potentially lead to an unprecedented sensitivity for bosons within the 0.3 to 10 MeV mass range.
Probing New Long-Range Interactions by Isotope Shift Spectroscopy
Berengut, Julian C.; Budker, Dmitry; Delaunay, Cédric; ...
2018-02-26
We explore a method to probe new long- and intermediate-range interactions using precision atomic isotope shift spectroscopy. We develop a formalism to interpret linear King plots as bounds on new physics with minimal theory inputs. We focus only on bounding the new physics contributions that can be calculated independently of the standard model nuclear effects. We apply our method to existing Ca + data and project its sensitivity to conjectured new bosons with spin-independent couplings to the electron and the neutron using narrow transitions in other atoms and ions, specifically, Sr and Yb. Future measurements are expected to improve themore » relative precision by 5 orders of magnitude, and they can potentially lead to an unprecedented sensitivity for bosons within the 0.3 to 10 MeV mass range.« less
On the theory of Carriers's Electrostatic Interaction near an Interface
NASA Astrophysics Data System (ADS)
Waters, Michael; Hashemi, Hossein; Kieffer, John
2015-03-01
Heterojunction interfaces are common in non-traditional photovoltaic device designs, such as those based small molecules, polymers, and perovskites. We have examined a number of the effects of the heterojunction interface region on carrier/exciton energetics using a mixture of both semi-classical and quantum electrostatic methods, ab initio methods, and statistical mechanics. Our theoretical analysis has yielded several useful relationships and numerical recipes that should be considered in device design regardless of the particular materials system. As a demonstration, we highlight these formalisms as applied to carriers and polaron pairs near a C60/subphthalocyanine interface. On the regularly ordered areas of the heterojunction, the effect of the interface is a significant set of corrections to the carrier energies, which in turn directly affects device performance.
NASA Astrophysics Data System (ADS)
Bog, Tino; Zander, Nils; Kollmannsberger, Stefan; Rank, Ernst
2018-04-01
The finite cell method (FCM) is a fictitious domain approach that greatly simplifies simulations involving complex structures. Recently, the FCM has been applied to contact problems. The current study continues in this field by extending the concept of weakly enforced boundary conditions to inequality constraints for frictionless contact. Furthermore, it formalizes an approach that automatically recovers high-order contact surfaces of (implicitly defined) embedded geometries by means of an extended Marching Cubes algorithm. To further improve the accuracy of the discretization, irregularities at the boundary of contact zones are treated with multi-level hp-refinements. Numerical results and a systematic study of h-, p- and hp-refinements show that the FCM can efficiently provide accurate results for problems involving contact.
Formal methods for test case generation
NASA Technical Reports Server (NTRS)
Rushby, John (Inventor); De Moura, Leonardo Mendonga (Inventor); Hamon, Gregoire (Inventor)
2011-01-01
The invention relates to the use of model checkers to generate efficient test sets for hardware and software systems. The method provides for extending existing tests to reach new coverage targets; searching *to* some or all of the uncovered targets in parallel; searching in parallel *from* some or all of the states reached in previous tests; and slicing the model relative to the current set of coverage targets. The invention provides efficient test case generation and test set formation. Deep regions of the state space can be reached within allotted time and memory. The approach has been applied to use of the model checkers of SRI's SAL system and to model-based designs developed in Stateflow. Stateflow models achieving complete state and transition coverage in a single test case are reported.
A Linguistic Truth-Valued Temporal Reasoning Formalism and Its Implementation
NASA Astrophysics Data System (ADS)
Lu, Zhirui; Liu, Jun; Augusto, Juan C.; Wang, Hui
Temporality and uncertainty are important features of many real world systems. Solving problems in such systems requires the use of formal mechanism such as logic systems, statistical methods or other reasoning and decision-making methods. In this paper, we propose a linguistic truth-valued temporal reasoning formalism to enable the management of both features concurrently using a linguistic truth valued logic and a temporal logic. We also provide a backward reasoning algorithm which allows the answering of user queries. A simple but realistic scenario in a smart home application is used to illustrate our work.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.
Statistical inference for tumor growth inhibition T/C ratio.
Wu, Jianrong
2010-09-01
The tumor growth inhibition T/C ratio is commonly used to quantify treatment effects in drug screening tumor xenograft experiments. The T/C ratio is converted to an antitumor activity rating using an arbitrary cutoff point and often without any formal statistical inference. Here, we applied a nonparametric bootstrap method and a small sample likelihood ratio statistic to make a statistical inference of the T/C ratio, including both hypothesis testing and a confidence interval estimate. Furthermore, sample size and power are also discussed for statistical design of tumor xenograft experiments. Tumor xenograft data from an actual experiment were analyzed to illustrate the application.
Theory of molecular rate processes in the presence of intense laser radiation
NASA Technical Reports Server (NTRS)
George, T. F.; Zimmerman, I. H.; Devries, P. L.; Yuan, J.-M.; Lam, K.-S.; Bellum, J. C.; Lee, H.-W.; Slutsky, M. S.; Lin, J.-T.
1979-01-01
The present paper deals with the influence of intense laser radiation on gas-phase molecular rate processes. Representations of the radiation field, the particle system, and the interaction involving these two entities are discussed from a general rather than abstract point of view. The theoretical methods applied are outlined, and the formalism employed is illustrated by application to a variety of specific processes. Quantum mechanical and semiclassical treatments of representative atom-atom and atom-diatom collision processes in the presence of a field are examined, and examples of bound-continuum processes and heterogeneous catalysis are discussed within the framework of both quantum-mechanical and semiclassical theories.
NASA Astrophysics Data System (ADS)
Ojima, Izumi
1981-11-01
"Thermo field dynamics," allowing the Feynman diagram method to be applied to real-time causal Green's functions at finite temperatures ( not temperature Green's functions with imaginary times) expressed in the form of "vacuum" expectation values, is reconsidered in light of its connection with the algebraic formulation of statical machanics based upon the KMS condition. On the basis of so-obtained general basic formulae, the formalism is extended to the case of gauge theories, where the subsidiary condition specifying physical states, the notion of observables, and the structure of the physical subspace at finite temperatures are clarified.
The quantization of the chiral Schwinger model based on the BFT-BFV formalism II
NASA Astrophysics Data System (ADS)
Park, Mu-In; Park, Young-Jai; Yoon, Sean J.
1998-12-01
We apply an improved version of Batalin-Fradkin-Tyutin Hamiltonian method to the a = 1 chiral Schwinger model, which is much more nontrivial than the a>1 one. Furthermore, through the path integral quantization, we newly resolve the problem of the nontrivial 0954-3899/24/12/002/img6-function as well as that of the unwanted Fourier parameter 0954-3899/24/12/002/img7 in the measure. As a result, we explicitly obtain the fully gauge invariant partition function, which includes a new type of Wess-Zumino term irrelevant to the gauge symmetry as well as the usual WZ action.
Electromagnetic processes in nucleus-nucleus collisions relating to space radiation research
NASA Technical Reports Server (NTRS)
Norbury, John W.
1992-01-01
Most of the papers within this report deal with electromagnetic processes in nucleus-nucleus collisions which are of concern in the space radiation program. In particular, the removal of one and two nucleons via both electromagnetic and strong interaction processes has been extensively investigated. The theory of relativistic Coulomb fission has also been developed. Several papers on quark models also appear. Finally, note that the theoretical methods developed in this work have been directly applied to the task of radiation protection of astronauts. This has been done by parameterizing the theoretical formalism in such a fashion that it can be used in cosmic ray transport codes.
Ritchie, R.H.; Sakakura, A.Y.
1956-01-01
The formal solutions of problems involving transient heat conduction in infinite internally bounded cylindrical solids may be obtained by the Laplace transform method. Asymptotic series representing the solutions for large values of time are given in terms of functions related to the derivatives of the reciprocal gamma function. The results are applied to the case of the internally bounded infinite cylindrical medium with, (a) the boundary held at constant temperature; (b) with constant heat flow over the boundary; and (c) with the "radiation" boundary condition. A problem in the flow of gas through a porous medium is considered in detail.
Deductive Evaluation: Formal Code Analysis With Low User Burden
NASA Technical Reports Server (NTRS)
Di Vito, Ben. L
2016-01-01
We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.
An analytical approach to gravitational lensing by an ensemble of axisymmetric lenses
NASA Technical Reports Server (NTRS)
Lee, Man Hoi; Spergel, David N.
1990-01-01
The problem of gravitational lensing by an ensemble of identical axisymmetric lenses randomly distributed on a single lens plane is considered and a formal expression is derived for the joint probability density of finding shear and convergence at a random point on the plane. The amplification probability for a source can be accurately estimated from the distribution in shear and convergence. This method is applied to two cases: lensing by an ensemble of point masses and by an ensemble of objects with Gaussian surface mass density. There is no convergence for point masses whereas shear is negligible for wide Gaussian lenses.
Eiber, Calvin D; Dokos, Socrates; Lovell, Nigel H; Suaning, Gregg J
2017-05-01
The capacity to quickly and accurately simulate extracellular stimulation of neurons is essential to the design of next-generation neural prostheses. Existing platforms for simulating neurons are largely based on finite-difference techniques; due to the complex geometries involved, the more powerful spectral or differential quadrature techniques cannot be applied directly. This paper presents a mathematical basis for the application of a spectral element method to the problem of simulating the extracellular stimulation of retinal neurons, which is readily extensible to neural fibers of any kind. The activating function formalism is extended to arbitrary neuron geometries, and a segmentation method to guarantee an appropriate choice of collocation points is presented. Differential quadrature may then be applied to efficiently solve the resulting cable equations. The capacity for this model to simulate action potentials propagating through branching structures and to predict minimum extracellular stimulation thresholds for individual neurons is demonstrated. The presented model is validated against published values for extracellular stimulation threshold and conduction velocity for realistic physiological parameter values. This model suggests that convoluted axon geometries are more readily activated by extracellular stimulation than linear axon geometries, which may have ramifications for the design of neural prostheses.
Usability evaluation techniques in mobile commerce applications: A systematic review
NASA Astrophysics Data System (ADS)
Hussain, Azham; Mkpojiogu, Emmanuel O. C.
2016-08-01
Obviously, there are a number of literatures concerning the usability of mobile commerce (m-commerce) applications and related areas, but they do not adequately provide knowledge about usability techniques used in most of the empirical usability evaluation for m-commerce application. Therefore, this paper is aimed at producing the usability techniques frequently used in the aspect of usability evaluation for m-commerce applications. To achieve the stated objective, systematic literature review was employed. Sixty seven papers were downloaded in usability evaluation for m-commerce and related areas; twenty one most relevant studies were selected for review in order to extract the appropriate information. The results from the review shows that heuristic evaluation, formal test and think aloud methods are the most commonly used methods in m-commerce application in comparison to cognitive walkthrough and the informal test methods. Moreover, most of the studies applied control experiment (33.3% of the total studies); other studies that applied case study for usability evaluation are 14.28%. The results from this paper provide additional knowledge to the usability practitioners and research community for the current state and use of usability techniques in m-commerce application.
ERIC Educational Resources Information Center
Ugwu, Chinwe U.
2015-01-01
The National Commission for Mass Literacy, Adult and Non-Formal Education (NMEC) is the Federal Statutory Agency set up to co-ordinate all aspects of Non-Formal Education in Nigeria whether offered by government agencies or non-governmental organisations. This study looked at the existing Capacity Building Programme, the delivery methods, impact…
ERIC Educational Resources Information Center
Penning, Margaret J.
2002-01-01
Purpose: In response to concerns among policymakers and others that increases in the availability of publicly funded formal services will lead to reductions in self- and informal care, this study examines the relationship between the extent of formal in-home care received and levels of self- and informal care. Design and Methods: Two-stage least…
The Influence of Rural Location on Utilization of Formal Home Care: The Role of Medicaid
ERIC Educational Resources Information Center
McAuley, William J.; Spector, William D.; Van Nostrand, Joan; Shaffer, Tom
2004-01-01
Purpose: This research examines the impact of rural-urban residence on formal home-care utilization among older people and determines whether and how Medicaid coverage influences the association between, rural-urban location and risk of formal home-care use. Design and Methods: We combined data from the 1998 consolidated file of the Medical…
GRADSPMHD: A parallel MHD code based on the SPH formalism
NASA Astrophysics Data System (ADS)
Vanaverbeke, S.; Keppens, R.; Poedts, S.
2014-03-01
We present GRADSPMHD, a completely Lagrangian parallel magnetohydrodynamics code based on the SPH formalism. The implementation of the equations of SPMHD in the “GRAD-h” formalism assembles known results, including the derivation of the discretized MHD equations from a variational principle, the inclusion of time-dependent artificial viscosity, resistivity and conductivity terms, as well as the inclusion of a mixed hyperbolic/parabolic correction scheme for satisfying the ∇ṡB→ constraint on the magnetic field. The code uses a tree-based formalism for neighbor finding and can optionally use the tree code for computing the self-gravity of the plasma. The structure of the code closely follows the framework of our parallel GRADSPH FORTRAN 90 code which we added previously to the CPC program library. We demonstrate the capabilities of GRADSPMHD by running 1, 2, and 3 dimensional standard benchmark tests and we find good agreement with previous work done by other researchers. The code is also applied to the problem of simulating the magnetorotational instability in 2.5D shearing box tests as well as in global simulations of magnetized accretion disks. We find good agreement with available results on this subject in the literature. Finally, we discuss the performance of the code on a parallel supercomputer with distributed memory architecture. Catalogue identifier: AERP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERP_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 620503 No. of bytes in distributed program, including test data, etc.: 19837671 Distribution format: tar.gz Programming language: FORTRAN 90/MPI. Computer: HPC cluster. Operating system: Unix. Has the code been vectorized or parallelized?: Yes, parallelized using MPI. RAM: ˜30 MB for a Sedov test including 15625 particles on a single CPU. Classification: 12. Nature of problem: Evolution of a plasma in the ideal MHD approximation. Solution method: The equations of magnetohydrodynamics are solved using the SPH method. Running time: The test provided takes approximately 20 min using 4 processors.
Gati, Wafa; Rammah, Mohamed M; Rammah, Mohamed B
2012-01-01
Summary We have developed a general synthesis of polysubstituted 1,4-dihydropyridines and pyridines based on a highly regioselective lithiation/6-endo-dig intramolecular carbolithiation from readily available N-allyl-ynamides. This reaction, which has been successfully applied to the formal synthesis of the anti-dyskinesia agent sarizotan, further extends the use of ynamides in organic synthesis and further demonstrates the synthetic efficiency of carbometallation reactions. PMID:23365632
NASA Astrophysics Data System (ADS)
Güémez, J.; Fiolhais, M.
2018-05-01
We apply the four-vector formalism of special relativity to describe various interaction processes of photons with a solar sail, in two cases: when the sail’s surface is a perfect mirror, and when it is a body coated with a totally absorbing material. We stress the pedagogical value of implementing simultaneously both the linear momentum and the energy conservation in a covariant fashion, as our formalism inherently does. It also allows for a straightforward change of the description of a certain process in different inertial reference frames.
Topological vertex formalism with O5-plane
NASA Astrophysics Data System (ADS)
Kim, Sung-Soo; Yagi, Futoshi
2018-01-01
We propose a new topological vertex formalism for a type IIB (p ,q ) 5-brane web with an O5-plane. We apply our proposal to five-dimensional N =1 Sp(1) gauge theory with Nf=0 , 1, 8 flavors to compute the topological string partition functions and check the agreement with the known results. Especially for the Nf=8 case, which corresponds to E-string theory on a circle, we obtain a new, yet simple, expression of the partition function with a two Young diagram sum.
Trumpet slices in Kerr spacetimes.
Dennison, Kenneth A; Baumgarte, Thomas W; Montero, Pedro J
2014-12-31
We introduce a new time-independent family of analytical coordinate systems for the Kerr spacetime representing rotating black holes. We also propose a (2+1)+1 formalism for the characterization of trumpet geometries. Applying this formalism to our new family of coordinate systems we identify, for the first time, analytical and stationary trumpet slices for general rotating black holes, even for charged black holes in the presence of a cosmological constant. We present results for metric functions in this slicing and analyze the geometry of the rotating trumpet surface.
Computational logic: its origins and applications
2018-01-01
Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the ‘logic for computable functions (LCF) approach’ pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users’ code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself. PMID:29507522
ERIC Educational Resources Information Center
Johnson, Christopher W.
1996-01-01
The development of safety-critical systems (aircraft cockpits and reactor control rooms) is qualitatively different from that of other interactive systems. These differences impose burdens on design teams that must ensure the development of human-machine interfaces. Analyzes strengths and weaknesses of formal methods for the design of user…
Villareal, Oscar D; Rodriguez, Roberto A; Yu, Lili; Wambo, Thierry O
2016-08-20
Molecular dynamics simulations employing all-atom force fields have become a reliable way to study binding interactions quantitatively for a wide range of systems. In this work, we employ two recently developed methods for the calculation of dissociation constants K D between gold nanoparticles (AuNPs) of different sizes in a near-physiological environment through the potential of mean force (PMF) formalism: the method of geometrical restraints developed by Woo et al. and formalized by Gumbart et al. and the method of hybrid Steered Molecular Dynamics (hSMD). Obtaining identical results (within the margin of error) from both approaches on the negatively charged Au 18 (SR) 14 NP, functionalized by the negatively charged 4-mercapto-benzoate (pMBA) ligand, we draw parallels between their energetic and entropic interactions. By applying the hSMD method on Au 102 (SR) 44 and Au 144 (SR) 60 , both of them near-spherical in shape and functionalized by pMBA, we study the effects of size and shape on the binding interactions. Au 18 binds weakly with K D = 13 mM as a result of two opposing effects: its large surface curvature hindering the formation of salt bridges, and its large ligand density on preferential orientations favoring their formation. On the other hand, Au 102 binds more strongly with K D = 30 μM and Au 144 binds the strongest with K D = 3.2 nM .
Discontinuous Galerkin finite element methods for radiative transfer in spherical symmetry
NASA Astrophysics Data System (ADS)
Kitzmann, D.; Bolte, J.; Patzer, A. B. C.
2016-11-01
The discontinuous Galerkin finite element method (DG-FEM) is successfully applied to treat a broad variety of transport problems numerically. In this work, we use the full capacity of the DG-FEM to solve the radiative transfer equation in spherical symmetry. We present a discontinuous Galerkin method to directly solve the spherically symmetric radiative transfer equation as a two-dimensional problem. The transport equation in spherical atmospheres is more complicated than in the plane-parallel case owing to the appearance of an additional derivative with respect to the polar angle. The DG-FEM formalism allows for the exact integration of arbitrarily complex scattering phase functions, independent of the angular mesh resolution. We show that the discontinuous Galerkin method is able to describe accurately the radiative transfer in extended atmospheres and to capture discontinuities or complex scattering behaviour which might be present in the solution of certain radiative transfer tasks and can, therefore, cause severe numerical problems for other radiative transfer solution methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Viennot, David
We show that the holonomy of a connection defined on a principal composite bundle is related by a non-Abelian Stokes theorem to the composition of the holonomies associated with the connections of the component bundles of the composite. We apply this formalism to describe the non-Abelian geometric phase (when the geometric phase generator does not commute with the dynamical phase generator). We find then an assumption to obtain a new kind of separation between the dynamical and the geometric phases. We also apply this formalism to the gauge theory of gravity in the presence of a Dirac spinor field inmore » order to decompose the holonomy of the Lorentz connection into holonomies of the linear connection and of the Cartan connection.« less
Thermal quantum time-correlation functions from classical-like dynamics
NASA Astrophysics Data System (ADS)
Hele, Timothy J. H.
2017-07-01
Thermal quantum time-correlation functions are of fundamental importance in quantum dynamics, allowing experimentally measurable properties such as reaction rates, diffusion constants and vibrational spectra to be computed from first principles. Since the exact quantum solution scales exponentially with system size, there has been considerable effort in formulating reliable linear-scaling methods involving exact quantum statistics and approximate quantum dynamics modelled with classical-like trajectories. Here, we review recent progress in the field with the development of methods including centroid molecular dynamics , ring polymer molecular dynamics (RPMD) and thermostatted RPMD (TRPMD). We show how these methods have recently been obtained from 'Matsubara dynamics', a form of semiclassical dynamics which conserves the quantum Boltzmann distribution. We also apply the Matsubara formalism to reaction rate theory, rederiving t → 0+ quantum transition-state theory (QTST) and showing that Matsubara-TST, like RPMD-TST, is equivalent to QTST. We end by surveying areas for future progress.
The LS-STAG immersed boundary/cut-cell method for non-Newtonian flows in 3D extruded geometries
NASA Astrophysics Data System (ADS)
Nikfarjam, F.; Cheny, Y.; Botella, O.
2018-05-01
The LS-STAG method is an immersed boundary/cut-cell method for viscous incompressible flows based on the staggered MAC arrangement for Cartesian grids, where the irregular boundary is sharply represented by its level-set function, results in a significant gain in computer resources (wall time, memory usage) compared to commercial body-fitted CFD codes. The 2D version of LS-STAG method is now well-established (Cheny and Botella, 2010), and this paper presents its extension to 3D geometries with translational symmetry in the z direction (hereinafter called 3D extruded configurations). This intermediate step towards the fully 3D implementation can be applied to a wide variety of canonical flows and will be regarded as the keystone for the full 3D solver, since both discretization and implementation issues on distributed memory machines are tackled at this stage of development. The LS-STAG method is then applied to various Newtonian and non-Newtonian flows in 3D extruded geometries (axisymmetric pipe, circular cylinder, duct with an abrupt expansion) for which benchmark results and experimental data are available. The purpose of these investigations are (a) to investigate the formal order of accuracy of the LS-STAG method, (b) to assess the versatility of method for flow applications at various regimes (Newtonian and shear-thinning fluids, steady and unsteady laminar to turbulent flows) (c) to compare its performance with well-established numerical methods (body-fitted and immersed boundary methods).
Wang, Jun; Zhang, Mengya; Li, Shulan; He, Bingshu
2018-07-01
Now, the occurrence of pharmaceuticals in natural environment has been frequently reported around the world. As a kind of biologically active compounds specially designed to be effective even at very low concentration levels, pharmaceuticals in the environment could have adverse impacts to the health of human beings or other non-targeted organisms due to long-term exposures. To minimize the pharmaceutical pollution from the perspective of drug administration, a new concept called as eco-pharmacovigilance (EPV) has been proposed as a kind of pharmacovigilance(PV) for the environment. However, as a new and comprehensive science, EPV has not sophisticated methods in practice and formalized implementation model up to now. Since EPV is a special kind of PV, it could be feasible to draw on the experience of PV as a possible and reasonable starting point for EPV. In this paper, we discussed the common methods and activities used in PV including spontaneous reporting, intensive monitoring, database studies, and their potential applicability to the environment. And we concluded that these common methods in PV could be adapted and applied to EPV. But there is still the need for organizational, technical and financial supports of the EPV system. Copyright © 2018 Elsevier B.V. All rights reserved.
The NIFTy way of Bayesian signal inference
NASA Astrophysics Data System (ADS)
Selig, Marco
2014-12-01
We introduce NIFTy, "Numerical Information Field Theory", a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTy can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTy as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D3PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy.
Shenvi, Neil; van Aggelen, Helen; Yang, Yang; Yang, Weitao; Schwerdtfeger, Christine; Mazziotti, David
2013-08-07
Tensor hypercontraction is a method that allows the representation of a high-rank tensor as a product of lower-rank tensors. In this paper, we show how tensor hypercontraction can be applied to both the electron repulsion integral tensor and the two-particle excitation amplitudes used in the parametric 2-electron reduced density matrix (p2RDM) algorithm. Because only O(r) auxiliary functions are needed in both of these approximations, our overall algorithm can be shown to scale as O(r(4)), where r is the number of single-particle basis functions. We apply our algorithm to several small molecules, hydrogen chains, and alkanes to demonstrate its low formal scaling and practical utility. Provided we use enough auxiliary functions, we obtain accuracy similar to that of the standard p2RDM algorithm, somewhere between that of CCSD and CCSD(T).
Koenderink, Jan J; van Doorn, Andrea J; Wagemans, Johan
2011-01-01
Depth is the feeling of remoteness, or separateness, that accompanies awareness in human modalities like vision and audition. In specific cases depths can be graded on an ordinal scale, or even measured quantitatively on an interval scale. In the case of pictorial vision this is complicated by the fact that human observers often appear to apply mental transformations that involve depths in distinct visual directions. This implies that a comparison of empirically determined depths between observers involves pictorial space as an integral entity, whereas comparing pictorial depths as such is meaningless. We describe the formal structure of pictorial space purely in the phenomenological domain, without taking recourse to the theories of optics which properly apply to physical space-a distinct ontological domain. We introduce a number of general ways to design and implement methods of geodesy in pictorial space, and discuss some basic problems associated with such measurements. We deal mainly with conceptual issues.
Marae o te Rangi, Temples of the Heavens: Explorations in Polynesian Archaeoastronomy
NASA Astrophysics Data System (ADS)
Kirch, Patrick V.
2015-08-01
It is well established that the ancient Polynesians possessed sophisticated knowledge of astronomy, applying their understanding of the movements of heavenly bodies among other things to long-distance navigation and to their calendrical systems. Nonetheless, Polynesian archaeologists have been reticent to apply the methods of archaeoastronomy to the interpretation of prehistoric monumental sites, especially temples (marae and heiau). This presentation draws upon examples from the Mangareva and Hawaiian archipelagoes to demonstrate that Polynesian ritual architecture frequently exhibits regular patterns of orientation, suggesting that these temples were aligned with particular astronomical phenomena, such as solstice, equinox, and Pleiades rising positions. The argument is advanced that Polynesian temples were not only places of offering and sacrifice to the gods, but also locations for formal astronomical observation. In part, such observation was presumably crucial to keeping the Polynesian lunar calendar synchronized with the solar year.
NASA Astrophysics Data System (ADS)
Galliano, Frédéric
2018-05-01
This article presents a new dust spectral energy distribution (SED) model, named HerBIE, aimed at eliminating the noise-induced correlations and large scatter obtained when performing least-squares fits. The originality of this code is to apply the hierarchical Bayesian approach to full dust models, including realistic optical properties, stochastic heating, and the mixing of physical conditions in the observed regions. We test the performances of our model by applying it to synthetic observations. We explore the impact on the recovered parameters of several effects: signal-to-noise ratio, SED shape, sample size, the presence of intrinsic correlations, the wavelength coverage, and the use of different SED model components. We show that this method is very efficient: the recovered parameters are consistently distributed around their true values. We do not find any clear bias, even for the most degenerate parameters, or with extreme signal-to-noise ratios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drzymala, R. E., E-mail: drzymala@wustl.edu; Alvarez, P. E.; Bednarz, G.
2015-11-15
Purpose: Absorbed dose calibration for gamma stereotactic radiosurgery is challenging due to the unique geometric conditions, dosimetry characteristics, and nonstandard field size of these devices. Members of the American Association of Physicists in Medicine (AAPM) Task Group 178 on Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance have participated in a round-robin exchange of calibrated measurement instrumentation and phantoms exploring two approved and two proposed calibration protocols or formalisms on ten gamma radiosurgery units. The objectives of this study were to benchmark and compare new formalisms to existing calibration methods, while maintaining traceability to U.S. primary dosimetry calibration laboratory standards. Methods:more » Nine institutions made measurements using ten gamma stereotactic radiosurgery units in three different 160 mm diameter spherical phantoms [acrylonitrile butadiene styrene (ABS) plastic, Solid Water, and liquid water] and in air using a positioning jig. Two calibrated miniature ionization chambers and one calibrated electrometer were circulated for all measurements. Reference dose-rates at the phantom center were determined using the well-established AAPM TG-21 or TG-51 dose calibration protocols and using two proposed dose calibration protocols/formalisms: an in-air protocol and a formalism proposed by the International Atomic Energy Agency (IAEA) working group for small and nonstandard radiation fields. Each institution’s results were normalized to the dose-rate determined at that institution using the TG-21 protocol in the ABS phantom. Results: Percentages of dose-rates within 1.5% of the reference dose-rate (TG-21 + ABS phantom) for the eight chamber-protocol-phantom combinations were the following: 88% for TG-21, 70% for TG-51, 93% for the new IAEA nonstandard-field formalism, and 65% for the new in-air protocol. Averages and standard deviations for dose-rates over all measurements relative to the TG-21 + ABS dose-rate were 0.999 ± 0.009 (TG-21), 0.991 ± 0.013 (TG-51), 1.000 ± 0.009 (IAEA), and 1.009 ± 0.012 (in-air). There were no statistically significant differences (i.e., p > 0.05) between the two ionization chambers for the TG-21 protocol applied to all dosimetry phantoms. The mean results using the TG-51 protocol were notably lower than those for the other dosimetry protocols, with a standard deviation 2–3 times larger. The in-air protocol was not statistically different from TG-21 for the A16 chamber in the liquid water or ABS phantoms (p = 0.300 and p = 0.135) but was statistically different from TG-21 for the PTW chamber in all phantoms (p = 0.006 for Solid Water, 0.014 for liquid water, and 0.020 for ABS). Results of IAEA formalism were statistically different from TG-21 results only for the combination of the A16 chamber with the liquid water phantom (p = 0.017). In the latter case, dose-rates measured with the two protocols differed by only 0.4%. For other phantom-ionization-chamber combinations, the new IAEA formalism was not statistically different from TG-21. Conclusions: Although further investigation is needed to validate the new protocols for other ionization chambers, these results can serve as a reference to quantitatively compare different calibration protocols and ionization chambers if a particular method is chosen by a professional society to serve as a standardized calibration protocol.« less
Changes in formal sex education: 1995-2002.
Lindberg, Laura Duberstein; Santelli, John S; Singh, Susheela
2006-12-01
Although comprehensive sex education is broadly supported by health professionals, funding for abstinence-only education has increased. Using data from the 1995 National Survey of Adolescent Males, the 1995 National Survey of Family Growth (NSFG) and the 2002 NSFG, changes in male and female adolescents' reports of the sex education they have received from formal sources were examined. Life-table methods were used to measure the timing of instruction, and t tests were used for changes over time. From 1995 to 2002, reports of formal instruction about birth control methods declined among both genders (males, from 81% to 66%; females, from 87% to 70%). This, combined with increases in reports of abstinence education among males (from 74% to 83%), resulted in a lower proportion of teenagers' overall receiving formal instruction about both abstinence and birth control methods (males, 65% to 59%; females, 84% to 65%), and a higher proportion of teenagers' receiving instruction only about abstinence (males, 9% to 24%; females, 8% to 21%). Teenagers in 2002 had received abstinence education about two years earlier (median age, 11.4 for males, 11.8 for females) than they had received birth control instruction (median age, 13.5 for both males and females). Among sexually experienced adolescents, 62% of females and 54% of males had received instruction about birth control methods prior to first sex. A substantial retreat from formal instruction about birth control methods has left increasing proportions of adolescents receiving only abstinence education. Efforts are needed to expand teenagers' access to medically accurate and comprehensive reproductive health information.
USDA-ARS?s Scientific Manuscript database
The ultimate goal of applied research of phosphorus (P) transfer from agricultural fields to surface waters should arguably be to develop and apply mathematical models. There are two primary reasons for this assertion: 1) models formalize our understanding of P transfer and force us to test that und...
ERIC Educational Resources Information Center
Rienties, Bart; Hosein, Anesa
2015-01-01
How and with whom academics develop and maintain formal and informal networks for reflecting on their teaching practice has received limited attention even though academic development (AD) programmes have become an almost ubiquitous feature of higher education. The primary goal of this mixed-method study is to unpack how 114 academics in an AD…
Samuvel, K; Ramachandran, K
2015-07-05
This study examined the effects of the combination of starting materials on the properties of solid-state reacted BaTiO3 using two different types of BaCO3 and TiO2. In addition, the effect of mechanochemical activation by high energy milling and the Ba/Ti molar ratio on the reaction temperature, particle size and tetragonality were investigated. The TiO2 phase and size plays a major role in increasing the reaction temperature and particle size. With the optimum selection of starting materials and processing conditions, BaTiO3 with a particle size <200 nm (Scherrer's formula) and a tetragonality c/a of approximately 1.007 was obtained. Broadband dielectric spectroscopy is applied to investigate the electrical properties of disordered perovskite-like ceramics in a wide temperature range. From the X-ray diffraction analysis it was found that the newly obtained BaTi0.5Fe0.5O3 ceramics consist of two chemically different phases. The electric modulus M∗ formalism used in the analysis enabled us to distinguish and separate the relaxation processes, dominated by marked conductivity in the ε∗(ω) representation. Interfacial effects on the dielectric properties of the samples have been understood by Cole-Cole plots in complex impedance and modulus formalism. Modulus formalism has identified the effects of both grain and grain boundary microstructure on the dielectric properties, particularly in solid state routed samples. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Samuvel, K.; Ramachandran, K.
2015-07-01
This study examined the effects of the combination of starting materials on the properties of solid-state reacted BaTiO3 using two different types of BaCO3 and TiO2. In addition, the effect of mechanochemical activation by high energy milling and the Ba/Ti molar ratio on the reaction temperature, particle size and tetragonality were investigated. The TiO2 phase and size plays a major role in increasing the reaction temperature and particle size. With the optimum selection of starting materials and processing conditions, BaTiO3 with a particle size <200 nm (Scherrer's formula) and a tetragonality c/a of approximately 1.007 was obtained. Broadband dielectric spectroscopy is applied to investigate the electrical properties of disordered perovskite-like ceramics in a wide temperature range. From the X-ray diffraction analysis it was found that the newly obtained BaTi0.5Fe0.5O3 ceramics consist of two chemically different phases. The electric modulus M∗ formalism used in the analysis enabled us to distinguish and separate the relaxation processes, dominated by marked conductivity in the ε∗(ω) representation. Interfacial effects on the dielectric properties of the samples have been understood by Cole-Cole plots in complex impedance and modulus formalism. Modulus formalism has identified the effects of both grain and grain boundary microstructure on the dielectric properties, particularly in solid state routed samples.
Description logic-based methods for auditing frame-based medical terminological systems.
Cornet, Ronald; Abu-Hanna, Ameen
2005-07-01
Medical terminological systems (TSs) play an increasingly important role in health care by supporting recording, retrieval and analysis of patient information. As the size and complexity of TSs are growing, the need arises for means to audit them, i.e. verify and maintain (logical) consistency and (semantic) correctness of their contents. This is not only important for the management of TSs but also for providing their users with confidence about the reliability of their contents. Formal methods have the potential to play an important role in the audit of TSs, although there are few empirical studies to assess the benefits of using these methods. In this paper we propose a method based on description logics (DLs) for the audit of TSs. This method is based on the migration of the medical TS from a frame-based representation to a DL-based one. Our method is characterized by a process in which initially stringent assumptions are made about concept definitions. The assumptions allow the detection of concepts and relations that might comprise a source of logical inconsistency. If the assumptions hold then definitions are to be altered to eliminate the inconsistency, otherwise the assumptions are revised. In order to demonstrate the utility of the approach in a real-world case study we audit a TS in the intensive care domain and discuss decisions pertaining to building DL-based representations. This case study demonstrates that certain types of inconsistencies can indeed be detected by applying the method to a medical terminological system. The added value of the method described in this paper is that it provides a means to evaluate the compliance to a number of common modeling principles in a formal manner. The proposed method reveals potential modeling inconsistencies, helping to audit and (if possible) improve the medical TS. In this way, it contributes to providing confidence in the contents of the terminological system.
Definition and determination of the triplet-triplet energy transfer reaction coordinate.
Zapata, Felipe; Marazzi, Marco; Castaño, Obis; Acuña, A Ulises; Frutos, Luis Manuel
2014-01-21
A definition of the triplet-triplet energy transfer reaction coordinate within the very weak electronic coupling limit is proposed, and a novel theoretical formalism is developed for its quantitative determination in terms of internal coordinates The present formalism permits (i) the separation of donor and acceptor contributions to the reaction coordinate, (ii) the identification of the intrinsic role of donor and acceptor in the triplet energy transfer process, and (iii) the quantification of the effect of every internal coordinate on the transfer process. This formalism is general and can be applied to classical as well as to nonvertical triplet energy transfer processes. The utility of the novel formalism is demonstrated here by its application to the paradigm of nonvertical triplet-triplet energy transfer involving cis-stilbene as acceptor molecule. In this way the effect of each internal molecular coordinate in promoting the transfer rate, from triplet donors in the low and high-energy limit, could be analyzed in detail.
Connecting different TMD factorization formalisms in QCD
Collins, John; Rogers, Ted C.
2017-09-11
In the original Collins-Soper-Sterman (CSS) presentation of the results of transverse-momentum-dependent (TMD) factorization for the Drell-Yan process, results for perturbative coefficients can be obtained from calculations for collinear factorization. Here we show how to use these results, plus known results for the quark form factor, to obtain coefficients for TMD factorization in more recent formulations, e.g., that due to Collins, and apply them to known results at ordermore » $$\\alpha_s^2$$ and $$\\alpha_s^3$$. We also show that the ``non-perturbative'' functions as obtained from fits to data are equal in the two schemes. We compile the higher-order perturbative inputs needed for the updated CSS scheme by appealing to results obtained in a variety of different formalisms. In addition, we derive the connection between both versions of the CSS formalism and several formalisms based in soft-collinear effective theory (SCET). As a result, our work uses some important new results for factorization for the quark form factor, which we derive.« less
Connecting different TMD factorization formalisms in QCD
NASA Astrophysics Data System (ADS)
Collins, John; Rogers, Ted C.
2017-09-01
In the original Collins-Soper-Sterman (CSS) presentation of the results of transverse-momentum-dependent (TMD) factorization for the Drell-Yan process, results for perturbative coefficients can be obtained from calculations for collinear factorization. Here we show how to use these results, plus known results for the quark form factor, to obtain coefficients for TMD factorization in more recent formulations, e.g., that due to Collins, and apply them to known results at order αs2 and αs3. We also show that the "nonperturbative" functions as obtained from fits to data are equal in the two schemes. We compile the higher-order perturbative inputs needed for the updated CSS scheme by appealing to results obtained in a variety of different formalisms. In addition, we derive the connection between both versions of the CSS formalism and several formalisms based in soft-collinear effective theory (SCET). Our work uses some important new results for factorization for the quark form factor, which we derive.
Connecting different TMD factorization formalisms in QCD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, John; Rogers, Ted C.
In the original Collins-Soper-Sterman (CSS) presentation of the results of transverse-momentum-dependent (TMD) factorization for the Drell-Yan process, results for perturbative coefficients can be obtained from calculations for collinear factorization. Here we show how to use these results, plus known results for the quark form factor, to obtain coefficients for TMD factorization in more recent formulations, e.g., that due to Collins, and apply them to known results at ordermore » $$\\alpha_s^2$$ and $$\\alpha_s^3$$. We also show that the ``non-perturbative'' functions as obtained from fits to data are equal in the two schemes. We compile the higher-order perturbative inputs needed for the updated CSS scheme by appealing to results obtained in a variety of different formalisms. In addition, we derive the connection between both versions of the CSS formalism and several formalisms based in soft-collinear effective theory (SCET). As a result, our work uses some important new results for factorization for the quark form factor, which we derive.« less
Definition and determination of the triplet-triplet energy transfer reaction coordinate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zapata, Felipe; Marazzi, Marco; Castaño, Obis
2014-01-21
A definition of the triplet-triplet energy transfer reaction coordinate within the very weak electronic coupling limit is proposed, and a novel theoretical formalism is developed for its quantitative determination in terms of internal coordinates The present formalism permits (i) the separation of donor and acceptor contributions to the reaction coordinate, (ii) the identification of the intrinsic role of donor and acceptor in the triplet energy transfer process, and (iii) the quantification of the effect of every internal coordinate on the transfer process. This formalism is general and can be applied to classical as well as to nonvertical triplet energy transfermore » processes. The utility of the novel formalism is demonstrated here by its application to the paradigm of nonvertical triplet-triplet energy transfer involving cis-stilbene as acceptor molecule. In this way the effect of each internal molecular coordinate in promoting the transfer rate, from triplet donors in the low and high-energy limit, could be analyzed in detail.« less
User Interface Technology for Formal Specification Development
NASA Technical Reports Server (NTRS)
Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)
1994-01-01
Formal specification development and modification are an essential component of the knowledge-based software life cycle. User interface technology is needed to empower end-users to create their own formal specifications. This paper describes the advanced user interface for AMPHION1 a knowledge-based software engineering system that targets scientific subroutine libraries. AMPHION is a generic, domain-independent architecture that is specialized to an application domain through a declarative domain theory. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that provides semantic guidance in creating diagrams denoting formal specifications in an application domain. The diagrams also serve to document the specifications. Automatic deductive program synthesis ensures that end-user specifications are correctly implemented. The tables that drive AMPHION's user interface are automatically compiled from a domain theory; portions of the interface can be customized by the end-user. The user interface facilitates formal specification development by hiding syntactic details, such as logical notation. It also turns some of the barriers for end-user specification development associated with strongly typed formal languages into active sources of guidance, without restricting advanced users. The interface is especially suited for specification modification. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development.
Bishop, Felicity L
2015-02-01
To outline some of the challenges of mixed methods research and illustrate how they can be addressed in health psychology research. This study critically reflects on the author's previously published mixed methods research and discusses the philosophical and technical challenges of mixed methods, grounding the discussion in a brief review of methodological literature. Mixed methods research is characterized as having philosophical and technical challenges; the former can be addressed by drawing on pragmatism, the latter by considering formal mixed methods research designs proposed in a number of design typologies. There are important differences among the design typologies which provide diverse examples of designs that health psychologists can adapt for their own mixed methods research. There are also similarities; in particular, many typologies explicitly orient to the technical challenges of deciding on the respective timing of qualitative and quantitative methods and the relative emphasis placed on each method. Characteristics, strengths, and limitations of different sequential and concurrent designs are identified by reviewing five mixed methods projects each conducted for a different purpose. Adapting formal mixed methods designs can help health psychologists address the technical challenges of mixed methods research and identify the approach that best fits the research questions and purpose. This does not obfuscate the need to address philosophical challenges of mixing qualitative and quantitative methods. Statement of contribution What is already known on this subject? Mixed methods research poses philosophical and technical challenges. Pragmatism in a popular approach to the philosophical challenges while diverse typologies of mixed methods designs can help address the technical challenges. Examples of mixed methods research can be hard to locate when component studies from mixed methods projects are published separately. What does this study add? Critical reflections on the author's previously published mixed methods research illustrate how a range of different mixed methods designs can be adapted and applied to address health psychology research questions. The philosophical and technical challenges of mixed methods research should be considered together and in relation to the broader purpose of the research. © 2014 The British Psychological Society.
A Benchmark for Comparing Different Approaches for Specifying and Verifying Real-Time Systems
1993-01-01
To be considered correct or useful, real - time systems must deliver results within specified time intervals, either without exception or with high...probability. Recently, a large number of formal methods have been invented for specifying and verifying real - time systems . It has been suggested that...these formal methods need to be tested out on actual real - time systems . Such testing will allow the scalability of the methods to be assessed and also
Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified
NASA Technical Reports Server (NTRS)
Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.
2005-01-01
Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.
Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald
2017-01-01
ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.
Formal Foundations for Hierarchical Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen; Pai, Ganesh; Whiteside, Iain
2015-01-01
Safety cases are increasingly being required in many safety-critical domains to assure, using structured argumentation and evidence, that a system is acceptably safe. However, comprehensive system-wide safety arguments present appreciable challenges to develop, understand, evaluate, and manage, partly due to the volume of information that they aggregate, such as the results of hazard analysis, requirements analysis, testing, formal verification, and other engineering activities. Previously, we have proposed hierarchical safety cases, hicases, to aid the comprehension of safety case argument structures. In this paper, we build on a formal notion of safety case to formalise the use of hierarchy as a structuring technique, and show that hicases satisfy several desirable properties. Our aim is to provide a formal, theoretical foundation for safety cases. In particular, we believe that tools for high assurance systems should be granted similar assurance to the systems to which they are applied. To this end, we formally specify and prove the correctness of key operations for constructing and managing hicases, which gives the specification for implementing hicases in AdvoCATE, our toolset for safety case automation. We motivate and explain the theory with the help of a simple running example, extracted from a real safety case and developed using AdvoCATE.
NASA Astrophysics Data System (ADS)
Xu, Feng; Davis, Anthony B.; Diner, David J.
2016-11-01
A Markov chain formalism is developed for computing the transport of polarized radiation according to Generalized Radiative Transfer (GRT) theory, which was developed recently to account for unresolved random fluctuations of scattering particle density and can also be applied to unresolved spectral variability of gaseous absorption as an improvement over the standard correlated-k method. Using Gamma distribution to describe the probability density function of the extinction or absorption coefficient, a shape parameter a that quantifies the variability is introduced, defined as the mean extinction or absorption coefficient squared divided by its variance. It controls the decay rate of a power-law transmission that replaces the usual exponential Beer-Lambert-Bouguer law. Exponential transmission, hence classic RT, is recovered when a→∞. The new approach is verified to high accuracy against numerical benchmark results obtained with a custom Monte Carlo method. For a<∞, angular reciprocity is violated to a degree that increases with the spatial variability, as observed for finite portions of real-world cloudy scenes. While the degree of linear polarization in liquid water cloudbows, supernumerary bows, and glories is affected by spatial heterogeneity, the positions in scattering angle of these features are relatively unchanged. As a result, a single-scattering model based on the assumption of subpixel homogeneity can still be used to derive droplet size distributions from polarimetric measurements of extended stratocumulus clouds.