Science.gov

Sample records for knowledge base verification

  1. Knowledge-based system verification and validation

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1990-01-01

    The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.

  2. Verification of knowledge bases based on containment checking

    SciTech Connect

    Levy. A.Y.; Rousset, M.C.

    1996-12-31

    Building complex knowledge based applications requires encoding large amounts of domain knowledge. After acquiring knowledge from domain experts, much of the effort in building a knowledge base goes into verifying that the knowledge is encoded correctly. We consider the problem of verifying hybrid knowledge bases that contain both Horn rules and a terminology in a description logic. Our approach to the verification problem is based on showing a close relationship to the problem of query containment. Our first contribution, based on this relationship, is presenting a thorough analysis of the decidability and complexity of the verification problem, for knowledge bases containing recursive rules and the interpreted predicates =, {le}, < and {ne}. Second, we show that important new classes of constraints on correct inputs and outputs can be expressed in a hybrid setting, in which a description logic class hierarchy is also considered, and we present the first complete algorithm for verifying such hybrid knowledge bases.

  3. Verification of Legal Knowledge-base with Conflictive Concept

    NASA Astrophysics Data System (ADS)

    Hagiwara, Shingo; Tojo, Satoshi

    In this paper, we propose a verification methodology of large-scale legal knowledge. With a revision of legal code, we are forced to revise also other affected code to keep the consistency of law. Thus, our task is to revise the affected area properly and to investigate its adequacy. In this study, we extend the notion of inconsistency besides of the ordinary logical inconsistency, to include the conceptual conflicts. We obtain these conflictions from taxonomy data, and thus, we can avoid tedious manual declarations of opponent words. In the verification process, we adopt extended disjunctive logic programming (EDLP) to tolerate multiple consequences for a given set of antecedents. In addition, we employ abductive logic programming (ALP) regarding the situations to which the rules are applied as premises. Also, we restrict a legal knowledge-base to acyclic program to avoid the circulation of definitions, to justify the relevance of verdicts. Therefore, detecting cyclic parts of legal knowledge would be one of our objectives. The system is composed of two subsystems; we implement the preprocessor in Ruby to facilitate string manipulation, and the verifier in Prolog to exert the logical inference. Also, we employ XML format in the system to retain readability. In this study, we verify actual code of ordinances of Toyama prefecture, and show the experimental results.

  4. Intelligent Tools for Planning Knowledge base Development and Verification

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  5. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    DTIC Science & Technology

    2003-03-01

    Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig

  6. Use of metaknowledge in the verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Morell, Larry J.

    1989-01-01

    Knowledge-based systems are modeled as deductive systems. The model indicates that the two primary areas of concern in verification are demonstrating consistency and completeness. A system is inconsistent if it asserts something that is not true of the modeled domain. A system is incomplete if it lacks deductive capability. Two forms of consistency are discussed along with appropriate verification methods. Three forms of incompleteness are discussed. The use of metaknowledge, knowledge about knowledge, is explored in connection to each form of incompleteness.

  7. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    ERIC Educational Resources Information Center

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  8. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    ERIC Educational Resources Information Center

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  9. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  10. Systems, methods and apparatus for verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor); Rouff, Christopher A. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  11. Offline signature verification and skilled forgery detection using HMM and sum graph features with ANN and knowledge based classifier

    NASA Astrophysics Data System (ADS)

    Mehta, Mohit; Choudhary, Vijay; Das, Rupam; Khan, Ilyas

    2010-02-01

    Signature verification is one of the most widely researched areas in document analysis and signature biometric. Various methodologies have been proposed in this area for accurate signature verification and forgery detection. In this paper we propose a unique two stage model of detecting skilled forgery in the signature by combining two feature types namely Sum graph and HMM model for signature generation and classify them with knowledge based classifier and probability neural network. We proposed a unique technique of using HMM as feature rather than a classifier as being widely proposed by most of the authors in signature recognition. Results show a higher false rejection than false acceptance rate. The system detects forgeries with an accuracy of 80% and can detect the signatures with 91% accuracy. The two stage model can be used in realistic signature biometric applications like the banking applications where there is a need to detect the authenticity of the signature before processing documents like checks.

  12. Enhanced high-level Petri nets with multiple colors for knowledge verification/validation of rule-based expert systems.

    PubMed

    Wu, C H; Lee, S J

    1997-01-01

    Exploring the properties of rule-based expert systems through Petri net models has received a lot of attention. Traditional Petri nets provide a straightforward but inadequate method for knowledge verification/validation of rule-based expert systems. We propose an enhanced high-level Petri net model in which variables and negative information can be represented and processed properly. Rule inference is modeled exactly and some important aspects in rule-based systems (RBSs), such as conservation of facts, refraction, and closed-world assumption, are considered in this model. With the coloring scheme proposed in this paper, the tasks involved in checking the logic structure and output correctness of an RES are formally investigated. We focus on the detection of redundancy, conflicts, cycles, unnecessary conditions, dead ends, and unreachable goals in an RES. These knowledge verification/validation (KVV) tasks are formulated as the reachability problem and improper knowledge can be detected by solving a set of equations with respect to multiple colors. The complexity of our method is discussed and a comparison of our model with other Petri net models is presented.

  13. Static and Completion Analysis for Planning Knowledge Base Development and Verification

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  14. Experiences in improving the state of the practice in verification and validation of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; French, Scott W.; Hamilton, David

    1994-01-01

    Knowledge-based systems (KBS's) are in general use in a wide variety of domains, both commercial and government. As reliance on these types of systems grows, the need to assess their quality and validity reaches critical importance. As with any software, the reliability of a KBS can be directly attributed to the application of disciplined programming and testing practices throughout the development life-cycle. However, there are some essential differences between conventional software and KBSs, both in construction and use. The identification of these differences affect the verification and validation (V&V) process and the development of techniques to handle them. The recognition of these differences is the basis of considerable on-going research in this field. For the past three years IBM (Federal Systems Company - Houston) and the Software Technology Branch (STB) of NASA/Johnson Space Center have been working to improve the 'state of the practice' in V&V of Knowledge-based systems. This work was motivated by the need to maintain NASA's ability to produce high quality software while taking advantage of new KBS technology. To date, the primary accomplishment has been the development and teaching of a four-day workshop on KBS V&V. With the hope of improving the impact of these workshops, we also worked directly with NASA KBS projects to employ concepts taught in the workshop. This paper describes two projects that were part of this effort. In addition to describing each project, this paper describes problems encountered and solutions proposed in each case, with particular emphasis on implications for transferring KBS V&V technology beyond the NASA domain.

  15. Guidelines for the verification and validation of expert system software and conventional software: Evaluation of knowledge base certification methods. Volume 4

    SciTech Connect

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This report presents the results of the Knowledge Base Certification activity of the expert systems verification and validation (V&V) guideline development project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. This activity is concerned with the development and testing of various methods for assuring the quality of knowledge bases. The testing procedure used was that of behavioral experiment, the first known such evaluation of any type of V&V activity. The value of such experimentation is its capability to provide empirical evidence for -- or against -- the effectiveness of plausible methods in helping people find problems in knowledge bases. The three-day experiment included 20 participants from three nuclear utilities, the Nuclear Regulatory Commission`s Technical training Center, the University of Maryland, EG&G Idaho, and SAIC. The study used two real nuclear expert systems: a boiling water reactor emergency operating procedures tracking system and a pressurized water reactor safety assessment systems. Ten participants were assigned to each of the expert systems. All participants were trained in and then used a sequence of four different V&V methods selected as being the best and most appropriate for study on the basis of prior evaluation activities. These methods either involved the analysis and tracing of requirements to elements in the knowledge base (requirements grouping and requirements tracing) or else involved direct inspection of the knowledge base for various kinds of errors. Half of the subjects within each system group used the best manual variant of the V&V methods (the control group), while the other half were supported by the results of applying real or simulated automated tools to the knowledge bases (the experimental group).

  16. Wavelet Features Based Fingerprint Verification

    NASA Astrophysics Data System (ADS)

    Bagadi, Shweta U.; Thalange, Asha V.; Jain, Giridhar P.

    2010-11-01

    In this work; we present a automatic fingerprint identification system based on Level 3 features. Systems based only on minutiae features do not perform well for poor quality images. In practice, we often encounter extremely dry, wet fingerprint images with cuts, warts, etc. Due to such fingerprints, minutiae based systems show poor performance for real time authentication applications. To alleviate the problem of poor quality fingerprints, and to improve overall performance of the system, this paper proposes fingerprint verification based on wavelet statistical features & co-occurrence matrix features. The features include mean, standard deviation, energy, entropy, contrast, local homogeneity, cluster shade, cluster prominence, Information measure of correlation. In this method, matching can be done between the input image and the stored template without exhaustive search using the extracted feature. The wavelet transform based approach is better than the existing minutiae based method and it takes less response time and hence suitable for on-line verification, with high accuracy.

  17. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    SciTech Connect

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  18. Empirical Analysis and Refinement of Expert System Knowledge Bases.

    DTIC Science & Technology

    1987-11-30

    Knowledge base refinement is the modification of an existing expert system knowledge base with the goals of localizing specific weaknesses in a... expert system techniques for knowledge acquisition, knowledge base refinement, maintenance, and verification....on the related problems of knowledge base acquisition, maintenance, verification, and learning from experience. The SEEK system was the first expert

  19. Video-based fingerprint verification.

    PubMed

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-09-04

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low.

  20. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  1. A Zero Knowledge Protocol For Nuclear Warhead Verification

    SciTech Connect

    Glaser, Alexander; Goldston, Robert J.

    2014-03-14

    The verification of nuclear warheads for arms control faces a paradox: International inspectors must gain high confidence in the authenticity of submitted items while learning nothing about them. Conventional inspection systems featuring ''information barriers'', designed to hide measurments stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, designed such that sensitive information is never measured so does not need to be hidden. We interrogate submitted items with energetic neutrons, making in effect, differential measurements of neutron transmission and emission. Calculations of diversion scenarios show that a high degree of discrimination can be achieved while revealing zero information. Timely demonstration of the viability of such an approach could be critical for the nexxt round of arms-control negotiations, which will likely require verification of individual warheads, rather than whole delivery systems.

  2. Knowledge-Based Image Analysis.

    DTIC Science & Technology

    1981-04-01

    UNCLASSIF1 ED ETL-025s N IIp ETL-0258 AL Ai01319 S"Knowledge-based image analysis u George C. Stockman Barbara A. Lambird I David Lavine Laveen N. Kanal...extraction, verification, region classification, pattern recognition, image analysis . 3 20. A. CT (Continue on rever.. d. It necessary and Identify by...UNCLgSTFTF n In f SECURITY CLASSIFICATION OF THIS PAGE (When Date Entered) .L1 - I Table of Contents Knowledge Based Image Analysis I Preface

  3. Tools for Knowledge Acquisition and Verification in Medicine

    PubMed Central

    Mars, Nicolass J. I.; Miller, Perry L.

    1986-01-01

    Expert systems require large amounts of domain knowledge for non-trivial problem solving. Experience in developing such systems has shown that the process of acquiring domain knowledge and of determining whether the knowledge is consistent, complete, and correct is a major obstacle to wide introduction of knowledge-based systems. We discuss various tools which have been developed to assist in these two processes. These tools bring additional knowledge to bear, or provide better interfaces between a knowledge engineer and the knowledge-based system. The discussion of existing tools is preceded by a framework for classifying them, based on the additional knowledge used.

  4. A zero-knowledge protocol for nuclear warhead verification

    NASA Astrophysics Data System (ADS)

    Glaser, Alexander; Barak, Boaz; Goldston, Robert J.

    2014-06-01

    The verification of nuclear warheads for arms control involves a paradox: international inspectors will have to gain high confidence in the authenticity of submitted items while learning nothing about them. Proposed inspection systems featuring `information barriers', designed to hide measurements stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of a fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, which is designed in such a way that sensitive information is never measured and so does not need to be hidden. We interrogate submitted items with energetic neutrons, making, in effect, differential measurements of both neutron transmission and emission. Calculations for scenarios in which material is diverted from a test object show that a high degree of discrimination can be achieved while revealing zero information. Our ideas for a physical zero-knowledge system could have applications beyond the context of nuclear disarmament. The proposed technique suggests a way to perform comparisons or computations on personal or confidential data without measuring the data in the first place.

  5. A zero-knowledge protocol for nuclear warhead verification.

    PubMed

    Glaser, Alexander; Barak, Boaz; Goldston, Robert J

    2014-06-26

    The verification of nuclear warheads for arms control involves a paradox: international inspectors will have to gain high confidence in the authenticity of submitted items while learning nothing about them. Proposed inspection systems featuring 'information barriers', designed to hide measurements stored in electronic systems, are at risk of tampering and snooping. Here we show the viability of a fundamentally new approach to nuclear warhead verification that incorporates a zero-knowledge protocol, which is designed in such a way that sensitive information is never measured and so does not need to be hidden. We interrogate submitted items with energetic neutrons, making, in effect, differential measurements of both neutron transmission and emission. Calculations for scenarios in which material is diverted from a test object show that a high degree of discrimination can be achieved while revealing zero information. Our ideas for a physical zero-knowledge system could have applications beyond the context of nuclear disarmament. The proposed technique suggests a way to perform comparisons or computations on personal or confidential data without measuring the data in the first place.

  6. Dynamic knowledge validation and verification for CBR teledermatology system.

    PubMed

    Ou, Monica H; West, Geoff A W; Lazarescu, Mihai; Clay, Chris

    2007-01-01

    Case-based reasoning has been of great importance in the development of many decision support applications. However, relatively little effort has gone into investigating how new knowledge can be validated. Knowledge validation is important in dealing with imperfect data collected over time, because inconsistencies in data do occur and adversely affect the performance of a diagnostic system. This paper consists of two parts. First, it describes methods that enable the domain expert, who may not be familiar with machine learning, to interactively validate knowledge base of a Web-based teledermatology system. The validation techniques involve decision tree classification and formal concept analysis. Second, it describes techniques to discover unusual relationships hidden in the dataset for building and updating a comprehensive knowledge base, because the diagnostic performance of the system is highly dependent on the content thereof. Therefore, in order to classify different kinds of diseases, it is desirable to have a knowledge base that covers common as well as uncommon diagnoses. Evaluation results show that the knowledge validation techniques are effective in keeping the knowledge base consistent, and that the query refinement techniques are useful in improving the comprehensiveness of the case base.

  7. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    NASA Technical Reports Server (NTRS)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network

  8. An ontology based trust verification of software license agreement

    NASA Astrophysics Data System (ADS)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  9. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  10. Rule-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2003-01-01

    We present a rule-based framework for defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time logics, interval logics, forms of quantified temporal logics, and so on. Our logic, EAGLE, is implemented as a Java library and involves novel techniques for rule definition, manipulation and execution. Monitoring is done on a state-by-state basis, without storing the execution trace.

  11. Verification in referral-based crowdsourcing.

    PubMed

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge.

  12. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  13. Introduction to knowledge base

    SciTech Connect

    Ohsuga, S.

    1986-01-01

    This work provides a broad range of easy-to-understand information on basic knowledge base concepts and basic element technology for the building of a knowledge base system. It also discusses various languages and networks for development of knowledge base systems. It describes applications of knowledge base utilization methodology and prospects for the future in such areas as pattern recognition, natural language processing, expert systems, and CAD/CAM.

  14. Cooperative Knowledge Bases.

    DTIC Science & Technology

    1988-02-01

    intellegent knowledge bases. The present state of our system for concurrent evaluation of a knowledge base of logic clauses using static allocation...de Kleer, J., An assumption-based TMS, Artificial Intelligence, Vol. 28, No. 2, 1986. [Doyle 79) Doyle, J. A truth maintenance system, Artificial

  15. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Base inertia verification. 1066.250 Section 1066.250 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a...

  16. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Base inertia verification. 1066.250 Section 1066.250 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a...

  17. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Base inertia verification. 1066.250 Section 1066.250 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a...

  18. A New Approach to Nuclear Warhead Verification Using a Zero-Knowledge Protocol

    SciTech Connect

    Glaser,; Alexander,

    2012-05-16

    Warhead verification systems proposed to date fundamentally rely on the use of information barriers to prevent the release of classified design information. Measurements with information carriers significantly increase the complexity of inspection systems, make their certification and authentication difficult, and may reduce the overall confidence in the verifiability of future arms- control agreements. This talk presents a proof-of-concept of a new approach to nuclear warhead verification that minimizes the role of information barriers from the outset and envisions instead an inspection system that a priori avoids leakage of sensitive information using a so-called zero-knowledge protocol. The proposed inspection system is based on the template-matching approach and relies on active interrogation of a test object with 14-MeV neutrons. The viability of the method is examined with MCNP Monte Carlo neutron transport calculations modeling the experimental setup, an investigation of different diversion scenarios, and an analysis of the simulated data showing that it does not contain information about the properties of the inspected object.

  19. Pattern-based full-chip process verification

    NASA Astrophysics Data System (ADS)

    Ying, Changsheng; Kwon, Yongjun; Fornari, Paul; Perçin, Gökhan; Liu, Anwei

    2014-03-01

    This paper discusses a novel pattern based standalone process verification technique that meets with current and future needs for semiconductor manufacturing of memory and logic devices. The choosing the right process verification technique is essential to bridge the discrepancy between the intended and the printed pattern. As the industry moving to very low k1 patterning solutions at each technology node, the challenges for process verification are becoming nightmare for lithography engineers, such as large number of possible verification defects and defect disposition. In low k1 lithography, demand for full-chip process verification is increasing. Full-chip process verification is applied post to process and optical proximity correction (OPC) step. The current challenges in process verification are large number of defects reported, disposition difficulties, long defect review times, and no feedback provided to OPC. The technique presented here is based on pattern based verification where each reported defects are classified in terms of patterns and these patterns are saved to a database. Later this database is used for screening incoming new design prior to OPC step.

  20. NES++: number system for encryption based privacy preserving speaker verification

    NASA Astrophysics Data System (ADS)

    Xu, Lei; Feng, Tao; Zhao, Xi; Shi, Weidong

    2014-05-01

    As speech based operation becomes a main hand-free interaction solution between human and mobile devices (i.e., smartphones, Google Glass), privacy preserving speaker verification receives much attention nowadays. Privacy preserving speaker verification can be achieved through many different ways, such as fuzzy vault and encryption. Encryption based solutions are promising as cryptography is based on solid mathematic foundations and the security properties can be easily analyzed in a well established framework. Most current asymmetric encryption schemes work on finite algebraic structures, such as finite group and finite fields. However, the encryption scheme for privacy preserving speaker verification must handle floating point numbers. This gap must be filled to make the overall scheme practical. In this paper, we propose a number system that meets the requirements of both speaker verification and the encryption scheme used in the process. It also supports addition homomorphic property of Pailliers encryption, which is crucial for privacy preserving speaker verification. As asymmetric encryption is expensive, we propose a method of packing several numbers into one plain-text and the computation overhead is greatly reduced. To evaluate the performance of this method, we implement Pailliers encryption scheme over proposed number system and the packing technique. Our findings show that the proposed solution can fulfill the gap between speaker verification and encryption scheme very well, and the packing technique improves the overall performance. Furthermore, our solution is a building block of encryption based privacy preserving speaker verification, the privacy protection and accuracy rate are not affected.

  1. Universal Verification Methodology Based Register Test Automation Flow.

    PubMed

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  2. Medical Knowledge Bases.

    ERIC Educational Resources Information Center

    Miller, Randolph A.; Giuse, Nunzia B.

    1991-01-01

    Few commonly available, successful computer-based tools exist in medical informatics. Faculty expertise can be included in computer-based medical information systems. Computers allow dynamic recombination of knowledge to answer questions unanswerable with print textbooks. Such systems can also create stronger ties between academic and clinical…

  3. Medical Knowledge Bases.

    ERIC Educational Resources Information Center

    Miller, Randolph A.; Giuse, Nunzia B.

    1991-01-01

    Few commonly available, successful computer-based tools exist in medical informatics. Faculty expertise can be included in computer-based medical information systems. Computers allow dynamic recombination of knowledge to answer questions unanswerable with print textbooks. Such systems can also create stronger ties between academic and clinical…

  4. Biometric verification based on grip-pattern recognition

    NASA Astrophysics Data System (ADS)

    Veldhuis, Raymond N.; Bazen, Asker M.; Kauffman, Joost A.; Hartel, Pieter

    2004-06-01

    This paper describes the design, implementation and evaluation of a user-verification system for a smart gun, which is based on grip-pattern recognition. An existing pressure sensor consisting of an array of 44 × 44 piezoresistive elements is used to measure the grip pattern. An interface has been developed to acquire pressure images from the sensor. The values of the pixels in the pressure-pattern images are used as inputs for a verification algorithm, which is currently implemented in software on a PC. The verification algorithm is based on a likelihoodratio classifier for Gaussian probability densities. First results indicate that it is feasible to use grip-pattern recognition for biometric verification.

  5. Monitoring Knowledge Base (MKB)

    EPA Pesticide Factsheets

    The Monitoring Knowledge Base (MKB) is a compilation of emissions measurement and monitoring techniques associated with air pollution control devices, industrial process descriptions, and permitting techniques, including flexible permit development. Using MKB, one can gain a comprehensive understanding of emissions sources, control devices, and monitoring techniques, enabling one to determine appropriate permit terms and conditions.

  6. Knowledge-Based Abstracting.

    ERIC Educational Resources Information Center

    Black, William J.

    1990-01-01

    Discussion of automatic abstracting of technical papers focuses on a knowledge-based method that uses two sets of rules. Topics discussed include anaphora; text structure and discourse; abstracting techniques, including the keyword method and the indicator phrase method; and tools for text skimming. (27 references) (LRW)

  7. Optical secure image verification system based on ghost imaging

    NASA Astrophysics Data System (ADS)

    Wu, Jingjing; Haobogedewude, Buyinggaridi; Liu, Zhengjun; Liu, Shutian

    2017-09-01

    The ghost imaging can perform Fourier-space filtering by tailoring the configuration. We proposed a novel optical secure image verification system based on this theory with the help of phase matched filtering. In the verification process, the system key and the ID card which contain the information of the correct image and the information to be verified are put in the reference and the test paths, respectively. We demonstrate that the ghost imaging configuration can perform an incoherent correlation between the system key and the ID card. The correct verification manifests itself with a correlation peak in the ghost image. The primary image and the image to be verified are encrypted and encoded into pure phase masks beforehand for security. Multi-image secure verifications can also be implemented in the proposed system.

  8. Sensor-fusion-based biometric identity verification

    SciTech Connect

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  9. Simulation environment based on the Universal Verification Methodology

    NASA Astrophysics Data System (ADS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  10. Prototype-Based Discriminative Feature Learning for Kinship Verification.

    PubMed

    Yan, Haibin; Lu, Jiwen; Zhou, Xiuzhuang

    2015-11-01

    In this paper, we propose a new prototype-based discriminative feature learning (PDFL) method for kinship verification. Unlike most previous kinship verification methods which employ low-level hand-crafted descriptors such as local binary pattern and Gabor features for face representation, this paper aims to learn discriminative mid-level features to better characterize the kin relation of face images for kinship verification. To achieve this, we construct a set of face samples with unlabeled kin relation from the labeled face in the wild dataset as the reference set. Then, each sample in the training face kinship dataset is represented as a mid-level feature vector, where each entry is the corresponding decision value from one support vector machine hyperplane. Subsequently, we formulate an optimization function by minimizing the intraclass samples (with a kin relation) and maximizing the neighboring interclass samples (without a kin relation) with the mid-level features. To better use multiple low-level features for mid-level feature learning, we further propose a multiview PDFL method to learn multiple mid-level features to improve the verification performance. Experimental results on four publicly available kinship datasets show the superior performance of the proposed methods over both the state-of-the-art kinship verification methods and human ability in our kinship verification task.

  11. Verification strategies for fluid-based plasma simulation models

    NASA Astrophysics Data System (ADS)

    Mahadevan, Shankar

    2012-10-01

    Verification is an essential aspect of computational code development for models based on partial differential equations. However, verification of plasma models is often conducted internally by authors of these programs and not openly discussed. Several professional research bodies including the IEEE, AIAA, ASME and others have formulated standards for verification and validation (V&V) of computational software. This work focuses on verification, defined succinctly as determining whether the mathematical model is solved correctly. As plasma fluid models share several aspects with the Navier-Stokes equations used in Computational Fluid Dynamics (CFD), the CFD verification process is used as a guide. Steps in the verification process: consistency checks, examination of iterative, spatial and temporal convergence, and comparison with exact solutions, are described with examples from plasma modeling. The Method of Manufactured Solutions (MMS), which has been used to verify complex systems of PDEs in solid and fluid mechanics, is introduced. An example of the application of MMS to a self-consistent plasma fluid model using the local mean energy approximation is presented. The strengths and weaknesses of the techniques presented in this work are discussed.

  12. Knowledge Based Text Generation

    DTIC Science & Technology

    1989-08-01

    knowledge base as well as communicate the reasoning behind a particular diagnosis. This is discussed more thoroughly in subsequent sections. On the other...explanation. Wcincr proposed that a statement can be justified by offering reasons , supporting examples, and implausible alternatives, except for the statement...These justification techniques are realized in his system by four predicates: statement, reason , example and alternative. Connectives such as and/or

  13. Developing sub-domain verification methods based on GIS tools

    NASA Astrophysics Data System (ADS)

    Smith, J. A.; Foley, T. A.; Raby, J. W.

    2014-12-01

    The meteorological community makes extensive use of the Model Evaluation Tools (MET) developed by National Center for Atmospheric Research for numerical weather prediction model verification through grid-to-point, grid-to-grid and object-based domain level analyses. MET Grid-Stat has been used to perform grid-to-grid neighborhood verification to account for the uncertainty inherent in high resolution forecasting, and MET Method for Object-based Diagnostic Evaluation (MODE) has been used to develop techniques for object-based spatial verification of high resolution forecast grids for continuous meteorological variables. High resolution modeling requires more focused spatial and temporal verification over parts of the domain. With a Geographical Information System (GIS), researchers can now consider terrain type/slope and land use effects and other spatial and temporal variables as explanatory metrics in model assessments. GIS techniques, when coupled with high resolution point and gridded observations sets, allow location-based approaches that permit discovery of spatial and temporal scales where models do not sufficiently resolve the desired phenomena. In this paper we discuss our initial GIS approach to verify WRF-ARW with a one-kilometer horizontal resolution inner domain centered over Southern California. Southern California contains a mixture of urban, sub-urban, agricultural and mountainous terrain types along with a rich array of observational data with which to illustrate our ability to conduct sub-domain verification.

  14. Post-OPC verification using a full-chip pattern-based simulation verification method

    NASA Astrophysics Data System (ADS)

    Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary

    2005-11-01

    In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow

  15. Palmprint Based Verification System Using SURF Features

    NASA Astrophysics Data System (ADS)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  16. Biometric Subject Verification Based on Electrocardiographic Signals

    NASA Technical Reports Server (NTRS)

    Dusan, Sorin V. (Inventor); Jorgensen, Charles C. (Inventor)

    2014-01-01

    A method of authenticating or declining to authenticate an asserted identity of a candidate-person. In an enrollment phase, a reference PQRST heart action graph is provided or constructed from information obtained from a plurality of graphs that resemble each other for a known reference person, using a first graph comparison metric. In a verification phase, a candidate-person asserts his/her identity and presents a plurality of his/her heart cycle graphs. If a sufficient number of the candidate-person's measured graphs resemble each other, a representative composite graph is constructed from the candidate-person's graphs and is compared with a composite reference graph, for the person whose identity is asserted, using a second graph comparison metric. When the second metric value lies in a selected range, the candidate-person's assertion of identity is accepted.

  17. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  18. Log-Gabor filters for image-based vehicle verification.

    PubMed

    Arróspide, Jon; Salgado, Luis

    2013-06-01

    Vehicle detection based on image analysis has attracted increasing attention in recent years due to its low cost, flexibility, and potential toward collision avoidance. In particular, vehicle verification is especially challenging on account of the heterogeneity of vehicles in color, size, pose, etc. Image-based vehicle verification is usually addressed as a supervised classification problem. Specifically, descriptors using Gabor filters have been reported to show good performance in this task. However, Gabor functions have a number of drawbacks relating to their frequency response. The main contribution of this paper is the proposal and evaluation of a new descriptor based on the alternative family of log-Gabor functions for vehicle verification, as opposed to existing Gabor filter-based descriptors. These filters are theoretically superior to Gabor filters as they can better represent the frequency properties of natural images. As a second contribution, and in contrast to existing approaches, which transfer the standard configuration of filters used for other applications to the vehicle classification task, an in-depth analysis of the required filter configuration by both Gabor and log-Gabor descriptors for this particular application is performed for fair comparison. The extensive experiments conducted in this paper confirm that the proposed log-Gabor descriptor significantly outperforms the standard Gabor filter for image-based vehicle verification.

  19. Wavelet-based verification of the quantitative precipitation forecast

    NASA Astrophysics Data System (ADS)

    Yano, Jun-Ichi; Jakubiak, Bogumil

    2016-06-01

    This paper explores the use of wavelets for spatial verification of quantitative precipitation forecasts (QPF), and especially the capacity of wavelets to provide both localization and scale information. Two 24-h forecast experiments using the two versions of the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) on 22 August 2010 over Poland are used to illustrate the method. Strong spatial localizations and associated intermittency of the precipitation field make verification of QPF difficult using standard statistical methods. The wavelet becomes an attractive alternative, because it is specifically designed to extract spatially localized features. The wavelet modes are characterized by the two indices for the scale and the localization. Thus, these indices can simply be employed for characterizing the performance of QPF in scale and localization without any further elaboration or tunable parameters. Furthermore, spatially-localized features can be extracted in wavelet space in a relatively straightforward manner with only a weak dependence on a threshold. Such a feature may be considered an advantage of the wavelet-based method over more conventional "object" oriented verification methods, as the latter tend to represent strong threshold sensitivities. The present paper also points out limits of the so-called "scale separation" methods based on wavelets. Our study demonstrates how these wavelet-based QPF verifications can be performed straightforwardly. Possibilities for further developments of the wavelet-based methods, especially towards a goal of identifying a weak physical process contributing to forecast error, are also pointed out.

  20. A wavelet-based approach to face verification/recognition

    NASA Astrophysics Data System (ADS)

    Jassim, Sabah; Sellahewa, Harin

    2005-10-01

    Face verification/recognition is a tough challenge in comparison to identification based on other biometrics such as iris, or fingerprints. Yet, due to its unobtrusive nature, the face is naturally suitable for security related applications. Face verification process relies on feature extraction from face images. Current schemes are either geometric-based or template-based. In the latter, the face image is statistically analysed to obtain a set of feature vectors that best describe it. Performance of a face verification system is affected by image variations due to illumination, pose, occlusion, expressions and scale. This paper extends our recent work on face verification for constrained platforms, where the feature vector of a face image is the coefficients in the wavelet transformed LL-subbands at depth 3 or more. It was demonstrated that the wavelet-only feature vector scheme has a comparable performance to sophisticated state-of-the-art when tested on two benchmark databases (ORL, and BANCA). The significance of those results stem from the fact that the size of the k-th LL- subband is 1/4k of the original image size. Here, we investigate the use of wavelet coefficients in various subbands at level 3 or 4 using various wavelet filters. We shall compare the performance of the wavelet-based scheme for different filters at different subbands with a number of state-of-the-art face verification/recognition schemes on two benchmark databases, namely ORL and the control section of BANCA. We shall demonstrate that our schemes have comparable performance to (or outperform) the best performing other schemes.

  1. A physical zero-knowledge object-comparison system for nuclear warhead verification

    NASA Astrophysics Data System (ADS)

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; D'Errico, Francesco

    2016-09-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  2. Combining Task Execution and Background Knowledge for the Verification of Medical Guidelines

    NASA Astrophysics Data System (ADS)

    Hommersom, Arjen; Groot, Perry; Lucas, Peter; Balser, Michael; Schmitt, Jonathan

    The use of a medical guideline can be seen as the execution of computational tasks, sequentially or in parallel, in the face of patient data. It has been shown that many of such guidelines can be represented as a 'network of tasks', i.e., as a number of steps that have a specific function or goal. To investigate the quality of such guidelines we propose a formalization of criteria for good practice medicine a guideline should comply to. We use this theory in conjunction with medical background knowledge to verify the quality of a guideline dealing with diabetes mellitus type 2 using the interactive theorem prover KIV. Verification using task execution and background knowledge is a novel approach to quality checking of medical guidelines.

  3. A physical zero-knowledge object-comparison system for nuclear warhead verification

    PubMed Central

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d'Errico, Francesco

    2016-01-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477

  4. A physical zero-knowledge object-comparison system for nuclear warhead verification

    SciTech Connect

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d’Errico, Francesco

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  5. A physical zero-knowledge object-comparison system for nuclear warhead verification.

    PubMed

    Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  6. A physical zero-knowledge object-comparison system for nuclear warhead verification

    DOE PAGES

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; ...

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less

  7. Mobile robot knowledge base

    NASA Astrophysics Data System (ADS)

    Heath Pastore, Tracy; Barnes, Mitchell; Hallman, Rory

    2005-05-01

    Robot technology is developing at a rapid rate for both commercial and Department of Defense (DOD) applications. As a result, the task of managing both technology and experience information is growing. In the not-to-distant past, tracking development efforts of robot platforms, subsystems and components was not too difficult, expensive, or time consuming. To do the same today is a significant undertaking. The Mobile Robot Knowledge Base (MRKB) provides the robotics community with a web-accessible, centralized resource for sharing information, experience, and technology to more efficiently and effectively meet the needs of the robot system user. The resource includes searchable information on robot components, subsystems, mission payloads, platforms, and DOD robotics programs. In addition, the MRKB website provides a forum for technology and information transfer within the DOD robotics community and an interface for the Robotic Systems Pool (RSP). The RSP manages a collection of small teleoperated and semi-autonomous robotic platforms, available for loan to DOD and other qualified entities. The objective is to put robots in the hands of users and use the test data and fielding experience to improve robot systems.

  8. Foundation: Transforming data bases into knowledge bases

    NASA Technical Reports Server (NTRS)

    Purves, R. B.; Carnes, James R.; Cutts, Dannie E.

    1987-01-01

    One approach to transforming information stored in relational data bases into knowledge based representations and back again is described. This system, called Foundation, allows knowledge bases to take advantage of vast amounts of pre-existing data. A benefit of this approach is inspection, and even population, of data bases through an intelligent knowledge-based front-end.

  9. Supporting the design of translational clinical studies through the generation and verification of conceptual knowledge-anchored hypotheses.

    PubMed

    Payne, Philip R O; Payne, Philip Richard Orrin; Borlawsky, Tara B; Borlawsky, Tara; Kwok, Alan; Greaves, Andrew W; Greaves, Andrew

    2008-11-06

    The ability to generate hypotheses based upon the contents of large-scale, heterogeneous data sets is critical to the design of translational clinical studies. In previous reports, we have described the application of a conceptual knowledge engineering technique, known as constructive induction (CI) in order to satisfy such needs. However, one of the major limitations of this method is the need to engage multiple subject matter experts to verify potential hypotheses generated using CI. In this manuscript, we describe an alternative verification technique that leverages published biomedical literature abstracts. Our report will be framed in the context of an ongoing project to generate hypotheses related to the contents of a translational research data repository maintained by the CLL Research Consortium. Such hypotheses will are intended to inform the design of prospective clinical studies that can elucidate the relationships that may exist between biomarkers and patient phenotypes.

  10. Wavelet-based face verification for constrained platforms

    NASA Astrophysics Data System (ADS)

    Sellahewa, Harin; Jassim, Sabah A.

    2005-03-01

    Human Identification based on facial images is one of the most challenging tasks in comparison to identification based on other biometric features such as fingerprints, palm prints or iris. Facial recognition is the most natural and suitable method of identification for security related applications. This paper is concerned with wavelet-based schemes for efficient face verification suitable for implementation on devices that are constrained in memory size and computational power such as PDA"s and smartcards. Beside minimal storage requirements we should apply as few as possible pre-processing procedures that are often needed to deal with variation in recoding conditions. We propose the LL-coefficients wavelet-transformed face images as the feature vectors for face verification, and compare its performance of PCA applied in the LL-subband at levels 3,4 and 5. We shall also compare the performance of various versions of our scheme, with those of well-established PCA face verification schemes on the BANCA database as well as the ORL database. In many cases, the wavelet-only feature vector scheme has the best performance while maintaining efficacy and requiring minimal pre-processing steps. The significance of these results is their efficiency and suitability for platforms of constrained computational power and storage capacity (e.g. smartcards). Moreover, working at or beyond level 3 LL-subband results in robustness against high rate compression and noise interference.

  11. Common Criteria Based Security Scenario Verification

    NASA Astrophysics Data System (ADS)

    Ohnishi, Atsushi

    Software is required to comply with the laws and standards of software security. However, stakeholders with less concern regarding security can neither describe the behaviour of the system with regard to security nor validate the system’s behaviour when the security function conflicts with usability. Scenarios or use-case specifications are common in requirements elicitation and are useful to analyze the usability of the system from a behavioural point of view. In this paper, the authors propose both (1) a scenario language based on a simple case grammar and (2) a method to verify a scenario with rules based on security evaluation criteria.

  12. Retinal Verification Using a Feature Points-Based Biometric Pattern

    NASA Astrophysics Data System (ADS)

    Ortega, M.; Penedo, M. G.; Rouco, J.; Barreira, N.; Carreira, M. J.

    2009-12-01

    Biometrics refer to identity verification of individuals based on some physiologic or behavioural characteristics. The typical authentication process of a person consists in extracting a biometric pattern of him/her and matching it with the stored pattern for the authorised user obtaining a similarity value between patterns. In this work an efficient method for persons authentication is showed. The biometric pattern of the system is a set of feature points representing landmarks in the retinal vessel tree. The pattern extraction and matching is described. Also, a deep analysis of similarity metrics performance is presented for the biometric system. A database with samples of retina images from users on different moments of time is used, thus simulating a hard and real environment of verification. Even in this scenario, the system allows to establish a wide confidence band for the metric threshold where no errors are obtained for training and test sets.

  13. Tolerance-based process proximity correction (PPC) verification methodology

    NASA Astrophysics Data System (ADS)

    Hashimoto, Kohji; Fujise, Hiroharu; Nojima, Shigeki; Ito, Takeshi; Ikeda, Takahiro

    2004-08-01

    Tolerance-based process proximity correction (PPC) verification methodology is proposed for "hot spot management" in LSI fabrication process flow. This methodology verifies the PPC accuracy with the features of actual processed wafers/masks and target features in CAD data including CD tolerance around hot spots. The CD tolerance in CAD data is decided according to device characteristics, process integration, CD budget, and so on, and is used for the judgment criteria of the PPC accuracy. After the verifications, the actions in the manufacturing are decided. This methodology is demonstrated for the 65nm-node CMOS local metal at three representative hot spots extracted by lithography simulation, and the results yielded useful information for the manufacturing.

  14. Approaches to the verification of rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; Riley, Gary; Savely, Robert T.

    1987-01-01

    Expert systems are a highly useful spinoff of artificial intelligence research. One major stumbling block to extended use of expert systems is the lack of well-defined verification and validation (V and V) methodologies. Since expert systems are computer programs, the definitions of verification and validation from conventional software are applicable. The primary difficulty with expert systems is the use of development methodologies which do not support effective V and V. If proper techniques are used to document requirements, V and V of rule-based expert systems is possible, and may be easier than with conventional code. For NASA applications, the flight technique panels used in previous programs should provide an excellent way to verify the rules used in expert systems. There are, however, some inherent differences in expert systems that will affect V and V considerations.

  15. Geothermal Resource Verification for Air Force Bases,

    DTIC Science & Technology

    1981-06-01

    Geothermal Resources ............................ 12 Overview ........................................................ 12 Hydrothermal Convection Systems...exploration techniques. In some regions where thick sedimentary occur usable hydrothermal resources may be encountered in deep v.elIE where there is no...effort to locate a low to moderate temperature hydrothermal regime suitable for space heating at Hill Air Force Base, Utah, was recently completed and

  16. A DVE Time Management Simulation and Verification Platform Based on Causality Consistency Middleware

    NASA Astrophysics Data System (ADS)

    Zhou, Hangjun; Zhang, Wei; Peng, Yuxing; Li, Sikun

    During the course of designing a time management algorithm for DVEs, the researchers always become inefficiency for the distraction from the realization of the trivial and fundamental details of simulation and verification. Therefore, a platform having realized theses details is desirable. However, this has not been achieved in any published work to our knowledge. In this paper, we are the first to design and realize a DVE time management simulation and verification platform providing exactly the same interfaces as those defined by the HLA Interface Specification. Moreover, our platform is based on a new designed causality consistency middleware and might offer the comparison of three kinds of time management services: CO, RO and TSO. The experimental results show that the implementation of the platform only costs small overhead, and that the efficient performance of it is highly effective for the researchers to merely focus on the improvement of designing algorithms.

  17. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  18. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  19. Non-Invertible Transforms for Image-Based Verification

    SciTech Connect

    White, Timothy A.; Robinson, Sean M.; Jarman, Kenneth D.; Miller, Erin A.; Seifert, Allen; McDonald, Benjamin S.; Pitts, W. Karl; Misner, Alex C.

    2011-07-20

    Imaging may play a unique role in verifying the presence and distribution of warhead components in warhead counting and dismantlement settings where image information content can distinguish among shapes, forms, and material composition of items. However, a major issue with imaging is the high level of intrusiveness, and in particular, the possible need to store sensitive comparison images in the inspection system that would violate information barrier (IB) principles. Reducing images via transformations or feature extraction can produce image features (e.g. attributes) for verification, but with enough prior information about structure the reduced information itself may be sufficient to deduce sensitive details of the original image. Further reducing resolution of the transformed image information is an option, but too much reduction destroys the quality of the attribute. We study the possibility of a one-way transform that allows storage of non-sensitive reference information and analysis to enable comparison of transformed images within IB constraints. In particular, we consider the degree to which images can be reconstructed from image intensity histograms depending on the number of pixel intensity bins and the degree of frequency data quantization, as well as assumed knowledge of configuration of objects in the images. We also explore the concept of a 'perceptual hash' as a class of transforms that may enable verification with provable non-invertibility, leading to an effective one-way transform that preserves the nature of the image feature data without revealing sufficient information to reconstruct the original image.

  20. A Scala DSL for RETE-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  1. A Scala DSL for RETE-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  2. Knowledge-based nursing diagnosis

    NASA Astrophysics Data System (ADS)

    Roy, Claudette; Hay, D. Robert

    1991-03-01

    Nursing diagnosis is an integral part of the nursing process and determines the interventions leading to outcomes for which the nurse is accountable. Diagnoses under the time constraints of modern nursing can benefit from a computer assist. A knowledge-based engineering approach was developed to address these problems. A number of problems were addressed during system design to make the system practical extended beyond capture of knowledge. The issues involved in implementing a professional knowledge base in a clinical setting are discussed. System functions, structure, interfaces, health care environment, and terminology and taxonomy are discussed. An integrated system concept from assessment through intervention and evaluation is outlined.

  3. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    NASA Astrophysics Data System (ADS)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  4. Knowledge based question answering

    SciTech Connect

    Pazzani, M.J.; Engelman, C.

    1983-01-01

    The natural language database query system incorporated in the Knobs Interactive Planning System comprises a dictionary driven parser, APE-II, and script interpreter whch yield a conceptual dependency as a representation of the meaning of user input. A conceptualisation pattern matching production system then determines and executes a procedure for extracting the desired information from the database. In contrast to syntax driven q-a systems, e.g. those based on atn parsers, APE-II is driven bottom-up by expectations associated with word meanings. The goals of this approach include utilising similar representations for questions with similar meanings but widely varying surface structures, developing a powerful mechanism for the disambiguation of words with multiple meanings and the determination of pronoun referents, answering questions which require inferences to be understood, and interpreting ellipses and ungrammatical statements. The Knobs demonstration system is an experimental, expert system for air force mission planning applications. 16 refs.

  5. Verification of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.

  6. Crowd-Sourced Help with Emergent Knowledge for Optimized Formal Verification (CHEKOFV)

    DTIC Science & Technology

    2016-03-01

    software. Leveraging human pattern recognition skills, the CSFV games provide formal verification proofs a machine analyzing the code cannot. The SRI...effectiveness and reduce the cost to verify code. 15. SUBJECT TERMS Formal Software Verification, Crowd-Sourcing, Games, Cyber Security, Human- Machine Systems...Abstract  Interpretation  and   Machine  Learning

  7. SystemVerilog-Based Verification Environment Employing Multiple Inheritance of SystemC

    NASA Astrophysics Data System (ADS)

    You, Myoung-Keun; Song, Gi-Yong

    In this paper, we describe a verification environment which is based on a constrained random layered testbench using SystemVerilog OOP. As SystemVerilog OOP technique does not allow multiple inheritance, we adopt SystemC to design components of a verification environment which employ multiple inheritance. Then SystemC design unit is linked to a SystemVerilog-based verification environment using SystemVerilog DPI and ModelSim macro. Employing multiple inheritance of SystemC makes the design phase of verification environment simple and easy through source code reusability without corruption due to multi-level single inheritance.

  8. Verification issues for rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; Riley, Gary; Savely, Robert T.

    1987-01-01

    Verification and validation of expert systems is very important for the future success of this technology. Software will never be used in non-trivial applications unless the program developers can assure both users and managers that the software is reliable and generally free from error. Therefore, verification and validation of expert systems must be done. The primary hindrance to effective verification and validation is the use of methodologies which do not produce testable requirements. An extension of the flight technique panels used in previous NASA programs should provide both documented requirements and very high levels of verification for expert systems.

  9. An Economical Framework for Verification of Swarm-Based Algorithms Using Small, Autonomous Robots

    DTIC Science & Technology

    2006-09-01

    NAWCWD TP 8630 An Economical Framework for Verification of Swarm- Based Algorithms Using Small, Autonomous Robots by James...Verification of Swarm-Based Algorithms Using Small, Autonomous Robots (U) 6. AUTHOR(S) James Bobinchak, Eric Ford, Rodney Heil, and Duane Schwartzwald

  10. Reusing Design Knowledge Based on Design Cases and Knowledge Map

    ERIC Educational Resources Information Center

    Yang, Cheng; Liu, Zheng; Wang, Haobai; Shen, Jiaoqi

    2013-01-01

    Design knowledge was reused for innovative design work to support designers with product design knowledge and help designers who lack rich experiences to improve their design capacity and efficiency. First, based on the ontological model of product design knowledge constructed by taxonomy, implicit and explicit knowledge was extracted from some…

  11. Knowledge Acquisition Using Linguistic-Based Knowledge Analysis

    Treesearch

    Daniel L. Schmoldt

    1998-01-01

    Most knowledge-based system developmentefforts include acquiring knowledge from one or more sources. difficulties associated with this knowledge acquisition task are readily acknowledged by most researchers. While a variety of knowledge acquisition methods have been reported, little has been done to organize those different methods and to suggest how to apply them...

  12. Reusing Design Knowledge Based on Design Cases and Knowledge Map

    ERIC Educational Resources Information Center

    Yang, Cheng; Liu, Zheng; Wang, Haobai; Shen, Jiaoqi

    2013-01-01

    Design knowledge was reused for innovative design work to support designers with product design knowledge and help designers who lack rich experiences to improve their design capacity and efficiency. First, based on the ontological model of product design knowledge constructed by taxonomy, implicit and explicit knowledge was extracted from some…

  13. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    NASA Astrophysics Data System (ADS)

    Kryszczuk, Krzysztof; Richiardi, Jonas; Prodanov, Plamen; Drygajlo, Andrzej

    2007-12-01

    We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature) and multimodal (speech and face) systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  14. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-12-10

    Environment Representation (CCER) ontology . We also provide an overview of a methodology to specify verification rules and the corresponding error...lack models for their configuration. Our main contributions in this paper are the following. First, we have developed a configuration ontology ...the CCER ontology , and its current component ontologies . Section 3 describes the connections between diagrams, verification rules, and error messages

  15. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-11-13

    Environment Representation (CCER) ontology . We also provide an overview of a methodology to specify verification rules and the corresponding error...which lack models for their configuration. Our main contributions in this paper are the following. First, we have developed a configuration ontology ...the CCER ontology , and its current component ontologies . Section 3 describes the connections between diagrams, verification rules, and error

  16. The data dictionary: A view into the CTBT knowledge base

    SciTech Connect

    Shepherd, E.R.; Keyser, R.G.; Armstrong, H.M.

    1997-08-01

    The data dictionary for the Comprehensive Test Ban Treaty (CTBT) knowledge base provides a comprehensive, current catalog of the projected contents of the knowledge base. It is written from a data definition view of the knowledge base and therefore organizes information in a fashion that allows logical storage within the computer. The data dictionary introduces two organization categories of data: the datatype, which is a broad, high-level category of data, and the dataset, which is a specific instance of a datatype. The knowledge base, and thus the data dictionary, consist of a fixed, relatively small number of datatypes, but new datasets are expected to be added on a regular basis. The data dictionary is a tangible result of the design effort for the knowledge base and is intended to be used by anyone who accesses the knowledge base for any purpose, such as populating the knowledge base with data, or accessing the data for use with automatic data processing (ADP) routines, or browsing through the data for verification purposes. For these two reasons, it is important to discuss the development of the data dictionary as well as to describe its contents to better understand its usefulness; that is the purpose of this paper.

  17. Population Education: A Knowledge Base.

    ERIC Educational Resources Information Center

    Jacobson, Willard J.

    To aid junior high and high school educators and curriculum planners as they develop population education programs, the book provides an overview of the population education knowledge base. In addition, it suggests learning activities, discussion questions, and background information which can be integrated into courses dealing with population,…

  18. Population Education: A Knowledge Base.

    ERIC Educational Resources Information Center

    Jacobson, Willard J.

    To aid junior high and high school educators and curriculum planners as they develop population education programs, the book provides an overview of the population education knowledge base. In addition, it suggests learning activities, discussion questions, and background information which can be integrated into courses dealing with population,…

  19. Epistemology of knowledge based simulation

    SciTech Connect

    Reddy, R.

    1987-04-01

    Combining artificial intelligence concepts, with traditional simulation methodologies yields a powerful design support tool known as knowledge based simulation. This approach turns a descriptive simulation tool into a prescriptive tool, one which recommends specific goals. Much work in the area of general goal processing and explanation of recommendations remains to be done.

  20. Retina verification system based on biometric graph matching.

    PubMed

    Lajevardi, Seyed Mehdi; Arakala, Arathi; Davis, Stephen A; Horadam, Kathy J

    2013-09-01

    This paper presents an automatic retina verification framework based on the biometric graph matching (BGM) algorithm. The retinal vasculature is extracted using a family of matched filters in the frequency domain and morphological operators. Then, retinal templates are defined as formal spatial graphs derived from the retinal vasculature. The BGM algorithm, a noisy graph matching algorithm, robust to translation, non-linear distortion, and small rotations, is used to compare retinal templates. The BGM algorithm uses graph topology to define three distance measures between a pair of graphs, two of which are new. A support vector machine (SVM) classifier is used to distinguish between genuine and imposter comparisons. Using single as well as multiple graph measures, the classifier achieves complete separation on a training set of images from the VARIA database (60% of the data), equaling the state-of-the-art for retina verification. Because the available data set is small, kernel density estimation (KDE) of the genuine and imposter score distributions of the training set are used to measure performance of the BGM algorithm. In the one dimensional case, the KDE model is validated with the testing set. A 0 EER on testing shows that the KDE model is a good fit for the empirical distribution. For the multiple graph measures, a novel combination of the SVM boundary and the KDE model is used to obtain a fair comparison with the KDE model for the single measure. A clear benefit in using multiple graph measures over a single measure to distinguish genuine and imposter comparisons is demonstrated by a drop in theoretical error of between 60% and more than two orders of magnitude.

  1. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... verifications to third party requesters based on consent. The CBSV process provides the business community and... private businesses and other requesters who obtain a valid, signed consent form from the Social Security... addition to the benefit of providing high volume, centralized SSN verification services to the...

  2. Automated knowledge-base refinement

    NASA Technical Reports Server (NTRS)

    Mooney, Raymond J.

    1994-01-01

    Over the last several years, we have developed several systems for automatically refining incomplete and incorrect knowledge bases. These systems are given an imperfect rule base and a set of training examples and minimally modify the knowledge base to make it consistent with the examples. One of our most recent systems, FORTE, revises first-order Horn-clause knowledge bases. This system can be viewed as automatically debugging Prolog programs based on examples of correct and incorrect I/O pairs. In fact, we have already used the system to debug simple Prolog programs written by students in a programming language course. FORTE has also been used to automatically induce and revise qualitative models of several continuous dynamic devices from qualitative behavior traces. For example, it has been used to induce and revise a qualitative model of a portion of the Reaction Control System (RCS) of the NASA Space Shuttle. By fitting a correct model of this portion of the RCS to simulated qualitative data from a faulty system, FORTE was also able to correctly diagnose simple faults in this system.

  3. Reducing scan angle using adaptive prior knowledge for a limited-angle intrafraction verification (LIVE) system for conformal arc radiotherapy

    NASA Astrophysics Data System (ADS)

    Zhang, Yawei; Yin, Fang-Fang; Zhang, You; Ren, Lei

    2017-05-01

    The purpose of this study is to develop an adaptive prior knowledge guided image estimation technique to reduce the scan angle needed in the limited-angle intrafraction verification (LIVE) system for 4D-CBCT reconstruction. The LIVE system has been previously developed to reconstruct 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to further reduce the scanning angle needed to reconstruct the 4D-CBCT images for faster intrafraction verification. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on kV-MV projections acquired in extremely limited angle (orthogonal 3°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of the respiratory motion. The 4D digital extended-cardiac-torso (XCAT) phantom and a CIRS 008A dynamic thoracic phantom were used to evaluate the effectiveness of this technique. The reconstruction accuracy of the technique was evaluated by calculating both the center-of-mass-shift (COMS) and 3D volume-percentage-difference (VPD) of the tumor in reconstructed images and the true on-board images. The performance of the technique was also assessed with varied breathing signals against scanning angle, lesion size, lesion location, projection sampling interval, and scanning direction. In the XCAT study, using orthogonal-view of 3° kV and portal MV projections, this technique achieved an average tumor COMS/VPD of 0.4  ±  0.1 mm/5.5  ±  2.2%, 0.6  ±  0.3 mm/7.2  ±  2.8%, 0.5  ±  0.2 mm/7.1  ±  2.6%, 0.6  ±  0.2 mm/8.3  ±  2.4%, for baseline drift, amplitude variation, phase shift, and patient breathing signal variation

  4. Knowledge-based media adaptation

    NASA Astrophysics Data System (ADS)

    Leopold, Klaus; Jannach, Dietmar; Hellwagner, Hermann

    2004-10-01

    This paper introduces the principal approach and describes the basic architecture and current implementation of the knowledge-based multimedia adaptation framework we are currently developing. The framework can be used in Universal Multimedia Access scenarios, where multimedia content has to be adapted to specific usage environment parameters (network and client device capabilities, user preferences). Using knowledge-based techniques (state-space planning), the framework automatically computes an adaptation plan, i.e., a sequence of media conversion operations, to transform the multimedia resources to meet the client's requirements or constraints. The system takes as input standards-compliant descriptions of the content (using MPEG-7 metadata) and of the target usage environment (using MPEG-21 Digital Item Adaptation metadata) to derive start and goal states for the planning process, respectively. Furthermore, declarative descriptions of the conversion operations (such as available via software library functions) enable existing adaptation algorithms to be invoked without requiring programming effort. A running example in the paper illustrates the descriptors and techniques employed by the knowledge-based media adaptation system.

  5. Cleaning Verification Monitor Technique Based on Infrared Optical Methods

    DTIC Science & Technology

    2004-10-01

    Cleaning Verification Techniques.” Real-time methods to provide both qualitative and quantitative assessments of surface cleanliness are needed for a...detection VCPI method offer a wide range of complementary capabilities in real-time surface cleanliness verification. Introduction Currently...also has great potential to reduce or eliminate premature failures of surface coatings caused by a lack of surface cleanliness . Additional

  6. Verification of the Forecast Errors Based on Ensemble Spread

    NASA Astrophysics Data System (ADS)

    Vannitsem, S.; Van Schaeybroeck, B.

    2014-12-01

    The use of ensemble prediction systems allows for an uncertainty estimation of the forecast. Most end users do not require all the information contained in an ensemble and prefer the use of a single uncertainty measure. This measure is the ensemble spread which serves to forecast the forecast error. It is however unclear how best the quality of these forecasts can be performed, based on spread and forecast error only. The spread-error verification is intricate for two reasons: First for each probabilistic forecast only one observation is substantiated and second, the spread is not meant to provide an exact prediction for the error. Despite these facts several advances were recently made, all based on traditional deterministic verification of the error forecast. In particular, Grimit and Mass (2007) and Hopson (2014) considered in detail the strengths and weaknesses of the spread-error correlation, while Christensen et al (2014) developed a proper-score extension of the mean squared error. However, due to the strong variance of the error given a certain spread, the error forecast should be preferably considered as probabilistic in nature. In the present work, different probabilistic error models are proposed depending on the spread-error metrics used. Most of these models allow for the discrimination of a perfect forecast from an imperfect one, independent of the underlying ensemble distribution. The new spread-error scores are tested on the ensemble prediction system of the European Centre of Medium-range forecasts (ECMWF) over Europe and Africa. ReferencesChristensen, H. M., Moroz, I. M. and Palmer, T. N., 2014, Evaluation of ensemble forecast uncertainty using a new proper score: application to medium-range and seasonal forecasts. In press, Quarterly Journal of the Royal Meteorological Society. Grimit, E. P., and C. F. Mass, 2007: Measuring the ensemble spread-error relationship with a probabilistic approach: Stochastic ensemble results. Mon. Wea. Rev., 135, 203

  7. Verification and Planning Based on Coinductive Logic Programming

    NASA Technical Reports Server (NTRS)

    Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal

    2008-01-01

    Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution

  8. Dynamic Signature Verification System Based on One Real Signature.

    PubMed

    Diaz, Moises; Fischer, Andreas; Ferrer, Miguel A; Plamondon, Rejean

    2016-12-06

    The dynamic signature is a biometric trait widely used and accepted for verifying a person's identity. Current automatic signature-based biometric systems typically require five, ten, or even more specimens of a person's signature to learn intrapersonal variability sufficient to provide an accurate verification of the individual's identity. To mitigate this drawback, this paper proposes a procedure for training with only a single reference signature. Our strategy consists of duplicating the given signature a number of times and training an automatic signature verifier with each of the resulting signatures. The duplication scheme is based on a sigma lognormal decomposition of the reference signature. Two methods are presented to create human-like duplicated signatures: the first varies the strokes' lognormal parameters (stroke-wise) whereas the second modifies their virtual target points (target-wise). A challenging benchmark, assessed with multiple state-of-the-art automatic signature verifiers and multiple databases, proves the robustness of the system. Experimental results suggest that our system, with a single reference signature, is capable of achieving a similar performance to standard verifiers trained with up to five signature specimens.

  9. Ground-based visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-11-01

    Ground-based visual inspection will play an essential role in On-Site Inspection (OSI) for Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection will greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can ground-based visual inspection offer effective documentation in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending state may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection. The inspections will be carried out by inspectors from members of the CTBT Organization.

  10. Finger vein verification system based on sparse representation.

    PubMed

    Xin, Yang; Liu, Zhi; Zhang, Haixia; Zhang, Hong

    2012-09-01

    Finger vein verification is a promising biometric pattern for personal identification in terms of security and convenience. The recognition performance of this technology heavily relies on the quality of finger vein images and on the recognition algorithm. To achieve efficient recognition performance, a special finger vein imaging device is developed, and a finger vein recognition method based on sparse representation is proposed. The motivation for the proposed method is that finger vein images exhibit a sparse property. In the proposed system, the regions of interest (ROIs) in the finger vein images are segmented and enhanced. Sparse representation and sparsity preserving projection on ROIs are performed to obtain the features. Finally, the features are measured for recognition. An equal error rate of 0.017% was achieved based on the finger vein image database, which contains images that were captured by using the near-IR imaging device that was developed in this study. The experimental results demonstrate that the proposed method is faster and more robust than previous methods.

  11. Minimum classification error-based weighted support vector machine kernels for speaker verification.

    PubMed

    Suh, Youngjoo; Kim, Hoirin

    2013-04-01

    Support vector machines (SVMs) have been proved to be an effective approach to speaker verification. An appropriate selection of the kernel function is a key issue in SVM-based classification. In this letter, a new SVM-based speaker verification method utilizing weighted kernels in the Gaussian mixture model supervector space is proposed. The weighted kernels are derived by using the discriminative training approach, which minimizes speaker verification errors. Experiments performed on the NIST 2008 speaker recognition evaluation task showed that the proposed approach provides substantially improved performance over the baseline kernel-based method.

  12. Knowledge based jet engine diagnostics

    NASA Technical Reports Server (NTRS)

    Jellison, Timothy G.; Dehoff, Ronald L.

    1987-01-01

    A fielded expert system automates equipment fault isolation and recommends corrective maintenance action for Air Force jet engines. The knowledge based diagnostics tool was developed as an expert system interface to the Comprehensive Engine Management System, Increment IV (CEMS IV), the standard Air Force base level maintenance decision support system. XMAM (trademark), the Expert Maintenance Tool, automates procedures for troubleshooting equipment faults, provides a facility for interactive user training, and fits within a diagnostics information feedback loop to improve the troubleshooting and equipment maintenance processes. The application of expert diagnostics to the Air Force A-10A aircraft TF-34 engine equipped with the Turbine Engine Monitoring System (TEMS) is presented.

  13. Cooperating knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Feigenbaum, Edward A.; Buchanan, Bruce G.

    1988-01-01

    This final report covers work performed under Contract NCC2-220 between NASA Ames Research Center and the Knowledge Systems Laboratory, Stanford University. The period of research was from March 1, 1987 to February 29, 1988. Topics covered were as follows: (1) concurrent architectures for knowledge-based systems; (2) methods for the solution of geometric constraint satisfaction problems, and (3) reasoning under uncertainty. The research in concurrent architectures was co-funded by DARPA, as part of that agency's Strategic Computing Program. The research has been in progress since 1985, under DARPA and NASA sponsorship. The research in geometric constraint satisfaction has been done in the context of a particular application, that of determining the 3-D structure of complex protein molecules, using the constraints inferred from NMR measurements.

  14. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  15. Building validation tools for knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Stachowitz, R. A.; Chang, C. L.; Stock, T. S.; Combs, J. B.

    1987-01-01

    The Expert Systems Validation Associate (EVA), a validation system under development at the Lockheed Artificial Intelligence Center for more than a year, provides a wide range of validation tools to check the correctness, consistency and completeness of a knowledge-based system. A declarative meta-language (higher-order language), is used to create a generic version of EVA to validate applications written in arbitrary expert system shells. The architecture and functionality of EVA are presented. The functionality includes Structure Check, Logic Check, Extended Structure Check (using semantic information), Extended Logic Check, Semantic Check, Omission Check, Rule Refinement, Control Check, Test Case Generation, Error Localization, and Behavior Verification.

  16. Operational verification of a blow out preventer utilizing fiber Bragg grating based strain gauges

    NASA Astrophysics Data System (ADS)

    Turner, Alan L.; Loustau, Philippe; Thibodeau, Dan

    2015-05-01

    Ultra-deep water BOP (Blowout Preventer) operation poses numerous challenges in obtaining accurate knowledge of current system integrity and component condition- a salient example is the difficulty of verifying closure of the pipe and shearing rams during and after well control events. Ascertaining the integrity of these functions is currently based on a manual volume measurement performed with a stop watch. Advances in sensor technology now permit more accurate methods of BOP condition monitoring. Fiber optic sensing technology and particularly fiber optic strain gauges have evolved to a point where we can derive a good representation of what is happening inside a BOP by installing sensors on the outside shell. Function signatures can be baselined to establish thresholds that indicate successful function activation. Based on this knowledge base, signal variation over time can then be utilized to assess degradation of these functions and subsequent failure to function. Monitoring the BOP from the outside has the advantage of gathering data through a system that can be interfaced with risk based integrity management software and/or a smart monitoring system that analyzes BOP control redundancies without the requirement of interfacing with OEM control systems. The paper will present the results of ongoing work on a fully instrumented 13-½" 10,000 psi pipe ram. Instrumentation includes commonly used pressure transducers, accelerometers, flow meters, and optical strain gauges. Correlation will be presented between flow, pressure, acceleration signatures and the fiber optic strain gauge's response as it relates to functional verification and component level degradation trending.

  17. CommonKADS models for knowledge-based planning

    SciTech Connect

    Kingston, J.; Shadbolt, N.; Tate, A.

    1996-12-31

    The CommonKADS methodology is a collection of structured methods for building knowledge-based systems. A key component of CommonKADS is the library of generic inference models which can be applied to tasks of specified types. These generic models can either be used as frameworks for knowledge acquisition, or to verify the completeness of models developed by analysis of the domain. However, the generic models for some task types, such as knowledge-based planning, are not well-developed. Since knowledge-based planning is an important commercial application of Artificial Intelligence, there is a clear need for the development of generic models for planning tasks. Many of the generic models which currently exist have been derived from modelling of existing AI systems. These models have the strength of proven applicability. There are a number of well-known and well-tried Al planning systems in existence; one of the best known is the Open Planning Architecture (O-Plan). This paper describes the development of a CommonKADS generic inference model for knowledge-based planning tasks, based on the capabilities of the O-Plan system. The paper also describes the verification of this model in the context of a real-life planning task: the assignment and management of Royal Air Force Search and Rescue operations.

  18. Fuzzy-logic-based safety verification framework for nuclear power plants.

    PubMed

    Rastogi, Achint; Gabbar, Hossam A

    2013-06-01

    This article presents a practical implementation of a safety verification framework for nuclear power plants (NPPs) based on fuzzy logic where hazard scenarios are identified in view of safety and control limits in different plant process values. Risk is estimated quantitatively and compared with safety limits in real time so that safety verification can be achieved. Fuzzy logic is used to define safety rules that map hazard condition with required safety protection in view of risk estimate. Case studies are analyzed from NPP to realize the proposed real-time safety verification framework. An automated system is developed to demonstrate the safety limit for different hazard scenarios. © 2012 Society for Risk Analysis.

  19. A knowledge base architecture for distributed knowledge agents

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel; Walls, Bryan

    1990-01-01

    A tuple space based object oriented model for knowledge base representation and interpretation is presented. An architecture for managing distributed knowledge agents is then implemented within the model. The general model is based upon a database implementation of a tuple space. Objects are then defined as an additional layer upon the database. The tuple space may or may not be distributed depending upon the database implementation. A language for representing knowledge and inference strategy is defined whose implementation takes advantage of the tuple space. The general model may then be instantiated in many different forms, each of which may be a distinct knowledge agent. Knowledge agents may communicate using tuple space mechanisms as in the LINDA model as well as using more well known message passing mechanisms. An implementation of the model is presented describing strategies used to keep inference tractable without giving up expressivity. An example applied to a power management and distribution network for Space Station Freedom is given.

  20. Knowledge Base Editor (SharpKBE)

    NASA Technical Reports Server (NTRS)

    Tikidjian, Raffi; James, Mark; Mackey, Ryan

    2007-01-01

    The SharpKBE software provides a graphical user interface environment for domain experts to build and manage knowledge base systems. Knowledge bases can be exported/translated to various target languages automatically, including customizable target languages.

  1. Isocenter verification for linac-based stereotactic radiation therapy: review of principles and techniques.

    PubMed

    Rowshanfarzad, Pejman; Sabet, Mahsheed; O'Connor, Daryl J; Greer, Peter B

    2011-11-15

    There have been several manual, semi-automatic and fully-automatic methods proposed for verification of the position of mechanical isocenter as part of comprehensive quality assurance programs required for linear accelerator-based stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments. In this paper, a systematic review has been carried out to discuss the present methods for isocenter verification and compare their characteristics, to help physicists in making a decision on selection of their quality assurance routine.

  2. Knowledge-Based Inferences Are Not General

    ERIC Educational Resources Information Center

    Shears, Connie; Chiarello, Christine

    2004-01-01

    Although knowledge-based inferences (Graesser, Singer, & Trabasso, 1994) depend on general knowledge, there may be differences across knowledge areas in how they support these processes. This study explored processing differences between 2 areas of knowledge (physical cause?effect vs. goals and planning) to establish (a) that each would support…

  3. Method and computer product to increase accuracy of time-based software verification for sensor networks

    DOEpatents

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  4. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans

    NASA Astrophysics Data System (ADS)

    Spezi, E.; Lewis, D. G.; Smith, C. W.

    2002-12-01

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region.

  5. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans.

    PubMed

    Spezi, E; Lewis, D G; Smith, C W

    2002-12-07

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region.

  6. A Discussion of Knowledge Based Design

    NASA Technical Reports Server (NTRS)

    Wood, Richard M.; Bauer, Steven X. S.

    1999-01-01

    A discussion of knowledge and Knowledge- Based design as related to the design of aircraft is presented. The paper discusses the perceived problem with existing design studies and introduces the concepts of design and knowledge for a Knowledge- Based design system. A review of several Knowledge-Based design activities is provided. A Virtual Reality, Knowledge-Based system is proposed and reviewed. The feasibility of Virtual Reality to improve the efficiency and effectiveness of aerodynamic and multidisciplinary design, evaluation, and analysis of aircraft through the coupling of virtual reality technology and a Knowledge-Based design system is also reviewed. The final section of the paper discusses future directions for design and the role of Knowledge-Based design.

  7. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    NASA Astrophysics Data System (ADS)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  8. Action-based verification of RTCP-nets with CADP

    NASA Astrophysics Data System (ADS)

    Biernacki, Jerzy; Biernacka, Agnieszka; Szpyrka, Marcin

    2015-12-01

    The paper presents an RTCP-nets' (real-time coloured Petri nets) coverability graphs into Aldebaran format translation algorithm. The approach provides the possibility of automatic RTCP-nets verification using model checking techniques provided by the CADP toolbox. An actual fire alarm control panel system has been modelled and several of its crucial properties have been verified to demonstrate the usability of the approach.

  9. 76 FR 3859 - Trade Acknowledgment and Verification of Security-Based Swap Transactions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-21

    ... COMMISSION 17 CFR Part 240 RIN 3235-AK91 Trade Acknowledgment and Verification of Security-Based Swap... security-based swap dealers and major security-based swap participants to provide trade acknowledgments and to verify those trade acknowledgments in security-based swap transactions. DATES: Comments should be...

  10. Nonlinear knowledge-based classification.

    PubMed

    Mangasarian, Olvi L; Wild, Edward W

    2008-10-01

    In this brief, prior knowledge over general nonlinear sets is incorporated into nonlinear kernel classification problems as linear constraints in a linear program. These linear constraints are imposed at arbitrary points, not necessarily where the prior knowledge is given. The key tool in this incorporation is a theorem of the alternative for convex functions that converts nonlinear prior knowledge implications into linear inequalities without the need to kernelize these implications. Effectiveness of the proposed formulation is demonstrated on publicly available classification data sets, including a cancer prognosis data set. Nonlinear kernel classifiers for these data sets exhibit marked improvements upon the introduction of nonlinear prior knowledge compared to nonlinear kernel classifiers that do not utilize such knowledge.

  11. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    PubMed

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks

  12. Knowledge-based pitch detection

    NASA Astrophysics Data System (ADS)

    Dove, W. P.

    1986-06-01

    Many problems in signal processing involve a mixture of numerical and symbolic knowledge. Examples of problems of this sort include the recognition of speech and the analysis of images. This thesis focuses on the problem of employing a mixture of symbolic and numerical knowledge within a single system, through the development of a system directed at a modified pitch detection problem. For this thesis, the conventional pitch detection problem was modified by providing a phonetic transcript and sex/age information as input to the system, in addition to the acoustic waveform. The Pitch Detector's Assistant (PDA) system that was developed is an interactive facility for evaluating ways of approaching this problem. The PDA system allows the user to interrupt processing at any point, change either input data, derived data, or problem knowledge and continue execution.

  13. Knowledge-based assistance in costing the space station DMS

    NASA Technical Reports Server (NTRS)

    Henson, Troy; Rone, Kyle

    1988-01-01

    The Software Cost Engineering (SCE) methodology developed over the last two decades at IBM Systems Integration Division (SID) in Houston is utilized to cost the NASA Space Station Data Management System (DMS). An ongoing project to capture this methodology, which is built on a foundation of experiences and lessons learned, has resulted in the development of an internal-use-only, PC-based prototype that integrates algorithmic tools with knowledge-based decision support assistants. This prototype Software Cost Engineering Automation Tool (SCEAT) is being employed to assist in the DMS costing exercises. At the same time, DMS costing serves as a forcing function and provides a platform for the continuing, iterative development, calibration, and validation and verification of SCEAT. The data that forms the cost engineering database is derived from more than 15 years of development of NASA Space Shuttle software, ranging from low criticality, low complexity support tools to highly complex and highly critical onboard software.

  14. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    ... Security number (SSN) verification services to enrolled private businesses, State and local government... to third party requesters based on consent. The CBSV process provides users with a consent-based SSN... management benefits. New Information: To use CBSV, interested parties must pay a one- time...

  15. Knowledge-Based Search Tactics.

    ERIC Educational Resources Information Center

    Shute, Steven J.; Smith, Philip J.

    1993-01-01

    Describes an empirical study that was conducted to examine the performance of expert search intermediaries from Chemical Abstracts Service. Highlights include subject-independent and subject-dependent expertise; a model of the use of subject-specific knowledge; and implications for computerized intermediary systems and for training human…

  16. Knowledge-Based Search Tactics.

    ERIC Educational Resources Information Center

    Shute, Steven J.; Smith, Philip J.

    1993-01-01

    Describes an empirical study that was conducted to examine the performance of expert search intermediaries from Chemical Abstracts Service. Highlights include subject-independent and subject-dependent expertise; a model of the use of subject-specific knowledge; and implications for computerized intermediary systems and for training human…

  17. Common Sense about Uncommon Knowledge: The Knowledge Bases for Diversity.

    ERIC Educational Resources Information Center

    Smith, G. Pritchy

    This book explains knowledge bases for teaching diverse student populations. An introduction displays one first-year teacher's experiences with diverse students in a high school classroom in San Angelo, Texas in 1961. The 15 chapters are: (1) "Toward Defining Culturally Responsible and Responsive Teacher Education"; (2) "Knowledge…

  18. Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder

    NASA Technical Reports Server (NTRS)

    Lindsey, A. E.; Pecheur, Charles

    2004-01-01

    AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.

  19. Evaluation of an electrocardiograph-based PICC tip verification system.

    PubMed

    Oliver, Gemma; Jones, Matt

    Performing a chest x-ray after insertion of a peripherally inserted central catheter (PICC) is recognised as the gold standard for checking that the tip of the catheter is correctly positioned in the lower third of the superior vena cava at the right atrial junction; however, numerous problems are associated with this practice. A recent technological advancement has been developed that utilises changes in a patient's electrocardiograph (ECG) recorded from the tip of the PICC as a more reliable method. This evaluation discusses how a vascular access team in a large acute NHS Trust safely and successfully incorporated the use of ECG guidance technology for verification of PICC tip placement into their practice.

  20. Persistent Data/Knowledge Base

    DTIC Science & Technology

    1991-06-01

    Processing in PDKB 47 8 Future Research Topics 51 9 Conclusion 59 A cknowledgnents iii Acknowledgments This research was supported by the Rome Air...queries are processed and rules are inferenced in PDKB. Chapter 8 discusses future research topics and directions for PDKB. They include reasoning...existing knowledge, and forward chaining will be triggered., A detailed query and inferencing processing design is beyond the scope of this research

  1. Online signature verification and recognition: an approach based on symbolic representation.

    PubMed

    Guru, D S; Prakash, H N

    2009-06-01

    In this paper, we propose a new method of representing on-line signatures by interval valued symbolic features. Global features of on-line signatures are used to form an interval valued feature vectors. Methods for signature verification and recognition based on the symbolic representation are also proposed. We exploit the notions of writer dependent threshold and introduce the concept of feature dependent threshold to achieve a significant reduction in equal error rate. Several experiments are conducted to demonstrate the ability of the proposed scheme in discriminating the genuine signatures from the forgeries. We investigate the feasibility of the proposed representation scheme for signature verification and also signature recognition using all 16500 signatures from 330 individuals of the MCYT bimodal biometric database. Further, extensive experimentations are conducted to evaluate the performance of the proposed methods by projecting features onto Eigenspace and Fisherspace. Unlike other existing signature verification methods, the proposed method is simple and efficient. The results of the experimentations reveal that the proposed scheme outperforms several other existing verification methods including the state-of-the-art method for signature verification.

  2. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    PubMed Central

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-01

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375

  3. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.

    PubMed

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-10

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  4. Feasibility study on dosimetry verification of volumetric-modulated arc therapy-based total marrow irradiation.

    PubMed

    Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K

    2013-03-04

    The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.

  5. UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces

    NASA Technical Reports Server (NTRS)

    Shiffman, Smadar; Degani, Asaf; Heymann, Michael

    2004-01-01

    In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.

  6. UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces

    NASA Technical Reports Server (NTRS)

    Shiffman, Smadar; Degani, Asaf; Heymann, Michael

    2004-01-01

    In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.

  7. Integrity Verification for Multiple Data Copies in Cloud Storage Based on Spatiotemporal Chaos

    NASA Astrophysics Data System (ADS)

    Long, Min; Li, You; Peng, Fei

    Aiming to strike for a balance between the security, efficiency and availability of the data verification in cloud storage, a novel integrity verification scheme based on spatiotemporal chaos is proposed for multiple data copies. Spatiotemporal chaos is implemented for node calculation of the binary tree, and the location of the data in the cloud is verified. Meanwhile, dynamic operation can be made to the data. Furthermore, blind information is used to prevent a third-party auditor (TPA) leakage of the users’ data privacy in a public auditing process. Performance analysis and discussion indicate that it is secure and efficient, and it supports dynamic operation and the integrity verification of multiple copies of data. It has a great potential to be implemented in cloud storage services.

  8. Information verification and encryption based on phase retrieval with sparsity constraints and optical inference

    NASA Astrophysics Data System (ADS)

    Zhong, Shenlu; Li, Mengjiao; Tang, Xiajie; He, Weiqing; Wang, Xiaogang

    2017-01-01

    A novel optical information verification and encryption method is proposed based on inference principle and phase retrieval with sparsity constraints. In this method, a target image is encrypted into two phase-only masks (POMs), which comprise sparse phase data used for verification. Both of the two POMs need to be authenticated before being applied for decrypting. The target image can be optically reconstructed when the two authenticated POMs are Fourier transformed and convolved by the correct decryption key, which is also generated in encryption process. No holographic scheme is involved in the proposed optical verification and encryption system and there is also no problem of information disclosure in the two authenticable POMs. Numerical simulation results demonstrate the validity and good performance of this new proposed method.

  9. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  10. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  11. In-orbit verification of optical inter-satellite communication links based on homodyne BPSK

    NASA Astrophysics Data System (ADS)

    Smutny, Berry; Lange, Robert; Kämpfner, Hartmut; Dallmann, Daniel; Mühlnikel, Gerd; Reinhardt, Martin; Saucke, Karen; Sterr, Uwe; Wandernoth, Bernhard; Czichy, Reinhard

    2008-02-01

    Laser communication terminals based on homodyne BPSK are under in-orbit verification in LEO-to-ground and duplex LEO-LEO 5.65 Gbps links. With the LEO-to-ground link beacon-less acquisition has been verified as a reliable and quick acquisition procedure with acquisition times less than one minute.

  12. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    PubMed Central

    Xian, Xuefeng; Cui, Zhiming

    2017-01-01

    Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost. PMID:28588611

  13. Utilizing Data and Knowledge Mining for Probabilistic Knowledge Bases

    DTIC Science & Technology

    1996-12-01

    everything. My dear wife Tamara deserves an award simply for tolerating me these last 18 months. This is the second time she has been with me during a degree...and the flexibility of its knowledge representation scheme is an inverse one. In order to implement a realistic, real-world application, both of these ...information from some text- based source, such as an on-line encyclopedia or an Internet web page. Most often, these systems are highly focused and specialize

  14. Verification Benchmarks to Assess the Implementation of Computational Fluid Dynamics Based Hemolysis Prediction Models.

    PubMed

    Hariharan, Prasanna; D'Souza, Gavin; Horner, Marc; Malinauskas, Richard A; Myers, Matthew R

    2015-09-01

    As part of an ongoing effort to develop verification and validation (V&V) standards for using computational fluid dynamics (CFD) in the evaluation of medical devices, we have developed idealized flow-based verification benchmarks to assess the implementation of commonly cited power-law based hemolysis models in CFD. Verification process ensures that all governing equations are solved correctly and the model is free of user and numerical errors. To perform verification for power-law based hemolysis modeling, analytical solutions for the Eulerian power-law blood damage model (which estimates hemolysis index (HI) as a function of shear stress and exposure time) were obtained for Couette and inclined Couette flow models, and for Newtonian and non-Newtonian pipe flow models. Subsequently, CFD simulations of fluid flow and HI were performed using Eulerian and three different Lagrangian-based hemolysis models and compared with the analytical solutions. For all the geometries, the blood damage results from the Eulerian-based CFD simulations matched the Eulerian analytical solutions within ∼1%, which indicates successful implementation of the Eulerian hemolysis model. Agreement between the Lagrangian and Eulerian models depended upon the choice of the hemolysis power-law constants. For the commonly used values of power-law constants (α  = 1.9-2.42 and β  = 0.65-0.80), in the absence of flow acceleration, most of the Lagrangian models matched the Eulerian results within 5%. In the presence of flow acceleration (inclined Couette flow), moderate differences (∼10%) were observed between the Lagrangian and Eulerian models. This difference increased to greater than 100% as the beta exponent decreased. These simplified flow problems can be used as standard benchmarks for verifying the implementation of blood damage predictive models in commercial and open-source CFD codes. The current study only used power-law model as an illustrative example to emphasize the need

  15. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    SciTech Connect

    Tachibana, H; Tachibana, R

    2015-06-15

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification software program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction.

  16. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    NASA Astrophysics Data System (ADS)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the

  17. Knowledge Based Systems and Metacognition in Radar

    NASA Astrophysics Data System (ADS)

    Capraro, Gerard T.; Wicks, Michael C.

    An airborne ground looking radar sensor's performance may be enhanced by selecting algorithms adaptively as the environment changes. A short description of an airborne intelligent radar system (AIRS) is presented with a description of the knowledge based filter and detection portions. A second level of artificial intelligence (AI) processing is presented that monitors, tests, and learns how to improve and control the first level. This approach is based upon metacognition, a way forward for developing knowledge based systems.

  18. Logic Programming and Knowledge Base Maintenance.

    DTIC Science & Technology

    1983-11-01

    fallen into-three classes: 1) expression of various quasi-intelligent expert systems tasks; 2) development of basic knowledge base systems; and 3...exploration of reasoning systems for maintenance of knowledge bases. We discuss each of these below. 1) Expression of expert systems tasks. We have coded...and run in metaProlog a diagnostic assistant based on the Oak Ridge spills expert of [Rosie3. This experiment demonstrated the usefulness and

  19. Knowledge-Based Replanning System.

    DTIC Science & Technology

    1987-05-01

    The mid-1 970s was a bad time for theoretical linguistics ; the standard theory of Noam Chomsky had begun to lose ground to its various critics, but...researchers built good parsers that had at least some structural component based on linguistic theory . Also, CD based parsers began to show their...few adjectives anid ad,.erbs appear in the APE-II dcictionars the, are all but impossible to define adequately in terms of P* ( s , ([) theory crew

  20. Image-Based Verification Algorithms for Arms Control

    SciTech Connect

    Robinson, Sean M.; Jarman, Kenneth D.; Seifert, Allen; McDonald, Benjamin S.; Misner, Alex C.; White, Timothy A.; Miller, Erin A.; Pitts, W. Karl

    2011-06-09

    PNNL is developing and evaluating radiographic image analysis techniques (active and passive) for verifying sensitive objects in a material control or warhead counting regime in which sensitive information may be acquired and processed behind an information barrier. Since sensitive image information cannot be present outside the information barrier, these techniques are necessary to extract features from the full images and reduce them to relevant parameters (attributes) of the inspected items. This evaluation can be done behind the information barrier, allowing for reporting and storage of non-sensitive attributes only. Several advances have been made to radiographic object verification algorithms, in the areas of spectral imaging for passive detectors and estimation of material density in active radiographic images. Both of these advances are pertinent in an arms control context. While passive radiographic images produced by previous work may be evaluated for the presence of emissive objects, approaches which leverage the spectroscopic potential of the detectors allow a much greater discrimination of SNM from background and other sources. Spectral passive imaging approaches to warhead discrimination and counting include specific materials and geometric arrangement localization, as well as “spectral difference” metrics which group regions with similar spectra together. These approaches may improve resolution for discrimination between materials in addition to locating SNM within surrounding shielding and/or structural elements. Previous work by our group has developed the capability to discern material density and composition in radiographic images by examining the edge transition characteristics of objects. The material construction of an object can be investigated in this way. In a weapons counting or discrimination context, unknown occultation of objects of interest, as well as additional elements of warhead construction, construction materials of varying

  1. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    SciTech Connect

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H

    2015-06-15

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  2. A Knowledge Base for FIA Data Uses

    Treesearch

    Victor A. Rudis

    2005-01-01

    Knowledge management provides a way to capture the collective wisdom of an organization, facilitate organizational learning, and foster opportunities for improvement. This paper describes a knowledge base compiled from uses of field observations made by the U.S. Department of Agriculture Forest Service, Forest Inventory and Analysis program and a citation database of...

  3. Updating knowledge bases with disjunctive information

    SciTech Connect

    Zhang, Yan; Foo, Norman Y.

    1996-12-31

    It is well known that the minimal change principle was widely used in knowledge base updates. However, recent research has shown that conventional minimal change methods, eg. the PMA, are generally problematic for updating knowledge bases with disjunctive information. In this paper, we propose two different approaches to deal with this problem - one is called the minimal change with exceptions (MCE), the other is called the minimal change with maximal disjunctive inclusions (MCD). The first method is syntax-based, while the second is model-theoretic. We show that these two approaches are equivalent for propositional knowledge base updates, and the second method is also appropriate for first order knowledge base updates. We then prove that our new update approaches still satisfy the standard Katsuno and Mendelzon`s update postulates.

  4. Knowledge-Based Learning: Integration of Deductive and Inductive Learning for Knowledge Base Completion.

    ERIC Educational Resources Information Center

    Whitehall, Bradley Lane

    In constructing a knowledge-based system, the knowledge engineer must convert rules of thumb provided by the domain expert and previously solved examples into a working system. Research in machine learning has produced algorithms that create rules for knowledge-based systems, but these algorithms require either many examples or a complete domain…

  5. Knowledge-Based Learning: Integration of Deductive and Inductive Learning for Knowledge Base Completion.

    ERIC Educational Resources Information Center

    Whitehall, Bradley Lane

    In constructing a knowledge-based system, the knowledge engineer must convert rules of thumb provided by the domain expert and previously solved examples into a working system. Research in machine learning has produced algorithms that create rules for knowledge-based systems, but these algorithms require either many examples or a complete domain…

  6. The Coming of Knowledge-Based Business.

    ERIC Educational Resources Information Center

    Davis, Stan; Botkin, Jim

    1994-01-01

    Economic growth will come from knowledge-based businesses whose "smart" products filter and interpret information. Businesses will come to think of themselves as educators and their customers as learners. (SK)

  7. The Coming of Knowledge-Based Business.

    ERIC Educational Resources Information Center

    Davis, Stan; Botkin, Jim

    1994-01-01

    Economic growth will come from knowledge-based businesses whose "smart" products filter and interpret information. Businesses will come to think of themselves as educators and their customers as learners. (SK)

  8. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE PAGES

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...

    2017-03-23

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  9. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  10. Methodology for testing and validating knowledge bases

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, C.; Padalkar, S.; Sztipanovits, J.; Purves, B. R.

    1987-01-01

    A test and validation toolset developed for artificial intelligence programs is described. The basic premises of this method are: (1) knowledge bases have a strongly declarative character and represent mostly structural information about different domains, (2) the conditions for integrity, consistency, and correctness can be transformed into structural properties of knowledge bases, and (3) structural information and structural properties can be uniformly represented by graphs and checked by graph algorithms. The interactive test and validation environment have been implemented on a SUN workstation.

  11. Knowledge based programming environments: A perspective

    NASA Technical Reports Server (NTRS)

    Amin, Ashok T.

    1988-01-01

    Programming environments is an area of recent origin and refers to an integrated set of tools, such as program library, text editor, compiler, and debugger, in support of program development. Understanding of programs and programming has lead to automated techniques for program development. Knowledge based programming system using program transformations offer significant impact on future program development methodologies. A review of recent developments in the area of knowledge based programming environments, from the perspective of software engineering, is presented.

  12. Distributed, cooperating knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt

    1991-01-01

    Some current research in the development and application of distributed, cooperating knowledge-based systems technology is addressed. The focus of the current research is the spacecraft ground operations environment. The underlying hypothesis is that, because of the increasing size, complexity, and cost of planned systems, conventional procedural approaches to the architecture of automated systems will give way to a more comprehensive knowledge-based approach. A hallmark of these future systems will be the integration of multiple knowledge-based agents which understand the operational goals of the system and cooperate with each other and the humans in the loop to attain the goals. The current work includes the development of a reference model for knowledge-base management, the development of a formal model of cooperating knowledge-based agents, the use of testbed for prototyping and evaluating various knowledge-based concepts, and beginning work on the establishment of an object-oriented model of an intelligent end-to-end (spacecraft to user) system. An introductory discussion of these activities is presented, the major concepts and principles being investigated are highlighted, and their potential use in other application domains is indicated.

  13. An Insulating Glass Knowledge Base

    SciTech Connect

    Michael L. Doll; Gerald Hendrickson; Gerard Lagos; Russell Pylkki; Chris Christensen; Charlie Cureija

    2005-08-01

    This report will discuss issues relevant to Insulating Glass (IG) durability performance by presenting the observations and developed conclusions in a logical sequential format. This concluding effort discusses Phase II activities and focuses on beginning to quantifying IG durability issues while continuing the approach presented in the Phase I activities (Appendix 1) which discuss a qualitative assessment of durability issues. Phase II developed a focus around two specific IG design classes previously presented in Phase I of this project. The typical box spacer and thermoplastic spacer design including their Failure Modes and Effect Analysis (FMEA) and Fault Tree diagrams were chosen to address two currently used IG design options with varying components and failure modes. The system failures occur due to failures of components or their interfaces. Efforts to begin quantifying the durability issues focused on the development and delivery of an included computer based IG durability simulation program. The focus/effort to deliver the foundation for a comprehensive IG durability simulation tool is necessary to address advancements needed to meet current and future building envelope energy performance goals. This need is based upon the current lack of IG field failure data and the lengthy field observation time necessary for this data collection. Ultimately, the simulation program is intended to be used by designers throughout the current and future industry supply chain. Its use is intended to advance IG durability as expectations grow around energy conservation and with the growth of embedded technologies as required to meet energy needs. In addition the tool has the immediate benefit of providing insight for research and improvement prioritization. Included in the simulation model presentation are elements and/or methods to address IG materials, design, process, quality, induced stress (environmental and other factors), validation, etc. In addition, acquired data

  14. On-Board Imager-based MammoSite treatment verification.

    PubMed

    Wojcicka, Jadwiga; Yankelevich, Rafael; Iorio, Stephen; Tinger, Alfred

    2007-11-01

    Contemporary radiation oncology departments are often lacking a conventional simulator due to common use of virtual simulation and recent implementation of image guided radiation therapy. A protocol based on MammoSite method was developed using CT based planning, a Source Position Simulator (SPS) with a Simulator Wire and a linear accelerator based On-Board Imager (OBI) for daily verification. After MammoSite balloon implantation, the patient undergoes a CT study. The images are evaluated for tissue conformance, balloon symmetry, and balloon surface to skin distance according to the departmental procedure. Prior to the CT study the SPS is attached to the transfer tube that in turn is attached to the balloon catheter. The length from the indexer to the first dwell position is measured using the simulator wire with X-ray markers. After the CT study is performed, the data set is sent to the Varian Eclipse treatment planning system (TPS) and to the Nucletron PLATO brachytherapy planning system. The reference digitally reconstructed radiographs (DRRs) of anterior and lateral setup fields are created using Eclipse TPS and are immediately available on the OBI console via the Varian Vision integrated system. The source dwell position coinciding with the balloon center is identified in the CT dataset, followed by the offset calculation, catheter reconstruction, dose points placement and dwell time calculation. OBI fluoroscopy images are acquired and marked as initial. Prior to each treatment fraction balloon diameter and symmetry are evaluated using OBI fluoroscopy and tools available on the OBI console. Acquired images are compared with reference DRRs and/or initial OBI images. The whole process from initial evaluation to daily verification is filmless and does not undermine the precision of the procedure. This verification time does not exceed 10 min. The balloon diameter correlates well (within 1 mm) between initial CT and OBI verification images. The balloon symmetry is

  15. The importance of knowledge-based technology.

    PubMed

    Cipriano, Pamela F

    2012-01-01

    Nurse executives are responsible for a workforce that can provide safer and more efficient care in a complex sociotechnical environment. National quality priorities rely on technologies to provide data collection, share information, and leverage analytic capabilities to interpret findings and inform approaches to care that will achieve better outcomes. As a key steward for quality, the nurse executive exercises leadership to provide the infrastructure to build and manage nursing knowledge and instill accountability for following evidence-based practices. These actions contribute to a learning health system where new knowledge is captured as a by-product of care delivery enabled by knowledge-based electronic systems. The learning health system also relies on rigorous scientific evidence embedded into practice at the point of care. The nurse executive optimizes use of knowledge-based technologies, integrated throughout the organization, that have the capacity to help transform health care.

  16. Patient Dependency Knowledge-Based Systems.

    PubMed

    Soliman, F

    1998-10-01

    The ability of Patient Dependency Systems to provide information for staffing decisions and budgetary development has been demonstrated. In addition, they have become powerful tools in modern hospital management. This growing interest in Patient Dependency Systems has renewed calls for their automation. As advances in Information Technology and in particular Knowledge-Based Engineering reach new heights, hospitals can no longer afford to ignore the potential benefits obtainable from developing and implementing Patient Dependency Knowledge-Based Systems. Experience has shown that the vast majority of decisions and rules used in the Patient Dependency method are too complex to capture in the form of a traditional programming language. Furthermore, the conventional Patient Dependency Information System automates the simple and rigid bookkeeping functions. On the other hand Knowledge-Based Systems automate complex decision making and judgmental processes and therefore are the appropriate technology for automating the Patient Dependency method. In this paper a new technique to automate Patient Dependency Systems using knowledge processing is presented. In this approach all Patient Dependency factors have been translated into a set of Decision Rules suitable for use in a Knowledge-Based System. The system is capable of providing the decision-maker with a number of scenarios and their possible outcomes. This paper also presents the development of Patient Dependency Knowledge-Based Systems, which can be used in allocating and evaluating resources and nursing staff in hospitals on the basis of patients' needs.

  17. Feasibility of biochemical verification in a web-based smoking cessation study.

    PubMed

    Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L

    2017-10-01

    Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. First clinical application of a prompt gamma based in vivo proton range verification system.

    PubMed

    Richter, Christian; Pausch, Guntram; Barczyk, Steffen; Priegnitz, Marlen; Keitz, Isabell; Thiele, Julia; Smeets, Julien; Stappen, Francois Vander; Bombelli, Luca; Fiorini, Carlo; Hotoiu, Lucian; Perali, Irene; Prieels, Damien; Enghardt, Wolfgang; Baumann, Michael

    2016-02-01

    To improve precision of particle therapy, in vivo range verification is highly desirable. Methods based on prompt gamma rays emitted during treatment seem promising but have not yet been applied clinically. Here we report on the worldwide first clinical application of prompt gamma imaging (PGI) based range verification. A prototype of a knife-edge shaped slit camera was used to measure the prompt gamma ray depth distribution during a proton treatment of a head and neck tumor for seven consecutive fractions. Inter-fractional variations of the prompt gamma profile were evaluated. For three fractions, in-room control CTs were acquired and evaluated for dose relevant changes. The measurement of PGI profiles during proton treatment was successful. Based on the PGI information, inter-fractional global range variations were in the range of ±2 mm for all evaluated fractions. This is in agreement with the control CT evaluation showing negligible range variations of about 1.5mm. For the first time, range verification based on prompt gamma imaging was applied for a clinical proton treatment. With the translation from basic physics experiments into clinical operation, the potential to improve the precision of particle therapy with this technique has increased considerably. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Face verification system for Android mobile devices using histogram based features

    NASA Astrophysics Data System (ADS)

    Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu

    2016-07-01

    This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.

  20. Bridging the gap: simulations meet knowledge bases

    NASA Astrophysics Data System (ADS)

    King, Gary W.; Morrison, Clayton T.; Westbrook, David L.; Cohen, Paul R.

    2003-09-01

    Tapir and Krill are declarative languages for specifying actions and agents, respectively, that can be executed in simulation. As such, they bridge the gap between strictly declarative knowledge bases and strictly executable code. Tapir and Krill components can be combined to produce models of activity which can answer questions about mechanisms and processes using conventional inference methods and simulation. Tapir was used in DARPA's Rapid Knowledge Formation (RKF) project to construct models of military tactics from the Army Field Manual FM3-90. These were then used to build Courses of Actions (COAs) which could be critiqued by declarative reasoning or via Monte Carlo simulation. Tapir and Krill can be read and written by non-knowledge engineers making it an excellent vehicle for Subject Matter Experts to build and critique knowledge bases.

  1. A knowledge based approach to VLSI CAD

    NASA Astrophysics Data System (ADS)

    Steinberg, L. I.; Mitchell, T. M.

    1983-09-01

    Artificial Intelligence (AI) techniques offer one possible avenue toward new CAD tools to handle the complexities of VLSI. This paper summarizes the experience of the Rutgers AI/VLSI group in exploring applications of AI to VLSI design over the past few years. In particular, it summarizes our experience in developing REDESIGN, a knowledge-based system for providing interactive aid in the functional redesign of digital circuits. Given a desired change to the function of a circuit, REDESIGN combines rule-based knowledge of design tactics with its ability to analyze signal propagation through circuits, in order to (1) help the user focus on an appropriate portion of the circuit to redesign, (2) suggest local redesign alternatives, and (3) determine side effects of possible redesigns. We also summarize our more recent research toward constructing a knowledge-based system for VLSI design and a system for chip debugging, both based on extending the techniques used by the REDESIGN system.

  2. Knowledge-based Autonomous Test Engineer (KATE)

    NASA Technical Reports Server (NTRS)

    Parrish, Carrie L.; Brown, Barbara L.

    1991-01-01

    Mathematical models of system components have long been used to allow simulators to predict system behavior to various stimuli. Recent efforts to monitor, diagnose, and control real-time systems using component models have experienced similar success. NASA Kennedy is continuing the development of a tool for implementing real-time knowledge-based diagnostic and control systems called KATE (Knowledge based Autonomous Test Engineer). KATE is a model-based reasoning shell designed to provide autonomous control, monitoring, fault detection, and diagnostics for complex engineering systems by applying its reasoning techniques to an exchangeable quantitative model describing the structure and function of the various system components and their systemic behavior.

  3. Geothermal-resource verification for Air Force bases

    SciTech Connect

    Grant, P.R. Jr.

    1981-06-01

    This report summarizes the various types of geothermal energy reviews some legal uncertainties of the resource and then describes a methodology to evaluate geothermal resources for applications to US Air Force bases. Estimates suggest that exploration costs will be $50,000 to $300,000, which, if favorable, would lead to drilling a $500,000 exploration well. Successful identification and development of a geothermal resource could provide all base, fixed system needs with an inexpensive, renewable energy source.

  4. Formal verification of software-based medical devices considering medical guidelines.

    PubMed

    Daw, Zamira; Cleaveland, Rance; Vetter, Marcus

    2014-01-01

    Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one

  5. Geophysical System Verification (GSV): A Physics-Based Alternative to Geophysical Prove-Outs for Munitions Response. Addendum

    DTIC Science & Technology

    2015-09-24

    FINAL REPORT Geophysical System Verification (GSV): A Physics-Based Alternative to Geophysical Prove-Outs for Munitions Response July 2009...NUMBER (Include area code) 15-07-2009 Final Report July 2009 GEOPHYSICAL SYSTEM VERIFICATION (GSV): A PHYSICS-BASED ALTERNATIVE TO GEOPHYSICAL PROVE-OUTS...Technical Center for Explosive Safety (USATCES). 2000. “Study of Ammunition Dud and Low Order Detonation Rates.” SFIM-AEC-ET-CR-200049. July 2000. 2

  6. Geophysical System Verification (GSV): A Physics-Based Alternative to Geophysical Prove-Outs for Munitions Response

    DTIC Science & Technology

    2015-09-24

    FINAL REPORT Geophysical System Verification (GSV): A Physics-Based Alternative to Geophysical Prove-Outs for Munitions Response July 2009...NUMBER (Include area code) 15-07-2009 Final Report July 2009 GEOPHYSICAL SYSTEM VERIFICATION (GSV): A PHYSICS-BASED ALTERNATIVE TO GEOPHYSICAL PROVE-OUTS...Technical Center for Explosive Safety (USATCES). 2000. “Study of Ammunition Dud and Low Order Detonation Rates.” SFIM-AEC-ET-CR-200049. July 2000. 2

  7. Knowledge-based scheduling of arrival aircraft

    NASA Technical Reports Server (NTRS)

    Krzeczowski, K.; Davis, T.; Erzberger, H.; Lev-Ram, I.; Bergh, C.

    1995-01-01

    A knowledge-based method for scheduling arrival aircraft in the terminal area has been implemented and tested in real-time simulation. The scheduling system automatically sequences, assigns landing times, and assigns runways to arrival aircraft by utilizing continuous updates of aircraft radar data and controller inputs. The scheduling algorithms is driven by a knowledge base which was obtained in over two thousand hours of controller-in-the-loop real-time simulation. The knowledge base contains a series of hierarchical 'rules' and decision logic that examines both performance criteria, such as delay reduction, as well as workload reduction criteria, such as conflict avoidance. The objective of the algorithms is to devise an efficient plan to land the aircraft in a manner acceptable to the air traffic controllers. This paper will describe the scheduling algorithms, give examples of their use, and present data regarding their potential benefits to the air traffic system.

  8. Knowledge-based diagnosis for aerospace systems

    NASA Technical Reports Server (NTRS)

    Atkinson, David J.

    1988-01-01

    The need for automated diagnosis in aerospace systems and the approach of using knowledge-based systems are examined. Research issues in knowledge-based diagnosis which are important for aerospace applications are treated along with a review of recent relevant research developments in Artificial Intelligence. The design and operation of some existing knowledge-based diagnosis systems are described. The systems described and compared include the LES expert system for liquid oxygen loading at NASA Kennedy Space Center, the FAITH diagnosis system developed at the Jet Propulsion Laboratory, the PES procedural expert system developed at SRI International, the CSRL approach developed at Ohio State University, the StarPlan system developed by Ford Aerospace, the IDM integrated diagnostic model, and the DRAPhys diagnostic system developed at NASA Langley Research Center.

  9. The Application of GeoRSC Based on Domestic Satellite in Field Remote Sensing Anomaly Verification

    NASA Astrophysics Data System (ADS)

    Gao, Ting; Yang, Min; Han, Haihui; Li, Jianqiang; Yi, Huan

    2016-11-01

    The Geo REC is the digital remote sensing survey system which based on domestic satellites, and by means of it, the thesis carriedy out a remote sensing anomaly verification field application test in Nachitai area of Qinghai. Field test checks the system installation, the stability of the system operation, the efficiency of reading and show the romoate image or vector data, the security of the data management system and the accuracy of BeiDou navigation; through the test data, the author indicated that the hardware and software system could satisfy the remote sensing anomaly verification work in field, which could also could make it convenient forconvenient the workflow of remote sense survey and, improve the work efficiency,. Aat the same time, in the course of the experiment, we also found some shortcomings of the system, and give some suggestions for improvement combineding with the practical work for the system.

  10. Multimodal human verification using stereo-based 3D inforamtion, IR, and speech

    NASA Astrophysics Data System (ADS)

    Park, Changhan

    2007-04-01

    In this paper, we propose a personal verification method using 3D face information, infrared (IR), and speech to improve the rate of single biometric authentication. False acceptance rate (FAR) and false rejection rate (FRR) have been a fundamental bottleneck of real-time personal verification. Proposed method uses principal component analysis (PCA) for face recognition and hidden markov model (HMM) for speech recognition based on stereo acquisition system with IR imagery. 3D face information acquires face's depth and distance using a stereo system. The proposed system consists of eye detection, facial pose direction estimation, and PCA modules. An IR image of the human face presents its unique heat-signature and can be used for recognition. IR images use only for decision whether human face or not. It also uses fuzzy logic for the final decision of personal verification. Based on experimental results, the proposed system can reduce FAR which provides that the proposed method overcomes the limitation of single biometric system and provides stable person authentication in real-time.

  11. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    PubMed Central

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  12. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    PubMed

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  13. Collaboration-based medical knowledge recommendation.

    PubMed

    Huang, Zhengxing; Lu, Xudong; Duan, Huilong; Zhao, Chenhui

    2012-05-01

    Clinicians rely on a large amount of medical knowledge when performing clinical work. In clinical environment, clinical organizations must exploit effective methods of seeking and recommending appropriate medical knowledge in order to help clinicians perform their work. Aiming at supporting medical knowledge search more accurately and realistically, this paper proposes a collaboration-based medical knowledge recommendation approach. In particular, the proposed approach generates clinician trust profile based on the measure of trust factors implicitly from clinicians' past rating behaviors on knowledge items. And then the generated clinician trust profile is incorporated into collaborative filtering techniques to improve the quality of medical knowledge recommendation, to solve the information-overload problem by suggesting knowledge items of interest to clinicians. Two case studies are conducted at Zhejiang Huzhou Central Hospital of China. One case study is about the drug recommendation hold in the endocrinology department of the hospital. The experimental dataset records 16 clinicians' drug prescribing tracks in six months. This case study shows a proof-of-concept of the proposed approach. The other case study addresses the problem of radiological computed tomography (CT)-scan report recommendation. In particular, 30 pieces of CT-scan examinational reports about cerebral hemorrhage patients are collected from electronic medical record systems of the hospital, and are evaluated and rated by 19 radiologists of the radiology department and 7 clinicians of the neurology department, respectively. This case study provides some confidence the proposed approach will scale up. The experimental results show that the proposed approach performs well in recommending medical knowledge items of interest to clinicians, which indicates that the proposed approach is feasible in clinical practice. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    SciTech Connect

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor

    2013-12-15

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  15. Knowledge-based flow field zoning

    NASA Technical Reports Server (NTRS)

    Andrews, Alison E.

    1988-01-01

    Automation flow field zoning in two dimensions is an important step towards easing the three-dimensional grid generation bottleneck in computational fluid dynamics. A knowledge based approach works well, but certain aspects of flow field zoning make the use of such an approach challenging. A knowledge based flow field zoner, called EZGrid, was implemented and tested on representative two-dimensional aerodynamic configurations. Results are shown which illustrate the way in which EZGrid incorporates the effects of physics, shape description, position, and user bias in a flow field zoning.

  16. A knowledge-based approach to design

    NASA Astrophysics Data System (ADS)

    Mitchell, T. M.; Steinberg, L. I.; Shulman, J. S.

    1985-09-01

    The potential advantages of knowledge-based methods for computer-aided design are examined, and the organization of VEXED, a knowledge-based system for VLSI design, is described in detail. In particular, attention is given to the principles underlying the design of VEXED and several issues that have arisen from implementing and experimenting with the prototype system. The issues discussed include questions regarding the grainsize of rules, the possibility of learning new rules automatically, and issues related to constraint propagation and management.

  17. Knowledge-based commodity distribution planning

    NASA Technical Reports Server (NTRS)

    Saks, Victor; Johnson, Ivan

    1994-01-01

    This paper presents an overview of a Decision Support System (DSS) that incorporates Knowledge-Based (KB) and commercial off the shelf (COTS) technology components. The Knowledge-Based Logistics Planning Shell (KBLPS) is a state-of-the-art DSS with an interactive map-oriented graphics user interface and powerful underlying planning algorithms. KBLPS was designed and implemented to support skilled Army logisticians to prepare and evaluate logistics plans rapidly, in order to support corps-level battle scenarios. KBLPS represents a substantial advance in graphical interactive planning tools, with the inclusion of intelligent planning algorithms that provide a powerful adjunct to the planning skills of commodity distribution planners.

  18. BEAM-BASED SEXTUPOLE POLARITY VERIFICATION IN THE RHIC

    SciTech Connect

    LUO,Y.; SATOGATA, T.; CAMERON, P.; DELLAPENNA, A.; TRBOJEVIC, D.

    2007-06-25

    This article presents a beam-based method to check RHIC arc sextupole polarities using local horizontal orbit three-bumps at injection energy. We use 11 bumps in each arc, each covering two SFs (focusing sextupoles) and one SD (defocusing sextupole). If there are no wrong sextupole polarities, the tune shifts from bump to bump and the tune shift patterns from arc to arc should be similar. Wrong sextupole polarities can be easily identified from mismatched signs or amplitudes of tune shifts from bump to bump and/or from arc to arc. Tune shifts in both planes during this study were tracked with a high-resolution base-band tunemeter (BBQ) system. This method was successfully used to the sextupole polarity check in RHIC Blue and Yellow rings in the RHIC 2006 and 2007 runs.

  19. Finger-vein verification based on multi-features fusion.

    PubMed

    Qin, Huafeng; Qin, Lan; Xue, Lian; He, Xiping; Yu, Chengbo; Liang, Xinyuan

    2013-11-05

    This paper presents a new scheme to improve the performance of finger-vein identification systems. Firstly, a vein pattern extraction method to extract the finger-vein shape and orientation features is proposed. Secondly, to accommodate the potential local and global variations at the same time, a region-based matching scheme is investigated by employing the Scale Invariant Feature Transform (SIFT) matching method. Finally, the finger-vein shape, orientation and SIFT features are combined to further enhance the performance. The experimental results on databases of 426 and 170 fingers demonstrate the consistent superiority of the proposed approach.

  20. Respiratory gating with EPID-based verification: the MDACC experience

    NASA Astrophysics Data System (ADS)

    Briere, Tina Marie; Beddar, Sam; Balter, Peter; Murthy, Ravi; Gupta, Sanjay; Nelson, Christopher; Starkschall, George; Gillin, Michael T.; Krishnan, Sunil

    2009-06-01

    We have investigated the feasibility and accuracy of using a combination of internal and external fiducials for respiratory-gated image-guided radiotherapy of liver tumors after screening for suitable patients using a mock treatment. Five patients were enrolled in the study. Radio-opaque fiducials implanted adjacent to the liver tumor were used for daily online positioning using either electronic portal or kV images. Patient eligibility was assessed by determining the degree of correlation between the external and internal fiducials as analyzed during a mock treatment. Treatment delivery was based on the modification of conventional amplitude-based gating. Finally, the accuracy of respiratory-gated treatment using an external fiducial was verified offline using the cine mode of an electronic portal imaging device. For all patients, interfractional contribution to the random error was 2.0 mm in the supero-inferior direction, which is the dominant direction of motion due to respiration, while the interfractional contribution to the systematic error was 0.9 mm. The intrafractional contribution to the random error was 1.0 mm. One of the significant advantages to this technique is improved patient set-up using implanted fiducials and gated imaging. Daily assessment of images acquired during treatment verifies the accuracy of the delivered treatment and uncovers problems in patient set-up.

  1. DFM based on layout restriction and process window verification for sub-60nm memory devices

    NASA Astrophysics Data System (ADS)

    Choi, Soo-Han; Jung, Dai-Hyun; Hong, Ji-Suk; Choi, Joon-Ho; Yoo, Moon-Hyun; Kong, Jeong-Taek

    2007-05-01

    The adoption of the model-based OPC and RET does not guarantee enough process margin any more in the low k1 lithography because potential patterning defects by layout-induced hot spots reduce common process window. The introduction of the litho-friendly layout has faced practical limitation by the designers' short knowledge of the lithography and its impact on the layout. In this paper, we develop a novel method based on restricted design rules (RDR) and process window verification (PWV) to get rid of the layout-related process hot spots during the physical layout design. Since RDR consists of simple design rules familiar to designers and PWV is implemented on layout editor environment, this proposed method is easy to apply in the current design flow. Since memory core layout is designed with typical and repeated patterns, the restriction of layout by design rule enforcement is effective to remove hot spots in the core area. We develop a systematic RDR extraction method by designing test patterns representing repeated memory core patterns by simple pattern matching technique. 1-dimensional (1D, simple line and space pattern) and 1.5-dimensional (1.5D, complicated line and space pattern) test patterns are analyzed to take into account the printability. The 2-dimension (2D) test patterns split by contact pad size are designed to consider the overlap margin between related layers. After removing the hot spots with RDR violations on unit cell by auto-fixer, PWV is applied to detect the random hot spots located on peripheral area. Analyzing CD difference between measurement and simulation according to variation of resist cutting plane and focus, the optical model having physical meaning is generated. The resist model, which uses focus exposure matrix (FEM) data within the process margin of memory cell, can represent the photo process variations accurately. Implementing the proposed method based on RDR and PWV, depth of focus (DOF) of sub-60nm memory device is improved

  2. Knowledge-based machine indexing from natural language text: Knowledge base design, development, and maintenance

    NASA Technical Reports Server (NTRS)

    Genuardi, Michael T.

    1993-01-01

    One strategy for machine-aided indexing (MAI) is to provide a concept-level analysis of the textual elements of documents or document abstracts. In such systems, natural-language phrases are analyzed in order to identify and classify concepts related to a particular subject domain. The overall performance of these MAI systems is largely dependent on the quality and comprehensiveness of their knowledge bases. These knowledge bases function to (1) define the relations between a controlled indexing vocabulary and natural language expressions; (2) provide a simple mechanism for disambiguation and the determination of relevancy; and (3) allow the extension of concept-hierarchical structure to all elements of the knowledge file. After a brief description of the NASA Machine-Aided Indexing system, concerns related to the development and maintenance of MAI knowledge bases are discussed. Particular emphasis is given to statistically-based text analysis tools designed to aid the knowledge base developer. One such tool, the Knowledge Base Building (KBB) program, presents the domain expert with a well-filtered list of synonyms and conceptually-related phrases for each thesaurus concept. Another tool, the Knowledge Base Maintenance (KBM) program, functions to identify areas of the knowledge base affected by changes in the conceptual domain (for example, the addition of a new thesaurus term). An alternate use of the KBM as an aid in thesaurus construction is also discussed.

  3. Secure voice-based authentication for mobile devices: vaulted voice verification

    NASA Astrophysics Data System (ADS)

    Johnson, R. C.; Scheirer, Walter J.; Boult, Terrance E.

    2013-05-01

    As the use of biometrics becomes more wide-spread, the privacy concerns that stem from the use of biometrics are becoming more apparent. As the usage of mobile devices grows, so does the desire to implement biometric identification into such devices. A large majority of mobile devices being used are mobile phones. While work is being done to implement different types of biometrics into mobile phones, such as photo based biometrics, voice is a more natural choice. The idea of voice as a biometric identifier has been around a long time. One of the major concerns with using voice as an identifier is the instability of voice. We have developed a protocol that addresses those instabilities and preserves privacy. This paper describes a novel protocol that allows a user to authenticate using voice on a mobile/remote device without compromising their privacy. We first discuss the Vaulted Verification protocol, which has recently been introduced in research literature, and then describe its limitations. We then introduce a novel adaptation and extension of the Vaulted Verification protocol to voice, dubbed Vaulted Voice Verification (V3). Following that we show a performance evaluation and then conclude with a discussion of security and future work.

  4. IGENPRO knowledge-based digital system for process transient diagnostics and management

    SciTech Connect

    Morman, J.A.; Reifman, J.; Wei, T.Y.C.

    1997-12-31

    Verification and validation issues have been perceived as important factors in the large scale deployment of knowledge-based digital systems for plant transient diagnostics and management. Research and development (R&D) is being performed on the IGENPRO package to resolve knowledge base issues. The IGENPRO approach is to structure the knowledge bases on generic thermal-hydraulic (T-H) first principles and not use the conventional event-basis structure. This allows for generic comprehensive knowledge, relatively small knowledge bases and above all the possibility of T-H system/plant independence. To demonstrate concept feasibility the knowledge structure has been implemented in the diagnostic module PRODIAG. Promising laboratory testing results have been obtained using data from the full scope Braidwood PWR operator training simulator. This knowledge structure is now being implemented in the transient management module PROMANA to treat unanticipated events and the PROTREN module is being developed to process actual plant data. Achievement of the IGENPRO R&D goals should contribute to the acceptance of knowledge-based digital systems for transient diagnostics and management.

  5. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  6. Knowledge Base Refinement by Monitoring Abstract Control Knowledge.

    ERIC Educational Resources Information Center

    Wilkins, D. C.; And Others

    Arguing that an explicit representation of the problem-solving method of an expert system shell as abstract control knowledge provides a powerful foundation for learning, this paper describes the abstract control knowledge of the Heracles expert system shell for heuristic classification problems, and describes how the Odysseus apprenticeship…

  7. Viewing Knowledge Bases as Qualitative Models.

    ERIC Educational Resources Information Center

    Clancey, William J.

    The concept of a qualitative model provides a unifying perspective for understanding how expert systems differ from conventional programs. Knowledge bases contain qualitative models of systems in the world, that is, primarily non-numeric descriptions that provide a basis for explaining and predicting behavior and formulating action plans. The…

  8. Improving the Knowledge Base in Teacher Education.

    ERIC Educational Resources Information Center

    Rockler, Michael J.

    Education in the United States for most of the last 50 years has built its knowledge base on a single dominating foundation--behavioral psychology. This paper analyzes the history of behaviorism. Syntheses are presented of the theories of Ivan P. Pavlov, J. B. Watson, and B. F. Skinner, all of whom contributed to the body of works on behaviorism.…

  9. The knowledge-based software assistant

    NASA Technical Reports Server (NTRS)

    Benner, Kevin M.; White, Douglas A.

    1987-01-01

    Where the Knowledge Based Software Assistant (KBSA) is now, four years after the initial report, is discussed. Also described is what the Rome Air Development Center expects at the end of the first contract iteration. What the second and third contract iterations will look like are characterized.

  10. Improving the Knowledge Base in Teacher Education.

    ERIC Educational Resources Information Center

    Rockler, Michael J.

    Education in the United States for most of the last 50 years has built its knowledge base on a single dominating foundation--behavioral psychology. This paper analyzes the history of behaviorism. Syntheses are presented of the theories of Ivan P. Pavlov, J. B. Watson, and B. F. Skinner, all of whom contributed to the body of works on behaviorism.…

  11. The adverse outcome pathway knowledge base

    EPA Science Inventory

    The rapid advancement of the Adverse Outcome Pathway (AOP) framework has been paralleled by the development of tools to store, analyse, and explore AOPs. The AOP Knowledge Base (AOP-KB) project has brought three independently developed platforms (Effectopedia, AOP-Wiki, and AOP-X...

  12. Knowledge-Based Instructional Gaming: GEO.

    ERIC Educational Resources Information Center

    Duchastel, Philip

    1989-01-01

    Describes the design and development of an instructional game, GEO, in which the user learns elements of Canadian geography. The use of knowledge-based artificial intelligence techniques is discussed, the use of HyperCard in the design of GEO is explained, and future directions are suggested. (15 references) (Author/LRW)

  13. Constructing Knowledge Bases: A Promising Instructional Tool.

    ERIC Educational Resources Information Center

    Trollip, Stanley R.; Lippert, Renate C.

    1987-01-01

    Argues that construction of knowledge bases is an instructional tool that encourages students' critical thinking in problem solving situations through metacognitive experiences. A study is described in which college students created expert systems to test the effectiveness of this method of instruction, and benefits for students and teachers are…

  14. Knowledge-Based Instructional Gaming: GEO.

    ERIC Educational Resources Information Center

    Duchastel, Philip

    1989-01-01

    Describes the design and development of an instructional game, GEO, in which the user learns elements of Canadian geography. The use of knowledge-based artificial intelligence techniques is discussed, the use of HyperCard in the design of GEO is explained, and future directions are suggested. (15 references) (Author/LRW)

  15. The adverse outcome pathway knowledge base

    EPA Science Inventory

    The rapid advancement of the Adverse Outcome Pathway (AOP) framework has been paralleled by the development of tools to store, analyse, and explore AOPs. The AOP Knowledge Base (AOP-KB) project has brought three independently developed platforms (Effectopedia, AOP-Wiki, and AOP-X...

  16. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  17. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  18. An Ebola virus-centered knowledge base

    PubMed Central

    Kamdar, Maulik R.; Dumontier, Michel

    2015-01-01

    Ebola virus (EBOV), of the family Filoviridae viruses, is a NIAID category A, lethal human pathogen. It is responsible for causing Ebola virus disease (EVD) that is a severe hemorrhagic fever and has a cumulative death rate of 41% in the ongoing epidemic in West Africa. There is an ever-increasing need to consolidate and make available all the knowledge that we possess on EBOV, even if it is conflicting or incomplete. This would enable biomedical researchers to understand the molecular mechanisms underlying this disease and help develop tools for efficient diagnosis and effective treatment. In this article, we present our approach for the development of an Ebola virus-centered Knowledge Base (Ebola-KB) using Linked Data and Semantic Web Technologies. We retrieve and aggregate knowledge from several open data sources, web services and biomedical ontologies. This knowledge is transformed to RDF, linked to the Bio2RDF datasets and made available through a SPARQL 1.1 Endpoint. Ebola-KB can also be explored using an interactive Dashboard visualizing the different perspectives of this integrated knowledge. We showcase how different competency questions, asked by domain users researching the druggability of EBOV, can be formulated as SPARQL Queries or answered using the Ebola-KB Dashboard. Database URL: http://ebola.semanticscience.org. PMID:26055098

  19. An Ebola virus-centered knowledge base.

    PubMed

    Kamdar, Maulik R; Dumontier, Michel

    2015-01-01

    Ebola virus (EBOV), of the family Filoviridae viruses, is a NIAID category A, lethal human pathogen. It is responsible for causing Ebola virus disease (EVD) that is a severe hemorrhagic fever and has a cumulative death rate of 41% in the ongoing epidemic in West Africa. There is an ever-increasing need to consolidate and make available all the knowledge that we possess on EBOV, even if it is conflicting or incomplete. This would enable biomedical researchers to understand the molecular mechanisms underlying this disease and help develop tools for efficient diagnosis and effective treatment. In this article, we present our approach for the development of an Ebola virus-centered Knowledge Base (Ebola-KB) using Linked Data and Semantic Web Technologies. We retrieve and aggregate knowledge from several open data sources, web services and biomedical ontologies. This knowledge is transformed to RDF, linked to the Bio2RDF datasets and made available through a SPARQL 1.1 Endpoint. Ebola-KB can also be explored using an interactive Dashboard visualizing the different perspectives of this integrated knowledge. We showcase how different competency questions, asked by domain users researching the druggability of EBOV, can be formulated as SPARQL Queries or answered using the Ebola-KB Dashboard.

  20. Development and verification of an agent-based model of opinion leadership.

    PubMed

    Anderson, Christine A; Titler, Marita G

    2014-09-27

    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The

  1. Case-Based Tutoring from a Medical Knowledge Base

    PubMed Central

    Chin, Homer L.

    1988-01-01

    The past decade has seen the emergence of programs that make use of large knowledge bases to assist physicians in diagnosis within the general field of internal medicine. One such program, Internist-I, contains knowledge about over 600 diseases, covering a significant proportion of internal medicine. This paper describes the process of converting a subset of this knowledge base--in the area of cardiovascular diseases--into a probabilistic format, and the use of this resulting knowledge base to teach medical diagnostic knowledge. The system (called KBSimulator--for Knowledge-Based patient Simulator) generates simulated patient cases and uses these cases as a focal point from which to teach medical knowledge. It interacts with the student in a mixed-initiative fashion, presenting patients for the student to diagnose, and allowing the student to obtain further information on his/her own initiative in the context of that patient case. The system scores the student, and uses these scores to form a rudimentary model of the student. This resulting model of the student is then used to direct the generation of subsequent patient cases. This project demonstrates the feasibility of building an intelligent, flexible instructional system that uses a knowledge base constructed primarily for medical diagnosis.

  2. Satellite Contamination and Materials Outgassing Knowledge base

    NASA Technical Reports Server (NTRS)

    Minor, Jody L.; Kauffman, William J. (Technical Monitor)

    2001-01-01

    Satellite contamination continues to be a design problem that engineers must take into account when developing new satellites. To help with this issue, NASA's Space Environments and Effects (SEE) Program funded the development of the Satellite Contamination and Materials Outgassing Knowledge base. This engineering tool brings together in one location information about the outgassing properties of aerospace materials based upon ground-testing data, the effects of outgassing that has been observed during flight and measurements of the contamination environment by on-orbit instruments. The knowledge base contains information using the ASTM Standard E- 1559 and also consolidates data from missions using quartz-crystal microbalances (QCM's). The data contained in the knowledge base was shared with NASA by government agencies and industry in the US and international space agencies as well. The term 'knowledgebase' was used because so much information and capability was brought together in one comprehensive engineering design tool. It is the SEE Program's intent to continually add additional material contamination data as it becomes available - creating a dynamic tool whose value to the user is ever increasing. The SEE Program firmly believes that NASA, and ultimately the entire contamination user community, will greatly benefit from this new engineering tool and highly encourages the community to not only use the tool but add data to it as well.

  3. Presentation planning using an integrated knowledge base

    NASA Technical Reports Server (NTRS)

    Arens, Yigal; Miller, Lawrence; Sondheimer, Norman

    1988-01-01

    A description is given of user interface research aimed at bringing together multiple input and output modes in a way that handles mixed mode input (commands, menus, forms, natural language), interacts with a diverse collection of underlying software utilities in a uniform way, and presents the results through a combination of output modes including natural language text, maps, charts and graphs. The system, Integrated Interfaces, derives much of its ability to interact uniformly with the user and the underlying services and to build its presentations, from the information present in a central knowledge base. This knowledge base integrates models of the application domain (Navy ships in the Pacific region, in the current demonstration version); the structure of visual displays and their graphical features; the underlying services (data bases and expert systems); and interface functions. The emphasis is on a presentation planner that uses the knowledge base to produce multi-modal output. There has been a flurry of recent work in user interface management systems. (Several recent examples are listed in the references). Existing work is characterized by an attempt to relieve the software designer of the burden of handcrafting an interface for each application. The work has generally focused on intelligently handling input. This paper deals with the other end of the pipeline - presentations.

  4. XML-Based SHINE Knowledge Base Interchange Language

    NASA Technical Reports Server (NTRS)

    James, Mark; Mackey, Ryan; Tikidjian, Raffi

    2008-01-01

    The SHINE Knowledge Base Interchange Language software has been designed to more efficiently send new knowledge bases to spacecraft that have been embedded with the Spacecraft Health Inference Engine (SHINE) tool. The intention of the behavioral model is to capture most of the information generally associated with a spacecraft functional model, while specifically addressing the needs of execution within SHINE and Livingstone. As such, it has some constructs that are based on one or the other.

  5. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    PubMed

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  6. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    SciTech Connect

    Fuangrod, Todsaporn; Woodruff, Henry C.; O’Connor, Daryl J.; Uytven, Eric van; McCurdy, Boyd M. C.; Kuncic, Zdenka; Greer, Peter B.

    2013-09-15

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  7. Knowledge-Based Query Construction Using the CDSS Knowledge Base for Efficient Evidence Retrieval

    PubMed Central

    Afzal, Muhammad; Hussain, Maqbool; Ali, Taqdir; Hussain, Jamil; Khan, Wajahat Ali; Lee, Sungyoung; Kang, Byeong Ho

    2015-01-01

    Finding appropriate evidence to support clinical practices is always challenging, and the construction of a query to retrieve such evidence is a fundamental step. Typically, evidence is found using manual or semi-automatic methods, which are time-consuming and sometimes make it difficult to construct knowledge-based complex queries. To overcome the difficulty in constructing knowledge-based complex queries, we utilized the knowledge base (KB) of the clinical decision support system (CDSS), which has the potential to provide sufficient contextual information. To automatically construct knowledge-based complex queries, we designed methods to parse rule structure in KB of CDSS in order to determine an executable path and extract the terms by parsing the control structures and logic connectives used in the logic. The automatically constructed knowledge-based complex queries were executed on the PubMed search service to evaluate the results on the reduction of retrieved citations with high relevance. The average number of citations was reduced from 56,249 citations to 330 citations with the knowledge-based query construction approach, and relevance increased from 1 term to 6 terms on average. The ability to automatically retrieve relevant evidence maximizes efficiency for clinicians in terms of time, based on feedback collected from clinicians. This approach is generally useful in evidence-based medicine, especially in ambient assisted living environments where automation is highly important. PMID:26343669

  8. Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System

    NASA Technical Reports Server (NTRS)

    Braman, Julia M. B.; Murray, Richard M; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.

  9. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    NASA Astrophysics Data System (ADS)

    Hillen, F.; Höfle, B.; Ehlers, M.; Reinartz, P.

    2014-02-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories.

  10. Knowledge Base Management for Model Management Systems.

    DTIC Science & Technology

    1983-06-01

    inter- faces as they relate to aspects of model base management. The focus of this study is to identify some organiza- tions of knowledge about models...Vertical thinking is loosely related to systemic thinking, where one idea establishes a logical foundation upon which to construct the next idea...thinking is somewhat associated with creative thinking, and the idea of pattern matching from one circumstance to another. Mintzberg IRef. 41 has related

  11. Clips as a knowledge based language

    NASA Technical Reports Server (NTRS)

    Harrington, James B.

    1987-01-01

    CLIPS is a language for writing expert systems applications on a personal or small computer. Here, the CLIPS programming language is described and compared to three other artificial intelligence (AI) languages (LISP, Prolog, and OPS5) with regard to the processing they provide for the implementation of a knowledge based system (KBS). A discussion is given on how CLIPS would be used in a control system.

  12. Verification of the two-dimensional hydrodynamic model based on remote sensing

    NASA Astrophysics Data System (ADS)

    Sazonov, Alexey; Mikhailukova, Polina; Krylenko, Inna; Frolova, Natalya; Kireeva, Mariya

    2016-04-01

    Mathematical modeling methods are used more and more actively to evaluate possible damage, identify potential flood zone and the influence of individual factors affecting the river during the passage of the flood. Calculations were performed by means of domestic software complex «STREAM-2D» which is based on the numerical solution of two-dimensional St. Venant equations. One of the major challenges in mathematical modeling is the verification of the model. This is usually made using data on water levels from hydrological stations: the smaller the difference of the actual level and the simulated one, the better the quality of the model used. Data from hydrological stations are not always available, so alternative sources of verification, such as remote sensing, are increasingly used. The aim of this work is to develop a method of verification of hydrodynamic model based on a comparison of actual flood zone area, which in turn is determined on the basis of the automated satellite image interpretation methods for different imaging systems and flooded area obtained in the course of the model. The study areas are Lena River, The North Dvina River, Amur River near Blagoveshchensk. We used satellite images made by optical and radar sensors: SPOT-5/HRG, Resurs-F, Radarsat-2. Flooded area were calculated using unsupervised classification (ISODATA and K-mean) for optical images and segmentation for Radarsat-2. Knowing the flow rate and the water level at a given date for the upper and lower limits of the model, respectively, it is possible to calculate flooded area by means of program STREAM-2D and GIS technology. All the existing vector layers with the boundaries of flooding are included in a GIS project for flood area calculation. This study was supported by the Russian Science Foundation, project no. 14-17-00155.

  13. 4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia

    2015-08-01

    At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the {β+} -activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.

  14. 4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study.

    PubMed

    Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia

    2015-08-21

    At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the β+-activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.

  15. Modeling Guru: Knowledge Base for NASA Modelers

    NASA Astrophysics Data System (ADS)

    Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.

    2009-05-01

    Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the

  16. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  17. Knowledge based imaging for terrain analysis

    NASA Technical Reports Server (NTRS)

    Holben, Rick; Westrom, George; Rossman, David; Kurrasch, Ellie

    1992-01-01

    A planetary rover will have various vision based requirements for navigation, terrain characterization, and geological sample analysis. In this paper we describe a knowledge-based controller and sensor development system for terrain analysis. The sensor system consists of a laser ranger and a CCD camera. The controller, under the input of high-level commands, performs such functions as multisensor data gathering, data quality monitoring, and automatic extraction of sample images meeting various criteria. In addition to large scale terrain analysis, the system's ability to extract useful geological information from rock samples is illustrated. Image and data compression strategies are also discussed in light of the requirements of earth bound investigators.

  18. Knowledge-based systems in Japan

    NASA Technical Reports Server (NTRS)

    Feigenbaum, Edward; Engelmore, Robert S.; Friedland, Peter E.; Johnson, Bruce B.; Nii, H. Penny; Schorr, Herbert; Shrobe, Howard

    1994-01-01

    This report summarizes a study of the state-of-the-art in knowledge-based systems technology in Japan, organized by the Japanese Technology Evaluation Center (JTEC) under the sponsorship of the National Science Foundation and the Advanced Research Projects Agency. The panel visited 19 Japanese sites in March 1992. Based on these site visits plus other interactions with Japanese organizations, both before and after the site visits, the panel prepared a draft final report. JTEC sent the draft to the host organizations for their review. The final report was published in May 1993.

  19. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  20. Explanation-based knowledge acquisition of electronics

    NASA Astrophysics Data System (ADS)

    Kieras, David E.

    1992-08-01

    This is the final report in a project that examined how knowledge of practical electronics could be acquired from materials similar to that appearing in electronics training textbooks, from both an artificial intelligence perspective and an experimental psychology perspective. Practical electronics training materials present a series of basic circuits accompanied by an explanation of how the circuit performs the desired function. More complex circuits are then explained in terms of these basic circuits. This material thus presents schema knowledge for individual circuit types in the form of explanations of circuit behavior. Learning from such material would thus consist of first instantiating any applicable schemas, and then constructing a new schema based on the circuit structure and behavior described in the explanation. If the basic structure of the material is an effective approach to learning, learning about a new circuit should be easier if the relevant schemas are available than not. This result was obtained for both an artificial intelligence system that used standard explanation-based learning mechanisms and with human learners in a laboratory setting, but the benefits of already having the relevant schemas were not large in these materials. The close examination of learning in this domain, and the structure of knowledge, should be useful to future cognitive analyses of training in technical domains.

  1. Automated Fictional Ideation via Knowledge Base Manipulation.

    PubMed

    Llano, Maria Teresa; Colton, Simon; Hepworth, Rose; Gow, Jeremy

    The invention of fictional ideas (ideation) is often a central process in the creative production of artefacts such as poems, music and paintings, but has barely been studied in the computational creativity community. We present here a general approach to automated fictional ideation that works by manipulating facts specified in knowledge bases. More specifically, we specify a number of constructions which, by altering and combining facts from a knowledge base, result in the generation of fictions. Moreover, we present an instantiation of these constructions through the use of ConceptNet, a database of common sense knowledge. In order to evaluate the success of these constructions, we present a curation analysis that calculates the proportion of ideas which pass a typicality judgement. We further evaluate the output of this approach through a crowd-sourcing experiment in which participants were asked to rank ideas. We found a positive correlation between the participant's rankings and a chaining inference technique that automatically assesses the value of the fictions generated through our approach. We believe that these results show that this approach constitutes a firm basis for automated fictional ideation with evaluative capacity.

  2. Advances in knowledge-based software engineering

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt

    1991-01-01

    The underlying hypothesis of this work is that a rigorous and comprehensive software reuse methodology can bring about a more effective and efficient utilization of constrained resources in the development of large-scale software systems by both government and industry. It is also believed that correct use of this type of software engineering methodology can significantly contribute to the higher levels of reliability that will be required of future operational systems. An overview and discussion of current research in the development and application of two systems that support a rigorous reuse paradigm are presented: the Knowledge-Based Software Engineering Environment (KBSEE) and the Knowledge Acquisition fo the Preservation of Tradeoffs and Underlying Rationales (KAPTUR) systems. Emphasis is on a presentation of operational scenarios which highlight the major functional capabilities of the two systems.

  3. Compilation for critically constrained knowledge bases

    SciTech Connect

    Schrag, R.

    1996-12-31

    We show that many {open_quotes}critically constrained{close_quotes} Random 3SAT knowledge bases (KBs) can be compiled into disjunctive normal form easily by using a variant of the {open_quotes}Davis-Putnam{close_quotes} proof procedure. From these compiled KBs we can answer all queries about entailment of conjunctive normal formulas, also easily - compared to a {open_quotes}brute-force{close_quotes} approach to approximate knowledge compilation into unit clauses for the same KBs. We exploit this fact to develop an aggressive hybrid approach which attempts to compile a KB exactly until a given resource limit is reached, then falls back to approximate compilation into unit clauses. The resulting approach handles all of the critically constrained Random 3SAT KBs with average savings of an order of magnitude over the brute-force approach.

  4. Bidirectional mereological reasoning in anatomical knowledge bases.

    PubMed Central

    Schulz, S.

    2001-01-01

    Mereological relationships--relationships between parts and wholes--are essential for ontological engineering in the anatomical domain. We propose a knowledge engineering approach that emulates mereological reasoning by taxonomic reasoning based on SEP triplets, a special data structure for the encoding of part-whole relations, which is fully embedded in the formal framework of standard description logics. We extend the SEP formalism in order to account not only for the part-of but also for the has-part relation, both being considered transitive in our domain. Furthermore we analyze the distinction between the ontological primitives singletons, collections and mass concepts in the anatomy domain and sketch how reasoning about these kinds of concepts can be accounted for in a knowledge representation language, using the extended SEP formalism. PMID:11825258

  5. Building a Knowledge Base for the Knowledge Worker System

    DTIC Science & Technology

    1992-08-01

    Environment (PSE) 16. PRICE COoE 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. LIMITATION OF ABSTRACT OF REPORT OF...available to other related work groups through the Corps of Engineers Automation Plan ( CEAP ) environment. This extra avenue for connectivity will improve...review of literature pertaining to task analysis, decision making, and human-computer interfaces revealed that no classification system for knowledge

  6. Knowledge Base Refinement by Monitoring Abstract Control Knowledge. Revision 1.

    DTIC Science & Technology

    1987-08-01

    Wilkins, NV. J. Clancey, and B. G. Buchanan 0 S .. Department of Computer Science Stanford University Stanford, CA 94305 e EOT -N...APPROVED FOR PUBLIC RELEASE. 2b DECLASSIFICATIONDOWNGRADING SCHEDULE DISTRIBUTION UNLIMITED 4 PERFORMING ORGANIZATION REPORT NuMBER( S ) S MONITORING...ORGANIZATION REPORT NuMBER( S ) ONR TECHNICAL REPORT # 6a NAME OF PERFORMING ORGANIZATION 6b OFFICE SYMBOL 7a NAME OF MONITORING ORGANIZATION STANFORD KNOWLEDGE

  7. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  8. MRI-based treatment planning and dose delivery verification for intraocular melanoma brachytherapy.

    PubMed

    Zoberi, Jacqueline Esthappan; Garcia-Ramirez, Jose; Hedrick, Samantha; Rodriguez, Vivian; Bertelsman, Carol G; Mackey, Stacie; Hu, Yanle; Gach, H Michael; Rao, P Kumar; Grigsby, Perry W

    2017-08-14

    Episcleral plaque brachytherapy (EPB) planning is conventionally based on approximations of the implant geometry with no volumetric imaging following plaque implantation. We have developed an MRI-based technique for EPB treatment planning and dose delivery verification based on the actual patient-specific geometry. MR images of 6 patients, prescribed 85 Gy over 96 hours from Collaborative Ocular Melanoma Study-based EPB, were acquired before and after implantation. Preimplant and postimplant scans were used to generate "preplans" and "postplans", respectively. In the preplans, a digital plaque model was positioned relative to the tumor, sclera, and nerve. In the postplans, the same plaque model was positioned based on the imaged plaque. Plaque position, point doses, percentage of tumor volume receiving 85 Gy (V100), and dose to 100% of tumor volume (Dmin) were compared between preplans and postplans. All isodose plans were computed using TG-43 formalism with no heterogeneity corrections. Shifts and tilts of the plaque ranged from 1.4 to 8.6 mm and 1.0 to 3.8 mm, respectively. V100 was ≥97% for 4 patients. Dmin for preplans and postplans ranged from 83 to 118 Gy and 45 to 110 Gy, respectively. Point doses for tumor apex and base were all found to decrease from the preimplant to the postimplant plan, with mean differences of 16.7 ± 8.6% and 30.5 ± 11.3%, respectively. By implementing MRI for EPB, we eliminate reliance on approximations of the eye and tumor shape and the assumption of idealized plaque placement. With MRI, one can perform preimplant as well as postimplant imaging, facilitating EPB treatment planning based on the actual patient-specific geometry and dose-delivery verification based on the imaged plaque position. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  9. Personal knowledge management based on social software

    NASA Astrophysics Data System (ADS)

    Zhao, Chengling; Cao, Jianxia; Guo, Xinhua

    The emergence of information technology has provided a powerful hand for personal knowledge management, thus the personal knowledge management goes on more convenient and feasible. This paper gave the summary of personal knowledge management as well as social software, and then analyzed the characteristic of applying social software in personal knowledge management from the explicit and tacit knowledge perspective, finally gave a model of applying social software in personal knowledge management.

  10. A Collaborative Environment for Knowledge Base Development

    NASA Astrophysics Data System (ADS)

    Li, W.; Yang, C.; Raskin, R.; Nebert, D. D.; Wu, H.

    2009-12-01

    Knowledge Base (KB) is an essential component for capturing, structuring and defining the meanings of domain knowledge. It’s important in enabling the sharing and interoperability of scientific data and services in a smart manner. It’s also the foundation for most the research in semantic field, such as semantic reasoning and ranking. In collaborating with ESIP, GMU is developing an online interface and supporting infrastructure to allow semantic registration of datasets and other web resources. The semantic description of data, services, and scientific content will be collected and transformed to the KB. As a case study, the harvest of web map services from by Nordic mapping agencies to build a virtual Arctic spatial data infrastructure will be used as the domain example. To automate the process, a controlled vocabulary of certain subjects, such as solid water, is created to filter from existing data and service repositories to obtain a collection of closely related document. Then latent semantic indexing is utilized to analyze semantic relationship among concepts that appears in service document. At last, semantic structure in plain text will be mapped and automatically populated to the specific presentation of knowledge in the KB.

  11. Irrelevance Reasoning in Knowledge Based Systems

    NASA Technical Reports Server (NTRS)

    Levy, A. Y.

    1993-01-01

    This dissertation considers the problem of reasoning about irrelevance of knowledge in a principled and efficient manner. Specifically, it is concerned with two key problems: (1) developing algorithms for automatically deciding what parts of a knowledge base are irrelevant to a query and (2) the utility of relevance reasoning. The dissertation describes a novel tool, the query-tree, for reasoning about irrelevance. Based on the query-tree, we develop several algorithms for deciding what formulas are irrelevant to a query. Our general framework sheds new light on the problem of detecting independence of queries from updates. We present new results that significantly extend previous work in this area. The framework also provides a setting in which to investigate the connection between the notion of irrelevance and the creation of abstractions. We propose a new approach to research on reasoning with abstractions, in which we investigate the properties of an abstraction by considering the irrelevance claims on which it is based. We demonstrate the potential of the approach for the cases of abstraction of predicates and projection of predicate arguments. Finally, we describe an application of relevance reasoning to the domain of modeling physical devices.

  12. Knowledge-based representations of risk beliefs.

    PubMed

    Tonn, B E; Travis, C B; Goeltz, R T; Phillippi, R H

    1990-03-01

    Beliefs about risks associated with two risk agents, AIDS and toxic waste, are modeled using knowledge-based methods and elicited from subjects via interactive computer technology. A concept net is developed to organize subject responses concerning the consequences of the risk agents. It is found that death and adverse personal emotional and sociological consequences are most associated with AIDS. Toxic waste is most associated with environmental problems. These consequence profiles are quite dissimilar, although past work in risk perception would have judged the risk agents as being quite similar. Subjects frequently used causal semantics to represent their beliefs and "% of time" instead of "probability" to represent likelihoods. The news media is the most prevalent source of risk information although experiences of acquaintances appear more credible. The results suggest that "broadly based risk" communication may be ineffective because people differ in their conceptual representation of risk beliefs. In general, the knowledge-based approach to risk perception representation has great potential to increase our understanding of important risk topics.

  13. Knowledge-based public health situation awareness

    NASA Astrophysics Data System (ADS)

    Mirhaji, Parsa; Zhang, Jiajie; Srinivasan, Arunkumar; Richesson, Rachel L.; Smith, Jack W.

    2004-09-01

    There have been numerous efforts to create comprehensive databases from multiple sources to monitor the dynamics of public health and most specifically to detect the potential threats of bioterrorism before widespread dissemination. But there are not many evidences for the assertion that these systems are timely and dependable, or can reliably identify man made from natural incident. One must evaluate the value of so called 'syndromic surveillance systems' along with the costs involved in design, development, implementation and maintenance of such systems and the costs involved in investigation of the inevitable false alarms1. In this article we will introduce a new perspective to the problem domain with a shift in paradigm from 'surveillance' toward 'awareness'. As we conceptualize a rather different approach to tackle the problem, we will introduce a different methodology in application of information science, computer science, cognitive science and human-computer interaction concepts in design and development of so called 'public health situation awareness systems'. We will share some of our design and implementation concepts for the prototype system that is under development in the Center for Biosecurity and Public Health Informatics Research, in the University of Texas Health Science Center at Houston. The system is based on a knowledgebase containing ontologies with different layers of abstraction, from multiple domains, that provide the context for information integration, knowledge discovery, interactive data mining, information visualization, information sharing and communications. The modular design of the knowledgebase and its knowledge representation formalism enables incremental evolution of the system from a partial system to a comprehensive knowledgebase of 'public health situation awareness' as it acquires new knowledge through interactions with domain experts or automatic discovery of new knowledge.

  14. Adaptive Knowledge Management of Project-Based Learning

    ERIC Educational Resources Information Center

    Tilchin, Oleg; Kittany, Mohamed

    2016-01-01

    The goal of an approach to Adaptive Knowledge Management (AKM) of project-based learning (PBL) is to intensify subject study through guiding, inducing, and facilitating development knowledge, accountability skills, and collaborative skills of students. Knowledge development is attained by knowledge acquisition, knowledge sharing, and knowledge…

  15. Range verification of passively scattered proton beams based on prompt gamma time patterns

    NASA Astrophysics Data System (ADS)

    Testa, Mauro; Min, Chul Hee; Verburg, Joost M.; Schümann, Jan; Lu, Hsiao-Ming; Paganetti, Harald

    2014-07-01

    We propose a proton range verification technique for passive scattering proton therapy systems where spread out Bragg peak (SOBP) fields are produced with rotating range modulator wheels. The technique is based on the correlation of time patterns of the prompt gamma ray emission with the range of protons delivering the SOBP. The main feature of the technique is the ability to verify the proton range with a single point of measurement and a simple detector configuration. We performed four-dimensional (time-dependent) Monte Carlo simulations using TOPAS to show the validity and accuracy of the technique. First, we validated the hadronic models used in TOPAS by comparing simulations and prompt gamma spectrometry measurements published in the literature. Second, prompt gamma simulations for proton range verification were performed for the case of a water phantom and a prostate cancer patient. In the water phantom, the proton range was determined with 2 mm accuracy with a full ring detector configuration for a dose of ~2.5 cGy. For the prostate cancer patient, 4 mm accuracy on range determination was achieved for a dose of ~15 cGy. The results presented in this paper are encouraging in view of a potential clinical application of the technique.

  16. Application of knowledge-based vision to closed-loop control of the injection molding process

    NASA Astrophysics Data System (ADS)

    Marsh, Robert; Stamp, R. J.; Hill, T. M.

    1997-10-01

    An investigation is under way to develop a control system for an industrial process which uses a vision systems as a sensor. The research is aimed at the improvement of product quality in commercial injection molding system. A significant enhancement has been achieved in the level of application of visually based inspection techniques to component quality. The aim of the research has been the investigation, and employment, of inspection methods that use knowledge based machine vision. The application of such techniques in this context is comprehensive, extending from object oriented analysis, design and programming of the inspection program, to the application of rule based reasoning, to image interpretation, vision system diagnostics, component diagnostics and molding machine control. In this way, knowledge handling methods are exploited wherever they prove to be beneficial. The vision knowledge base contains information on the procedures required to achieve successful identification of component surface defects. A collection of image processing and pattern recognition algorithms are applied selectively. Once inspection of the component has been performed, defects are related to process variables which affect the quality of the component, and another knowledge base is used to effect a control action at the molding machine. Feedback from other machine sensor is also used to direct the control procedure. Results from the knowledge based vision inspection system are encouraging. They indicate that rapid and effective fault detection and analysis is feasible, as is the verification of system integrity.

  17. Magnetic nanoparticles-based extraction and verification of nucleic acids from different sources.

    PubMed

    Ma, Chao; Li, Chuanyan; Wang, Fang; Ma, Ningning; Li, Xiaolong; Li, Zhiyang; Deng, Yan; Wang, Zhifei; Xi, Zhijiang; Tang, Yongjun; Hel, Nongyue

    2013-04-01

    In many molecule biology and genetic technology studies, the amount of available DNA can be one of the important criteria for selecting the samples from different sources. Compared with those genomic DNA methods using organic solvents or other traditional commercial kits, the method based on magnetic nanoparticles (MNPs) and adsorption technology has many remarkable advantages like being time-saving and cost effective without the laborious centrifugation or precipitation steps, and more importantly it has the great potential and especially suitable for automated DNA extraction and up-scaling. In this paper, the extraction efficiency of genomic nucleic acids based on magnetic nanoparticles from four different sources including bacteria, yeast, human blood and virus samples are compared and verified. After measurement and verification of the extracted genomic nucleic acids, it was shown that all these genomic nucleic acids extracted using the MNPs method can be of high yield and be available for next molecule biological steps.

  18. A knowledge based expert system for turbomachinery

    SciTech Connect

    Kubiak, J.A.; Rivera-Grijalva, J.J.

    1994-12-31

    This paper describes the development of an Expert system for identification of turbomachinery faults caused by an increase in the vibrations of the rotor bearing system. The organization of the expert system is based on the known vibration patterns constructed from four distinctive vibration parameters: amplitude, frequency, phase angle and waveform. Each of the parameters can lead to a different set of faults. When all four sets of possible faults are overlapped during the selection process, which is automatically carried out, the number of possible faults is reduced to one or two. All the possible causes of increment of vibration can be detected with a high degree of probability. The system is used off-line and can be installed on-line with a monitoring system. The knowledge levels and the separate bases are built into the system.

  19. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  20. Knowledge-based analysis of microarrays for the discovery of transcriptional regulation relationships

    PubMed Central

    2010-01-01

    Background The large amount of high-throughput genomic data has facilitated the discovery of the regulatory relationships between transcription factors and their target genes. While early methods for discovery of transcriptional regulation relationships from microarray data often focused on the high-throughput experimental data alone, more recent approaches have explored the integration of external knowledge bases of gene interactions. Results In this work, we develop an algorithm that provides improved performance in the prediction of transcriptional regulatory relationships by supplementing the analysis of microarray data with a new method of integrating information from an existing knowledge base. Using a well-known dataset of yeast microarrays and the Yeast Proteome Database, a comprehensive collection of known information of yeast genes, we show that knowledge-based predictions demonstrate better sensitivity and specificity in inferring new transcriptional interactions than predictions from microarray data alone. We also show that comprehensive, direct and high-quality knowledge bases provide better prediction performance. Comparison of our results with ChIP-chip data and growth fitness data suggests that our predicted genome-wide regulatory pairs in yeast are reasonable candidates for follow-up biological verification. Conclusion High quality, comprehensive, and direct knowledge bases, when combined with appropriate bioinformatic algorithms, can significantly improve the discovery of gene regulatory relationships from high throughput gene expression data. PMID:20122245

  1. Knowledge-based engineering of a PLC controlled telescope

    NASA Astrophysics Data System (ADS)

    Pessemier, Wim; Raskin, Gert; Saey, Philippe; Van Winckel, Hans; Deconinck, Geert

    2016-08-01

    As the new control system of the Mercator Telescope is being finalized, we can review some technologies and design methodologies that are advantageous, despite their relative uncommonness in astronomical instrumentation. Particular for the Mercator Telescope is that it is controlled by a single high-end soft-PLC (Programmable Logic Controller). Using off-the-shelf components only, our distributed embedded system controls all subsystems of the telescope such as the pneumatic primary mirror support, the hydrostatic bearing, the telescope axes, the dome, the safety system, and so on. We show how real-time application logic can be written conveniently in typical PLC languages (IEC 61131-3) and in C++ (to implement the pointing kernel) using the commercial TwinCAT 3 programming environment. This software processes the inputs and outputs of the distributed system in real-time via an observatory-wide EtherCAT network, which is synchronized with high precision to an IEEE 1588 (PTP, Precision Time Protocol) time reference clock. Taking full advantage of the ability of soft-PLCs to run both real-time and non real-time software, the same device also hosts the most important user interfaces (HMIs or Human Machine Interfaces) and communication servers (OPC UA for process data, FTP for XML configuration data, and VNC for remote control). To manage the complexity of the system and to streamline the development process, we show how most of the software, electronics and systems engineering aspects of the control system have been modeled as a set of scripts written in a Domain Specific Language (DSL). When executed, these scripts populate a Knowledge Base (KB) which can be queried to retrieve specific information. By feeding the results of those queries to a template system, we were able to generate very detailed "browsable" web-based documentation about the system, but also PLC software code, Python client code, model verification reports, etc. The aim of this paper is to

  2. M&V Guidelines: Measurement and Verification for Performance-Based Contracts Version 4.0

    SciTech Connect

    2015-11-02

    Document outlines the Federal Energy Management Program's standard procedures and guidelines for measurement and verification (M&V) for federal energy managers, procurement officials, and energy service providers.

  3. Interfaces for knowledge-base builders control knowledge and application-specific procedures

    SciTech Connect

    Hirsch, P.; Katke, W.; Meier, M.; Snyder, S.; Stillman, R.

    1986-01-01

    Expert System Environment/VM is an expert system shell-a general-purpose system for constructing and executing expert system applications. An application expert has both factual knowledge about an application and knowledge about how that factual knowledge should be organized and processed. In addition, many applications require application-dependent procedures to access databases or to do specialized processing. An important and novel part of Expert System Environment/VM is the technique used to allow the expert or knowledge-base builder to enter the control knowledge and to interface with application-dependent procedures. This paper discusses these high-level interfaces for the knowledge-base builder.

  4. An object-based methodology for knowledge representation in SGML

    SciTech Connect

    Kelsey, R.L.; Hartley, R.T.; Webster, R.B.

    1997-11-01

    An object-based methodology for knowledge representation and its Standard Generalized Markup Language (SGML) implementation is presented. The methodology includes class, perspective domain, and event constructs for representing knowledge within an object paradigm. The perspective construct allows for representation of knowledge from multiple and varying viewpoints. The event construct allows actual use of knowledge to be represented. The SGML implementation of the methodology facilitates usability, structured, yet flexible knowledge design, and sharing and reuse of knowledge class libraries.

  5. Research on simulation and verification system of satellite remote sensing camera video processor based on dual-FPGA

    NASA Astrophysics Data System (ADS)

    Ma, Fei; Liu, Qi; Cui, Xuenan

    2014-09-01

    To satisfy the needs for testing video processor of satellite remote sensing cameras, a design is provided to achieve a simulation and verification system of satellite remote sensing camera video processor based on dual-FPGA. The correctness of video processor FPGA logic can be verified even without CCD signals or analog to digital convertor. Two Xilinx Virtex FPGAs are adopted to make a center unit, the logic of A/D digital data generating and data processing are developed with VHDL. The RS-232 interface is used to receive commands from the host computer, and different types of data are generated and outputted depending on the commands. Experimental results show that the simulation and verification system is flexible and can work well. The simulation and verification system meets the requirements of testing video processors for several different types of satellite remote sensing cameras.

  6. Strategies for Knowledge-Based Image Interpretation.

    DTIC Science & Technology

    1982-05-01

    hypothesis formation and hypothesis verification Certain assumptions have beer ) ’ide in the experiments. Fhe system assumes a camera posltIOn that is...object. PACE! IF, The strategy did not work well One problemn was the basic irability to label any region with (ireat accuracy. Another was the v...c h..r J fe. ’efe PAifll - A. No iep Lamber . .. ’ lin 1 . "A Fra ne o k . Rep resent thu L-.IWuLedqv, I r T he P Lj ar ci -( un ltir Vi.sio~n, l". W

  7. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  8. Truth in Complex Adaptive Systems Models Should BE Based on Proof by Constructive Verification

    NASA Astrophysics Data System (ADS)

    Shipworth, David

    It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. `Emergent' properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.

  9. NASDA knowledge-based network planning system

    NASA Technical Reports Server (NTRS)

    Yamaya, K.; Fujiwara, M.; Kosugi, S.; Yambe, M.; Ohmori, M.

    1993-01-01

    One of the SODS (space operation and data system) sub-systems, NP (network planning) was the first expert system used by NASDA (national space development agency of Japan) for tracking and control of satellite. The major responsibilities of the NP system are: first, the allocation of network and satellite control resources and, second, the generation of the network operation plan data (NOP) used in automated control of the stations and control center facilities. Up to now, the first task of network resource scheduling was done by network operators. NP system automatically generates schedules using its knowledge base, which contains information on satellite orbits, station availability, which computer is dedicated to which satellite, and how many stations must be available for a particular satellite pass or a certain time period. The NP system is introduced.

  10. DeepDive: Declarative Knowledge Base Construction.

    PubMed

    De Sa, Christopher; Ratner, Alex; Ré, Christopher; Shin, Jaeho; Wang, Feiran; Wu, Sen; Zhang, Ce

    2016-03-01

    The dark data extraction or knowledge base construction (KBC) problem is to populate a SQL database with information from unstructured data sources including emails, webpages, and pdf reports. KBC is a long-standing problem in industry and research that encompasses problems of data extraction, cleaning, and integration. We describe DeepDive, a system that combines database and machine learning ideas to help develop KBC systems. The key idea in DeepDive is that statistical inference and machine learning are key tools to attack classical data problems in extraction, cleaning, and integration in a unified and more effective manner. DeepDive programs are declarative in that one cannot write probabilistic inference algorithms; instead, one interacts by defining features or rules about the domain. A key reason for this design choice is to enable domain experts to build their own KBC systems. We present the applications, abstractions, and techniques of DeepDive employed to accelerate construction of KBC systems.

  11. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  12. Knowledge-based systems and NASA's software support environment

    NASA Technical Reports Server (NTRS)

    Dugan, Tim; Carmody, Cora; Lennington, Kent; Nelson, Bob

    1990-01-01

    A proposed role for knowledge-based systems within NASA's Software Support Environment (SSE) is described. The SSE is chartered to support all software development for the Space Station Freedom Program (SSFP). This includes support for development of knowledge-based systems and the integration of these systems with conventional software systems. In addition to the support of development of knowledge-based systems, various software development functions provided by the SSE will utilize knowledge-based systems technology.

  13. An objective weather-regime-based verification of WRF-RTFDDA forecasts over the eastern Mediterranean

    NASA Astrophysics Data System (ADS)

    Rostkier-Edelstein, Dorita; Liu, Yubao; Pan, Linlin; Sheu, Rong-Shyang

    2014-05-01

    Numerical weather prediction in the eastern Mediterranean is very challenging because of the region's unique geography, which includes strong land-sea contrast, complex topography, highly varied vegetation, and mosaic of urban and desert areas. This geographic heterogeneity often results in complex and dramatically different mesoscale and microscale flows underdifferent synoptic situations. WRF-RTFDDA (Weather Research and Forecasting - Realtime four-dimensional data assimilation and forecasting system) is a WRF-based multi-scale 4-dimensional weather analysis and prediction system. It effectively assimilates diverse types of direct, retrieved and non-direct observations available at irregular time and locations using a hybrid Newtonian relaxation and 3DVar data assimilation procedure to initiate regional weather forecast. The hybrid data assimilation and forecasting system has been implemented in a triple-nested WRF configuration with 30, 10, and 3.3 km horizontal grid spacing over the eastern Mediterranean. Analysis and forecasts have been run for a one-year long period, covering four seasons that include a wide variety of synoptic weather regimes. Objective verification is conducted to study the model performance under different weather regime. The Alpert et al. (2001) weather-regime classification method is adopted to classify the synoptic weather into 19 classes according to daily surface synoptic flows that include cyclones, highs and troughs. The aim of this paper is to investigate the model skill under different synoptic weather regimes. Objective verification statistics including Bias, RMSE and MAE of main weather variables are calculated by comparing the model data with soundings and surface observations for each weather regime. Preliminary examination of the verification scores shows significant differences of model forecast accuracy under different weather situations. The RMSE of 24h forecasts of 2-m temperatures varies from 1.6 C to 2.3C among

  14. Automated knowledge base development from CAD/CAE databases

    NASA Technical Reports Server (NTRS)

    Wright, R. Glenn; Blanchard, Mary

    1988-01-01

    Knowledge base development requires a substantial investment in time, money, and resources in order to capture the knowledge and information necessary for anything other than trivial applications. This paper addresses a means to integrate the design and knowledge base development process through automated knowledge base development from CAD/CAE databases and files. Benefits of this approach include the development of a more efficient means of knowledge engineering, resulting in the timely creation of large knowledge based systems that are inherently free of error.

  15. Knowledge data base system for twins study.

    PubMed

    Reina, S; Miozza, F

    1994-01-01

    The medical research on twins, carried out at the Gregor Mendel Institute for Medical Genetics and Twin Study in Rome over the past four decades, has resulted in a vast collection of clinical paper records. A challenge was presented by the need for a more secure method of storage to preserve this enormously valuable historical and scientific patrimony and to render its contents more easily accessible for research purposes. We met the challenge by planning and developing the computerization of this material. New concepts, currently being explored in biomedical informatics, were applied to build a Knowledge Data Base System, using a fourth-generation language (SQL). This architecturally innovative computer system enables its users to manipulate data supplied, rather than just simply storing it. Based on heuristic relational criteria between variables and parameters, the system is employed to solve problems of sibling design analysis typically arising from twins' records, but is also equipped to meet future data base requirements. Another feature of the system is its users' ability to pull off data in the form of regular automated reports, which are distributed through a Local Area Network (LAN). Through a Bulletin Board System (BBS) and modem, any scientist (outside as well as within the Institute) is thus able to access data and exchange scientific information.

  16. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy

    SciTech Connect

    Bojechko, Casey; Phillps, Mark; Kalet, Alan; Ford, Eric C.

    2015-09-15

    Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into different failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.

  17. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy.

    PubMed

    Bojechko, Casey; Phillps, Mark; Kalet, Alan; Ford, Eric C

    2015-09-01

    Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a "defense in depth" system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into different failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.

  18. Advanced software development workstation. Knowledge base design: Design of knowledge base for flight planning application

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.

  19. Tools for Assembling and Managing Scalable Knowledge Bases

    DTIC Science & Technology

    2003-02-01

    1 1.1 Knowledge Translation .......................................................................................................................... 1...areas of the knowledge base and ontology construction process and are outlined in more detail below. 1.1 Knowledge Translation As mentioned above...during KB merging operations. 2.2 The Translation Problem Figure 2: The knowledge translation problem. The general problem we set out to solve is

  20. Scenario based seismic hazard assessment and its application to the seismic verification of relevant buildings

    NASA Astrophysics Data System (ADS)

    Romanelli, Fabio; Vaccari, Franco; Altin, Giorgio; Panza, Giuliano

    2016-04-01

    The procedure we developed, and applied to a few relevant cases, leads to the seismic verification of a building by: a) use of a scenario based neodeterministic approach (NDSHA) for the calculation of the seismic input, and b) control of the numerical modeling of an existing building, using free vibration measurements of the real structure. The key point of this approach is the strict collaboration, from the seismic input definition to the monitoring of the response of the building in the calculation phase, of the seismologist and the civil engineer. The vibrometry study allows the engineer to adjust the computational model in the direction suggested by the experimental result of a physical measurement. Once the model has been calibrated by vibrometric analysis, one can select in the design spectrum the proper range of periods of interest for the structure. Then, the realistic values of spectral acceleration, which include the appropriate amplification obtained through the modeling of a "scenario" input to be applied to the final model, can be selected. Generally, but not necessarily, the "scenario" spectra lead to higher accelerations than those deduced by taking the spectra from the national codes (i.e. NTC 2008, for Italy). The task of the verifier engineer is to act so that the solution of the verification is conservative and realistic. We show some examples of the application of the procedure to some relevant (e.g. schools) buildings of the Trieste Province. The adoption of the scenario input has given in most of the cases an increase of critical elements that have to be taken into account in the design of reinforcements. However, the higher cost associated with the increase of elements to reinforce is reasonable, especially considering the important reduction of the risk level.

  1. Verification & Validation of High-Order Short-Characteristics-Based Deterministic Transport Methodology on Unstructured Grids

    SciTech Connect

    Azmy, Yousry; Wang, Yaqi

    2013-12-20

    The research team has developed a practical, high-order, discrete-ordinates, short characteristics neutron transport code for three-dimensional configurations represented on unstructured tetrahedral grids that can be used for realistic reactor physics applications at both the assembly and core levels. This project will perform a comprehensive verification and validation of this new computational tool against both a continuous-energy Monte Carlo simulation (e.g. MCNP) and experimentally measured data, an essential prerequisite for its deployment in reactor core modeling. Verification is divided into three phases. The team will first conduct spatial mesh and expansion order refinement studies to monitor convergence of the numerical solution to reference solutions. This is quantified by convergence rates that are based on integral error norms computed from the cell-by-cell difference between the code’s numerical solution and its reference counterpart. The latter is either analytic or very fine- mesh numerical solutions from independent computational tools. For the second phase, the team will create a suite of code-independent benchmark configurations to enable testing the theoretical order of accuracy of any particular discretization of the discrete ordinates approximation of the transport equation. For each tested case (i.e. mesh and spatial approximation order), researchers will execute the code and compare the resulting numerical solution to the exact solution on a per cell basis to determine the distribution of the numerical error. The final activity comprises a comparison to continuous-energy Monte Carlo solutions for zero-power critical configuration measurements at Idaho National Laboratory’s Advanced Test Reactor (ATR). Results of this comparison will allow the investigators to distinguish between modeling errors and the above-listed discretization errors introduced by the deterministic method, and to separate the sources of uncertainty.

  2. Forecast bias analysis using object-based verification of regional WRF summertime convective forecasts

    NASA Astrophysics Data System (ADS)

    Starzec, Mariusz

    Forecast verification remains a crucial component of improving model forecasts, but still remains a challenge to perform. An objective method is developed to verify simulated reflectivity against radar reflectivity at a 1 km altitude utilizing the Method for Object-based Diagnostic Evaluation (MODE) Tool. Comparing the reflectivity field allows for an instantaneous view of what is occurring in simulations without any averaging that may occur when analyzing fields such as accumulated precipitation. The objective method is applied to high resolution 3 km and 1 km local convective WRF summertime forecasts in the Northern Plains region. The bulk verification statistics reveal that forecasts generate too many objects, over-forecast the areal coverage of convection, and over-intensify convection. No noteworthy increases in skill are found when increasing to 1 km resolution and instead lead to a significant over-forecasting of small cells. A sensitivity study is performed to investigate the forecast biases found by varying the cloud droplet concentration, microphysical scheme, and horizontal resolution on a case day containing weakly forced convection mostly below the freezing level. Changing the cloud droplet concentration has a strong impact on the number of object and area biases. Increasing droplet counts to observed values generates a forecast that more closely resembles the observations in terms of area and object counts, but leads not enough rain generation. Changing the microphysical scheme produces the most pronounced effects on object counts and intensity, which is attributed to differences in autoconversion formulations. Coarsening the resolution from 3 km to 9 km leads to a decrease in skill, showing that 3 km simulations are more effective at convective forecasts. Increasing the resolution to 1 km results in amplifying the object count bias, and is found to not be worth the additional computational expense.

  3. Reuse: A knowledge-based approach

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.

  4. Model-based verification and validation of the SMAP uplink processes

    NASA Astrophysics Data System (ADS)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  5. Proposing a Knowledge Base for Teaching Academic Content to English Language Learners: Disciplinary Linguistic Knowledge

    ERIC Educational Resources Information Center

    Turkan, Sultan; De Oliveira, Luciana C.; Lee, Okhee; Phelps, Geoffrey

    2014-01-01

    Background/Context: The current research on teacher knowledge and teacher accountability falls short on information about what teacher knowledge base could guide preparation and accountability of the mainstream teachers for meeting the academic needs of ELLs. Most recently, research on specialized knowledge for teaching has offered ways to…

  6. The Relationship between Agriculture Knowledge Bases for Teaching and Sources of Knowledge

    ERIC Educational Resources Information Center

    Rice, Amber H.; Kitchel, Tracy

    2015-01-01

    The purpose of this study was to describe the agriculture knowledge bases for teaching of agriculture teachers and to see if a relationship existed between years of teaching experience, sources of knowledge, and development of pedagogical content knowledge (PCK), using quantitative methods. A model of PCK from mathematics was utilized as a…

  7. Performance evaluation of wavelet-based face verification on a PDA recorded database

    NASA Astrophysics Data System (ADS)

    Sellahewa, Harin; Jassim, Sabah A.

    2006-05-01

    The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.

  8. [Precision Nursing: Individual-Based Knowledge Translation].

    PubMed

    Chiang, Li-Chi; Yeh, Mei-Ling; Su, Sui-Lung

    2016-12-01

    U.S. President Obama announced a new era of precision medicine in the Precision Medicine Initiative (PMI). This initiative aims to accelerate the progress of personalized medicine in light of individual requirements for prevention and treatment in order to improve the state of individual and public health. The recent and dramatic development of large-scale biologic databases (such as the human genome sequence), powerful methods for characterizing patients (such as genomics, microbiome, diverse biomarkers, and even pharmacogenomics), and computational tools for analyzing big data are maximizing the potential benefits of precision medicine. Nursing science should follow and keep pace with this trend in order to develop empirical knowledge and expertise in the area of personalized nursing care. Nursing scientists must encourage, examine, and put into practice innovative research on precision nursing in order to provide evidence-based guidance to clinical practice. The applications in personalized precision nursing care include: explanations of personalized information such as the results of genetic testing; patient advocacy and support; anticipation of results and treatment; ongoing chronic monitoring; and support for shared decision-making throughout the disease trajectory. Further, attention must focus on the family and the ethical implications of taking a personalized approach to care. Nurses will need to embrace the paradigm shift to precision nursing and work collaboratively across disciplines to provide the optimal personalized care to patients. If realized, the full potential of precision nursing will provide the best chance for good health for all.

  9. Knowledge-based reusable software synthesis system

    NASA Technical Reports Server (NTRS)

    Donaldson, Cammie

    1989-01-01

    The Eli system, a knowledge-based reusable software synthesis system, is being developed for NASA Langley under a Phase 2 SBIR contract. Named after Eli Whitney, the inventor of interchangeable parts, Eli assists engineers of large-scale software systems in reusing components while they are composing their software specifications or designs. Eli will identify reuse potential, search for components, select component variants, and synthesize components into the developer's specifications. The Eli project began as a Phase 1 SBIR to define a reusable software synthesis methodology that integrates reusabilityinto the top-down development process and to develop an approach for an expert system to promote and accomplish reuse. The objectives of the Eli Phase 2 work are to integrate advanced technologies to automate the development of reusable components within the context of large system developments, to integrate with user development methodologies without significant changes in method or learning of special languages, and to make reuse the easiest operation to perform. Eli will try to address a number of reuse problems including developing software with reusable components, managing reusable components, identifying reusable components, and transitioning reuse technology. Eli is both a library facility for classifying, storing, and retrieving reusable components and a design environment that emphasizes, encourages, and supports reuse.

  10. IGENPRO knowledge-based operator support system.

    SciTech Connect

    Morman, J. A.

    1998-07-01

    Research and development is being performed on the knowledge-based IGENPRO operator support package for plant transient diagnostics and management to provide operator assistance during off-normal plant transient conditions. A generic thermal-hydraulic (T-H) first-principles approach is being implemented using automated reasoning, artificial neural networks and fuzzy logic to produce a generic T-H system-independent/plant-independent package. The IGENPRO package has a modular structure composed of three modules: the transient trend analysis module PROTREN, the process diagnostics module PRODIAG and the process management module PROMANA. Cooperative research and development work has focused on the PRODIAG diagnostic module of the IGENPRO package and the operator training matrix of transients used at the Braidwood Pressurized Water Reactor station. Promising simulator testing results with PRODIAG have been obtained for the Braidwood Chemical and Volume Control System (CVCS), and the Component Cooling Water System. Initial CVCS test results have also been obtained for the PROTREN module. The PROMANA effort also involves the CVCS. Future work will be focused on the long-term, slow and mild degradation transients where diagnoses of incipient T-H component failure prior to forced outage events is required. This will enhance the capability of the IGENPRO system as a predictive maintenance tool for plant staff and operator support.

  11. DeepDive: Declarative Knowledge Base Construction

    PubMed Central

    De Sa, Christopher; Ratner, Alex; Ré, Christopher; Shin, Jaeho; Wang, Feiran; Wu, Sen; Zhang, Ce

    2016-01-01

    The dark data extraction or knowledge base construction (KBC) problem is to populate a SQL database with information from unstructured data sources including emails, webpages, and pdf reports. KBC is a long-standing problem in industry and research that encompasses problems of data extraction, cleaning, and integration. We describe DeepDive, a system that combines database and machine learning ideas to help develop KBC systems. The key idea in DeepDive is that statistical inference and machine learning are key tools to attack classical data problems in extraction, cleaning, and integration in a unified and more effective manner. DeepDive programs are declarative in that one cannot write probabilistic inference algorithms; instead, one interacts by defining features or rules about the domain. A key reason for this design choice is to enable domain experts to build their own KBC systems. We present the applications, abstractions, and techniques of DeepDive employed to accelerate construction of KBC systems. PMID:28344371

  12. Radiochromic film based transit dosimetry for verification of dose delivery with intensity modulated radiotherapy

    SciTech Connect

    Chung, Kwangzoo; Lee, Kiho; Shin, Dongho; Kyung Lim, Young; Byeong Lee, Se; Yoon, Myonggeun; Son, Jaeman; Yong Park, Sung

    2013-02-15

    Purpose: To evaluate the transit dose based patient specific quality assurance (QA) of intensity modulated radiation therapy (IMRT) for verification of the accuracy of dose delivered to the patient. Methods: Five IMRT plans were selected and utilized to irradiate a homogeneous plastic water phantom and an inhomogeneous anthropomorphic phantom. The transit dose distribution was measured with radiochromic film and was compared with the computed dose map on the same plane using a gamma index with a 3% dose and a 3 mm distance-to-dose agreement tolerance limit. Results: While the average gamma index for comparisons of dose distributions was less than one for 98.9% of all pixels from the transit dose with the homogeneous phantom, the passing rate was reduced to 95.0% for the transit dose with the inhomogeneous phantom. Transit doses due to a 5 mm setup error may cause up to a 50% failure rate of the gamma index. Conclusions: Transit dose based IMRT QA may be superior to the traditional QA method since the former can show whether the inhomogeneity correction algorithm from TPS is accurate. In addition, transit dose based IMRT QA can be used to verify the accuracy of the dose delivered to the patient during treatment by revealing significant increases in the failure rate of the gamma index resulting from errors in patient positioning during treatment.

  13. Case-based reasoning: The marriage of knowledge base and data base

    NASA Technical Reports Server (NTRS)

    Pulaski, Kirt; Casadaban, Cyprian

    1988-01-01

    The coupling of data and knowledge has a synergistic effect when building an intelligent data base. The goal is to integrate the data and knowledge almost to the point of indistinguishability, permitting them to be used interchangeably. Examples given in this paper suggest that Case-Based Reasoning is a more integrated way to link data and knowledge than pure rule-based reasoning.

  14. [The testing and verification for interconnect faults based on cluster FPGA configuration].

    PubMed

    Duan, Cheng-Hu; Jia, Jian-Ge

    2005-05-01

    We have developed a hierarchical approach to define a set of FPGA configurations to solve the interconnect testing problem. This technique enables the detection, testing and verification of bridging faults involving intracluster interconnect and extracluster interconnect to be done easily.

  15. A framework for knowledge acquisition, representation and problem-solving in knowledge-based planning

    NASA Astrophysics Data System (ADS)

    Martinez-Bermudez, Iliana

    This research addresses the problem of developing planning knowledge-based applications. In particular, it is concerned with the problems of knowledge acquisition and representation---the issues that remain an impediment to the development of large-scale, knowledge-based planning applications. This work aims to develop a model of planning problem solving that facilitates expert knowledge elicitation and also supports effective problem solving. Achieving this goal requires determining the types of knowledge used by planning experts, the structure of this knowledge, and the problem-solving process that results in the plan. While answering these questions it became clear that the knowledge structure, as well as the process of problem solving, largely depends on the knowledge available to the expert. This dissertation proposes classification of planning problems based on their use of expert knowledge. Such classification can help in the selection of the appropriate planning method when dealing with a specific planning problem. The research concentrates on one of the identified classes of planning problems that can be characterized by well-defined and well-structured problem-solving knowledge. To achieve a more complete knowledge representation architecture for such problems, this work employs the task-specific approach to problem solving. The result of this endeavor is a task-specific methodology that allows the representation and use of planning knowledge in a structural, consistent manner specific to the domain of the application. The shell for building a knowledge-based planning application was created as a proof of concept for the methodology described in this dissertation. This shell enabled the development of a system for manufacturing planning---COMPLAN. COMPLAN encompasses knowledge related to four generic techniques used in composite material manufacturing and, given the description of the composite part, creates a family of plans capable of producing it.

  16. Computer-aided diagnosis of mammographic masses using geometric verification-based image retrieval

    NASA Astrophysics Data System (ADS)

    Li, Qingliang; Shi, Weili; Yang, Huamin; Zhang, Huimao; Li, Guoxin; Chen, Tao; Mori, Kensaku; Jiang, Zhengang

    2017-03-01

    Computer-Aided Diagnosis of masses in mammograms is an important indicator of breast cancer. The use of retrieval systems in breast examination is increasing gradually. In this respect, the method of exploiting the vocabulary tree framework and the inverted file in the mammographic masse retrieval have been proved high accuracy and excellent scalability. However it just considered the features in each image as a visual word and had ignored the spatial configurations of features. It greatly affect the retrieval performance. To overcome this drawback, we introduce the geometric verification method to retrieval in mammographic masses. First of all, we obtain corresponding match features based on the vocabulary tree framework and the inverted file. After that, we grasps the main point of local similarity characteristic of deformations in the local regions by constructing the circle regions of corresponding pairs. Meanwhile we segment the circle to express the geometric relationship of local matches in the area and generate the spatial encoding strictly. Finally we judge whether the matched features are correct or not, based on verifying the all spatial encoding are whether satisfied the geometric consistency. Experiments show the promising results of our approach.

  17. Video-based cargo fire verification system with fuzzy inference engine for commercial aircraft

    NASA Astrophysics Data System (ADS)

    Sadok, Mokhtar; Zakrzewski, Radek; Zeliff, Bob

    2005-02-01

    Conventional smoke detection systems currently installed onboard aircraft are often subject to high rates of false alarms. Under current procedures, whenever an alarm is issued the pilot is obliged to release fire extinguishers and to divert to the nearest airport. Aircraft diversions are costly and dangerous in some situations. A reliable detection system that minimizes false-alarm rate and allows continuous monitoring of cargo compartments is highly desirable. A video-based system has been recently developed by Goodrich Corporation to address this problem. The Cargo Fire Verification System (CFVS) is a multi camera system designed to provide live stream video to the cockpit crew and to perform hotspot, fire, and smoke detection in aircraft cargo bays. In addition to video frames, the CFVS uses other sensor readings to discriminate between genuine events such as fire or smoke and nuisance alarms such as fog or dust. A Mamdani-type fuzzy inference engine is developed to provide approximate reasoning for decision making. In one implementation, Gaussian membership functions for frame intensity-based features, relative humidity, and temperature are constructed using experimental data to form the system inference engine. The CFVS performed better than conventional aircraft smoke detectors in all standardized tests.

  18. Experimental Verification of a Vehicle Localization based on Moving Horizon Estimation Integrating LRS and Odometry

    NASA Astrophysics Data System (ADS)

    Sakaeta, Kuniyuki; Nonaka, Kenichiro; Sekiguchi, Kazuma

    2016-09-01

    Localization is an important function for the robots to complete various tasks. For localization, both internal and external sensors are used generally. The odometry is widely used as the method based on the internal sensors, but it suffers from cumulative errors. In the method using the laser range sensor (LRS) which is a kind of external sensor, the estimation accuracy is affected by the number of available measurement data. In our previous study, we applied moving horizon estimation (MHE) to the vehicle localization for integrating the LRS measurement data and the odometry information where the weightings of them are balanced relatively adapting to the number of the available LRS measurement data. In this paper, the effectiveness of the proposed localization method is verified through both numerical simulations and experiments using a 1/10 scale vehicle. The verification is conducted in the situations where the vehicle position cannot be localized uniquely on a certain direction using the LRS measurement data only. We achieve accurate localization even in such a situation by integrating the odometry and LRS based on MHE. We also show the superiority of the method through comparisons with a method using extended Kalman filter (EKF).

  19. A Knowledge-Based System Developer for aerospace applications

    NASA Technical Reports Server (NTRS)

    Shi, George Z.; Wu, Kewei; Fensky, Connie S.; Lo, Ching F.

    1993-01-01

    A prototype Knowledge-Based System Developer (KBSD) has been developed for aerospace applications by utilizing artificial intelligence technology. The KBSD directly acquires knowledge from domain experts through a graphical interface then builds expert systems from that knowledge. This raises the state of the art of knowledge acquisition/expert system technology to a new level by lessening the need for skilled knowledge engineers. The feasibility, applicability , and efficiency of the proposed concept was established, making a continuation which would develop the prototype to a full-scale general-purpose knowledge-based system developer justifiable. The KBSD has great commercial potential. It will provide a marketable software shell which alleviates the need for knowledge engineers and increase productivity in the workplace. The KBSD will therefore make knowledge-based systems available to a large portion of industry.

  20. A Knowledge-Based System Developer for aerospace applications

    NASA Technical Reports Server (NTRS)

    Shi, George Z.; Wu, Kewei; Fensky, Connie S.; Lo, Ching F.

    1993-01-01

    A prototype Knowledge-Based System Developer (KBSD) has been developed for aerospace applications by utilizing artificial intelligence technology. The KBSD directly acquires knowledge from domain experts through a graphical interface then builds expert systems from that knowledge. This raises the state of the art of knowledge acquisition/expert system technology to a new level by lessening the need for skilled knowledge engineers. The feasibility, applicability , and efficiency of the proposed concept was established, making a continuation which would develop the prototype to a full-scale general-purpose knowledge-based system developer justifiable. The KBSD has great commercial potential. It will provide a marketable software shell which alleviates the need for knowledge engineers and increase productivity in the workplace. The KBSD will therefore make knowledge-based systems available to a large portion of industry.

  1. Creating a knowledge base of biological research papers

    SciTech Connect

    Hafner, C.D.; Baclawski, K.; Futrelle, R.P.; Fridman, N.

    1994-12-31

    Intelligent text-oriented tools for representing and searching the biological research literature are being developed, which combine object-oriented databases with artificial intelligence techniques to create a richly structured knowledge base of Materials and Methods sections of biological research papers. A knowledge model of experimental processes, biological and chemical substances, and analytical techniques is described, based on the representation techniques of taxonomic semantic nets and knowledge frames. Two approaches to populating the knowledge base with the contents of biological research papers are described: natural language processing and an interactive knowledge definition tool.

  2. System Engineering for the NNSA Knowledge Base

    NASA Astrophysics Data System (ADS)

    Young, C.; Ballard, S.; Hipp, J.

    2006-05-01

    To improve ground-based nuclear explosion monitoring capability, GNEM R&E (Ground-based Nuclear Explosion Monitoring Research & Engineering) researchers at the national laboratories have collected an extensive set of raw data products. These raw data are used to develop higher level products (e.g. 2D and 3D travel time models) to better characterize the Earth at regional scales. The processed products and selected portions of the raw data are stored in an archiving and access system known as the NNSA (National Nuclear Security Administration) Knowledge Base (KB), which is engineered to meet the requirements of operational monitoring authorities. At its core, the KB is a data archive, and the effectiveness of the KB is ultimately determined by the quality of the data content, but access to that content is completely controlled by the information system in which that content is embedded. Developing this system has been the task of Sandia National Laboratories (SNL), and in this paper we discuss some of the significant challenges we have faced and the solutions we have engineered. One of the biggest system challenges with raw data has been integrating database content from the various sources to yield an overall KB product that is comprehensive, thorough and validated, yet minimizes the amount of disk storage required. Researchers at different facilities often use the same data to develop their products, and this redundancy must be removed in the delivered KB, ideally without requiring any additional effort on the part of the researchers. Further, related data content must be grouped together for KB user convenience. Initially SNL used whatever tools were already available for these tasks, and did the other tasks manually. The ever-growing volume of KB data to be merged, as well as a need for more control of merging utilities, led SNL to develop our own java software package, consisting of a low- level database utility library upon which we have built several

  3. Complementary technologies for verification of excess plutonium

    SciTech Connect

    Langner, , D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-12-31

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of {sup 240}Pu to {sup 239}Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime.

  4. SU-C-207A-04: Accuracy of Acoustic-Based Proton Range Verification in Water

    SciTech Connect

    Jones, KC; Sehgal, CM; Avery, S; Vander Stappen, F

    2016-06-15

    Purpose: To determine the accuracy and dose required for acoustic-based proton range verification (protoacoustics) in water. Methods: Proton pulses with 17 µs FWHM and instantaneous currents of 480 nA (5.6 × 10{sup 7} protons/pulse, 8.9 cGy/pulse) were generated by a clinical, hospital-based cyclotron at the University of Pennsylvania. The protoacoustic signal generated in a water phantom by the 190 MeV proton pulses was measured with a hydrophone placed at multiple known positions surrounding the dose deposition. The background random noise was measured. The protoacoustic signal was simulated to compare to the experiments. Results: The maximum protoacoustic signal amplitude at 5 cm distance was 5.2 mPa per 1 × 10{sup 7} protons (1.6 cGy at the Bragg peak). The background random noise of the measurement was 27 mPa. Comparison between simulation and experiment indicates that the hydrophone introduced a delay of 2.4 µs. For acoustic data collected with a signal-to-noise ratio (SNR) of 21, deconvolution of the protoacoustic signal with the proton pulse provided the most precise time-of-flight range measurement (standard deviation of 2.0 mm), but a systematic error (−4.5 mm) was observed. Conclusion: Based on water phantom measurements at a clinical hospital-based cyclotron, protoacoustics is a potential technique for measuring the proton Bragg peak range with 2.0 mm standard deviation. Simultaneous use of multiple detectors is expected to reduce the standard deviation, but calibration is required to remove systematic error. Based on the measured background noise and protoacoustic amplitude, a SNR of 5.3 is projected for a deposited dose of 2 Gy.

  5. An object-based methodology for knowledge representation

    SciTech Connect

    Kelsey, R.L.; Hartley, R.T.; Webster, R.B.

    1997-11-01

    An object based methodology for knowledge representation is presented. The constructs and notation to the methodology are described and illustrated with examples. The ``blocks world,`` a classic artificial intelligence problem, is used to illustrate some of the features of the methodology including perspectives and events. Representing knowledge with perspectives can enrich the detail of the knowledge and facilitate potential lines of reasoning. Events allow example uses of the knowledge to be represented along with the contained knowledge. Other features include the extensibility and maintainability of knowledge represented in the methodology.

  6. Project-Based Learning and the Limits of Corporate Knowledge.

    ERIC Educational Resources Information Center

    Rhodes, Carl; Garrick, John

    2003-01-01

    Analysis of management discourses, especially project-based learning and knowledge management, indicates that such terms as human capital, working knowledge, and knowledge assets construe managerial workers as cogito-economic subjects. Although workplace learning should develop economically related capabilities, such discourses imply that these…

  7. Knowledge Sharing in an American Multinational Company Based in Malaysia

    ERIC Educational Resources Information Center

    Ling, Chen Wai; Sandhu, Manjit S.; Jain, Kamal Kishore

    2009-01-01

    Purpose: This paper seeks to examine the views of executives working in an American based multinational company (MNC) about knowledge sharing, barriers to knowledge sharing, and strategies to promote knowledge sharing. Design/methodology/approach: This study was carried out in phases. In the first phase, a topology of organizational mechanisms for…

  8. Project-Based Learning and the Limits of Corporate Knowledge.

    ERIC Educational Resources Information Center

    Rhodes, Carl; Garrick, John

    2003-01-01

    Analysis of management discourses, especially project-based learning and knowledge management, indicates that such terms as human capital, working knowledge, and knowledge assets construe managerial workers as cogito-economic subjects. Although workplace learning should develop economically related capabilities, such discourses imply that these…

  9. Knowledge Sharing in an American Multinational Company Based in Malaysia

    ERIC Educational Resources Information Center

    Ling, Chen Wai; Sandhu, Manjit S.; Jain, Kamal Kishore

    2009-01-01

    Purpose: This paper seeks to examine the views of executives working in an American based multinational company (MNC) about knowledge sharing, barriers to knowledge sharing, and strategies to promote knowledge sharing. Design/methodology/approach: This study was carried out in phases. In the first phase, a topology of organizational mechanisms for…

  10. Knowledge-Based Aid: A Four Agency Comparative Study

    ERIC Educational Resources Information Center

    McGrath, Simon; King, Kenneth

    2004-01-01

    Part of the response of many development cooperation agencies to the challenges of globalisation, ICTs and the knowledge economy is to emphasise the importance of knowledge for development. This paper looks at the discourses and practices of ''knowledge-based aid'' through an exploration of four agencies: the World Bank, DFID, Sida and JICA. It…

  11. Knowledge-Based Aid: A Four Agency Comparative Study

    ERIC Educational Resources Information Center

    McGrath, Simon; King, Kenneth

    2004-01-01

    Part of the response of many development cooperation agencies to the challenges of globalisation, ICTs and the knowledge economy is to emphasise the importance of knowledge for development. This paper looks at the discourses and practices of ''knowledge-based aid'' through an exploration of four agencies: the World Bank, DFID, Sida and JICA. It…

  12. Model based correction of placement error in EBL and its verification

    NASA Astrophysics Data System (ADS)

    Babin, Sergey; Borisov, Sergey; Militsin, Vladimir; Komagata, Tadashi; Wakatsuki, Tetsuro

    2016-05-01

    In maskmaking, the main source of error contributing to placement error is charging. DISPLACE software corrects the placement error for any layout, based on a physical model. The charge of a photomask and multiple discharge mechanisms are simulated to find the charge distribution over the mask. The beam deflection is calculated for each location on the mask, creating data for the placement correction. The software considers the mask layout, EBL system setup, resist, and writing order, as well as other factors such as fogging and proximity effects correction. The output of the software is the data for placement correction. One important step is the calibration of physical model. A test layout on a single calibration mask was used for calibration. The extracted model parameters were used to verify the correction. As an ultimate test for the correction, a sophisticated layout was used for the verification that was very different from the calibration mask. The placement correction results were predicted by DISPLACE. A good correlation of the measured and predicted values of the correction confirmed the high accuracy of the charging placement error correction.

  13. Verification measurements of the Karoo Array timing system: a laser radar based time transfer system

    NASA Astrophysics Data System (ADS)

    Siebrits, R.; Bauermeister, E.; Gamatham, R.; Adams, G.; Malan, J. A.; Burger, J. P.; Kapp, F.; Gibbon, T.; Kriel, H.; Abbott, T.

    2016-02-01

    An optical fiber based laser radar time transfer system has been developed for the 64-dish MeerKAT radiointerferometer telescope project to provide accurate atomic time to the receivers of the telescope system. This time transfer system is called the Karoo Array Timing System (KATS). Calibration of the time transfer system is essential to ensure that time is accurately transferred to the digitisers that form part of the receivers. Frequency domain reflectometry via vector network analysers is also used to verify measurements taken using time interval counters. This paper details the progress that is made in the verification measurements of the system in order to ensure that time, accurate to within a few nanoseconds of the Universal Coordinated Time (UTC, is available at the point where radio signals from astronomical sources are received. This capability enables world class transient and timing studies with a compact radio interferometer, which has inherent advantages over large single dish radio-telescopes, in observing the transient sky.

  14. Enhanced spacer-is-dielectric (sid) decomposition flow with model-based verification

    NASA Astrophysics Data System (ADS)

    Du, Yuelin; Song, Hua; Shiely, James; Wong, Martin D. F.

    2013-03-01

    Self-aligned double patterning (SADP) lithography is a leading candidate for 14nm node lower-metal layer fabrication. Besides the intrinsic overlay-tolerance capability, the accurate spacer width and uniformity control enables such technology to fabricate very narrow and dense patterns. Spacer-is-dielectric (SID) is the most popular flavor of SADP with higher flexibility in design. In the SID process, due to uniform spacer deposition, the spacer shape gets rounded at convex mandrel corners, and disregarding the corner rounding issue during SID decomposition may result in severe residue artifacts on device patterns. Previously, SADP decomposition was merely verified by Boolean operations on the decomposed layers, where the residue artifacts are not even identifiable. This paper proposes a model-based verification method for SID decomposition to identify the artifacts caused by spacer corner rounding. Then targeting residue artifact removal, an enhanced SID decomposition flow is introduced. Simulation results show that residue artifacts are removed effectively through the enhanced SID decomposition strategy.

  15. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    NASA Astrophysics Data System (ADS)

    Yin, Peili; Wang, Jianhua; Lu, Chunxia

    2017-08-01

    Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI). The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI) to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  16. Fusion of hand vein, iris and fingerprint for person identity verification based on Bayesian theory

    NASA Astrophysics Data System (ADS)

    Li, Xiuyan; Liu, Tiegen; Deng, Shichao; Wang, Yunxin

    2009-11-01

    Biometric identification is an important guarantee for social security. In recent years, as the development of social and economic, the more accuracy and safety of identification are required. The person identity verification systems that use a single biometric appear inherent limitations in accuracy, user acceptance, universality. Limitations of unimodal biometric systems can be overcome by using multimodal biometric systems, which combines the conclusions made by a number of unrelated biometrics indicators. Aiming at the limitations of unimodal biometric identification, a recognition algorithm for multimodal biometric fusion based on hand vein, iris and fingerprint was proposed. To verify person identity, the hand vein images, iris images and fingerprint images were preprocessed firstly. The region of interest (ROI) of hand vein image was obtained and filtered to reduce image noises. The multiresolution analysis theory was utilized to extract the texture information of hand vein. The iris image was preprocessed through iris localization, eyelid detection, image normalization and image enhancement, and then the feature code of iris was extracted from the detail images obtained using wavelet transform. The texture feature information represented fingerprint pattern was extracted after filtering and image enhancement. The Bayesian theorem was employed to realize the fusion at the matching score level and the fusion recognition result was finally obtained. The experimental results were presented, which showed that the recognition performance of the proposed fusion method was obviously higher than that of single biometric recognition algorithm. It had verified the efficiency of the proposed method for biometrics.

  17. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    NASA Astrophysics Data System (ADS)

    Mao, S. P.; Rottenberg, X.; Rochus, V.; Czarnecki, P.; Helin, P.; Severi, S.; Nauwelaers, B.; Tilmans, H. A. C.

    2017-03-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp.). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells.

  18. An Evaluation of Knowledge Base Systems for Large OWL Datasets

    DTIC Science & Technology

    2004-01-01

    An Evaluation of Knowledge Base Systems for Large OWL Datasets Yuanbo Guo, Zhengxiang Pan, and Jeff Heflin Computer Science & Engineering ...present our work on evaluating knowledge base sys- tems with respect to use in large OWL applications. To this end, we have de- veloped the Lehigh...University Benchmark (LUBM). The benchmark is in- tended to evaluate knowledge base systems with respect to extensional queries over a large dataset that

  19. Utilizing knowledge-base semantics in graph-based algorithms

    SciTech Connect

    Darwiche, A.

    1996-12-31

    Graph-based algorithms convert a knowledge base with a graph structure into one with a tree structure (a join-tree) and then apply tree-inference on the result. Nodes in the join-tree are cliques of variables and tree-inference is exponential in w*, the size of the maximal clique in the join-tree. A central property of join-trees that validates tree-inference is the running-intersection property: the intersection of any two cliques must belong to every clique on the path between them. We present two key results in connection to graph-based algorithms. First, we show that the running-intersection property, although sufficient, is not necessary for validating tree-inference. We present a weaker property for this purpose, called running-interaction, that depends on non-structural (semantical) properties of a knowledge base. We also present a linear algorithm that may reduce w* of a join-tree, possibly destroying its running-intersection property, while maintaining its running-interaction property and, hence, its validity for tree-inference. Second, we develop a simple algorithm for generating trees satisfying the running-interaction property. The algorithm bypasses triangulation (the standard technique for constructing join-trees) and does not construct a join-tree first. We show that the proposed algorithm may in some cases generate trees that are more efficient than those generated by modifying a join-tree.

  20. Weather, knowledge base and life-style

    NASA Astrophysics Data System (ADS)

    Bohle, Martin

    2015-04-01

    Why to main-stream curiosity for earth-science topics, thus to appraise these topics as of public interest? Namely, to influence practices how humankind's activities intersect the geosphere. How to main-stream that curiosity for earth-science topics? Namely, by weaving diverse concerns into common threads drawing on a wide range of perspectives: be it beauty or particularity of ordinary or special phenomena, evaluating hazards for or from mundane environments, or connecting the scholarly investigation with concerns of citizens at large; applying for threading traditional or modern media, arts or story-telling. Three examples: First "weather"; weather is a topic of primordial interest for most people: weather impacts on humans lives, be it for settlement, for food, for mobility, for hunting, for fishing, or for battle. It is the single earth-science topic that went "prime-time" since in the early 1950-ties the broadcasting of weather forecasts started and meteorologists present their work to the public, daily. Second "knowledge base"; earth-sciences are a relevant for modern societies' economy and value setting: earth-sciences provide insights into the evolution of live-bearing planets, the functioning of Earth's systems and the impact of humankind's activities on biogeochemical systems on Earth. These insights bear on production of goods, living conditions and individual well-being. Third "life-style"; citizen's urban culture prejudice their experiential connections: earth-sciences related phenomena are witnessed rarely, even most weather phenomena. In the past, traditional rural communities mediated their rich experiences through earth-centric story-telling. In course of the global urbanisation process this culture has given place to society-centric story-telling. Only recently anthropogenic global change triggered discussions on geoengineering, hazard mitigation, demographics, which interwoven with arts, linguistics and cultural histories offer a rich narrative

  1. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience

    NASA Astrophysics Data System (ADS)

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael

    2007-08-01

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm3 ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 ± 1.2% and 0.5 ± 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 ± 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach

  2. Benchmarking of Prelicensure Nursing Students' Evidence-Based Practice Knowledge.

    PubMed

    Cosme, Sheryl; Milner, Kerry A; Wonder, Amy

    2017-06-19

    Evidence-based practice (EBP) knowledge among prelicensure nursing students was measured before, immediately following, and 1 year after completion of an EBP course using a relatively new instrument. There was a significant increase in EBP knowledge immediately following the course, and knowledge was sustained 1 year later. Results enabled faculty to gauge the effectiveness of the EBP course within the curriculum to prepare students with the knowledge needed to enact EBP in practice.

  3. Zero-Knowledge Proof Based Node Authentication

    DTIC Science & Technology

    2009-05-01

    results are inconclusive and require additional experiments. 15. SUBJECT TERMS Airborne Network Protocol, Zero Knowledge Proof, Graph Isomorphism...we developed the basic guidelines for this selection, our results are inconclusive and require additional experiments. 2 2. INTRODUCTION...manufacture, use, or sell any patented invention that may relate to them. This report is the result of contracted fundamental research deemed exempt

  4. Prior knowledge-based approach for associating ...

    EPA Pesticide Factsheets

    Evaluating the potential human health and/or ecological risks associated with exposures to complex chemical mixtures in the ambient environment is one of the central challenges of chemical safety assessment and environmental protection. There is a need for approaches that can help to integrate chemical monitoring and bio-effects data to evaluate risks associated with chemicals present in the environment. We used prior knowledge about chemical-gene interactions to develop a knowledge assembly model for detected chemicals at five locations near two wastewater treatment plants. The assembly model was used to generate hypotheses about the biological impacts of the chemicals at each location. The hypotheses were tested using empirical hepatic gene expression data from fathead minnows exposed for 12 d at each location. Empirical gene expression data was also mapped to the assembly models to statistically evaluate the likelihood of a chemical contributing to the observed biological responses. The prior knowledge approach was able reasonably hypothesize the biological impacts at one site but not the other. Chemicals most likely contributing to the observed biological responses were identified at each location. Despite limitations to the approach, knowledge assembly models have strong potential for associating chemical occurrence with potential biological effects and providing a foundation for hypothesis generation to guide research and/or monitoring efforts relat

  5. Prior knowledge-based approach for associating ...

    EPA Pesticide Factsheets

    Evaluating the potential human health and/or ecological risks associated with exposures to complex chemical mixtures in the ambient environment is one of the central challenges of chemical safety assessment and environmental protection. There is a need for approaches that can help to integrate chemical monitoring and bio-effects data to evaluate risks associated with chemicals present in the environment. We used prior knowledge about chemical-gene interactions to develop a knowledge assembly model for detected chemicals at five locations near two wastewater treatment plants. The assembly model was used to generate hypotheses about the biological impacts of the chemicals at each location. The hypotheses were tested using empirical hepatic gene expression data from fathead minnows exposed for 12 d at each location. Empirical gene expression data was also mapped to the assembly models to statistically evaluate the likelihood of a chemical contributing to the observed biological responses. The prior knowledge approach was able reasonably hypothesize the biological impacts at one site but not the other. Chemicals most likely contributing to the observed biological responses were identified at each location. Despite limitations to the approach, knowledge assembly models have strong potential for associating chemical occurrence with potential biological effects and providing a foundation for hypothesis generation to guide research and/or monitoring efforts relat

  6. Knowledge-Based Software Development Tools

    DTIC Science & Technology

    1993-09-01

    inference, finite differencing, and data structure selection are discussed. A detailed case study is presented that shows how these systems could cooperate...theories that were codified in the language. In particular, encoded in CHI were tireories for: generating data structure implementations 𔄃], which mil...problems, and the finite-differencing program optinmizatica technique Implementation knowledge for data structure generatmin and performance estimatim

  7. Active Data/Knowledge Base Dictionary

    DTIC Science & Technology

    1991-09-01

    maintenance of integrity asser- tions using redundant aggregate. In Proceedings of the 6th Int’l Conf. on Very Large Databases, pages 126-136, Alfonso ...Minker. Logic and Databases. Plenum Press, New York, 1978. [GM83] Hector Garcia- Molina . Using semantic knowledge for transaction processing in a dis

  8. Identity Verification Systems as a Critical Infrastructure

    DTIC Science & Technology

    2012-03-01

    fraudulent credit card scanners , stolen purse or wallet, or phone and Internet scams.9 The FTC also reports that identity thieves steal information in...alternative to knowledge and token-based verification. Fingerprints, retinal scans, facial recognition software, and DNA provide technically and...utilizing these systems. U.S.-VISIT was intended to automate the entry and exit process for foreign travelers. Biometric fingerprint scanners and

  9. Dust forecast over North Africa: verification with satellite and ground based observations

    NASA Astrophysics Data System (ADS)

    Singh, Aditi; Kumar, Sumit; George, John P.

    2016-05-01

    Arid regions of North Africa are considered as one of the major dust source. Present study focuses on the forecast of aerosol optical depth (AOD) of dust over different regions of North Africa. NCMRWF Unified Model (NCUM) produces dust AOD forecasts at different wavelengths with lead time upto 240 hr, based on 00UTC initial conditions. Model forecast of dust AOD at 550 nm up to 72 hr forecast, based on different initial conditions are verified against satellite and ground based observations of total AOD during May-June 2014 with the assumption that except dust, presence of all other aerosols type are negligible. Location specific and geographical distribution of dust AOD forecast is verified against Aerosol Robotic Network (AERONET) station observations of total and coarse mode AOD. Moderate Resolution Imaging Spectroradiometer (MODIS) dark target and deep blue merged level 3 total aerosol optical depth (AOD) at 550 nm and Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) retrieved dust AOD at 532 nm are also used for verification. CALIOP dust AOD was obtained by vertical integration of aerosol extinction coefficient at 532 nm from the aerosol profile level 2 products. It is found that at all the selected AERONET stations, the trend in dust AODs is well predicted by NCUM up to three days advance. Good correlation, with consistently low bias (~ +/-0.06) and RMSE (~ 0.2) values, is found between model forecasts and point measurements of AERONET, except over one location Cinzana (Mali). Model forecast consistently overestimated the dust AOD compared to CALIOP dust AOD, with a bias of 0.25 and RMSE of 0.40.

  10. Speaker verification based on the fusion of speech acoustics and inverted articulatory signals☆

    PubMed Central

    Li, Ming; Kim, Jangwon; Lammert, Adam; Ghosh, Prasanta Kumar; Ramanarayanan, Vikram; Narayanan, Shrikanth

    2016-01-01

    We propose a practical, feature-level and score-level fusion approach by combining acoustic and estimated articulatory information for both text independent and text dependent speaker verification. From a practical point of view, we study how to improve speaker verification performance by combining dynamic articulatory information with the conventional acoustic features. On text independent speaker verification, we find that concatenating articulatory features obtained from measured speech production data with conventional Mel-frequency cepstral coefficients (MFCCs) improves the performance dramatically. However, since directly measuring articulatory data is not feasible in many real world applications, we also experiment with estimated articulatory features obtained through acoustic-to-articulatory inversion. We explore both feature level and score level fusion methods and find that the overall system performance is significantly enhanced even with estimated articulatory features. Such a performance boost could be due to the inter-speaker variation information embedded in the estimated articulatory features. Since the dynamics of articulation contain important information, we included inverted articulatory trajectories in text dependent speaker verification. We demonstrate that the articulatory constraints introduced by inverted articulatory features help to reject wrong password trials and improve the performance after score level fusion. We evaluate the proposed methods on the X-ray Microbeam database and the RSR 2015 database, respectively, for the aforementioned two tasks. Experimental results show that we achieve more than 15% relative equal error rate reduction for both speaker verification tasks. PMID:28496292

  11. Film based verification of calculation algorithms used for brachytherapy planning-getting ready for upcoming challenges of MBDCA.

    PubMed

    Zwierzchowski, Grzegorz; Bielęda, Grzegorz; Skowronek, Janusz; Mazur, Magdalena

    2016-08-01

    Well-known defect of TG-43 based algorithms used in brachytherapy is a lack of information about interaction cross-sections, which are determined not only by electron density but also by atomic number. TG-186 recommendations with using of MBDCA (model-based dose calculation algorithm), accurate tissues segmentation, and the structure's elemental composition continue to create difficulties in brachytherapy dosimetry. For the clinical use of new algorithms, it is necessary to introduce reliable and repeatable methods of treatment planning systems (TPS) verification. The aim of this study is the verification of calculation algorithm used in TPS for shielded vaginal applicators as well as developing verification procedures for current and further use, based on the film dosimetry method. Calibration data was collected by separately irradiating 14 sheets of Gafchromic(®) EBT films with the doses from 0.25 Gy to 8.0 Gy using HDR (192)Ir source. Standard vaginal cylinders of three diameters were used in the water phantom. Measurements were performed without any shields and with three shields combination. Gamma analyses were performed using the VeriSoft(®) package. Calibration curve was determined as third-degree polynomial type. For all used diameters of unshielded cylinder and for all shields combinations, Gamma analysis were performed and showed that over 90% of analyzed points meets Gamma criteria (3%, 3 mm). Gamma analysis showed good agreement between dose distributions calculated using TPS and measured by Gafchromic films, thus showing the viability of using film dosimetry in brachytherapy.

  12. Automated knowledge acquisition for second generation knowledge base systems: A conceptual analysis and taxonomy

    SciTech Connect

    Williams, K.E.; Kotnour, T.

    1991-12-31

    In this paper, we present a conceptual analysis of knowledge-base development methodologies. The purpose of this research is to help overcome the high cost and lack of efficiency in developing knowledge base representations for artificial intelligence applications. To accomplish this purpose, we analyzed the available methodologies and developed a knowledge-base development methodology taxonomy. We review manual, machine-aided, and machine-learning methodologies. A set of developed characteristics allows description and comparison among the methodologies. We present the results of this conceptual analysis of methodologies and recommendations for development of more efficient and effective tools.

  13. Automated knowledge acquisition for second generation knowledge base systems: A conceptual analysis and taxonomy

    SciTech Connect

    Williams, K.E.; Kotnour, T.

    1991-01-01

    In this paper, we present a conceptual analysis of knowledge-base development methodologies. The purpose of this research is to help overcome the high cost and lack of efficiency in developing knowledge base representations for artificial intelligence applications. To accomplish this purpose, we analyzed the available methodologies and developed a knowledge-base development methodology taxonomy. We review manual, machine-aided, and machine-learning methodologies. A set of developed characteristics allows description and comparison among the methodologies. We present the results of this conceptual analysis of methodologies and recommendations for development of more efficient and effective tools.

  14. Approximate Degrees of Similarity between a User's Knowledge and the Tutorial Systems' Knowledge Base

    ERIC Educational Resources Information Center

    Mogharreban, Namdar

    2004-01-01

    A typical tutorial system functions by means of interaction between four components: the expert knowledge base component, the inference engine component, the learner's knowledge component and the user interface component. In typical tutorial systems the interaction and the sequence of presentation as well as the mode of evaluation are…

  15. Construction of Expert Knowledge Monitoring and Assessment System Based on Integral Method of Knowledge Evaluation

    ERIC Educational Resources Information Center

    Golovachyova, Viktoriya N.; Menlibekova, Gulbakhyt Zh.; Abayeva, Nella F.; Ten, Tatyana L.; Kogaya, Galina D.

    2016-01-01

    Using computer-based monitoring systems that rely on tests could be the most effective way of knowledge evaluation. The problem of objective knowledge assessment by means of testing takes on a new dimension in the context of new paradigms in education. The analysis of the existing test methods enabled us to conclude that tests with selected…

  16. Students' Refinement of Knowledge during the Development of Knowledge Bases for Expert Systems.

    ERIC Educational Resources Information Center

    Lippert, Renate; Finley, Fred

    The refinement of the cognitive knowledge base was studied through exploration of the transition from novice to expert and the use of an instructional strategy called novice knowledge engineering. Six college freshmen, who were enrolled in an honors physics course, used an expert system to create questions, decisions, rules, and explanations…

  17. Advancing the hydrogen safety knowledge base

    SciTech Connect

    Weiner, S. C.

    2014-08-29

    The International Energy Agency's Hydrogen Implementing Agreement (IEA HIA) was established in 1977 to pursue collaborative hydrogen research and development and information exchange among its member countries. Information and knowledge dissemination is a key aspect of the work within IEA HIA tasks, and case studies, technical reports and presentations/publications often result from the collaborative efforts. The work conducted in hydrogen safety under Task 31 and its predecessor, Task 19, can positively impact the objectives of national programs even in cases for which a specific task report is not published. As a result, the interactions within Task 31 illustrate how technology information and knowledge exchange among participating hydrogen safety experts serve the objectives intended by the IEA HIA.

  18. Advancing the hydrogen safety knowledge base

    DOE PAGES

    Weiner, S. C.

    2014-08-29

    The International Energy Agency's Hydrogen Implementing Agreement (IEA HIA) was established in 1977 to pursue collaborative hydrogen research and development and information exchange among its member countries. Information and knowledge dissemination is a key aspect of the work within IEA HIA tasks, and case studies, technical reports and presentations/publications often result from the collaborative efforts. The work conducted in hydrogen safety under Task 31 and its predecessor, Task 19, can positively impact the objectives of national programs even in cases for which a specific task report is not published. As a result, the interactions within Task 31 illustrate how technologymore » information and knowledge exchange among participating hydrogen safety experts serve the objectives intended by the IEA HIA.« less

  19. Fundamentals of Knowledge-Based Techniques

    DTIC Science & Technology

    2006-09-01

    Predicate logic knowledge representation models what are known as facts within a domain and represents these facts so that an inference engine or...ANyyy, x]Target Relation [ANsss, T1] [ANyyy, T2] RxPower Relation [T1, x] [T2, x] Figure 3. Relational DBMS Model of Facts and Causes...Semantic Nets Originally semantic nets were developed for the purpose of modeling the English language (7), for a computer to understand. A method

  20. Online Handwritten Signature Verification Using Neural Network Classifier Based on Principal Component Analysis

    PubMed Central

    Iranmanesh, Vahab; Ahmad, Sharifah Mumtazah Syed; Adnan, Wan Azizun Wan; Arigbabu, Olasimbo Ayodeji; Malallah, Fahad Layth

    2014-01-01

    One of the main difficulties in designing online signature verification (OSV) system is to find the most distinctive features with high discriminating capabilities for the verification, particularly, with regard to the high variability which is inherent in genuine handwritten signatures, coupled with the possibility of skilled forgeries having close resemblance to the original counterparts. In this paper, we proposed a systematic approach to online signature verification through the use of multilayer perceptron (MLP) on a subset of principal component analysis (PCA) features. The proposed approach illustrates a feature selection technique on the usually discarded information from PCA computation, which can be significant in attaining reduced error rates. The experiment is performed using 4000 signature samples from SIGMA database, which yielded a false acceptance rate (FAR) of 7.4% and a false rejection rate (FRR) of 6.4%. PMID:25133227

  1. Online handwritten signature verification using neural network classifier based on principal component analysis.

    PubMed

    Iranmanesh, Vahab; Ahmad, Sharifah Mumtazah Syed; Adnan, Wan Azizun Wan; Yussof, Salman; Arigbabu, Olasimbo Ayodeji; Malallah, Fahad Layth

    2014-01-01

    One of the main difficulties in designing online signature verification (OSV) system is to find the most distinctive features with high discriminating capabilities for the verification, particularly, with regard to the high variability which is inherent in genuine handwritten signatures, coupled with the possibility of skilled forgeries having close resemblance to the original counterparts. In this paper, we proposed a systematic approach to online signature verification through the use of multilayer perceptron (MLP) on a subset of principal component analysis (PCA) features. The proposed approach illustrates a feature selection technique on the usually discarded information from PCA computation, which can be significant in attaining reduced error rates. The experiment is performed using 4000 signature samples from SIGMA database, which yielded a false acceptance rate (FAR) of 7.4% and a false rejection rate (FRR) of 6.4%.

  2. Experimental evaluation of fingerprint verification system based on double random phase encoding

    NASA Astrophysics Data System (ADS)

    Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi

    2006-03-01

    We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.

  3. EHR based Genetic Testing Knowledge Base (iGTKB) Development.

    PubMed

    Zhu, Qian; Liu, Hongfang; Chute, Christopher G; Ferber, Matthew

    2015-01-01

    The gap between a large growing number of genetic tests and a suboptimal clinical workflow of incorporating these tests into regular clinical practice poses barriers to effective reliance on advanced genetic technologies to improve quality of healthcare. A promising solution to fill this gap is to develop an intelligent genetic test recommendation system that not only can provide a comprehensive view of genetic tests as education resources, but also can recommend the most appropriate genetic tests to patients based on clinical evidence. In this study, we developed an EHR based Genetic Testing Knowledge Base for Individualized Medicine (iGTKB). We extracted genetic testing information and patient medical records from EHR systems at Mayo Clinic. Clinical features have been semi-automatically annotated from the clinical notes by applying a Natural Language Processing (NLP) tool, MedTagger suite. To prioritize clinical features for each genetic test, we compared odds ratio across four population groups. Genetic tests, genetic disorders and clinical features with their odds ratios have been applied to establish iGTKB, which is to be integrated into the Genetic Testing Ontology (GTO). Overall, there are five genetic tests operated with sample size greater than 100 in 2013 at Mayo Clinic. A total of 1,450 patients who was tested by one of the five genetic tests have been selected. We assembled 243 clinical features from the Human Phenotype Ontology (HPO) for these five genetic tests. There are 60 clinical features with at least one mention in clinical notes of patients taking the test. Twenty-eight clinical features with high odds ratio (greater than 1) have been selected as dominant features and deposited into iGTKB with their associated information about genetic tests and genetic disorders. In this study, we developed an EHR based genetic testing knowledge base, iGTKB. iGTKB will be integrated into the GTO by providing relevant clinical evidence, and ultimately to

  4. 3D EPID based dosimetry for pre-treatment verification of VMAT - methods and challenges

    NASA Astrophysics Data System (ADS)

    Greer, P. B.

    2013-06-01

    This article presents an overview of pre-treatment verification of volumetric modulated arc therapy (VMAT) with electronic portal imaging devices (EPIDs). Challenges to VMAT verification with EPIDs are discussed including EPID sag/flex during rotation, acquisition using cine-mode imaging, image artefacts during VMAT and determining the gantry angle for each image. The major methods that have been proposed to verify VMAT with EPIDs are introduced including those using or adapting commercial software systems and non-commercial implementations. Both two-dimensional and three-dimensional methods are reviewed.

  5. Knowledge-Based Entrepreneurship in a Boundless Research System

    ERIC Educational Resources Information Center

    Dell'Anno, Davide

    2008-01-01

    International entrepreneurship and knowledge-based entrepreneurship have recently generated considerable academic and non-academic attention. This paper explores the "new" field of knowledge-based entrepreneurship in a boundless research system. Cultural barriers to the development of business opportunities by researchers persist in some academic…

  6. Knowledge-Based Entrepreneurship in a Boundless Research System

    ERIC Educational Resources Information Center

    Dell'Anno, Davide

    2008-01-01

    International entrepreneurship and knowledge-based entrepreneurship have recently generated considerable academic and non-academic attention. This paper explores the "new" field of knowledge-based entrepreneurship in a boundless research system. Cultural barriers to the development of business opportunities by researchers persist in some academic…

  7. 7 CFR 983.67 - Random verification audits.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing..., ARIZONA, AND NEW MEXICO Reports, Books and Records § 983.67 Random verification audits. (a) All handlers... the committee, based on information from the industry or knowledge of possible violations, may...

  8. Verification of sectoral cloud motion based direct normal irradiance nowcasting from satellite imagery

    NASA Astrophysics Data System (ADS)

    Schroedter-Homscheidt, Marion; Gesell, Gerhard

    2016-05-01

    The successful integration of solar electricity from photovoltaics or concentrating solar power plants into the existing electricity supply requires an electricity production forecast for 48 hours, while any improved surface irradiance forecast over the next upcoming hours is relevant for an optimized operation of the power plant. While numerical weather prediction has been widely assessed and is in commercial use, the short-term nowcasting is still a major field of development. European Commission's FP7 DNICast project is especially focusing on this task and this paper reports about parts of DNICast results. A nowcasting scheme based on Meteosat Second Generation cloud imagery and cloud movement tracking has been developed for Southern Spain as part of a solar production forecasting tool (CSP-FoSyS). It avoids the well-known, but not really satisfying standard cloud motion vector approach by using a sectoral approach and asking the question at which time any cloud structure will affect the power plant. It distinguishes between thin cirrus clouds and other clouds, which typically occur in different heights in the atmosphere and move in different directions. Also, their optical properties are very different - especially for the calculation of direct normal irradiances as required by concentrating solar power plants. Results for Southern Spain show a positive impact of up to 8 hours depending of the time of the day and a RMSD reduction of up to 10% in hourly DNI irradiation compared to day ahead forecasts. This paper presents the verification of this scheme at other locations in Europe and Northern Africa (BSRN and EnerMENA stations) with different cloud conditions. Especially for Jordan and Tunisia as the most relevant countries for CSP in this station list, we also find a positive impact of up to 8 hours.

  9. First Experience With Real-Time EPID-Based Delivery Verification During IMRT and VMAT Sessions.

    PubMed

    Woodruff, Henry C; Fuangrod, Todsaporn; Van Uytven, Eric; McCurdy, Boyd M C; van Beek, Timothy; Bhatia, Shashank; Greer, Peter B

    2015-11-01

    Gantry-mounted megavoltage electronic portal imaging devices (EPIDs) have become ubiquitous on linear accelerators. WatchDog is a novel application of EPIDs, in which the image frames acquired during treatment are used to monitor treatment delivery in real time. We report on the preliminary use of WatchDog in a prospective study of cancer patients undergoing intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) and identify the challenges of clinical adoption. At the time of submission, 28 cancer patients (head and neck, pelvis, and prostate) undergoing fractionated external beam radiation therapy (24 IMRT, 4 VMAT) had ≥1 treatment fraction verified in real time (131 fractions or 881 fields). EPID images acquired continuously during treatment were synchronized and compared with model-generated transit EPID images within a frame time (∼0.1 s). A χ comparison was performed to cumulative frames to gauge the overall delivery quality, and the resulting pass rates were reported graphically during treatment delivery. Every frame acquired (500-1500 per fraction) was saved for postprocessing and analysis. The system reported the mean ± standard deviation in real time χ 91.1% ± 11.5% (83.6% ± 13.2%) for cumulative frame χ analysis with 4%, 4 mm (3%, 3 mm) criteria, global over the integrated image. A real-time EPID-based radiation delivery verification system for IMRT and VMAT has been demonstrated that aims to prevent major mistreatments in radiation therapy. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Contour-based kernel modeling and verification for E-Beam lithography

    NASA Astrophysics Data System (ADS)

    You, Jan-Wen; Chen, Cheng-Hung; Chien, Tsung-Chih; Shin, Jaw-Jung; Lin, Shy-Jay; Lin, Burn J.

    2015-03-01

    In E-beam lithography, the double or multiple Gaussian kernels used to describe the electron scattering behavior have been discussed extensively for critical dimensions (CDs) larger than the e-beam blur size. However in e-beam direct write on wafer, CD dimensions are close to the beam blur size because of requirements in both resolution and throughput. This situation gives rise to a severe iso-dense CD bias. Hence the accuracy of the modeling kernel is required to achieve a larger common process window. In this paper we present contour-based kernel modeling and verification for e-beam lithography. The edge contours of CD-SEM images of the contact hole array pattern with duty ratio splits are used in this Gaussian kernel modeling study. A 2-step optimization sequence is proposed to improve the fitting efficiency and robustness. In the first step, roundness is the primary and the most effective index at the corner region which is sensitive to determine the beam blur size. The next step is to minimize the deviation of the through-pitch proximity effect by adjusting the ratio of the electron backscattering to the electron forward scattering. The more accurate cost index, edge placement error, is applied in the subsequent optimization step with constrained beam blur sizes extracted from the previous step. The optimum modeling kernel parameters can be obtained by the lowest cost deviation of the simulation contours and the CD-SEM extracted edge contours after optimization iterations. For early study of the proximity impact on future EBDW systems, the exposure experiment is performed on an EBM-8000 mask writer to build the modeling kernel. The prediction accuracy of the optimum modeling kernel on 60-nm features with different pattern densities is also verified experimentally to be within 1.5 nm.

  11. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 1: A case study in theorem prover-based verification

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1991-01-01

    The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  12. Improved knowledge diffusion model based on the collaboration hypernetwork

    NASA Astrophysics Data System (ADS)

    Wang, Jiang-Pan; Guo, Qiang; Yang, Guang-Yong; Liu, Jian-Guo

    2015-06-01

    The process for absorbing knowledge becomes an essential element for innovation in firms and in adapting to changes in the competitive environment. In this paper, we present an improved knowledge diffusion hypernetwork (IKDH) model based on the idea that knowledge will spread from the target node to all its neighbors in terms of the hyperedge and knowledge stock. We apply the average knowledge stock V(t) , the variable σ2(t) , and the variance coefficient c(t) to evaluate the performance of knowledge diffusion. By analyzing different knowledge diffusion ways, selection ways of the highly knowledgeable nodes, hypernetwork sizes and hypernetwork structures for the performance of knowledge diffusion, results show that the diffusion speed of IKDH model is 3.64 times faster than that of traditional knowledge diffusion (TKDH) model. Besides, it is three times faster to diffuse knowledge by randomly selecting "expert" nodes than that by selecting large-hyperdegree nodes as "expert" nodes. Furthermore, either the closer network structure or smaller network size results in the faster knowledge diffusion.

  13. A New "Moodle" Module Supporting Automatic Verification of VHDL-Based Assignments

    ERIC Educational Resources Information Center

    Gutierrez, Eladio; Trenas, Maria A.; Ramos, Julian; Corbera, Francisco; Romero, Sergio

    2010-01-01

    This work describes a new "Moodle" module developed to give support to the practical content of a basic computer organization course. This module goes beyond the mere hosting of resources and assignments. It makes use of an automatic checking and verification engine that works on the VHDL designs submitted by the students. The module automatically…

  14. A New "Moodle" Module Supporting Automatic Verification of VHDL-Based Assignments

    ERIC Educational Resources Information Center

    Gutierrez, Eladio; Trenas, Maria A.; Ramos, Julian; Corbera, Francisco; Romero, Sergio

    2010-01-01

    This work describes a new "Moodle" module developed to give support to the practical content of a basic computer organization course. This module goes beyond the mere hosting of resources and assignments. It makes use of an automatic checking and verification engine that works on the VHDL designs submitted by the students. The module automatically…

  15. IMAGE-BASED VERIFICATION: SOME ADVANTAGES, CHALLENGES, AND ALGORITHM-DRIVEN REQUIREMENTS

    SciTech Connect

    Seifert, Allen; McDonald, Benjamin S.; Jarman, Kenneth D.; Robinson, Sean M.; Misner, Alex C.; Miller, Erin A.; White, Timothy A.; Pitts, William K.

    2011-06-10

    ABSTRACT Imaging technologies may be a particularly useful technique that supports monitoring and verification of deployed and stockpiled nuclear weapons and dismantlement components. However, protecting the sensitive design information requires processing the image behind an information barrier and reporting only non-sensitive attributes related to the image. Reducing images to attributes may destroy some sensitive information, but the challenge remains. For example, reducing the measurement to an attribute such as defined shape and X-ray transmission of an edge might reveal sensitive information relating to shape, size, and material composition. If enough additional information is available to analyze with the attribute, it may still be possible to extract sensitive design information. In spite of these difficulties, the implementation of new treaty requirements may demand image technology as an option. Two fundamental questions are raised: What (minimal) information is needed from imaging to enable verification, and what imaging technologies are appropriate? PNNL is currently developing a suite of image analysis algorithms to define and extract attributes from images for dismantlement and warhead verification and counting scenarios. In this talk, we discuss imaging requirements from the perspective of algorithms operating behind information barriers, and review imaging technologies and their potential advantages for verification. Companion talks will concentrate on the technical aspects of the algorithms.

  16. "Expert" Verification of Classroom-Based Indicators of Teaching and Learning Effectiveness for Professional Renewable Certification.

    ERIC Educational Resources Information Center

    Naik, Nitin S.; And Others

    The results are provided of a statewide content verification survey of "expert" educators designed to verify indicators in the 1989-90 System for Teaching and Learning Assessment and Review (STAR) as reasonable expectations for beginning and/or experienced teachers (BETs) in Louisiana and as providing professional endorsement at the…

  17. Optical/digital identification/verification system based on digital watermarking technology

    NASA Astrophysics Data System (ADS)

    Herrigel, Alexander; Voloshynovskiy, Sviatoslav V.; Hrytskiv, Zenon D.

    2000-06-01

    This paper presents a new approach for the secure integrity verification of driver licenses, passports or other analogue identification documents. The system embeds (detects) the reference number of the identification document with the DCT watermark technology in (from) the owner photo of the identification document holder. During verification the reference number is extracted and compared with the reference number printed in the identification document. The approach combines optical and digital image processing techniques. The detection system must be able to scan an analogue driver license or passport, convert the image of this document into a digital representation and then apply the watermark verification algorithm to check the payload of the embedded watermark. If the payload of the watermark is identical with the printed visual reference number of the issuer, the verification was successful and the passport or driver license has not been modified. This approach constitutes a new class of application for the watermark technology, which was originally targeted for the copyright protection of digital multimedia data. The presented approach substantially increases the security of the analogue identification documents applied in many European countries.

  18. Analysis-Based Verification: A Programmer-Oriented Approach to the Assurance of Mechanical Program Properties

    DTIC Science & Technology

    2010-05-27

    viii CONTENTS 2.2.2 Promise symbols . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 2.2.3 Proof theory ...key idea is that there is a collection of theory and engineering that makes the verification of highly-focused programmer expressed design intent (e.g...to manually check. 1.6. CLAIMS AND CONTRIBUTIONS 37 Sound Combined Analyses Meta- Theory User experience & tool engineering informs Fluid project

  19. Structural and Network-based Methods for Knowledge-Based Systems

    DTIC Science & Technology

    2011-12-01

    NORTHWESTERN UNIVERSITY STRUCTURAL AND NETWORK-BASED METHODS FOR KNOWLEDGE -BASED SYSTEMS A DISSERTATION SUBMITTED TO THE GRADUATE SCHOOL...4. TITLE AND SUBTITLE Structural and Network-based Methods for Knowledge -based Systems 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...ABSTRACT Structural and Network-based Methods for Knowledge -based Systems In recent years, there has been

  20. Design of a knowledge-based report generator

    SciTech Connect

    Kukich, K.

    1983-01-01

    Knowledge-based report generation is a technique for automatically generating natural language reports from computer databases. It is so named because it applies knowledge-based expert systems software to the problem of text generation. The first application of the technique, a system for generating natural language stock reports from a daily stock quotes database, is partially implemented. Three fundamental principles of the technique are its use of domain-specific semantic and linguistic knowledge, its use of macro-level semantic and linguistic constructs (such as whole messages, a phrasal lexicon, and a sentence-combining grammar), and its production system approach to knowledge representation. 14 references.

  1. Design and Mechanical Evaluation of a Capacitive Sensor-Based Indexed Platform for Verification of Portable Coordinate Measuring Instruments

    PubMed Central

    Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar

    2014-01-01

    During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures. PMID:24451458

  2. Knowledge sources for evidence-based practice in rheumatology nursing.

    PubMed

    Neher, Margit; Ståhl, Christian; Ellström, Per-Erik; Nilsen, Per

    2015-12-01

    As rheumatology nursing develops and extends, knowledge about current use of knowledge in rheumatology nursing practice may guide discussions about future knowledge needs. To explore what perceptions rheumatology nurses have about their knowledge sources and about what knowledge they use in their practice, 12 nurses working in specialist rheumatology were interviewed using a semi-structured interview guide. The data were analyzed using conventional qualitative content analysis. The analysis yielded four types of knowledge sources in clinical practice: interaction with others in the workplace, contacts outside the workplace, written materials, and previous knowledge and experience. Colleagues, and physicians in particular, were important for informal learning in daily rheumatology practice. Evidence from the medical arena was accessed through medical specialists, while nursing research was used less. Facilitating informal learning and continuing formal education is proposed as a way toward a more evidence-based practice in extended roles. © The Author(s) 2014.

  3. Towards Modeling False Memory With Computational Knowledge Bases.

    PubMed

    Li, Justin; Kohanyi, Emma

    2017-01-01

    One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.

  4. Integration of textual guideline documents with formal guideline knowledge bases.

    PubMed

    Shankar, R D; Tu, S W; Martins, S B; Fagan, L M; Goldstein, M K; Musen, M A

    2001-01-01

    Numerous approaches have been proposed to integrate the text of guideline documents with guideline-based care systems. Current approaches range from serving marked up guideline text documents to generating advisories using complex guideline knowledge bases. These approaches have integration problems mainly because they tend to rigidly link the knowledge base with text. We are developing a bridge approach that uses an information retrieval technology. The new approach facilitates a versatile decision-support system by using flexible links between the formal structures of the knowledge base and the natural language style of the guideline text.

  5. KAT: A Flexible XML-based Knowledge Authoring Environment

    PubMed Central

    Hulse, Nathan C.; Rocha, Roberto A.; Del Fiol, Guilherme; Bradshaw, Richard L.; Hanna, Timothy P.; Roemer, Lorrie K.

    2005-01-01

    As part of an enterprise effort to develop new clinical information systems at Intermountain Health Care, the authors have built a knowledge authoring tool that facilitates the development and refinement of medical knowledge content. At present, users of the application can compose order sets and an assortment of other structured clinical knowledge documents based on XML schemas. The flexible nature of the application allows the immediate authoring of new types of documents once an appropriate XML schema and accompanying Web form have been developed and stored in a shared repository. The need for a knowledge acquisition tool stems largely from the desire for medical practitioners to be able to write their own content for use within clinical applications. We hypothesize that medical knowledge content for clinical use can be successfully created and maintained through XML-based document frameworks containing structured and coded knowledge. PMID:15802477

  6. Effective domain-dependent reuse in medical knowledge bases.

    PubMed

    Dojat, M; Pachet, F

    1995-12-01

    Knowledge reuse is now a critical issue for most developers of medical knowledge-based systems. As a rule, reuse is addressed from an ambitious, knowledge-engineering perspective that focuses on reusable general purpose knowledge modules, concepts, and methods. However, such a general goal fails to take into account the specific aspects of medical practice. From the point of view of the knowledge engineer, whose goal is to capture the specific features and intricacies of a given domain, this approach addresses the wrong level of generality. In this paper, we adopt a more pragmatic viewpoint, introducing the less ambitious goal of "domain-dependent limited reuse" and suggesting effective means of achieving it in practice. In a knowledge representation framework combining objects and production rules, we propose three mechanisms emerging from the combination of object-oriented programming and rule-based programming. We show these mechanisms contribute to achieve limited reuse and to introduce useful limited variations in medical expertise.

  7. EHR based Genetic Testing Knowledge Base (iGTKB) Development

    PubMed Central

    2015-01-01

    Background The gap between a large growing number of genetic tests and a suboptimal clinical workflow of incorporating these tests into regular clinical practice poses barriers to effective reliance on advanced genetic technologies to improve quality of healthcare. A promising solution to fill this gap is to develop an intelligent genetic test recommendation system that not only can provide a comprehensive view of genetic tests as education resources, but also can recommend the most appropriate genetic tests to patients based on clinical evidence. In this study, we developed an EHR based Genetic Testing Knowledge Base for Individualized Medicine (iGTKB). Methods We extracted genetic testing information and patient medical records from EHR systems at Mayo Clinic. Clinical features have been semi-automatically annotated from the clinical notes by applying a Natural Language Processing (NLP) tool, MedTagger suite. To prioritize clinical features for each genetic test, we compared odds ratio across four population groups. Genetic tests, genetic disorders and clinical features with their odds ratios have been applied to establish iGTKB, which is to be integrated into the Genetic Testing Ontology (GTO). Results Overall, there are five genetic tests operated with sample size greater than 100 in 2013 at Mayo Clinic. A total of 1,450 patients who was tested by one of the five genetic tests have been selected. We assembled 243 clinical features from the Human Phenotype Ontology (HPO) for these five genetic tests. There are 60 clinical features with at least one mention in clinical notes of patients taking the test. Twenty-eight clinical features with high odds ratio (greater than 1) have been selected as dominant features and deposited into iGTKB with their associated information about genetic tests and genetic disorders. Conclusions In this study, we developed an EHR based genetic testing knowledge base, iGTKB. iGTKB will be integrated into the GTO by providing relevant

  8. AOP Knowledge Base/Wiki Tool Set

    EPA Science Inventory

    Utilizing ToxCast Data and Lifestage Physiologically-Based Pharmacokinetic (PBPK) models to Drive Adverse Outcome Pathways (AOPs)-Based Margin of Exposures (ABME) to Chemicals. Hisham A. El-Masri1, Nicole C. Klienstreur2, Linda Adams1, Tamara Tal1, Stephanie Padilla1, Kristin Is...

  9. AOP Knowledge Base/Wiki Tool Set

    EPA Science Inventory

    Utilizing ToxCast Data and Lifestage Physiologically-Based Pharmacokinetic (PBPK) models to Drive Adverse Outcome Pathways (AOPs)-Based Margin of Exposures (ABME) to Chemicals. Hisham A. El-Masri1, Nicole C. Klienstreur2, Linda Adams1, Tamara Tal1, Stephanie Padilla1, Kristin Is...

  10. Design of an opt-electronic knowledge-based system

    NASA Astrophysics Data System (ADS)

    Shen, Xuan-Jing; Qian, Qing-Ji; Liu, Ping-Ping

    2006-01-01

    In this paper, based on the analysis of the features of knowledge-based system and optical computing, a scheme of an opt-electronic hybrid system (OEHKBS) model and its hardware supporting system. The OEHKBS adopts a knowledge representation based on matrix and its inference and learning arithmetic which suitable for optical parallel processing. Finally, the paper analyses the performance of the OEHKBS, which can make the time complexity for solving Maze problem reduce to O(n).

  11. The process for integrating the NNSA knowledge base.

    SciTech Connect

    Wilkening, Lisa K.; Carr, Dorthe Bame; Young, Christopher John; Hampton, Jeff; Martinez, Elaine

    2009-03-01

    From 2002 through 2006, the Ground Based Nuclear Explosion Monitoring Research & Engineering (GNEMRE) program at Sandia National Laboratories defined and modified a process for merging different types of integrated research products (IRPs) from various researchers into a cohesive, well-organized collection know as the NNSA Knowledge Base, to support operational treaty monitoring. This process includes defining the KB structure, systematically and logically aggregating IRPs into a complete set, and verifying and validating that the integrated Knowledge Base works as expected.

  12. Apprenticeship Learning Techniques for Knowledge Based Systems

    DTIC Science & Technology

    1988-12-01

    domain, such as medicine. The Odysseus explanation-based learning program constructs explanations of problem-solving actions in the domain of medical...theories and empirical methods so as to allow construction of an explanation. The Odysseus learning program provides the first demonstration of using the... Odysseus explanation-based learning program is presfuted, which constructs explanations of human problem-solving actions in the domain of medical di

  13. A specialized framework for medical diagnostic knowledge-based systems.

    PubMed

    Lanzola, G; Stefanelli, M

    1992-08-01

    For a knowledge-based system (KBS) to exhibit an intelligent behavior, it must be endowed with knowledge enabling it to represent the expert's strategies. The elicitation task is inherently difficult for strategic knowledge, because strategy is often tacit, and, even when it has been made explicit, it is not an easy task to describe it in a form which may be directly translated and implemented into a program. This paper describes a Specialized Framework for Medical Diagnostic Knowledge-Based Systems that can help an expert in the process of building KBSs in a medical domain. The framework is based on an epistemological model of diagnostic reasoning which has proven to be helpful in describing the diagnostic process in terms of the tasks that it is composed of. It allows a straightforward modeling of diagnostic reasoning at the knowledge level by the domain expert, thus helping to convey domain-dependent strategies into the target KBS.

  14. A specialized framework for Medical Diagnostic Knowledge Based Systems.

    PubMed

    Lanzola, G; Stefanelli, M

    1991-01-01

    To have a knowledge based system (KBS) exhibiting an intelligent behavior, it must be endowed even with knowledge able to represent the expert's strategies, other than with domain knowledge. The elicitation task is inherently difficult for strategic knowledge, because strategy is often tacit, and, even when it has been made explicit, it is not an easy task to describe it in a form that may be directly translated and implemented into a program. This paper describes a Specialized Framework for Medical Diagnostic Knowledge Based Systems able to help an expert in the process of building KBSs in a medical domain. The framework is based on an epistemological model of diagnostic reasoning which has proved to be helpful in describing the diagnostic process in terms of the tasks by which it is composed of.

  15. Using Knowledge-Based Systems to Support Learning of Organizational Knowledge: A Case Study

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.; Nash, Rebecca L.; Phan, Tu-Anh T.; Bailey, Teresa R.

    2003-01-01

    This paper describes the deployment of a knowledge system to support learning of organizational knowledge at the Jet Propulsion Laboratory (JPL), a US national research laboratory whose mission is planetary exploration and to 'do what no one has done before.' Data collected over 19 weeks of operation were used to assess system performance with respect to design considerations, participation, effectiveness of communication mechanisms, and individual-based learning. These results are discussed in the context of organizational learning research and implications for practice.

  16. Using Knowledge-Based Systems to Support Learning of Organizational Knowledge: A Case Study

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.; Nash, Rebecca L.; Phan, Tu-Anh T.; Bailey, Teresa R.

    2003-01-01

    This paper describes the deployment of a knowledge system to support learning of organizational knowledge at the Jet Propulsion Laboratory (JPL), a US national research laboratory whose mission is planetary exploration and to 'do what no one has done before.' Data collected over 19 weeks of operation were used to assess system performance with respect to design considerations, participation, effectiveness of communication mechanisms, and individual-based learning. These results are discussed in the context of organizational learning research and implications for practice.

  17. New superfamily members identified for Schiff-base enzymes based on verification of catalytically essential residues.

    PubMed

    Choi, Kyung H; Lai, Vicky; Foster, Christine E; Morris, Aaron J; Tolan, Dean R; Allen, Karen N

    2006-07-18

    Enzymes that utilize a Schiff-base intermediate formed with their substrates and that share the same alpha/beta barrel fold comprise a mechanistically diverse superfamily defined in the SCOPS database as the class I aldolase family. The family includes the "classical" aldolases fructose-1,6-(bis)phosphate (FBP) aldolase, transaldolase, and 2-keto-3-deoxy-6-phosphogluconate aldolase. Moreover, the N-acetylneuraminate lyase family has been included in the class I aldolase family on the basis of similar Schiff-base chemistry and fold. Herein, we generate primary sequence identities based on structural alignment that support the homology and reveal additional mechanistic similarities beyond the common use of a lysine for Schiff-base formation. The structural and mechanistic correspondence comprises the use of a catalytic dyad, wherein a general acid/base residue (Glu, Tyr, or His) involved in Schiff-base chemistry is stationed on beta-strand 5 of the alpha/beta barrel. The role of the acid/base residue was probed by site-directed mutagenesis and steady-state and pre-steady-state kinetics on a representative member of this family, FBP aldolase. The kinetic results are consistent with the participation of this conserved residue or position in the protonation of the carbinolamine intermediate and dehydration of the Schiff base in FBP aldolase and, by analogy, the class I aldolase family.

  18. A knowledge-based decision support system for payload scheduling

    NASA Technical Reports Server (NTRS)

    Tyagi, Rajesh; Tseng, Fan T.

    1988-01-01

    This paper presents the development of a prototype Knowledge-based Decision Support System, currently under development, for scheduling payloads/experiments on space station missions. The DSS is being built on Symbolics, a Lisp machine, using KEE, a commercial knowledge engineering tool.

  19. Grey Documentation as a Knowledge Base in Social Work.

    ERIC Educational Resources Information Center

    Berman, Yitzhak

    1994-01-01

    Defines grey documentation as documents issued informally and not available through normal channels and discusses the role that grey documentation can play in the social work knowledge base. Topics addressed include grey documentation and science; social work and the empirical approach in knowledge development; and dissemination of grey…

  20. Learning Science-Based Fitness Knowledge in Constructivist Physical Education

    ERIC Educational Resources Information Center

    Sun, Haichun; Chen, Ang; Zhu, Xihe; Ennis, Catherine D.

    2012-01-01

    Teaching fitness-related knowledge has become critical in developing children's healthful living behavior. The purpose of this study was to examine the effects of a science-based, constructivist physical education curriculum on learning fitness knowledge critical to healthful living in elementary school students. The schools (N = 30) were randomly…

  1. Developing Learning Progression-Based Teacher Knowledge Measures

    ERIC Educational Resources Information Center

    Jin, Hui; Shin, HyoJeong; Johnson, Michele E.; Kim, JinHo; Anderson, Charles W.

    2015-01-01

    This study developed learning progression-based measures of science teachers' content knowledge (CK) and pedagogical content knowledge (PCK). The measures focus on an important topic in secondary science curriculum using scientific reasoning (i.e., tracing matter, tracing energy, and connecting scales) to explain plants gaining weight and…

  2. Knowledge-Based Hierarchies: Using Organizations to Understand the Economy

    ERIC Educational Resources Information Center

    Garicano, Luis; Rossi-Hansberg, Esteban

    2015-01-01

    Incorporating the decision of how to organize the acquisition, use, and communication of knowledge into economic models is essential to understand a wide variety of economic phenomena. We survey the literature that has used knowledge-based hierarchies to study issues such as the evolution of wage inequality, the growth and productivity of firms,…

  3. Knowledge-Based Hierarchies: Using Organizations to Understand the Economy

    ERIC Educational Resources Information Center

    Garicano, Luis; Rossi-Hansberg, Esteban

    2015-01-01

    Incorporating the decision of how to organize the acquisition, use, and communication of knowledge into economic models is essential to understand a wide variety of economic phenomena. We survey the literature that has used knowledge-based hierarchies to study issues such as the evolution of wage inequality, the growth and productivity of firms,…

  4. Learning Science-Based Fitness Knowledge in Constructivist Physical Education

    ERIC Educational Resources Information Center

    Sun, Haichun; Chen, Ang; Zhu, Xihe; Ennis, Catherine D.

    2012-01-01

    Teaching fitness-related knowledge has become critical in developing children's healthful living behavior. The purpose of this study was to examine the effects of a science-based, constructivist physical education curriculum on learning fitness knowledge critical to healthful living in elementary school students. The schools (N = 30) were randomly…

  5. Developing Learning Progression-Based Teacher Knowledge Measures

    ERIC Educational Resources Information Center

    Jin, Hui; Shin, HyoJeong; Johnson, Michele E.; Kim, JinHo; Anderson, Charles W.

    2015-01-01

    This study developed learning progression-based measures of science teachers' content knowledge (CK) and pedagogical content knowledge (PCK). The measures focus on an important topic in secondary science curriculum using scientific reasoning (i.e., tracing matter, tracing energy, and connecting scales) to explain plants gaining weight and…

  6. Conventional and Knowledge-Based Information Retrieval with Prolog.

    ERIC Educational Resources Information Center

    Leigh, William; Paz, Noemi

    1988-01-01

    Describes the use of PROLOG to program knowledge-based information retrieval systems, in which the knowledge contained in a document is translated into machine processable logic. Several examples of the resulting search process, and the program rules supporting the process, are given. (10 references) (CLB)

  7. Principals for Our Changing Schools: The Knowledge and Skill Base.

    ERIC Educational Resources Information Center

    Thomson, Scott, D., Ed.; And Others

    This publication describes a new knowledge and skill base for use in a principal preparation program. These essential skills and knowledge encompass 21 "domains," which were defined in "Principals for Our Changing Schools: Preparation and Certification" (1990). The 21 domains, organized under 4 broad themes, blend the…

  8. Knowledge-based system V and V in the Space Station Freedom program

    NASA Technical Reports Server (NTRS)

    Kelley, Keith; Hamilton, David; Culbert, Chris

    1992-01-01

    Knowledge Based Systems (KBS's) are expected to be heavily used in the Space Station Freedom Program (SSFP). Although SSFP Verification and Validation (V&V) requirements are based on the latest state-of-the-practice in software engineering technology, they may be insufficient for Knowledge Based Systems (KBS's); it is widely stated that there are differences in both approach and execution between KBS V&V and conventional software V&V. In order to better understand this issue, we have surveyed and/or interviewed developers from sixty expert system projects in order to understand the differences and difficulties in KBS V&V. We have used this survey results to analyze the SSFP V&V requirements for conventional software in order to determine which specific requirements are inappropriate for KBS V&V and why they are inappropriate. Further work will result in a set of recommendations that can be used either as guidelines for applying conventional software V&V requirements to KBS's or as modifications to extend the existing SSFP conventional software V&V requirements to include KBS requirements. The results of this work are significant to many projects, in addition to SSFP, which will involve KBS's.

  9. Knowledge modeling of coal mining equipments based on ontology

    NASA Astrophysics Data System (ADS)

    Zhang, Baolong; Wang, Xiangqian; Li, Huizong; Jiang, Miaomiao

    2017-06-01

    The problems of information redundancy and sharing are universe in coal mining equipment management. In order to improve the using efficiency of knowledge of coal mining equipments, this paper proposed a new method of knowledge modeling based on ontology. On the basis of analyzing the structures and internal relations of coal mining equipment knowledge, taking OWL as ontology construct language, the ontology model of coal mining equipment knowledge is built with the help of Protégé 4.3 software tools. The knowledge description method will lay the foundation for the high effective knowledge management and sharing, which is very significant for improving the production management level of coal mining enterprises.

  10. A relational data-knowledge base system and its potential in developing a distributed data-knowledge system

    NASA Technical Reports Server (NTRS)

    Rahimian, Eric N.; Graves, Sara J.

    1988-01-01

    A new approach used in constructing a rational data knowledge base system is described. The relational database is well suited for distribution due to its property of allowing data fragmentation and fragmentation transparency. An example is formulated of a simple relational data knowledge base which may be generalized for use in developing a relational distributed data knowledge base system. The efficiency and ease of application of such a data knowledge base management system is briefly discussed. Also discussed are the potentials of the developed model for sharing the data knowledge base as well as the possible areas of difficulty in implementing the relational data knowledge base management system.

  11. A relational data-knowledge base system and its potential in developing a distributed data-knowledge system

    NASA Technical Reports Server (NTRS)

    Rahimian, Eric N.; Graves, Sara J.

    1988-01-01

    A new approach used in constructing a rational data knowledge base system is described. The relational database is well suited for distribution due to its property of allowing data fragmentation and fragmentation transparency. An example is formulated of a simple relational data knowledge base which may be generalized for use in developing a relational distributed data knowledge base system. The efficiency and ease of application of such a data knowledge base management system is briefly discussed. Also discussed are the potentials of the developed model for sharing the data knowledge base as well as the possible areas of difficulty in implementing the relational data knowledge base management system.

  12. Model based verification of the Secure Socket Layer (SSL) Protocol for NASA systems

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Gilliam, David

    2004-01-01

    The National Aeronautics and Space Administration (NASA) has tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information theft, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach' offers formal verification of information technology (IT), through the creation of a Software Security Assessment Instrument (SSAI), to address software security risks.

  13. Knowledge 'Translation' as social learning: negotiating the uptake of research-based knowledge in practice.

    PubMed

    Salter, K L; Kothari, A

    2016-02-29

    Knowledge translation and evidence-based practice have relied on research derived from clinical trials, which are considered to be methodologically rigorous. The result is practice recommendations based on a narrow view of evidence. We discuss how, within a practice environment, in fact individuals adopt and apply new evidence derived from multiple sources through ongoing, iterative learning cycles. The discussion is presented in four sections. After elaborating on the multiple forms of evidence used in practice, in section 2 we argue that the practitioner derives contextualized knowledge through reflective practice. Then, in section 3, the focus shifts from the individual to the team with consideration of social learning and theories of practice. In section 4 we discuss the implications of integrative and negotiated knowledge exchange and generation within the practice environment. Namely, how can we promote the use of research within a team-based, contextualized knowledge environment? We suggest support for: 1) collaborative learning environments for active learning and reflection, 2) engaged scholarship approaches so that practice can inform research in a collaborative manner and 3) leveraging authoritative opinion leaders for their clinical expertise during the shared negotiation of knowledge and research. Our approach also points to implications for studying evidence-informed practice: the identification of practice change (as an outcome) ought to be supplemented with understandings of how and when social negotiation processes occur to achieve integrated knowledge. This article discusses practice knowledge as dependent on the practice context and on social learning processes, and suggests how research knowledge uptake might be supported from this vantage point.

  14. Knowledge-based control and case-based diagnosis based upon empirical knowledge and fuzzy logic for the SBR plant.

    PubMed

    Bae, H; Seo, H Y; Kim, S; Kim, Y

    2006-01-01

    Because biological wastewater treatment plants (WWTPs) involve a long time-delay and various disturbances, in general, skilled operators manually control the plant based on empirical knowledge. And operators usually diagnose the plant using similar cases experienced in the past. For the effective management of the plant, system automation has to be accomplished based upon operating recipes. This paper introduces automatic control and diagnosis based upon the operator's knowledge. Fuzzy logic was employed to design this knowledge-based controller because fuzzy logic can convert the linguistic information to rules. The controller can manage the influent and external carbon in considering the loading rate. The input of the controller is not the loading rate but the dissolved oxygen (DO) lag-time, which has a strong relation to the loading rate. This approach can replace an expensive sensor, which measures the loading rate and ammonia concentration in the reactor, with a cheaper DO sensor. The proposed controller can assure optimal operation and prevent the over-feeding problem. Case-based diagnosis was achieved by the analysis of profile patterns collected from the past. A new test profile was diagnosed by comparing it with template patterns containing normal and abnormal cases. The proposed control and diagnostic system will guarantee the effective and stable operation of WWTPs.

  15. Comparing contents of a knowledge base to traditional information sources.

    PubMed Central

    Giuse, N. B.; Giuse, D. A.; Bankowitz, R. A.; Miller, R. A.

    1993-01-01

    Physicians rely on the medical literature as a major source of medical knowledge and data. The medical literature, however, is continually evolving and represents different sources at different levels of coverage and detail. The recent development of computerized medical knowledge bases has added a new form of information that can potentially be used to address the practicing physician's information needs. To understand how the information from various sources differs, we compared the description of a disease found in the QMR knowledge base to those found in two general internal medicine textbooks and two specialized nephrology textbooks. The study shows both differences in coverage and differences in the level of detail. Textbooks contain information about pathophysiology and therapy that is not present in the diagnostic knowledge base. The knowledge base contains a more detailed description of the associated findings, more quantitative information, and a greater number of references to peer-reviewed medical articles. The study demonstrates that computerized knowledge bases, if properly constructed, may be able to provide clinicians with a useful new source of medical knowledge that is complementary to existing sources. PMID:8130550

  16. Film based verification of calculation algorithms used for brachytherapy planning-getting ready for upcoming challenges of MBDCA

    PubMed Central

    Bielęda, Grzegorz; Skowronek, Janusz; Mazur, Magdalena

    2016-01-01

    Purpose Well-known defect of TG-43 based algorithms used in brachytherapy is a lack of information about interaction cross-sections, which are determined not only by electron density but also by atomic number. TG-186 recommendations with using of MBDCA (model-based dose calculation algorithm), accurate tissues segmentation, and the structure's elemental composition continue to create difficulties in brachytherapy dosimetry. For the clinical use of new algorithms, it is necessary to introduce reliable and repeatable methods of treatment planning systems (TPS) verification. The aim of this study is the verification of calculation algorithm used in TPS for shielded vaginal applicators as well as developing verification procedures for current and further use, based on the film dosimetry method. Material and methods Calibration data was collected by separately irradiating 14 sheets of Gafchromic® EBT films with the doses from 0.25 Gy to 8.0 Gy using HDR 192Ir source. Standard vaginal cylinders of three diameters were used in the water phantom. Measurements were performed without any shields and with three shields combination. Gamma analyses were performed using the VeriSoft® package. Results Calibration curve was determined as third-degree polynomial type. For all used diameters of unshielded cylinder and for all shields combinations, Gamma analysis were performed and showed that over 90% of analyzed points meets Gamma criteria (3%, 3 mm). Conclusions Gamma analysis showed good agreement between dose distributions calculated using TPS and measured by Gafchromic films, thus showing the viability of using film dosimetry in brachytherapy. PMID:27648087

  17. Knowledge-Based System Analysis and Control

    DTIC Science & Technology

    1989-09-30

    for use as a training tool (given considerable enlargement of its present circuit data base and problem repertoire), because it can provide step-by...from the slow and expensive process of training personnel in complex professional specialties. Tech Control began to emerge as a skill area ripe for...for any purpose but offline training . In late FY87 and early FY88, planning was therefore begun for a new expert system which would have no air gap

  18. Enhancing acronym/abbreviation knowledge bases with semantic information.

    PubMed

    Torii, Manabu; Liu, Hongfang

    2007-10-11

    In the biomedical domain, a terminology knowledge base that associates acronyms/abbreviations (denoted as SFs) with the definitions (denoted as LFs) is highly needed. For the construction such terminology knowledge base, we investigate the feasibility to build a system automatically assigning semantic categories to LFs extracted from text. Given a collection of pairs (SF,LF) derived from text, we i) assess the coverage of LFs and pairs (SF,LF) in the UMLS and justify the need of a semantic category assignment system; and ii) automatically derive name phrases annotated with semantic category and construct a system using machine learning. Utilizing ADAM, an existing collection of (SF,LF) pairs extracted from MEDLINE, our system achieved an f-measure of 87% when assigning eight UMLS-based semantic groups to LFs. The system has been incorporated into a web interface which integrates SF knowledge from multiple SF knowledge bases. Web site: http://gauss.dbb.georgetown.edu/liblab/SFThesurus.

  19. The Integration Process for Incorporating Nuclear Explosion Monitoring Research Results into the National Nuclear Security Administration Knowledge Base

    SciTech Connect

    GALLEGOS, DAVID P.; CARR, DORTHE B.; HERRINGTON, PRESTON B.; HARRIS, JAMES M.; EDWARDS, C.L.; TAYLOR, STEVEN R.; WOGMAN, NED A.; ANDERSON, DALE N.; CASEY, LESLIE A.

    2002-09-01

    The process of developing the National Nuclear Security Administration (NNSA) Knowledge Base (KB) must result in high-quality Information Products in order to support activities for monitoring nuclear explosions consistent with United States treaty and testing moratoria monitoring missions. The validation, verification, and management of the Information Products is critical to successful scientific integration, and hence, will enable high-quality deliveries to be made to the United States National Data Center (USNDC) at the Air Force Technical Applications Center (AFTAC). As an Information Product passes through the steps necessary to become part of a delivery to AFTAC, domain experts (including technical KB Working Groups that comprise NNSA and DOE laboratory staff and the customer) will provide coordination and validation, where validation is the determination of relevance and scientific quality. Verification is the check for completeness and correctness, and will be performed by both the Knowledge Base Integrator and the Scientific Integrator with support from the Contributor providing two levels of testing to assure content integrity and performance. The Information Products and their contained data sets will be systematically tracked through the integration portion of their life cycle. The integration process, based on lessons learned during its initial implementations, is presented in this report.

  20. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  1. Knowledge based systems: From process control to policy analysis

    SciTech Connect

    Marinuzzi, J.G.

    1993-06-01

    Los Alamos has been pursuing the use of Knowledge Based Systems for many years. These systems are currently being used to support projects that range across many production and operations areas. By investing time and money in people and equipment, Los Alamos has developed one of the strongest knowledge based systems capabilities within the DOE. Staff of Los Alamos` Mechanical & Electronic Engineering Division are using these knowledge systems to increase capability, productivity and competitiveness in areas of manufacturing quality control, robotics, process control, plant design and management decision support. This paper describes some of these projects and associated technical program approaches, accomplishments, benefits and future goals.

  2. Knowledge based systems: From process control to policy analysis

    SciTech Connect

    Marinuzzi, J.G.

    1993-01-01

    Los Alamos has been pursuing the use of Knowledge Based Systems for many years. These systems are currently being used to support projects that range across many production and operations areas. By investing time and money in people and equipment, Los Alamos has developed one of the strongest knowledge based systems capabilities within the DOE. Staff of Los Alamos' Mechanical Electronic Engineering Division are using these knowledge systems to increase capability, productivity and competitiveness in areas of manufacturing quality control, robotics, process control, plant design and management decision support. This paper describes some of these projects and associated technical program approaches, accomplishments, benefits and future goals.

  3. Online Knowledge-Based Model for Big Data Topic Extraction

    PubMed Central

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half. PMID:27195004

  4. From Data to Knowledge: GEOSS experience and the GEOSS Knowledge Base contribution to the GCI

    NASA Astrophysics Data System (ADS)

    Santoro, M.; Nativi, S.; Mazzetti, P., Sr.; Plag, H. P.

    2016-12-01

    According to systems theory, data is raw, it simply exists and has no significance beyond its existence; while, information is data that has been given meaning by way of relational connection. The appropriate collection of information, such that it contributes to understanding, is a process of knowledge creation.The Global Earth Observation System of Systems (GEOSS) developed by the Group on Earth Observations (GEO) is a set of coordinated, independent Earth observation, information and processing systems that interact and provide access to diverse information for a broad range of users in both public and private sectors. GEOSS links these systems to strengthen the monitoring of the state of the Earth. In the past ten years, the development of GEOSS has taught several lessons dealing with the need to move from (open) data to information and knowledge sharing. Advanced user-focused services require to move from a data-driven framework to a knowledge sharing platform. Such a platform needs to manage information and knowledge, in addition to datasets linked to them. For this scope, GEO has launched a specific task called "GEOSS Knowledge Base", which deals with resources, like user requirements, Sustainable Development Goals (SDGs), observation and processing ontologies, publications, guidelines, best practices, business processes/algorithms, definition of advanced concepts like Essential Variables (EVs), indicators, strategic goals, etc. In turn, information and knowledge (e.g. guidelines, best practices, user requirements, business processes, algorithms, etc.) can be used to generate additional information and knowledge from shared datasets. To fully utilize and leverage the GEOSS Knowledge Base, the current GEOSS Common Infrastructure (GCI) model will be extended and advanced to consider important concepts and implementation artifacts, such as data processing services and environmental/economic models as well as EVs, Primary Indicators, and SDGs. The new GCI model

  5. 4D ML reconstruction as a tool for volumetric PET-based treatment verification in ion beam radiotherapy

    SciTech Connect

    De Bernardi, E.; Ricotti, R.; Riboldi, M.; Baroni, G.; Parodi, K.; Gianoli, C.

    2016-02-15

    Purpose: An innovative strategy to improve the sensitivity of positron emission tomography (PET)-based treatment verification in ion beam radiotherapy is proposed. Methods: Low counting statistics PET images acquired during or shortly after the treatment (Measured PET) and a Monte Carlo estimate of the same PET images derived from the treatment plan (Expected PET) are considered as two frames of a 4D dataset. A 4D maximum likelihood reconstruction strategy was adapted to iteratively estimate the annihilation events distribution in a reference frame and the deformation motion fields that map it in the Expected PET and Measured PET frames. The outputs generated by the proposed strategy are as follows: (1) an estimate of the Measured PET with an image quality comparable to the Expected PET and (2) an estimate of the motion field mapping Expected PET to Measured PET. The details of the algorithm are presented and the strategy is preliminarily tested on analytically simulated datasets. Results: The algorithm demonstrates (1) robustness against noise, even in the worst conditions where 1.5 × 10{sup 4} true coincidences and a random fraction of 73% are simulated; (2) a proper sensitivity to different kind and grade of mismatches ranging between 1 and 10 mm; (3) robustness against bias due to incorrect washout modeling in the Monte Carlo simulation up to 1/3 of the original signal amplitude; and (4) an ability to describe the mismatch even in presence of complex annihilation distributions such as those induced by two perpendicular superimposed ion fields. Conclusions: The promising results obtained in this work suggest the applicability of the method as a quantification tool for PET-based treatment verification in ion beam radiotherapy. An extensive assessment of the proposed strategy on real treatment verification data is planned.

  6. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  7. Automated extraction of knowledge for model-based diagnostics

    NASA Technical Reports Server (NTRS)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  8. The latent structure of secure base script knowledge.

    PubMed

    Waters, Theodore E A; Fraley, R Chris; Groh, Ashley M; Steele, Ryan D; Vaughn, Brian E; Bost, Kelly K; Veríssimo, Manuela; Coppola, Gabrielle; Roisman, Glenn I

    2015-06-01

    There is increasing evidence that attachment representations abstracted from childhood experiences with primary caregivers are organized as a cognitive script describing secure base use and support (i.e., the secure base script). To date, however, the latent structure of secure base script knowledge has gone unexamined-this despite that such basic information about the factor structure and distributional properties of these individual differences has important conceptual implications for our understanding of how representations of early experience are organized and generalized, as well as methodological significance in relation to maximizing statistical power and precision. In this study, we report factor and taxometric analyses that examined the latent structure of secure base script knowledge in 2 large samples. Results suggested that variation in secure base script knowledge-as measured by both the adolescent (N = 674) and adult (N = 714) versions of the Attachment Script Assessment-is generalized across relationships and continuously distributed.

  9. 3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry.

    PubMed

    Barbeiro, A R; Ureba, A; Baeza, J A; Linares, R; Perucha, M; Jiménez-Ortega, E; Velázquez, S; Mateos, J C; Leal, A

    2016-01-01

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre

  10. 3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry

    PubMed Central

    Barbeiro, A. R.; Ureba, A.; Baeza, J. A.; Linares, R.; Perucha, M.; Jiménez-Ortega, E.; Velázquez, S.; Mateos, J. C.

    2016-01-01

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre

  11. WE-D-BRA-04: Online 3D EPID-Based Dose Verification for Optimum Patient Safety

    SciTech Connect

    Spreeuw, H; Rozendaal, R; Olaciregui-Ruiz, I; Mans, A; Mijnheer, B; Herk, M van; Gonzalez, P

    2015-06-15

    Purpose: To develop an online 3D dose verification tool based on EPID transit dosimetry to ensure optimum patient safety in radiotherapy treatments. Methods: A new software package was developed which processes EPID portal images online using a back-projection algorithm for the 3D dose reconstruction. The package processes portal images faster than the acquisition rate of the portal imager (∼ 2.5 fps). After a portal image is acquired, the software seeks for “hot spots” in the reconstructed 3D dose distribution. A hot spot is in this study defined as a 4 cm{sup 3} cube where the average cumulative reconstructed dose exceeds the average total planned dose by at least 20% and 50 cGy. If a hot spot is detected, an alert is generated resulting in a linac halt. The software has been tested by irradiating an Alderson phantom after introducing various types of serious delivery errors. Results: In our first experiment the Alderson phantom was irradiated with two arcs from a 6 MV VMAT H&N treatment having a large leaf position error or a large monitor unit error. For both arcs and both errors the linac was halted before dose delivery was completed. When no error was introduced, the linac was not halted. The complete processing of a single portal frame, including hot spot detection, takes about 220 ms on a dual hexacore Intel Xeon 25 X5650 CPU at 2.66 GHz. Conclusion: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for various kinds of gross delivery errors. The detection of hot spots was proven to be effective for the timely detection of these errors. Current work is focused on hot spot detection criteria for various treatment sites and the introduction of a clinical pilot program with online verification of hypo-fractionated (lung) treatments.

  12. In Search of Museum Professional Knowledge Base: Mapping the Professional Knowledge Debate onto Museum Work

    ERIC Educational Resources Information Center

    Tlili, Anwar

    2016-01-01

    Museum professionalism remains an unexplored area in museum studies, particularly with regard to what is arguably the core generic question of a "sui generis" professional knowledge base, and its necessary and sufficient conditions. The need to examine this question becomes all the more important with the increasing expansion of the…

  13. Sociopathic Knowledge Bases: Correct Knowledge Can Be Harmful Even Given Unlimited Computation

    DTIC Science & Technology

    1989-08-01

    SYMBOL 7a. NAME OF MONITORING ORGANIZATION *University of Illinois f (If applicable) Artificial Intelligence (Code 1133) ______________________ 1...Mathews Ave Dist Urbana, IL 61801 A August 1989 Submitted for Publication: Artificial Intelligence Journal Sociopathic Knowledge Bases: Correct...Introduction Reasoning under uncertainty has been widely investigated in artificial intelligence . Prob- abilistic approaches are of particular relevance

  14. Hidden Knowledge: Working-Class Capacity in the "Knowledge-Based Economy"

    ERIC Educational Resources Information Center

    Livingstone, David W.; Sawchuck, Peter H.

    2005-01-01

    The research reported in this paper attempts to document the actual learning practices of working-class people in the context of the much heralded "knowledge-based economy." Our primary thesis is that working-class peoples' indigenous learning capacities have been denied, suppressed, degraded or diverted within most capitalist schooling,…

  15. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  16. Voice verification upgrade

    NASA Astrophysics Data System (ADS)

    Davis, R. L.; Sinnamon, J. T.; Cox, D. L.

    1982-06-01

    This contractor has two major objectives. The first was to build, test, and deliver to the government an entry control system using speaker verification (voice authentication) as the mechanism for verifying the user's claimed identity. This system included a physical mantrap, with an integral weight scale to prevent more than one user from gaining access with one verification (tailgating). The speaker verification part of the entry control system contained all the updates and embellishments to the algorithm that was developed earlier for the BISS (Base and Installation Security System) system under contract with the Electronic Systems Division of the USAF. These updates were tested prior to and during the contract on an operational system used at Texas Instruments in Dallas, Texas, for controlling entry to the Corporate Information Center (CIC).

  17. Improving structural similarity based virtual screening using background knowledge

    PubMed Central

    2013-01-01

    Background Virtual screening in the form of similarity rankings is often applied in the early drug discovery process to rank and prioritize compounds from a database. This similarity ranking can be achieved with structural similarity measures. However, their general nature can lead to insufficient performance in some application cases. In this paper, we provide a link between ranking-based virtual screening and fragment-based data mining methods. The inclusion of binding-relevant background knowledge into a structural similarity measure improves the quality of the similarity rankings. This background knowledge in the form of binding relevant substructures can either be derived by hand selection or by automated fragment-based data mining methods. Results In virtual screening experiments we show that our approach clearly improves enrichment factors with both applied variants of our approach: the extension of the structural similarity measure with background knowledge in the form of a hand-selected relevant substructure or the extension of the similarity measure with background knowledge derived with data mining methods. Conclusion Our study shows that adding binding relevant background knowledge can lead to significantly improved similarity rankings in virtual screening and that even basic data mining approaches can lead to competitive results making hand-selection of the background knowledge less crucial. This is especially important in drug discovery and development projects where no receptor structure is available or more frequently no verified binding mode is known and mostly ligand based approaches can be applied to generate hit compounds. PMID:24341870

  18. Extensible knowledge-based architecture for segmenting CT data

    NASA Astrophysics Data System (ADS)

    Brown, Matthew S.; McNitt-Gray, Michael F.; Goldin, Jonathan G.; Aberle, Denise R.

    1998-06-01

    A knowledge-based system has been developed for segmenting computed tomography (CT) images. Its modular architecture includes an anatomical model, image processing engine, inference engine and blackboard. The model contains a priori knowledge of size, shape, X-ray attenuation and relative position of anatomical structures. This knowledge is used to constrain low-level segmentation routines. Model-derived constraints and segmented image objects are both transformed into a common feature space and posted on the blackboard. The inference engine then matches image to model objects, based on the constraints. The transformation to feature space allows the knowledge and image data representations to be independent. Thus a high-level model can be used, with data being stored in a frame-based semantic network. This modularity and explicit representation of knowledge allows for straightforward system extension. We initially demonstrate an application to lung segmentation in thoracic CT, with subsequent extension of the knowledge-base to include tumors within the lung fields. The anatomical model was later augmented to include basic brain anatomy including the skull and blood vessels, to allow automatic segmentation of vascular structures in CT angiograms for 3D rendering and visualization.

  19. Assessment of ground-based atmospheric observations for verification of greenhouse gas emissions from an urban region

    PubMed Central

    McKain, Kathryn; Wofsy, Steven C.; Nehrkorn, Thomas; Eluszkiewicz, Janusz; Ehleringer, James R.; Stephens, Britton B.

    2012-01-01

    International agreements to limit greenhouse gas emissions require verification to ensure that they are effective and fair. Verification based on direct observation of atmospheric greenhouse gas concentrations will be necessary to demonstrate that estimated emission reductions have been actualized in the atmosphere. Here we assess the capability of ground-based observations and a high-resolution (1.3 km) mesoscale atmospheric transport model to determine a change in greenhouse gas emissions over time from a metropolitan region. We test the method with observations from a network of CO2 surface monitors in Salt Lake City. Many features of the CO2 data were simulated with excellent fidelity, although data-model mismatches occurred on hourly timescales due to inadequate simulation of shallow circulations and the precise timing of boundary-layer stratification and destratification. Using two optimization procedures, monthly regional fluxes were constrained to sufficient precision to detect an increase or decrease in emissions of approximately 15% at the 95% confidence level. We argue that integrated column measurements of the urban dome of CO2 from the ground and/or space are less sensitive than surface point measurements to the redistribution of emitted CO2 by small-scale processes and thus may allow for more precise trend detection of emissions from urban regions. PMID:22611187

  20. Assessment of ground-based atmospheric observations for verification of greenhouse gas emissions from an urban region

    NASA Astrophysics Data System (ADS)

    McKain, Kathryn; Wofsy, Steven C.; Nehrkorn, Thomas; Eluszkiewicz, Janusz; Ehleringer, James R.; Stephens, Britton B.

    2012-05-01

    International agreements to limit greenhouse gas emissions require verification to ensure that they are effective and fair. Verification based on direct observation of atmospheric greenhouse gas concentrations will be necessary to demonstrate that estimated emission reductions have been actualized in the atmosphere. Here we assess the capability of ground-based observations and a high-resolution (1.3 km) mesoscale atmospheric transport model to determine a change in greenhouse gas emissions over time from a metropolitan region. We test the method with observations from a network of CO2 surface monitors in Salt Lake City. Many features of the CO2 data were simulated with excellent fidelity, although data-model mismatches occurred on hourly timescales due to inadequate simulation of shallow circulations and the precise timing of boundary-layer stratification and destratification. Using two optimization procedures, monthly regional fluxes were constrained to sufficient precision to detect an increase or decrease in emissions of approximately 15% at the 95% confidence level. We argue that integrated column measurements of the urban dome of CO2 from the ground and/or space are less sensitive than surface point measurements to the redistribution of emitted CO2 by small-scale processes and thus may allow for more precise trend detection of emissions from urban regions.

  1. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  2. A Natural Language Interface Concordant with a Knowledge Base.

    PubMed

    Han, Yong-Jin; Park, Seong-Bae; Park, Se-Young

    2016-01-01

    The discordance between expressions interpretable by a natural language interface (NLI) system and those answerable by a knowledge base is a critical problem in the field of NLIs. In order to solve this discordance problem, this paper proposes a method to translate natural language questions into formal queries that can be generated from a graph-based knowledge base. The proposed method considers a subgraph of a knowledge base as a formal query. Thus, all formal queries corresponding to a concept or a predicate in the knowledge base can be generated prior to query time and all possible natural language expressions corresponding to each formal query can also be collected in advance. A natural language expression has a one-to-one mapping with a formal query. Hence, a natural language question is translated into a formal query by matching the question with the most appropriate natural language expression. If the confidence of this matching is not sufficiently high the proposed method rejects the question and does not answer it. Multipredicate queries are processed by regarding them as a set of collected expressions. The experimental results show that the proposed method thoroughly handles answerable questions from the knowledge base and rejects unanswerable ones effectively.

  3. A Natural Language Interface Concordant with a Knowledge Base

    PubMed Central

    Han, Yong-Jin; Park, Seong-Bae; Park, Se-Young

    2016-01-01

    The discordance between expressions interpretable by a natural language interface (NLI) system and those answerable by a knowledge base is a critical problem in the field of NLIs. In order to solve this discordance problem, this paper proposes a method to translate natural language questions into formal queries that can be generated from a graph-based knowledge base. The proposed method considers a subgraph of a knowledge base as a formal query. Thus, all formal queries corresponding to a concept or a predicate in the knowledge base can be generated prior to query time and all possible natural language expressions corresponding to each formal query can also be collected in advance. A natural language expression has a one-to-one mapping with a formal query. Hence, a natural language question is translated into a formal query by matching the question with the most appropriate natural language expression. If the confidence of this matching is not sufficiently high the proposed method rejects the question and does not answer it. Multipredicate queries are processed by regarding them as a set of collected expressions. The experimental results show that the proposed method thoroughly handles answerable questions from the knowledge base and rejects unanswerable ones effectively. PMID:26904105

  4. Evaluation of database technologies for the CTBT Knowledge Base prototype

    SciTech Connect

    Keyser, R.; Shepard-Dombroski, E.; Baur, D.; Hipp, J.; Moore, S.; Young, C.; Chael, E.

    1996-11-01

    This document examines a number of different software technologies in the rapidly changing field of database management systems, evaluates these systems in light of the expected needs of the Comprehensive Test Ban Treaty (CTBT) Knowledge Base, and makes some recommendations for the initial prototypes of the Knowledge Base. The Knowledge Base requirements are examined and then used as criteria for evaluation of the database management options. A mock-up of the data expected in the Knowledge Base is used as a basis for examining how four different database technologies deal with the problems of storing and retrieving the data. Based on these requirement and the results of the evaluation, the recommendation is that the Illustra database be considered for the initial prototype of the Knowledge Base. Illustra offers a unique blend of performance, flexibility, and features that will aid in the implementation of the prototype. At the same time, Illustra provides a high level of compatibility with the hardware and software environments present at the US NDC (National Data Center) and the PIDC (Prototype International Data Center).

  5. Knowledge-based fault diagnosis system for refuse collection vehicle

    SciTech Connect

    Tan, CheeFai; Juffrizal, K.; Khalil, S. N.; Nidzamuddin, M. Y.

    2015-05-15

    The refuse collection vehicle is manufactured by local vehicle body manufacturer. Currently; the company supplied six model of the waste compactor truck to the local authority as well as waste management company. The company is facing difficulty to acquire the knowledge from the expert when the expert is absence. To solve the problem, the knowledge from the expert can be stored in the expert system. The expert system is able to provide necessary support to the company when the expert is not available. The implementation of the process and tool is able to be standardize and more accurate. The knowledge that input to the expert system is based on design guidelines and experience from the expert. This project highlighted another application on knowledge-based system (KBS) approached in trouble shooting of the refuse collection vehicle production process. The main aim of the research is to develop a novel expert fault diagnosis system framework for the refuse collection vehicle.

  6. Managing Project Landscapes in Knowledge-Based Enterprises

    NASA Astrophysics Data System (ADS)

    Stantchev, Vladimir; Franke, Marc Roman

    Knowledge-based enterprises are typically conducting a large number of research and development projects simultaneously. This is a particularly challenging task in complex and diverse project landscapes. Project Portfolio Management (PPM) can be a viable framework for knowledge and innovation management in such landscapes. A standardized process with defined functions such as project data repository, project assessment, selection, reporting, and portfolio reevaluation can serve as a starting point. In this work we discuss the benefits a multidimensional evaluation framework can provide for knowledge-based enterprises. Furthermore, we describe a knowledge and learning strategy and process in the context of PPM and evaluate their practical applicability at different stages of the PPM process.

  7. Knowledge-based fault diagnosis system for refuse collection vehicle

    NASA Astrophysics Data System (ADS)

    Tan, CheeFai; Juffrizal, K.; Khalil, S. N.; Nidzamuddin, M. Y.

    2015-05-01

    The refuse collection vehicle is manufactured by local vehicle body manufacturer. Currently; the company supplied six model of the waste compactor truck to the local authority as well as waste management company. The company is facing difficulty to acquire the knowledge from the expert when the expert is absence. To solve the problem, the knowledge from the expert can be stored in the expert system. The expert system is able to provide necessary support to the company when the expert is not available. The implementation of the process and tool is able to be standardize and more accurate. The knowledge that input to the expert system is based on design guidelines and experience from the expert. This project highlighted another application on knowledge-based system (KBS) approached in trouble shooting of the refuse collection vehicle production process. The main aim of the research is to develop a novel expert fault diagnosis system framework for the refuse collection vehicle.

  8. Validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Gilstrap, Lewey

    1991-01-01

    Validation and verification (V&V) are procedures used to evaluate system structure or behavior with respect to a set of requirements. Although expert systems are often developed as a series of prototypes without requirements, it is not possible to perform V&V on any system for which requirements have not been prepared. In addition, there are special problems associated with the evaluation of expert systems that do not arise in the evaluation of conventional systems, such as verification of the completeness and accuracy of the knowledge base. The criticality of most NASA missions make it important to be able to certify the performance of the expert systems used to support these mission. Recommendations for the most appropriate method for integrating V&V into the Expert System Development Methodology (ESDM) and suggestions for the most suitable approaches for each stage of ESDM development are presented.

  9. SU-E-T-505: CT-Based Independent Dose Verification for RapidArc Plan as a Secondary Check

    SciTech Connect

    Tachibana, H; Baba, H; Kamima, T; Takahashi, R

    2014-06-01

    Purpose: To design and develop a CT-based independent dose verification for the RapidArc plan and also to show the effectiveness of inhomogeneous correction in the secondary check for the plan. Methods: To compute the radiological path from the body surface to the reference point and equivalent field sizes from the multiple MLC aperture shapes in the RapidArc MLC sequences independently, DICOM files of CT image, structure and RapidArc plan were imported to our in-house software. The radiological path was computed using a three-dimensional CT arrays for each segment. The multiple MLC aperture shapes were used to compute tissue maximum ratio and phantom scatter factor using the Clarkson-method. In this study, two RapidArc plans for oropharynx cancer were used to compare the doses in CT-based calculation and water-equivalent phantom calculation using the contoured body structure to the dose in a treatment planning system (TPS). Results: The comparison in the one plan shows good agreement in both of the calculation (within 1%). However, in the other case, the CT-based calculation shows better agreement compared to the water-equivalent phantom calculation (CT-based: -2.8% vs. Water-based: -3.8%). Because there were multiple structures along the multiple beam paths and the radiological path length in the CT-based calculation and the path in the water-homogenous phantom calculation were comparatively different. Conclusion: RapidArc treatments are performed in any sites (from head, chest, abdomen to pelvis), which includes inhomogeneous media. Therefore, a more reliable CT-based calculation may be used as a secondary check for the independent verification.

  10. Knowledge discovery based on experiential learning corporate culture management

    NASA Astrophysics Data System (ADS)

    Tu, Kai-Jan

    2014-10-01

    A good corporate culture based on humanistic theory can make the enterprise's management very effective, all enterprise's members have strong cohesion and centripetal force. With experiential learning model, the enterprise can establish an enthusiastic learning spirit corporate culture, have innovation ability to gain the positive knowledge growth effect, and to meet the fierce global marketing competition. A case study on Trend's corporate culture can offer the proof of industry knowledge growth rate equation as the contribution to experiential learning corporate culture management.

  11. A Knowledge-Based Approach to Language Production

    DTIC Science & Technology

    1985-08-01

    systemic gramma rs--morphological, lexical, syntactic, and functional knowledge. The valu e of a feature may be a literal, special symbol, or a composite...A Knowledge-Based Approach to Language Prcxiuction By Paul Schafran Jacobs A.B. (Harvard University ) 1981 S.M. (Harvard University ) 1981...GRADUATE DIVISION OF ’THE UNIVERSITY OF CALIFORNIA, BERKELEY .. ~, ....-~- .. 9.!!.1’!.5 ’ i Date ..... 0. ~ •.. fi.:. -~- ..... f./..’(/ P.:’ .. ~ d

  12. Arranging ISO 13606 archetypes into a knowledge base.

    PubMed

    Kopanitsa, Georgy

    2014-01-01

    To enable the efficient reuse of standard based medical data we propose to develop a higher level information model that will complement the archetype model of ISO 13606. This model will make use of the relationships that are specified in UML to connect medical archetypes into a knowledge base within a repository. UML connectors were analyzed for their ability to be applied in the implementation of a higher level model that will establish relationships between archetypes. An information model was developed using XML Schema notation. The model allows linking different archetypes of one repository into a knowledge base. Presently it supports several relationships and will be advanced in future.

  13. "Chromosome": a knowledge-based system for the chromosome classification.

    PubMed

    Ramstein, G; Bernadet, M

    1993-01-01

    Chromosome, a knowledge-based analysis system has been designed for the classification of human chromosomes. Its aim is to perform an optimal classification by driving a tool box containing the procedures of image processing, pattern recognition and classification. This paper presents the general architecture of Chromosome, based on a multiagent system generator. The image processing tool box is described from the met aphasic enhancement to the fine classification. Emphasis is then put on the knowledge base intended for the chromosome recognition. The global classification process is also presented, showing how Chromosome proceeds to classify a given chromosome. Finally, we discuss further extensions of the system for the karyotype building.

  14. Acrylonitrile Butadiene Styrene (ABS) plastic based low cost tissue equivalent phantom for verification dosimetry in IMRT.

    PubMed

    Kumar, Rajesh; Sharma, S D; Deshpande, Sudesh; Ghadi, Yogesh; Shaiju, V S; Amols, H I; Mayya, Y S

    2009-12-17

    A novel IMRT phantom was designed and fabricated using Acrylonitrile Butadiene Styrene (ABS) plastic. Physical properties of ABS plastic related to radiation interaction and dosimetry were compared with commonly available phantom materials for dose measurements in radiotherapy. The ABS IMRT phantom has provisions to hold various types of detectors such as ion chambers, radiographic/radiochromic films, TLDs, MOSFETs, and gel dosimeters. The measurements related to pre-treatment dose verification in IMRT of carcinoma prostate were carried out using ABS and Scanditronics-Wellhoffer RW3 IMRT phantoms for five different cases. Point dose data were acquired using ionization chamber and TLD discs while Gafchromic EBT and radiographic EDR2 films were used for generating 2-D dose distributions. Treatment planning system (TPS) calculated and measured doses in ABS plastic and RW3 IMRT phantom were in agreement within +/-2%. The dose values at a point in a given patient acquired using ABS and RW3 phantoms were found comparable within 1%. Fluence maps and dose distributions of these patients generated by TPS and measured in ABS IMRT phantom were also found comparable both numerically and spatially. This study indicates that ABS plastic IMRT phantom is a tissue equivalent phantom and dosimetrically it is similar to solid/plastic water IMRT phantoms. Though this material is demonstrated for IMRT dose verification but it can be used as a tissue equivalent phantom material for other dosimetry purposes in radiotherapy.

  15. Rapid verification of candidate serological biomarkers using gel-based, label-free multiple reaction monitoring.

    PubMed

    Tang, Hsin-Yao; Beer, Lynn A; Barnhart, Kurt T; Speicher, David W

    2011-09-02

    Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μL of serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers.

  16. Development and validation of MCNPX-based Monte Carlo treatment plan verification system

    PubMed Central

    Jabbari, Iraj; Monadi, Shahram

    2015-01-01

    A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT) format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D) diode array (MapCHECK2) and gamma index analysis were used. The gamma passing rate (3%/3 mm) of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%). The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan. PMID:26170554

  17. Development and validation of MCNPX-based Monte Carlo treatment plan verification system.

    PubMed

    Jabbari, Iraj; Monadi, Shahram

    2015-01-01

    A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT) format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D) diode array (MapCHECK2) and gamma index analysis were used. The gamma passing rate (3%/3 mm) of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%). The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.

  18. New analysis tools and processes for mask repair verification and defect disposition based on AIMS images

    NASA Astrophysics Data System (ADS)

    Richter, Rigo; Poortinga, Eric; Scheruebl, Thomas

    2009-10-01

    Using AIMSTM to qualify repairs of defects on photomasks is an industry standard. AIMSTM images match the lithographic imaging performance without the need for wafer prints. Utilization of this capability by photomask manufacturers has risen due to the increased complexity of layouts incorporating RET and phase shift technologies. Tighter specifications by end-users have pushed AIMSTM analysis to now include CD performance results in addition to the traditional intensity performance results. Discussed is a new Repair Verification system for automated analysis of AIMSTM images. Newly designed user interfaces and algorithms guide users through predefined analysis routines as to minimize errors. There are two main routines discussed, one allowing multiple reference sites along with a test/defect site within a single image of repeating features. The second routine compares a test/defect measurement image with a reference measurement image. Three evaluation methods possible with the compared images are discussed in the context of providing thorough analysis capability. This paper highlights new functionality for AIMSTM analysis. Using structured analysis processes and innovative analysis tools leads to a highly efficient and more reliable result reporting of repair verification analysis.

  19. Knowledge.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on knowledge includes annotated listings of Web sites, CD-ROMs and computer software, videos, books, and additional resources that deal with knowledge and differences between how animals and humans learn. Sidebars discuss animal intelligence, learning proper behavior, and getting news from the Internet. (LRW)

  20. Knowledge.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on knowledge includes annotated listings of Web sites, CD-ROMs and computer software, videos, books, and additional resources that deal with knowledge and differences between how animals and humans learn. Sidebars discuss animal intelligence, learning proper behavior, and getting news from the Internet. (LRW)

  1. Dynamic Strategic Planning in a Professional Knowledge-Based Organization

    ERIC Educational Resources Information Center

    Olivarius, Niels de Fine; Kousgaard, Marius Brostrom; Reventlow, Susanne; Quelle, Dan Grevelund; Tulinius, Charlotte

    2010-01-01

    Professional, knowledge-based institutions have a particular form of organization and culture that makes special demands on the strategic planning supervised by research administrators and managers. A model for dynamic strategic planning based on a pragmatic utilization of the multitude of strategy models was used in a small university-affiliated…

  2. Malaysia Transitions toward a Knowledge-Based Economy

    ERIC Educational Resources Information Center

    Mustapha, Ramlee; Abdullah, Abu

    2004-01-01

    The emergence of a knowledge-based economy (k-economy) has spawned a "new" notion of workplace literacy, changing the relationship between employers and employees. The traditional covenant where employees expect a stable or lifelong employment will no longer apply. The retention of employees will most probably be based on their skills…

  3. Category vs. Object Knowledge in Category-Based Induction

    ERIC Educational Resources Information Center

    Murphy, Gregory L.; Ross, Brian H.

    2010-01-01

    In one form of category-based induction, people make predictions about unknown properties of objects. There is a tension between predictions made based on the object's specific features (e.g., objects above a certain size tend not to fly) and those made by reference to category-level knowledge (e.g., birds fly). Seven experiments with artificial…

  4. Developing and Assessing Teachers' Knowledge of Game-Based Learning

    ERIC Educational Resources Information Center

    Shah, Mamta; Foster, Aroutis

    2015-01-01

    Research focusing on the development and assessment of teacher knowledge in game-based learning is in its infancy. A mixed-methods study was undertaken to educate pre-service teachers in game-based learning using the Game Network Analysis (GaNA) framework. Fourteen pre-service teachers completed a methods course, which prepared them in game…

  5. Category vs. Object Knowledge in Category-Based Induction

    ERIC Educational Resources Information Center

    Murphy, Gregory L.; Ross, Brian H.

    2010-01-01

    In one form of category-based induction, people make predictions about unknown properties of objects. There is a tension between predictions made based on the object's specific features (e.g., objects above a certain size tend not to fly) and those made by reference to category-level knowledge (e.g., birds fly). Seven experiments with artificial…

  6. Developing and Assessing Teachers' Knowledge of Game-Based Learning

    ERIC Educational Resources Information Center

    Shah, Mamta; Foster, Aroutis

    2015-01-01

    Research focusing on the development and assessment of teacher knowledge in game-based learning is in its infancy. A mixed-methods study was undertaken to educate pre-service teachers in game-based learning using the Game Network Analysis (GaNA) framework. Fourteen pre-service teachers completed a methods course, which prepared them in game…

  7. Knowledge management: An abstraction of knowledge base and database management systems

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  8. A Simple Visual Ethanol Biosensor Based on Alcohol Oxidase Immobilized onto Polyaniline Film for Halal Verification of Fermented Beverage Samples

    PubMed Central

    Kuswandi, Bambang; Irmawati, Titi; Hidayat, Moch Amrun; Jayus; Ahmad, Musa

    2014-01-01

    A simple visual ethanol biosensor based on alcohol oxidase (AOX) immobilised onto polyaniline (PANI) film for halal verification of fermented beverage samples is described. This biosensor responds to ethanol via a colour change from green to blue, due to the enzymatic reaction of ethanol that produces acetaldehyde and hydrogen peroxide, when the latter oxidizes the PANI film. The procedure to obtain this biosensor consists of the immobilization of AOX onto PANI film by adsorption. For the immobilisation, an AOX solution is deposited on the PANI film and left at room temperature until dried (30 min). The biosensor was constructed as a dip stick for visual and simple use. The colour changes of the films have been scanned and analysed using image analysis software (i.e., ImageJ) to study the characteristics of the biosensor's response toward ethanol. The biosensor has a linear response in an ethanol concentration range of 0.01%–0.8%, with a correlation coefficient (r) of 0.996. The limit detection of the biosensor was 0.001%, with reproducibility (RSD) of 1.6% and a life time up to seven weeks when stored at 4 °C. The biosensor provides accurate results for ethanol determination in fermented drinks and was in good agreement with the standard method (gas chromatography) results. Thus, the biosensor could be used as a simple visual method for ethanol determination in fermented beverage samples that can be useful for Muslim community for halal verification. PMID:24473284

  9. A simple visual ethanol biosensor based on alcohol oxidase immobilized onto polyaniline film for halal verification of fermented beverage samples.

    PubMed

    Kuswandi, Bambang; Irmawati, Titi; Hidayat, Moch Amrun; Jayus; Ahmad, Musa

    2014-01-27

    A simple visual ethanol biosensor based on alcohol oxidase (AOX) immobilised onto polyaniline (PANI) film for halal verification of fermented beverage samples is described. This biosensor responds to ethanol via a colour change from green to blue, due to the enzymatic reaction of ethanol that produces acetaldehyde and hydrogen peroxide, when the latter oxidizes the PANI film. The procedure to obtain this biosensor consists of the immobilization of AOX onto PANI film by adsorption. For the immobilisation, an AOX solution is deposited on the PANI film and left at room temperature until dried (30 min). The biosensor was constructed as a dip stick for visual and simple use. The colour changes of the films have been scanned and analysed using image analysis software (i.e., ImageJ) to study the characteristics of the biosensor's response toward ethanol. The biosensor has a linear response in an ethanol concentration range of 0.01%-0.8%, with a correlation coefficient (r) of 0.996. The limit detection of the biosensor was 0.001%, with reproducibility (RSD) of 1.6% and a life time up to seven weeks when stored at 4 °C. The biosensor provides accurate results for ethanol determination in fermented drinks and was in good agreement with the standard method (gas chromatography) results. Thus, the biosensor could be used as a simple visual method for ethanol determination in fermented beverage samples that can be useful for Muslim community for halal verification.

  10. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  11. KBGIS-2: A knowledge-based geographic information system

    NASA Technical Reports Server (NTRS)

    Smith, T.; Peuquet, D.; Menon, S.; Agarwal, P.

    1986-01-01

    The architecture and working of a recently implemented knowledge-based geographic information system (KBGIS-2) that was designed to satisfy several general criteria for the geographic information system are described. The system has four major functions that include query-answering, learning, and editing. The main query finds constrained locations for spatial objects that are describable in a predicate-calculus based spatial objects language. The main search procedures include a family of constraint-satisfaction procedures that use a spatial object knowledge base to search efficiently for complex spatial objects in large, multilayered spatial data bases. These data bases are represented in quadtree form. The search strategy is designed to reduce the computational cost of search in the average case. The learning capabilities of the system include the addition of new locations of complex spatial objects to the knowledge base as queries are answered, and the ability to learn inductively definitions of new spatial objects from examples. The new definitions are added to the knowledge base by the system. The system is currently performing all its designated tasks successfully, although currently implemented on inadequate hardware. Future reports will detail the performance characteristics of the system, and various new extensions are planned in order to enhance the power of KBGIS-2.

  12. KBGIS-II: A knowledge-based geographic information system

    NASA Technical Reports Server (NTRS)

    Smith, Terence; Peuquet, Donna; Menon, Sudhakar; Agarwal, Pankaj

    1986-01-01

    The architecture and working of a recently implemented Knowledge-Based Geographic Information System (KBGIS-II), designed to satisfy several general criteria for the GIS, is described. The system has four major functions including query-answering, learning and editing. The main query finds constrained locations for spatial objects that are describable in a predicate-calculus based spatial object language. The main search procedures include a family of constraint-satisfaction procedures that use a spatial object knowledge base to search efficiently for complex spatial objects in large, multilayered spatial data bases. These data bases are represented in quadtree form. The search strategy is designed to reduce the computational cost of search in the average case. The learning capabilities of the system include the addition of new locations of complex spatial objects to the knowledge base as queries are answered, and the ability to learn inductively definitions of new spatial objects from examples. The new definitions are added to the knowledge base by the system. The system is performing all its designated tasks successfully. Future reports will relate performance characteristics of the system.

  13. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  14. Desperately seeking data: knowledge base-database links.

    PubMed Central

    Hripcsak, G.; Johnson, S. B.; Clayton, P. D.

    1993-01-01

    Linking a knowledge-based system (KBS) to a clinical database is a difficult task, but critical if such systems are to achieve widespread use. The Columbia-Presbyterian Medical Center's clinical event monitor provides alerts, interpretations, research screening, and quality assurance functions for the center. Its knowledge base consists of Arden Syntax Medical Logic Modules (MLMs). The knowledge base was analyzed in order to quantify the use and impact of KBS-database links. The MLM data slot, which contains the definition of these links, had almost as many statements (5.8 vs. 8.8, ns with p = 0.15) and more tokens (122 vs. 76, p = 0.037) than the logic slot, which contains the actual medical knowledge. The data slot underwent about twice as many modifications over time as the logic slot (3.0 vs. 1.6 modifications/version, p = 0.010). Database queries and updates accounted for 97.2% of the MLM's total elapsed execution time. Thus, KBS-database links consume substantial resources in an MLM knowledge base, in terms of coding, maintenance, and performance. PMID:8130552

  15. Fast 3D dosimetric verifications based on an electronic portal imaging device using a GPU calculation engine.

    PubMed

    Zhu, Jinhan; Chen, Lixin; Chen, Along; Luo, Guangwen; Deng, Xiaowu; Liu, Xiaowei

    2015-04-11

    To use a graphic processing unit (GPU) calculation engine to implement a fast 3D pre-treatment dosimetric verification procedure based on an electronic portal imaging device (EPID). The GPU algorithm includes the deconvolution and convolution method for the fluence-map calculations, the collapsed-cone convolution/superposition (CCCS) algorithm for the 3D dose calculations and the 3D gamma evaluation calculations. The results of the GPU-based CCCS algorithm were compared to those of Monte Carlo simulations. The planned and EPID-based reconstructed dose distributions in overridden-to-water phantoms and the original patients were compared for 6 MV and 10 MV photon beams in intensity-modulated radiation therapy (IMRT) treatment plans based on dose differences and gamma analysis. The total single-field dose computation time was less than 8 s, and the gamma evaluation for a 0.1-cm grid resolution was completed in approximately 1 s. The results of the GPU-based CCCS algorithm exhibited good agreement with those of the Monte Carlo simulations. The gamma analysis indicated good agreement between the planned and reconstructed dose distributions for the treatment plans. For the target volume, the differences in the mean dose were less than 1.8%, and the differences in the maximum dose were less than 2.5%. For the critical organs, minor differences were observed between the reconstructed and planned doses. The GPU calculation engine was used to boost the speed of 3D dose and gamma evaluation calculations, thus offering the possibility of true real-time 3D dosimetric verification.

  16. Integrating knowledge-based techniques into well-test interpretation

    SciTech Connect

    Harrison, I.W.; Fraser, J.L.

    1995-04-01

    The goal of the Spirit Project was to develop a prototype of next-generation well-test-interpretation (WTI) software that would include knowledge-based decision support for the WTI model selection task. This paper describes how Spirit makes use of several different types of information (pressure, seismic, petrophysical, geological, and engineering) to support the user in identifying the most appropriate WTI model. Spirit`s knowledge-based approach to type-curve matching is to generate several different feasible interpretations by making assumptions about the possible presence of both wellbore storage and late-time boundary effects. Spirit fuses information from type-curve matching and other data sources by use of a knowledge-based decision model developed in collaboration with a WTI expert. The sponsors of the work have judged the resulting prototype system a success.

  17. Assessing an AI knowledge-base for asymptomatic liver diseases.

    PubMed

    Babic, A; Mathiesen, U; Hedin, K; Bodemar, G; Wigertz, O

    1998-01-01

    Discovering not yet seen knowledge from clinical data is of importance in the field of asymptomatic liver diseases. Avoidance of liver biopsy which is used as the ultimate confirmation of diagnosis by making the decision based on relevant laboratory findings only, would be considered an essential support. The system based on Quinlan's ID3 algorithm was simple and efficient in extracting the sought knowledge. Basic principles of applying the AI systems are therefore described and complemented with medical evaluation. Some of the diagnostic rules were found to be useful as decision algorithms i.e. they could be directly applied in clinical work and made a part of the knowledge-base of the Liver Guide, an automated decision support system.

  18. A Knowledge-Based Approach To Planning And Scheduling

    NASA Astrophysics Data System (ADS)

    Gilmore, John F.; Williams, D. Lamont; Thornton, Sheila

    1989-03-01

    Analyses of the shop scheduling domain indicate the objective of scheduling is the determination and satisfaction of a large number of diverse constraints. Many researchers have explored the possibilities of scheduling with the assistance of dispatching rules, algorithms, heuristics and knowledge-based systems. This paper describes the development of an experimental knowledge-based planning and scheduling system which marries traditional planning and scheduling algorithms with a knowledge-based problem solving methodology in an integrated blackboard architecture. This system embodies scheduling methods and techniques which attempt to minimize one or a combination of scheduling parameters including completion time, average completion time, lateness, tardiness, and flow time. Preliminary results utilizing a test case factory involved in part production are presented.

  19. The browser prototype for the CTBT knowledge base

    SciTech Connect

    Armstrong, H.M.; Keyser, R.G.

    1997-07-02

    As part of the United States Department of Energy`s (DOE) Comprehensive Test Ban Treaty (CTBT) research and development effort, a Knowledge Base is being developed. This Knowledge Base will store the regional geophysical research results as well as geographic contexual information and make this information available to the Automated Data Processing (ADP routines) as well as human analysts involved in CTBT monitoring. This paper focuses on the initial development of a browser prototype to be used to interactively examine the contents of the CTBT Knowledge Base. The browser prototype is intended to be a research tool to experiment with different ways to display and integrate the datasets. An initial prototype version has been developed using Environmental Systems Research Incorporated`s (ESRI) ARC/INFO Geographic Information System (GIS) product. The conceptual requirements, design, initial implementation, current status, and future work plans are discussed. 4 refs., 2 figs.

  20. TVS: An Environment For Building Knowledge-Based Vision Systems

    NASA Astrophysics Data System (ADS)

    Weymouth, Terry E.; Amini, Amir A.; Tehrani, Saeid

    1989-03-01

    Advances in the field of knowledge-guided computer vision require the development of large scale projects and experimentation with them. One factor which impedes such development is the lack of software environments which combine standard image processing and graphics abilities with the ability to perform symbolic processing. In this paper, we describe a software environment that assists in the development of knowledge-based computer vision projects. We have built, upon Common LISP and C, a software development environment which combines standard image processing tools and a standard blackboard-based system, with the flexibility of the LISP programming environment. This environment has been used to develop research projects in knowledge-based computer vision and dynamic vision for robot navigation.

  1. Learning Science-Based Fitness Knowledge in Constructivist Physical Education

    PubMed Central

    Sun, Haichun; Chen, Ang; Zhu, Xihe; Ennis, Catherine D.

    2015-01-01

    Teaching fitness-related knowledge has become critical in developing children’s healthful living behavior. The purpose of this study was to examine the effects of a science-based, constructivist physical education curriculum on learning fitness knowledge critical to healthful living in elementary school students. The schools (N = 30) were randomly selected from one of the largest school districts in the United States and randomly assigned to treatment curriculum and control conditions. Students in third, fourth, and fifth grade (N = 5,717) were pre- and posttested on a standardized knowledge test on exercise principles and benefits in cardiorespiratory health, muscular capacity, and healthful nutrition and body flexibility. The results indicated that children in the treatment curriculum condition learned at a faster rate than their counterparts in the control condition. The results suggest that the constructivist curriculum is capable of inducing superior knowledge gain in third-, fourth-, and fifth-grade children. PMID:26269659

  2. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support.

    PubMed

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning, which generalizes the commonalities among the data to induce new rules, and analogical reasoning, which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15%, and

  3. A knowledge-based system for prototypical reasoning

    NASA Astrophysics Data System (ADS)

    Lieto, Antonio; Minieri, Andrea; Piana, Alberto; Radicioni, Daniele P.

    2015-04-01

    In this work we present a knowledge-based system equipped with a hybrid, cognitively inspired architecture for the representation of conceptual information. The proposed system aims at extending the classical representational and reasoning capabilities of the ontology-based frameworks towards the realm of the prototype theory. It is based on a hybrid knowledge base, composed of a classical symbolic component (grounded on a formal ontology) with a typicality based one (grounded on the conceptual spaces framework). The resulting system attempts to reconcile the heterogeneous approach to the concepts in Cognitive Science with the dual process theories of reasoning and rationality. The system has been experimentally assessed in a conceptual categorisation task where common sense linguistic descriptions were given in input, and the corresponding target concepts had to be identified. The results show that the proposed solution substantially extends the representational and reasoning 'conceptual' capabilities of standard ontology-based systems.

  4. Document Retrieval Using A Fuzzy Knowledge-Based System

    NASA Astrophysics Data System (ADS)

    Subramanian, Viswanath; Biswas, Gautam; Bezdek, James C.

    1986-03-01

    This paper presents the design and development of a prototype document retrieval system using a knowledge-based systems approach. Both the domain-specific knowledge base and the inferencing schemes are based on a fuzzy set theoretic framework. A query in natural language represents a request to retrieve a relevant subset of documents from a document base. Such a query, which can include both fuzzy terms and fuzzy relational operators, is converted into an unambiguous intermediate form by a natural language interface. Concepts that describe domain topics and the relationships between concepts, such as the synonym relation and the implication relation between a general concept and more specific concepts, have been captured in a knowledge base. The knowledge base enables the system to emulate the reasoning process followed by an expert, such as a librarian, in understanding and reformulating user queries. The retrieval mechanism processes the query in two steps. First it produces a pruned list of documents pertinent to the query. Second, it uses an evidence combination scheme to compute a degree of support between the query and individual documents produced in step one. The front-end component of the system then presents a set of document citations to the user in ranked order as an answer to the information request.

  5. Frame-based knowledge representation for processing planning

    NASA Astrophysics Data System (ADS)

    Lindsay, K. J.

    An Expert System is being developed to perform generative process planning for individual parts fabricated from extruded and sheet metal materials, and for bonded metal assemblies. The system employs a frame-based knowledge representation structure and production rules to generate detailed fabrication and processing instructions. The system is being developed using the InterLISP-D language, commercially available expert system development software and a dedicated LISP machine. The paper describes the knowledge-based representation and reasoning techniques applied within the system and pertinent development issues.

  6. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    PubMed Central

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-01-01

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated. PMID:27355949

  7. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants.

    PubMed

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-06-27

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated.

  8. A Tool for Verification and Validation of Neural Network Based Adaptive Controllers for High Assurance Systems

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Schumann, Johann

    2004-01-01

    High reliability of mission- and safety-critical software systems has been identified by NASA as a high-priority technology challenge. We present an approach for the performance analysis of a neural network (NN) in an advanced adaptive control system. This problem is important in the context of safety-critical applications that require certification, such as flight software in aircraft. We have developed a tool to measure the performance of the NN during operation by calculating a confidence interval (error bar) around the NN's output. Our tool can be used during pre-deployment verification as well as monitoring the network performance during operation. The tool has been implemented in Simulink and simulation results on a F-15 aircraft are presented.

  9. ECG-based PICC tip verification system: an evaluation 5 years on.

    PubMed

    Oliver, Gemma; Jones, Matt

    2016-10-27

    In 2011, the vascular access team at East Kent Hospitals University NHS Foundation Trust safely and successfully incorporated the use of electrocardiogram (ECG) guidance technology for verification of peripherally inserted central catheters (PICC) tip placement into their practice. This study, 5 years on, compared the strengths and limitations of using this ECG method with the previous gold-standard of post-procedural chest X-ray. The study was undertaken using an embedded case study approach, and the cost, accuracy and efficiency of both systems were evaluated and compared. Using ECG to confirm PICC tip position was found to be cheaper, quicker and more accurate than post-procedural chest X-ray.

  10. Perspective optical-electronic technologies for persons identification and verification on the bases of the fingerprints

    NASA Astrophysics Data System (ADS)

    Perju, Veacheslav L.; Casasent, David P.; Perju, Veacheslav V.; Saranciuc, Dorin I.

    2005-02-01

    There are presented the results of the investigations of the fingerprints" images correlation recognition in conditions of different distortions - scale, angular orientation change, image"s surface reducing, noises" influence. There are examined possibilities of the persons" identification and their verification. There are proposed and investigated the method of the fingerprints" semi-spectrums recognition and the method of the fingerprints" space-dependent recognition. There are presented the structures of the special purpose mono-channel and multi-channel optical-electronic systems and are described computing processes in the systems at the realization of the different fingerprints recognition algorithms: "FSR-1", "FSR-2", "FSDR-1", "FSDR-2", "FICR". Also, there are presented the results of systems investigations: fingerprints time recognition, systems productivity at the fingerprints comparison step, systems prices.

  11. Problem-Oriented Corporate Knowledge Base Models on the Case-Based Reasoning Approach Basis

    NASA Astrophysics Data System (ADS)

    Gluhih, I. N.; Akhmadulin, R. K.

    2017-07-01

    One of the urgent directions of efficiency enhancement of production processes and enterprises activities management is creation and use of corporate knowledge bases. The article suggests a concept of problem-oriented corporate knowledge bases (PO CKB), in which knowledge is arranged around possible problem situations and represents a tool for making and implementing decisions in such situations. For knowledge representation in PO CKB a case-based reasoning approach is encouraged to use. Under this approach, the content of a case as a knowledge base component has been defined; based on the situation tree a PO CKB knowledge model has been developed, in which the knowledge about typical situations as well as specific examples of situations and solutions have been represented. A generalized problem-oriented corporate knowledge base structural chart and possible modes of its operation have been suggested. The obtained models allow creating and using corporate knowledge bases for support of decision making and implementing, training, staff skill upgrading and analysis of the decisions taken. The universal interpretation of terms “situation” and “solution” adopted in the work allows using the suggested models to develop problem-oriented corporate knowledge bases in different subject domains. It has been suggested to use the developed models for making corporate knowledge bases of the enterprises that operate engineer systems and networks at large production facilities.

  12. A knowledge-based information system for monitoring drug levels.

    PubMed

    Wiener, F; Groth, T; Mortimer, O; Hallquist, I; Rane, A

    1989-06-01

    The expert system shell SMR has been enhanced to include information system routines for designing data screens and providing facilities for data entry, storage, retrieval, queries and descriptive statistics. The data for inference making is abstracted from the data base record and inserted into a data array to which the knowledge base is applied to derive the appropriate advice and comments. The enhanced system has been used to develop an intelligent information system for monitoring serum drug levels which includes evaluation of temporal changes and production of specialized printed reports. The module for digoxin has been fully developed and validated. To demonstrate the extension to other drugs a module for phenytoin was constructed with only a rudimentary knowledge base. Data from the request forms together with the S-digoxin results are entered into the data base by the department secretary. The day's results are then reviewed by the clinical pharmacologist. For each case, previous results may be displayed and are taken into account by the system in the decision process. The knowledge base is applied to the data to formulate an evaluative comment on the report returned to the requestor. The report includes a semi-graphic presentation of the current and previous results and either the system's interpretation or one entered by the pharmacologist if he does not agree with it. The pharmacologist's comment is also recorded in the data base for future retrieval, analysis and possible updating of the knowledge base. The system is now undergoing testing and evaluation under routine operations in the clinical pharmacology service. It is a prototype for other applications in both laboratory and clinical medicine currently under development at Uppsala University Hospital. This system may thus provide a vehicle for a more intensive penetration of knowledge-based systems in practical medical applications.

  13. The Latent Structure of Secure Base Script Knowledge

    PubMed Central

    Waters, Theodore E. A.; Fraley, R. Chris; Groh, Ashley M.; Steele, Ryan D.; Vaughn, Brian E.; Bost, Kelly K.; Veríssimo, Manuela; Coppola, Gabrielle; Roisman, Glenn I.

    2015-01-01

    There is increasing evidence that attachment representations abstracted from childhood experiences with primary caregivers are organized as a cognitive script describing secure base use and support (i.e., the secure base script). To date, however, the latent structure of secure base script knowledge has gone unexamined—this despite the fact that such basic information about the factor structure and distributional properties of these individual differences has important conceptual implications for our understanding of how representations of early experience are organized and generalized, as well as methodological significance in relation to maximizing statistical power and precision. In this study, we report factor and taxometric analyses that examined the latent structure of secure base script knowledge in two large samples. Results suggested that variation in secure base script knowledge—as measured by both the adolescent (N = 674) and adult (N = 714) versions of the Attachment Script Assessment—is generalized across relationships and continuously distributed. PMID:25775111

  14. MetaShare: Enabling Knowledge-Based Data Management

    NASA Astrophysics Data System (ADS)

    Pennington, D. D.; Salayandia, L.; Gates, A.; Osuna, F.

    2013-12-01

    MetaShare is a free and open source knowledge-based system for supporting data management planning, now required by some agencies and publishers. MetaShare supports users as they describe the types of data they will collect, expected standards, and expected policies for sharing. MetaShare's semantic model captures relationships between disciplines, tools, data types, data formats, and metadata standards. As the user plans their data management activities, MetaShare recommends choices based on practices and decisions from a community that has used the system for similar purposes, and extends the knowledge base to capture new relationships. The MetaShare knowledge base is being seeded with information for geoscience and environmental science domains, and is currently undergoing testing on at the University of Texas at El Paso. Through time and usage, it is expected to grow to support a variety of research domains, enabling community-based learning of data management practices. Knowledge of a user's choices during the planning phase can be used to support other tasks in the data life cycle, e.g., collecting, disseminating, and archiving data. A key barrier to scientific data sharing is the lack of sufficient metadata that provides context under which data were collected. The next phase of MetaShare development will automatically generate data collection instruments with embedded metadata and semantic annotations based on the information provided during the planning phase. While not comprehensive, this metadata will be sufficient for discovery and will enable user's to focus on more detailed descriptions of their data. Details are available at: Salayandia, L., Pennington, D., Gates, A., and Osuna, F. (accepted). MetaShare: From data management plans to knowledge base systems. AAAI Fall Symposium Series Workshop on Discovery Informatics, November 15-17, 2013, Arlington, VA.

  15. A knowledge-based multiple-sequence alignment algorithm.

    PubMed

    Nguyen, Ken D; Pan, Yi

    2013-01-01

    A common and cost-effective mechanism to identify the functionalities, structures, or relationships between species is multiple-sequence alignment, in which DNA/RNA/protein sequences are arranged and aligned so that similarities between sequences are clustered together. Correctly identifying and aligning these sequence biological similarities help from unwinding the mystery of species evolution to drug design. We present our knowledge-based multiple sequence alignment (KB-MSA) technique that utilizes the existing knowledge databases such as SWISSPROT, GENBANK, or HOMSTRAD to provide a more realistic and reliable sequence alignment. We also provide a modified version of this algorithm (CB-MSA) that utilizes the sequence consistency information when sequence knowledge databases are not available. Our benchmark tests on BAliBASE, PREFAB, HOMSTRAD, and SABMARK references show accuracy improvements up to 10 percent on twilight data sets against many leading alignment tools such as ISPALIGN, PADT, CLUSTALW, MAFFT, PROBCONS, and T-COFFEE.

  16. Monte Carlo based verification of a beam model used in a treatment planning system

    NASA Astrophysics Data System (ADS)

    Wieslander, E.; Knöös, T.

    2008-02-01

    Modern treatment planning systems (TPSs) usually separate the dose modelling into a beam modelling phase, describing the beam exiting the accelerator, followed by a subsequent dose calculation in the patient. The aim of this work is to use the Monte Carlo code system EGSnrc to study the modelling of head scatter as well as the transmission through multi-leaf collimator (MLC) and diaphragms in the beam model used in a commercial TPS (MasterPlan, Nucletron B.V.). An Elekta Precise linear accelerator equipped with an MLC has been modelled in BEAMnrc, based on available information from the vendor regarding the material and geometry of the treatment head. The collimation in the MLC direction consists of leafs which are complemented with a backup diaphragm. The characteristics of the electron beam, i.e., energy and spot size, impinging on the target have been tuned to match measured data. Phase spaces from simulations of the treatment head are used to extract the scatter from, e.g., the flattening filter and the collimating structures. Similar data for the source models used in the TPS are extracted from the treatment planning system, thus a comprehensive analysis is possible. Simulations in a water phantom, with DOSXYZnrc, are also used to study the modelling of the MLC and the diaphragms by the TPS. The results from this study will be helpful to understand the limitations of the model in the TPS and provide knowledge for further improvements of the TPS source modelling.

  17. Knowledge-based GIS techniques applied to geological engineering

    USGS Publications Warehouse

    Usery, E. Lynn; Altheide, Phyllis; Deister, Robin R.P.; Barr, David J.

    1988-01-01

    A knowledge-based geographic information system (KBGIS) approach which requires development of a rule base for both GIS processing and for the geological engineering application has been implemented. The rule bases are implemented in the Goldworks expert system development shell interfaced to the Earth Resources Data Analysis System (ERDAS) raster-based GIS for input and output. GIS analysis procedures including recoding, intersection, and union are controlled by the rule base, and the geological engineering map product is generted by the expert system. The KBGIS has been used to generate a geological engineering map of Creve Coeur, Missouri.

  18. Elicitation of neurological knowledge with argument-based machine learning.

    PubMed

    Groznik, Vida; Guid, Matej; Sadikov, Aleksander; Možina, Martin; Georgiev, Dejan; Kragelj, Veronika; Ribarič, Samo; Pirtošek, Zvezdan; Bratko, Ivan

    2013-02-01

    The paper describes the use of expert's knowledge in practice and the efficiency of a recently developed technique called argument-based machine learning (ABML) in the knowledge elicitation process. We are developing a neurological decision support system to help the neurologists differentiate between three types of tremors: Parkinsonian, essential, and mixed tremor (comorbidity). The system is intended to act as a second opinion for the neurologists, and most importantly to help them reduce the number of patients in the "gray area" that require a very costly further examination (DaTSCAN). We strive to elicit comprehensible and medically meaningful knowledge in such a way that it does not come at the cost of diagnostic accuracy. To alleviate the difficult problem of knowledge elicitation from data and domain experts, we used ABML. ABML guides the expert to explain critical special cases which cannot be handled automatically by machine learning. This very efficiently reduces the expert's workload, and combines expert's knowledge with learning data. 122 patients were enrolled into the study. The classification accuracy of the final model was 91%. Equally important, the initial and the final models were also evaluated for their comprehensibility by the neurologists. All 13 rules of the final model were deemed as appropriate to be able to support its decisions with good explanations. The paper demonstrates ABML's advantage in combining machine learning and expert knowledge. The accuracy of the system is very high with respect to the current state-of-the-art in clinical practice, and the system's knowledge base is assessed to be very consistent from a medical point of view. This opens up the possibility to use the system also as a teaching tool. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Apprenticeship learning techniques for knowledge-based systems

    SciTech Connect

    Wilkins, D.C.

    1987-01-01

    This thesis describes apprenticeship learning techniques for automation of the transfer of expertise. Apprenticeship learning is a form of learning by watching, in which learning occurs as a byproduct of building explanations of human problem-solving actions. As apprenticeship is the most-powerful method that human experts use to refine and debug their expertise in knowledge-intensive domains such as medicine; this motivates giving such capabilities to an expert system. The major accomplishment in this thesis is showing how an explicit representation of the strategy knowledge to solve a general problem class, such as diagnosis, can provide a basis for learning the knowledge that is specific to a particular domain, such as medicine. The Odysseus learning program provides the first demonstration of using the same technique to transfer of expertise to and from an expert system knowledge base. Another major focus of this thesis is limitations of apprenticeship learning. It is shown that extant techniques for reasoning under uncertainty for expert systems lead to a sociopathic knowledge base.

  20. Knowledge-based simulation using object-oriented programming

    NASA Technical Reports Server (NTRS)

    Sidoran, Karen M.

    1993-01-01

    Simulations have become a powerful mechanism for understanding and modeling complex phenomena. Their results have had substantial impact on a broad range of decisions in the military, government, and industry. Because of this, new techniques are continually being explored and developed to make them even more useful, understandable, extendable, and efficient. One such area of research is the application of the knowledge-based methods of artificial intelligence (AI) to the computer simulation field. The goal of knowledge-based simulation is to facilitate building simulations of greatly increased power and comprehensibility by making use of deeper knowledge about the behavior of the simulated world. One technique for representing and manipulating knowledge that has been enhanced by the AI community is object-oriented programming. Using this technique, the entities of a discrete-event simulation can be viewed as objects in an object-oriented formulation. Knowledge can be factual (i.e., attributes of an entity) or behavioral (i.e., how the entity is to behave in certain circumstances). Rome Laboratory's Advanced Simulation Environment (RASE) was developed as a research vehicle to provide an enhanced simulation development environment for building more intelligent, interactive, flexible, and realistic simulations. This capability will support current and future battle management research and provide a test of the object-oriented paradigm for use in large scale military applications.

  1. Knowledge base image classification using P-trees

    NASA Astrophysics Data System (ADS)

    Seetha, M.; Ravi, G.

    2010-02-01

    Image Classification is the process of assigning classes to the pixels in remote sensed images and important for GIS applications, since the classified image is much easier to incorporate than the original unclassified image. To resolve misclassification in traditional parametric classifier like Maximum Likelihood Classifier, the neural network classifier is implemented using back propagation algorithm. The extra spectral and spatial knowledge acquired from the ancillary information is required to improve the accuracy and remove the spectral confusion. To build knowledge base automatically, this paper explores a non-parametric decision tree classifier to extract knowledge from the spatial data in the form of classification rules. A new method is proposed using a data structure called Peano Count Tree (P-tree) for decision tree classification. The Peano Count Tree is a spatial data organization that provides a lossless compressed representation of a spatial data set and facilitates efficient classification than other data mining techniques. The accuracy is assessed using the parameters overall accuracy, User's accuracy and Producer's accuracy for image classification methods of Maximum Likelihood Classification, neural network classification using back propagation, Knowledge Base Classification, Post classification and P-tree Classifier. The results reveal that the knowledge extracted from decision tree classifier and P-tree data structure from proposed approach remove the problem of spectral confusion to a greater extent. It is ascertained that the P-tree classifier surpasses the other classification techniques.

  2. After the Crash: Research-Based Theater for Knowledge Transfer

    ERIC Educational Resources Information Center

    Colantonio, Angela; Kontos, Pia C.; Gilbert, Julie E.; Rossiter, Kate; Gray, Julia; Keightley, Michelle L.

    2008-01-01

    Introduction: The aim of this project was to develop and evaluate a research-based dramatic production for the purpose of transferring knowledge about traumatic brain injury (TBI) to health care professionals, managers, and decision makers. Methods: Using results drawn from six focus group discussions with key stakeholders (consumers, informal…

  3. Ada as an implementation language for knowledge based systems

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel

    1990-01-01

    Debates about the selection of programming languages often produce cultural collisions that are not easily resolved. This is especially true in the case of Ada and knowledge based programming. The construction of programming tools provides a desirable alternative for resolving the conflict.

  4. Towards an Intelligent Planning Knowledge Base Development Environment

    NASA Technical Reports Server (NTRS)

    Chien, S.

    1994-01-01

    ract describes work in developing knowledge base editing and debugging tools for the Multimission VICAR Planner (MVP) system. MVP uses artificial intelligence planning techniques to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing requests made to the JPL Multimission Image Processing Laboratory.

  5. SCU at TREC 2014 Knowledge Base Acceleration Track

    DTIC Science & Technology

    2014-11-01

    SCU at TREC 2014 Knowledge Base Acceleration Track Hung Nguyen, Yi Fang Department of Computer Engineering Santa Clara University 500 El Camino ...University,Department of Computer Engineering,500 El Camino Real,Santa Clara,CA,95053 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING

  6. Development of a Knowledge Base for Incorporating Technology into Courses

    ERIC Educational Resources Information Center

    Rath, Logan

    2013-01-01

    This article discusses a project resulting from the request of a group of faculty at The College at Brockport to create a website for best practices in teaching and technology. The project evolved into a knowledge base powered by WordPress. Installation and configuration of WordPress resulted in the creation of custom taxonomies and post types,…

  7. Value Creation in the Knowledge-Based Economy

    ERIC Educational Resources Information Center

    Liu, Fang-Chun

    2013-01-01

    Effective investment strategies help companies form dynamic core organizational capabilities allowing them to adapt and survive in today's rapidly changing knowledge-based economy. This dissertation investigates three valuation issues that challenge managers with respect to developing business-critical investment strategies that can have…

  8. Language-Based Prior Knowledge and Transition to Mathematics

    ERIC Educational Resources Information Center

    Dogan-Dunlap, Hamide; Torres, Cristina; Chen, Fan

    2005-01-01

    The paper provides a college mathematics student's concept maps, definitions, and essays to support the thesis that language-based prior knowledge can influence students' cognitive processes of mathematical concepts. A group of intermediate algebra students who displayed terms mainly from the spoken language on the first and the second concept…

  9. Enhancing Acronym/Abbreviation Knowledge Bases with Semantic Information

    PubMed Central

    Torii, Manabu; Liu, Hongfang

    2007-01-01

    Objective: In the biomedical domain, a terminology knowledge base that associates acronyms/abbreviations (denoted as SFs) with the definitions (denoted as LFs) is highly needed. For the construction such terminology knowledge base, we investigate the feasibility to build a system automatically assigning semantic categories to LFs extracted from text. Methods: Given a collection of pairs (SF,LF) derived from text, we i) assess the coverage of LFs and pairs (SF,LF) in the UMLS and justify the need of a semantic category assignment system; and ii) automatically derive name phrases annotated with semantic category and construct a system using machine learning. Results: Utilizing ADAM, an existing collection of (SF,LF) pairs extracted from MEDLINE, our system achieved an f-measure of 87% when assigning eight UMLS-based semantic groups to LFs. The system has been incorporated into a web interface which integrates SF knowledge from multiple SF knowledge bases. Web site: http://gauss.dbb.georgetown.edu/liblab/SFThesurus. PMID:18693933

  10. Knowledge Based Engineering for Spatial Database Management and Use

    NASA Technical Reports Server (NTRS)

    Peuquet, D. (Principal Investigator)

    1984-01-01

    The use of artificial intelligence techniques that are applicable to Geographic Information Systems (GIS) are examined. Questions involving the performance and modification to the database structure, the definition of spectra in quadtree structures and their use in search heuristics, extension of the knowledge base, and learning algorithm concepts are investigated.

  11. Toffler's Powershift: Creating New Knowledge Bases in Higher Education.

    ERIC Educational Resources Information Center

    Powers, Patrick James

    This paper examines the creation of new knowledge bases in higher education in light of the ideas of Alvin Toffler, whose trilogy "Future Shock" (1970), "The Third Wave" (1980), and "Powershift" (1990) focus on the processes, directions, and control of change, respectively. It discusses the increasingly important role…

  12. Spinning Fantasy: Themes, Structure, and the Knowledge Base.

    ERIC Educational Resources Information Center

    Lucariello, Joan

    1987-01-01

    Investigated the influence of the child's knowledge base on symbolic play in terms of event schemas. Pretend play of 10 mother-child (ages 24 to 29 months) dyads was observed in novel and free play contexts. Play was examined for thematic content, self-other relations, substitute/imaginary objects, action integration, and planfulness. (Author/BN)

  13. Designing a Knowledge Base for Automatic Book Classification.

    ERIC Educational Resources Information Center

    Kim, Jeong-Hyen; Lee, Kyung-Ho

    2002-01-01

    Reports on the design of a knowledge base for an automatic classification in the library science field by using the facet classification principles of colon classification. Discusses inputting titles or key words into the computer to create class numbers through automatic subject recognition and processing title key words. (Author/LRW)

  14. PLAN-IT - Knowledge-based mission sequencing

    NASA Technical Reports Server (NTRS)

    Biefeld, Eric W.

    1987-01-01

    PLAN-IT (Plan-Integrated Timelines), a knowledge-based approach to assist in mission sequencing, is discussed. PLAN-IT uses a large set of scheduling techniques known as strategies to develop and maintain a mission sequence. The approach implemented by PLAN-IT and the current applications of PLAN-IT for sequencing at NASA are reported.

  15. After the Crash: Research-Based Theater for Knowledge Transfer

    ERIC Educational Resources Information Center

    Colantonio, Angela; Kontos, Pia C.; Gilbert, Julie E.; Rossiter, Kate; Gray, Julia; Keightley, Michelle L.

    2008-01-01

    Introduction: The aim of this project was to develop and evaluate a research-based dramatic production for the purpose of transferring knowledge about traumatic brain injury (TBI) to health care professionals, managers, and decision makers. Methods: Using results drawn from six focus group discussions with key stakeholders (consumers, informal…

  16. Designing a Knowledge Base for Automatic Book Classification.

    ERIC Educational Resources Information Center

    Kim, Jeong-Hyen; Lee, Kyung-Ho

    2002-01-01

    Reports on the design of a knowledge base for an automatic classification in the library science field by using the facet classification principles of colon classification. Discusses inputting titles or key words into the computer to create class numbers through automatic subject recognition and processing title key words. (Author/LRW)

  17. CACTUS: Command and Control Training Using Knowledge-Based Simulations

    ERIC Educational Resources Information Center

    Hartley, Roger; Ravenscroft, Andrew; Williams, R. J.

    2008-01-01

    The CACTUS project was concerned with command and control training of large incidents where public order may be at risk, such as large demonstrations and marches. The training requirements and objectives of the project are first summarized justifying the use of knowledge-based computer methods to support and extend conventional training…

  18. Planning and Implementing a High Performance Knowledge Base.

    ERIC Educational Resources Information Center

    Cortez, Edwin M.

    1999-01-01

    Discusses the conceptual framework for developing a rapid-prototype high-performance knowledge base for the four mission agencies of the United States Department of Agriculture and their university partners. Describes the background of the project and methods used for establishing the requirements; examines issues and problems surrounding semantic…

  19. Planning and Implementing a High Performance Knowledge Base.

    ERIC Educational Resources Information Center

    Cortez, Edwin M.

    1999-01-01

    Discusses the conceptual framework for developing a rapid-prototype high-performance knowledge base for the four mission agencies of the United States Department of Agriculture and their university partners. Describes the background of the project and methods used for establishing the requirements; examines issues and problems surrounding semantic…

  20. The Pedagogical Knowledge Base of Four TESOL Teachers

    ERIC Educational Resources Information Center

    Mullock, Barbara

    2006-01-01

    Many researchers have called for a broadening of the theoretical base of language teacher development programs to include gathering information not only on what teachers do in the classroom, but also on what they know, and how this knowledge is transferred to their teaching behavior, especially as they gain more experience in the classroom.…