Science.gov

Sample records for lcg mcdb-a knowledgebase

  1. LCG MCDB—a knowledgebase of Monte-Carlo simulated events

    NASA Astrophysics Data System (ADS)

    Belov, S.; Dudko, L.; Galkin, E.; Gusev, A.; Pokorski, W.; Sherstnev, A.

    2008-02-01

    In this paper we report on LCG Monte-Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC Collaborations by experts. In many cases, the modern Monte-Carlo simulation of physical processes requires expert knowledge in Monte-Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly dedicated to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project. Program summaryProgram title: LCG Monte-Carlo Data Base Catalogue identifier: ADZX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 30 129 No. of bytes in distributed program, including test data, etc.: 216 943 Distribution format: tar.gz Programming language: Perl Computer: CPU: Intel Pentium 4, RAM: 1 Gb, HDD: 100 Gb Operating system: Scientific Linux CERN 3/4 RAM: 1 073 741 824 bytes (1 Gb) Classification: 9 External routines:perl >= 5.8.5; Perl modules DBD-mysql >= 2.9004, File::Basename, GD::SecurityImage, GD::SecurityImage::AC, Linux::Statistics, XML::LibXML > 1.6, XML::SAX, XML::NamespaceSupport; Apache HTTP Server >= 2.0.59; mod auth external >= 2.2.9; edg-utils-system RPM package; gd >= 2.0.28; rpm package CASTOR-client >= 2.1.2-4; arc-server (optional) Nature of problem: Often, different groups of experimentalists prepare similar samples of particle collision events or turn to the same group of authors of Monte-Carlo (MC

  2. LcgCAF: CDF access method to LCG resources

    NASA Astrophysics Data System (ADS)

    Compostella, Gabriele; Bauce, Matteo; Pagan Griso, Simone; Lucchesi, Donatella; Sgaravatto, Massimo; Cecchi, Marco

    2011-12-01

    Up to the early 2011, the CDF collaboration has collected more than 8 fb-1 of data from pbar p collisions at a center of mass energy TeV delivered by the Tevatron collider at Fermilab. Second generation physics measurements, like precision determinations of top properties or searches for the Standard Model higgs, require increasing computing power for data analysis and events simulation. Instead of expanding its set of dedicated Condor based analysis farms, CDF moved to Grid resources. While in the context of OSG this transition was performed using Condor glideins and keeping CDF custom middleware software almost intact, in LCG a complete rewrite of the experiment's submission and monitoring tools was realized, taking full advantage of the features offered by the gLite Workload Management System (WMS). This led to the development of a new computing facility called LcgCAF that CDF collaborators are using to exploit Grid resources in Europe in a transparent way. Given the opportunistic usage of the available resources, it is of crucial importance for CDF to maximize jobs efficiency from submission to output retrieval. This work describes how an experimental resubmisson feature implemented in the WMS was tested in LcgCAF with the aim of lowering the overall execution time of a typical CDF job.

  3. WHALE, a management tool for Tier-2 LCG sites

    NASA Astrophysics Data System (ADS)

    Barone, L. M.; Organtini, G.; Talamo, I. G.

    2012-12-01

    The LCG (Worldwide LHC Computing Grid) is a grid-based hierarchical computing distributed facility, composed of more than 140 computing centers, organized in 4 tiers, by size and offer of services. Every site, although indipendent for many technical choices, has to provide services with a well-defined set of interfaces. For this reason, different LCG sites need frequently to manage very similar situations, like jobs behaviour on the batch system, dataset transfers between sites, operating system and experiment software installation and configuration, monitoring of services. In this context we created WHALE (WHALE Handles Administration in an LCG Environment), a software actually used at the T2_IT_Rome site, an LCG Tier-2 for the CMS experiment. WHALE is a generic, site independent tool written in Python: it allows administrator to interact in a uniform and coherent way with several subsystems using a high level syntax which hides specific commands. The architecture of WHALE is based on the plugin concept and on the possibility of connecting the output of a plugin to the input of the next one, in a pipe-like system, giving the administrator the possibility of making complex functions by combining the simpler ones. The core of WHALE just handles the plugin orchestrations, while even the basic functions (eg. the WHALE activity logging) are performed by plugins, giving the capability to tune and possibly modify every component of the system. WHALE already provides many plugins useful for a LCG site and some more for a Tier-2 of the CMS experiment, especially in the field of job management, dataset transfer and analysis of performance results and availability tests (eg. Nagios tests, SAM tests). Thanks to its architecture and the provided plugins WHALE makes easy to perform tasks that, even if logically simple, are technically complex or tedious, like eg. closing all the worker nodes with a job-failure rate greater than a given threshold. Finally, thanks to the

  4. The Knowledgebase Kibbutz

    ERIC Educational Resources Information Center

    Singer, Ross

    2008-01-01

    As libraries' collections increasingly go digital, so too does their dependence on knowledgebases to access and maintain these electronic holdings. Somewhat different from other library-based knowledge management systems (catalogs, institutional repositories, etc.), the data found in the knowledgebases of link resolvers or electronic resource…

  5. Space Environmental Effects Knowledgebase

    NASA Technical Reports Server (NTRS)

    Wood, B. E.

    2007-01-01

    This report describes the results of an NRA funded program entitled Space Environmental Effects Knowledgebase that received funding through a NASA NRA (NRA8-31) and was monitored by personnel in the NASA Space Environmental Effects (SEE) Program. The NASA Project number was 02029. The Satellite Contamination and Materials Outgassing Knowledgebase (SCMOK) was created as a part of the earlier NRA8-20. One of the previous tasks and part of the previously developed Knowledgebase was to accumulate data from facilities using QCMs to measure the outgassing data for satellite materials. The main object of this current program was to increase the number of material outgassing datasets from 250 up to approximately 500. As a part of this effort, a round-robin series of materials outgassing measurements program was also executed that allowed comparison of the results for the same materials tested in 10 different test facilities. Other programs tasks included obtaining datasets or information packages for 1) optical effects of contaminants on optical surfaces, thermal radiators, and sensor systems and 2) space environmental effects data and incorporating these data into the already existing NASA/SEE Knowledgebase.

  6. The Reactome pathway Knowledgebase

    PubMed Central

    Fabregat, Antonio; Sidiropoulos, Konstantinos; Garapati, Phani; Gillespie, Marc; Hausmann, Kerstin; Haw, Robin; Jassal, Bijay; Jupe, Steven; Korninger, Florian; McKay, Sheldon; Matthews, Lisa; May, Bruce; Milacic, Marija; Rothfels, Karen; Shamovsky, Veronica; Webber, Marissa; Weiser, Joel; Williams, Mark; Wu, Guanming; Stein, Lincoln; Hermjakob, Henning; D'Eustachio, Peter

    2016-01-01

    The Reactome Knowledgebase (www.reactome.org) provides molecular details of signal transduction, transport, DNA replication, metabolism and other cellular processes as an ordered network of molecular transformations—an extended version of a classic metabolic map, in a single consistent data model. Reactome functions both as an archive of biological processes and as a tool for discovering unexpected functional relationships in data such as gene expression pattern surveys or somatic mutation catalogues from tumour cells. Over the last two years we redeveloped major components of the Reactome web interface to improve usability, responsiveness and data visualization. A new pathway diagram viewer provides a faster, clearer interface and smooth zooming from the entire reaction network to the details of individual reactions. Tool performance for analysis of user datasets has been substantially improved, now generating detailed results for genome-wide expression datasets within seconds. The analysis module can now be accessed through a RESTFul interface, facilitating its inclusion in third party applications. A new overview module allows the visualization of analysis results on a genome-wide Reactome pathway hierarchy using a single screen page. The search interface now provides auto-completion as well as a faceted search to narrow result lists efficiently. PMID:26656494

  7. ECOTOX knowledgebase: Search features and customized reports

    EPA Science Inventory

    The ECOTOXicology knowledgebase (ECOTOX) is a comprehensive, publicly available knowledgebase developed and maintained by ORD/NHEERL. It is used for environmental toxicity data on aquatic life, terrestrial plants and wildlife. ECOTOX has the capability to refine and filter search...

  8. Knowledge-Based Image Analysis.

    DTIC Science & Technology

    1981-04-01

    UNCLASSIF1 ED ETL-025s N IIp ETL-0258 AL Ai01319 S"Knowledge-based image analysis u George C. Stockman Barbara A. Lambird I David Lavine Laveen N. Kanal...extraction, verification, region classification, pattern recognition, image analysis . 3 20. A. CT (Continue on rever.. d. It necessary and Identify by...UNCLgSTFTF n In f SECURITY CLASSIFICATION OF THIS PAGE (When Date Entered) .L1 - I Table of Contents Knowledge Based Image Analysis I Preface

  9. Knowledge-based media adaptation

    NASA Astrophysics Data System (ADS)

    Leopold, Klaus; Jannach, Dietmar; Hellwagner, Hermann

    2004-10-01

    This paper introduces the principal approach and describes the basic architecture and current implementation of the knowledge-based multimedia adaptation framework we are currently developing. The framework can be used in Universal Multimedia Access scenarios, where multimedia content has to be adapted to specific usage environment parameters (network and client device capabilities, user preferences). Using knowledge-based techniques (state-space planning), the framework automatically computes an adaptation plan, i.e., a sequence of media conversion operations, to transform the multimedia resources to meet the client's requirements or constraints. The system takes as input standards-compliant descriptions of the content (using MPEG-7 metadata) and of the target usage environment (using MPEG-21 Digital Item Adaptation metadata) to derive start and goal states for the planning process, respectively. Furthermore, declarative descriptions of the conversion operations (such as available via software library functions) enable existing adaptation algorithms to be invoked without requiring programming effort. A running example in the paper illustrates the descriptors and techniques employed by the knowledge-based media adaptation system.

  10. Knowledge-based nursing diagnosis

    NASA Astrophysics Data System (ADS)

    Roy, Claudette; Hay, D. Robert

    1991-03-01

    Nursing diagnosis is an integral part of the nursing process and determines the interventions leading to outcomes for which the nurse is accountable. Diagnoses under the time constraints of modern nursing can benefit from a computer assist. A knowledge-based engineering approach was developed to address these problems. A number of problems were addressed during system design to make the system practical extended beyond capture of knowledge. The issues involved in implementing a professional knowledge base in a clinical setting are discussed. System functions, structure, interfaces, health care environment, and terminology and taxonomy are discussed. An integrated system concept from assessment through intervention and evaluation is outlined.

  11. Cooperating knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Feigenbaum, Edward A.; Buchanan, Bruce G.

    1988-01-01

    This final report covers work performed under Contract NCC2-220 between NASA Ames Research Center and the Knowledge Systems Laboratory, Stanford University. The period of research was from March 1, 1987 to February 29, 1988. Topics covered were as follows: (1) concurrent architectures for knowledge-based systems; (2) methods for the solution of geometric constraint satisfaction problems, and (3) reasoning under uncertainty. The research in concurrent architectures was co-funded by DARPA, as part of that agency's Strategic Computing Program. The research has been in progress since 1985, under DARPA and NASA sponsorship. The research in geometric constraint satisfaction has been done in the context of a particular application, that of determining the 3-D structure of complex protein molecules, using the constraints inferred from NMR measurements.

  12. Protective Effects of the Launch/Entry Suit (LES) and the Liquid Cooling Garment(LCG) During Re-entry and Landing After Spaceflight

    NASA Technical Reports Server (NTRS)

    Perez, Sondra A.; Charles, John B.; Fortner, G. William; Hurst, Victor, IV; Meck, Janice V.

    2002-01-01

    Heart rate and arterial pressure were measured during shuttle re-entry, landing and initial standing in crewmembers with and without inflated anti-g suits and with and without liquid cooling garments (LCG). Preflight, three measurements were obtained seated, then standing. Prior to and during re-entry, arterial pressure and heart rate were measured every five minutes until wheels stop (WS). Then crewmembers initiated three seated and three standing measurements. In subjects without inflated anti-g suits, SBP and DBP were significantly lower during preflight standing (P = 0.006; P = 0.001 respectively) and at touchdown (TD) (P = 0.001; P = 0.003 respectively); standing SBP was significantly lower after WS. on-LeG users developed significantly higher heart rates during re-entry (P = 0.029, maxG; P = 0.05, TD; P = 0.02, post-WS seated; P = 0.01, post-WS standing) than LCG users. Our data suggest that the anti-g suit is effective, but the combined anti-g suit with LCG is more effective.

  13. The Coming of Knowledge-Based Business.

    ERIC Educational Resources Information Center

    Davis, Stan; Botkin, Jim

    1994-01-01

    Economic growth will come from knowledge-based businesses whose "smart" products filter and interpret information. Businesses will come to think of themselves as educators and their customers as learners. (SK)

  14. Distributed, cooperating knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt

    1991-01-01

    Some current research in the development and application of distributed, cooperating knowledge-based systems technology is addressed. The focus of the current research is the spacecraft ground operations environment. The underlying hypothesis is that, because of the increasing size, complexity, and cost of planned systems, conventional procedural approaches to the architecture of automated systems will give way to a more comprehensive knowledge-based approach. A hallmark of these future systems will be the integration of multiple knowledge-based agents which understand the operational goals of the system and cooperate with each other and the humans in the loop to attain the goals. The current work includes the development of a reference model for knowledge-base management, the development of a formal model of cooperating knowledge-based agents, the use of testbed for prototyping and evaluating various knowledge-based concepts, and beginning work on the establishment of an object-oriented model of an intelligent end-to-end (spacecraft to user) system. An introductory discussion of these activities is presented, the major concepts and principles being investigated are highlighted, and their potential use in other application domains is indicated.

  15. Knowledge-based systems and NASA's software support environment

    NASA Technical Reports Server (NTRS)

    Dugan, Tim; Carmody, Cora; Lennington, Kent; Nelson, Bob

    1990-01-01

    A proposed role for knowledge-based systems within NASA's Software Support Environment (SSE) is described. The SSE is chartered to support all software development for the Space Station Freedom Program (SSFP). This includes support for development of knowledge-based systems and the integration of these systems with conventional software systems. In addition to the support of development of knowledge-based systems, various software development functions provided by the SSE will utilize knowledge-based systems technology.

  16. Knowledge-based diagnosis for aerospace systems

    NASA Technical Reports Server (NTRS)

    Atkinson, David J.

    1988-01-01

    The need for automated diagnosis in aerospace systems and the approach of using knowledge-based systems are examined. Research issues in knowledge-based diagnosis which are important for aerospace applications are treated along with a review of recent relevant research developments in Artificial Intelligence. The design and operation of some existing knowledge-based diagnosis systems are described. The systems described and compared include the LES expert system for liquid oxygen loading at NASA Kennedy Space Center, the FAITH diagnosis system developed at the Jet Propulsion Laboratory, the PES procedural expert system developed at SRI International, the CSRL approach developed at Ohio State University, the StarPlan system developed by Ford Aerospace, the IDM integrated diagnostic model, and the DRAPhys diagnostic system developed at NASA Langley Research Center.

  17. Knowledge-based commodity distribution planning

    NASA Technical Reports Server (NTRS)

    Saks, Victor; Johnson, Ivan

    1994-01-01

    This paper presents an overview of a Decision Support System (DSS) that incorporates Knowledge-Based (KB) and commercial off the shelf (COTS) technology components. The Knowledge-Based Logistics Planning Shell (KBLPS) is a state-of-the-art DSS with an interactive map-oriented graphics user interface and powerful underlying planning algorithms. KBLPS was designed and implemented to support skilled Army logisticians to prepare and evaluate logistics plans rapidly, in order to support corps-level battle scenarios. KBLPS represents a substantial advance in graphical interactive planning tools, with the inclusion of intelligent planning algorithms that provide a powerful adjunct to the planning skills of commodity distribution planners.

  18. Knowledge-Based Instructional Gaming: GEO.

    ERIC Educational Resources Information Center

    Duchastel, Philip

    1989-01-01

    Describes the design and development of an instructional game, GEO, in which the user learns elements of Canadian geography. The use of knowledge-based artificial intelligence techniques is discussed, the use of HyperCard in the design of GEO is explained, and future directions are suggested. (15 references) (Author/LRW)

  19. Analysis of Unit-Level Changes in Operations with Increased SPP Wind from EPRI/LCG Balancing Study

    SciTech Connect

    Hadley, Stanton W

    2012-01-01

    Wind power development in the United States is outpacing previous estimates for many regions, particularly those with good wind resources. The pace of wind power deployment may soon outstrip regional capabilities to provide transmission and integration services to achieve the most economic power system operation. Conversely, regions such as the Southeastern United States do not have good wind resources and will have difficulty meeting proposed federal Renewable Portfolio Standards with local supply. There is a growing need to explore innovative solutions for collaborating between regions to achieve the least cost solution for meeting such a renewable energy mandate. The Department of Energy funded the project 'Integrating Midwest Wind Energy into Southeast Electricity Markets' to be led by EPRI in coordination with the main authorities for the regions: SPP, Entergy, TVA, Southern Company and OPC. EPRI utilized several subcontractors for the project including LCG, the developers of the model UPLAN. The study aims to evaluate the operating cost benefits of coordination of scheduling and balancing for Southwest Power Pool (SPP) wind transfers to Southeastern Electric Reliability Council (SERC) Balancing Authorities (BAs). The primary objective of this project is to analyze the benefits of regional cooperation for integrating mid-western wind energy into southeast electricity markets. Scenarios were defined, modeled and investigated to address production variability and uncertainty and the associated balancing of large quantities of wind power in SPP and delivery to energy markets in the southern regions of the SERC. DOE funded Oak Ridge National Laboratory to provide additional support to the project, including a review of results and any side analysis that may provide additional insight. This report is a unit-by-unit analysis of changes in operations due to the different scenarios used in the overall study. It focuses on the change in capacity factors and the number

  20. The importance of knowledge-based technology.

    PubMed

    Cipriano, Pamela F

    2012-01-01

    Nurse executives are responsible for a workforce that can provide safer and more efficient care in a complex sociotechnical environment. National quality priorities rely on technologies to provide data collection, share information, and leverage analytic capabilities to interpret findings and inform approaches to care that will achieve better outcomes. As a key steward for quality, the nurse executive exercises leadership to provide the infrastructure to build and manage nursing knowledge and instill accountability for following evidence-based practices. These actions contribute to a learning health system where new knowledge is captured as a by-product of care delivery enabled by knowledge-based electronic systems. The learning health system also relies on rigorous scientific evidence embedded into practice at the point of care. The nurse executive optimizes use of knowledge-based technologies, integrated throughout the organization, that have the capacity to help transform health care.

  1. Knowledge-Based Entrepreneurship in a Boundless Research System

    ERIC Educational Resources Information Center

    Dell'Anno, Davide

    2008-01-01

    International entrepreneurship and knowledge-based entrepreneurship have recently generated considerable academic and non-academic attention. This paper explores the "new" field of knowledge-based entrepreneurship in a boundless research system. Cultural barriers to the development of business opportunities by researchers persist in some academic…

  2. Knowledge-based public health situation awareness

    NASA Astrophysics Data System (ADS)

    Mirhaji, Parsa; Zhang, Jiajie; Srinivasan, Arunkumar; Richesson, Rachel L.; Smith, Jack W.

    2004-09-01

    There have been numerous efforts to create comprehensive databases from multiple sources to monitor the dynamics of public health and most specifically to detect the potential threats of bioterrorism before widespread dissemination. But there are not many evidences for the assertion that these systems are timely and dependable, or can reliably identify man made from natural incident. One must evaluate the value of so called 'syndromic surveillance systems' along with the costs involved in design, development, implementation and maintenance of such systems and the costs involved in investigation of the inevitable false alarms1. In this article we will introduce a new perspective to the problem domain with a shift in paradigm from 'surveillance' toward 'awareness'. As we conceptualize a rather different approach to tackle the problem, we will introduce a different methodology in application of information science, computer science, cognitive science and human-computer interaction concepts in design and development of so called 'public health situation awareness systems'. We will share some of our design and implementation concepts for the prototype system that is under development in the Center for Biosecurity and Public Health Informatics Research, in the University of Texas Health Science Center at Houston. The system is based on a knowledgebase containing ontologies with different layers of abstraction, from multiple domains, that provide the context for information integration, knowledge discovery, interactive data mining, information visualization, information sharing and communications. The modular design of the knowledgebase and its knowledge representation formalism enables incremental evolution of the system from a partial system to a comprehensive knowledgebase of 'public health situation awareness' as it acquires new knowledge through interactions with domain experts or automatic discovery of new knowledge.

  3. Knowledge-based systems in Japan

    NASA Technical Reports Server (NTRS)

    Feigenbaum, Edward; Engelmore, Robert S.; Friedland, Peter E.; Johnson, Bruce B.; Nii, H. Penny; Schorr, Herbert; Shrobe, Howard

    1994-01-01

    This report summarizes a study of the state-of-the-art in knowledge-based systems technology in Japan, organized by the Japanese Technology Evaluation Center (JTEC) under the sponsorship of the National Science Foundation and the Advanced Research Projects Agency. The panel visited 19 Japanese sites in March 1992. Based on these site visits plus other interactions with Japanese organizations, both before and after the site visits, the panel prepared a draft final report. JTEC sent the draft to the host organizations for their review. The final report was published in May 1993.

  4. Systems Biology Knowledgebase (GSC8 Meeting)

    ScienceCinema

    Cottingham, Robert W [ORNL

    2016-07-12

    The Genomic Standards Consortium was formed in September 2005. It is an international, open-membership working body which promotes standardization in the description of genomes and the exchange and integration of genomic data. The 2009 meeting was an activity of a five-year funding "Research Coordination Network" from the National Science Foundation and was organized held at the DOE Joint Genome Institute with organizational support provided by the JGI and by the University of California - San Diego. Robert W. Cottingham of Oak Ridge National Laboratory discusses the DOE KnowledgeBase at the Genomic Standards Consortium's 8th meeting at the DOE JGI in Walnut Creek, Calif. on Sept. 9, 2009.

  5. An Introduction to the Heliophysics Event Knowledgebase

    NASA Astrophysics Data System (ADS)

    Hurlburt, Neal E.; Cheung, M.; Schrijver, C.; Chang, L.; Freeland, S.; Green, S.; Heck, C.; Jaffey, A.; Kobashi, A.; Schiff, D.; Serafin, J.; Seguin, R.; Slater, G.; Somani, A.; Timmons, R.

    2010-05-01

    The immense volume of data generated by the suite of instruments on SDO requires new tools for efficiently identifying and accessing data that are most relevant to research investigations. We have developed the Heliophysics Events Knowledgebase (HEK) to fill this need. The system developed to support the HEK combines automated datamining using feature detection methods; high-performance visualization systems for data markup; and web-services and clients for searching the resulting metadata, reviewing results and efficient access to the data. We will review these components and present examples of their use with SDO data.

  6. Knowledge-based Autonomous Test Engineer (KATE)

    NASA Technical Reports Server (NTRS)

    Parrish, Carrie L.; Brown, Barbara L.

    1991-01-01

    Mathematical models of system components have long been used to allow simulators to predict system behavior to various stimuli. Recent efforts to monitor, diagnose, and control real-time systems using component models have experienced similar success. NASA Kennedy is continuing the development of a tool for implementing real-time knowledge-based diagnostic and control systems called KATE (Knowledge based Autonomous Test Engineer). KATE is a model-based reasoning shell designed to provide autonomous control, monitoring, fault detection, and diagnostics for complex engineering systems by applying its reasoning techniques to an exchangeable quantitative model describing the structure and function of the various system components and their systemic behavior.

  7. Bioenergy Science Center KnowledgeBase

    DOE Data Explorer

    Syed, M. H.; Karpinets, T. V.; Parang, M.; Leuze, M. R.; Park, B. H.; Hyatt, D.; Brown, S. D.; Moulton, S. Galloway, M.D.; Uberbacher, E. C.

    The challenge of converting cellulosic biomass to sugars is the dominant obstacle to cost effective production of biofuels in s capable of significant enough quantities to displace U. S. consumption of fossil transportation fuels. The BioEnergy Science Center (BESC) tackles this challenge of biomass recalcitrance by closely linking (1) plant research to make cell walls easier to deconstruct, and (2) microbial research to develop multi-talented biocatalysts tailor-made to produce biofuels in a single step. [from the 2011 BESC factsheet] The BioEnergy Science Center (BESC) is a multi-institutional, multidisciplinary research (biological, chemical, physical and computational sciences, mathematics and engineering) organization focused on the fundamental understanding and elimination of biomass recalcitrance. The BESC Knowledgebase and its associated tools is a discovery platform for bioenergy research. It consists of a collection of metadata, data, and computational tools for data analysis, integration, comparison and visualization for plants and microbes in the center.The BESC Knowledgebase (KB) and BESC Laboratory Information Management System (LIMS) enable bioenergy researchers to perform systemic research. [http://bobcat.ornl.gov/besc/index.jsp

  8. Advances in knowledge-based software engineering

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt

    1991-01-01

    The underlying hypothesis of this work is that a rigorous and comprehensive software reuse methodology can bring about a more effective and efficient utilization of constrained resources in the development of large-scale software systems by both government and industry. It is also believed that correct use of this type of software engineering methodology can significantly contribute to the higher levels of reliability that will be required of future operational systems. An overview and discussion of current research in the development and application of two systems that support a rigorous reuse paradigm are presented: the Knowledge-Based Software Engineering Environment (KBSEE) and the Knowledge Acquisition fo the Preservation of Tradeoffs and Underlying Rationales (KAPTUR) systems. Emphasis is on a presentation of operational scenarios which highlight the major functional capabilities of the two systems.

  9. Knowledge-based scheduling of arrival aircraft

    NASA Technical Reports Server (NTRS)

    Krzeczowski, K.; Davis, T.; Erzberger, H.; Lev-Ram, I.; Bergh, C.

    1995-01-01

    A knowledge-based method for scheduling arrival aircraft in the terminal area has been implemented and tested in real-time simulation. The scheduling system automatically sequences, assigns landing times, and assigns runways to arrival aircraft by utilizing continuous updates of aircraft radar data and controller inputs. The scheduling algorithms is driven by a knowledge base which was obtained in over two thousand hours of controller-in-the-loop real-time simulation. The knowledge base contains a series of hierarchical 'rules' and decision logic that examines both performance criteria, such as delay reduction, as well as workload reduction criteria, such as conflict avoidance. The objective of the algorithms is to devise an efficient plan to land the aircraft in a manner acceptable to the air traffic controllers. This paper will describe the scheduling algorithms, give examples of their use, and present data regarding their potential benefits to the air traffic system.

  10. Cildb: a knowledgebase for centrosomes and cilia.

    PubMed

    Arnaiz, Olivier; Malinowska, Agata; Klotz, Catherine; Sperling, Linda; Dadlez, Michal; Koll, France; Cohen, Jean

    2009-01-01

    Ciliopathies, pleiotropic diseases provoked by defects in the structure or function of cilia or flagella, reflect the multiple roles of cilia during development, in stem cells, in somatic organs and germ cells. High throughput studies have revealed several hundred proteins that are involved in the composition, function or biogenesis of cilia. The corresponding genes are potential candidates for orphan ciliopathies. To study ciliary genes, model organisms are used in which particular questions on motility, sensory or developmental functions can be approached by genetics. In the course of high throughput studies of cilia in Paramecium tetraurelia, we were confronted with the problem of comparing our results with those obtained in other model organisms. We therefore developed a novel knowledgebase, Cildb, that integrates ciliary data from heterogeneous sources. Cildb links orthology relationships among 18 species to high throughput ciliary studies, and to OMIM data on human hereditary diseases. The web interface of Cildb comprises three tools, BioMart for complex queries, BLAST for sequence homology searches and GBrowse for browsing the human genome in relation to OMIM information for human diseases. Cildb can be used for interspecies comparisons, building candidate ciliary proteomes in any species, or identifying candidate ciliopathy genes.Database URL:http://cildb.cgm.cnrs-gif.fr.

  11. Knowledge-based reusable software synthesis system

    NASA Technical Reports Server (NTRS)

    Donaldson, Cammie

    1989-01-01

    The Eli system, a knowledge-based reusable software synthesis system, is being developed for NASA Langley under a Phase 2 SBIR contract. Named after Eli Whitney, the inventor of interchangeable parts, Eli assists engineers of large-scale software systems in reusing components while they are composing their software specifications or designs. Eli will identify reuse potential, search for components, select component variants, and synthesize components into the developer's specifications. The Eli project began as a Phase 1 SBIR to define a reusable software synthesis methodology that integrates reusabilityinto the top-down development process and to develop an approach for an expert system to promote and accomplish reuse. The objectives of the Eli Phase 2 work are to integrate advanced technologies to automate the development of reusable components within the context of large system developments, to integrate with user development methodologies without significant changes in method or learning of special languages, and to make reuse the easiest operation to perform. Eli will try to address a number of reuse problems including developing software with reusable components, managing reusable components, identifying reusable components, and transitioning reuse technology. Eli is both a library facility for classifying, storing, and retrieving reusable components and a design environment that emphasizes, encourages, and supports reuse.

  12. IGENPRO knowledge-based operator support system.

    SciTech Connect

    Morman, J. A.

    1998-07-01

    Research and development is being performed on the knowledge-based IGENPRO operator support package for plant transient diagnostics and management to provide operator assistance during off-normal plant transient conditions. A generic thermal-hydraulic (T-H) first-principles approach is being implemented using automated reasoning, artificial neural networks and fuzzy logic to produce a generic T-H system-independent/plant-independent package. The IGENPRO package has a modular structure composed of three modules: the transient trend analysis module PROTREN, the process diagnostics module PRODIAG and the process management module PROMANA. Cooperative research and development work has focused on the PRODIAG diagnostic module of the IGENPRO package and the operator training matrix of transients used at the Braidwood Pressurized Water Reactor station. Promising simulator testing results with PRODIAG have been obtained for the Braidwood Chemical and Volume Control System (CVCS), and the Component Cooling Water System. Initial CVCS test results have also been obtained for the PROTREN module. The PROMANA effort also involves the CVCS. Future work will be focused on the long-term, slow and mild degradation transients where diagnoses of incipient T-H component failure prior to forced outage events is required. This will enhance the capability of the IGENPRO system as a predictive maintenance tool for plant staff and operator support.

  13. UniProt: the universal protein knowledgebase

    PubMed Central

    2017-01-01

    The UniProt knowledgebase is a large resource of protein sequences and associated detailed annotation. The database contains over 60 million sequences, of which over half a million sequences have been curated by experts who critically review experimental and predicted data for each protein. The remainder are automatically annotated based on rule systems that rely on the expert curated knowledge. Since our last update in 2014, we have more than doubled the number of reference proteomes to 5631, giving a greater coverage of taxonomic diversity. We implemented a pipeline to remove redundant highly similar proteomes that were causing excessive redundancy in UniProt. The initial run of this pipeline reduced the number of sequences in UniProt by 47 million. For our users interested in the accessory proteomes, we have made available sets of pan proteome sequences that cover the diversity of sequences for each species that is found in its strains and sub-strains. To help interpretation of genomic variants, we provide tracks of detailed protein information for the major genome browsers. We provide a SPARQL endpoint that allows complex queries of the more than 22 billion triples of data in UniProt (http://sparql.uniprot.org/). UniProt resources can be accessed via the website at http://www.uniprot.org/. PMID:27899622

  14. A Knowledge-Based System Developer for aerospace applications

    NASA Technical Reports Server (NTRS)

    Shi, George Z.; Wu, Kewei; Fensky, Connie S.; Lo, Ching F.

    1993-01-01

    A prototype Knowledge-Based System Developer (KBSD) has been developed for aerospace applications by utilizing artificial intelligence technology. The KBSD directly acquires knowledge from domain experts through a graphical interface then builds expert systems from that knowledge. This raises the state of the art of knowledge acquisition/expert system technology to a new level by lessening the need for skilled knowledge engineers. The feasibility, applicability , and efficiency of the proposed concept was established, making a continuation which would develop the prototype to a full-scale general-purpose knowledge-based system developer justifiable. The KBSD has great commercial potential. It will provide a marketable software shell which alleviates the need for knowledge engineers and increase productivity in the workplace. The KBSD will therefore make knowledge-based systems available to a large portion of industry.

  15. Integrating knowledge-based techniques into well-test interpretation

    SciTech Connect

    Harrison, I.W.; Fraser, J.L.

    1995-04-01

    The goal of the Spirit Project was to develop a prototype of next-generation well-test-interpretation (WTI) software that would include knowledge-based decision support for the WTI model selection task. This paper describes how Spirit makes use of several different types of information (pressure, seismic, petrophysical, geological, and engineering) to support the user in identifying the most appropriate WTI model. Spirit`s knowledge-based approach to type-curve matching is to generate several different feasible interpretations by making assumptions about the possible presence of both wellbore storage and late-time boundary effects. Spirit fuses information from type-curve matching and other data sources by use of a knowledge-based decision model developed in collaboration with a WTI expert. The sponsors of the work have judged the resulting prototype system a success.

  16. The Knowledge-Based Economy and E-Learning: Critical Considerations for Workplace Democracy

    ERIC Educational Resources Information Center

    Remtulla, Karim A.

    2007-01-01

    The ideological shift by nation-states to "a knowledge-based economy" (also referred to as "knowledge-based society") is causing changes in the workplace. Brought about by the forces of globalisation and technological innovation, the ideologies of the "knowledge-based economy" are not limited to influencing the…

  17. Design of a knowledge-based report generator

    SciTech Connect

    Kukich, K.

    1983-01-01

    Knowledge-based report generation is a technique for automatically generating natural language reports from computer databases. It is so named because it applies knowledge-based expert systems software to the problem of text generation. The first application of the technique, a system for generating natural language stock reports from a daily stock quotes database, is partially implemented. Three fundamental principles of the technique are its use of domain-specific semantic and linguistic knowledge, its use of macro-level semantic and linguistic constructs (such as whole messages, a phrasal lexicon, and a sentence-combining grammar), and its production system approach to knowledge representation. 14 references.

  18. Online Knowledge-Based Model for Big Data Topic Extraction

    PubMed Central

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half. PMID:27195004

  19. PLAN-IT - Knowledge-based mission sequencing

    NASA Technical Reports Server (NTRS)

    Biefeld, Eric W.

    1987-01-01

    PLAN-IT (Plan-Integrated Timelines), a knowledge-based approach to assist in mission sequencing, is discussed. PLAN-IT uses a large set of scheduling techniques known as strategies to develop and maintain a mission sequence. The approach implemented by PLAN-IT and the current applications of PLAN-IT for sequencing at NASA are reported.

  20. Malaysia Transitions toward a Knowledge-Based Economy

    ERIC Educational Resources Information Center

    Mustapha, Ramlee; Abdullah, Abu

    2004-01-01

    The emergence of a knowledge-based economy (k-economy) has spawned a "new" notion of workplace literacy, changing the relationship between employers and employees. The traditional covenant where employees expect a stable or lifelong employment will no longer apply. The retention of employees will most probably be based on their skills…

  1. Value Creation in the Knowledge-Based Economy

    ERIC Educational Resources Information Center

    Liu, Fang-Chun

    2013-01-01

    Effective investment strategies help companies form dynamic core organizational capabilities allowing them to adapt and survive in today's rapidly changing knowledge-based economy. This dissertation investigates three valuation issues that challenge managers with respect to developing business-critical investment strategies that can have…

  2. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  3. Dynamic Strategic Planning in a Professional Knowledge-Based Organization

    ERIC Educational Resources Information Center

    Olivarius, Niels de Fine; Kousgaard, Marius Brostrom; Reventlow, Susanne; Quelle, Dan Grevelund; Tulinius, Charlotte

    2010-01-01

    Professional, knowledge-based institutions have a particular form of organization and culture that makes special demands on the strategic planning supervised by research administrators and managers. A model for dynamic strategic planning based on a pragmatic utilization of the multitude of strategy models was used in a small university-affiliated…

  4. Knowledge-Based Hierarchies: Using Organizations to Understand the Economy

    ERIC Educational Resources Information Center

    Garicano, Luis; Rossi-Hansberg, Esteban

    2015-01-01

    Incorporating the decision of how to organize the acquisition, use, and communication of knowledge into economic models is essential to understand a wide variety of economic phenomena. We survey the literature that has used knowledge-based hierarchies to study issues such as the evolution of wage inequality, the growth and productivity of firms,…

  5. CACTUS: Command and Control Training Using Knowledge-Based Simulations

    ERIC Educational Resources Information Center

    Hartley, Roger; Ravenscroft, Andrew; Williams, R. J.

    2008-01-01

    The CACTUS project was concerned with command and control training of large incidents where public order may be at risk, such as large demonstrations and marches. The training requirements and objectives of the project are first summarized justifying the use of knowledge-based computer methods to support and extend conventional training…

  6. A knowledge-based decision support system for payload scheduling

    NASA Technical Reports Server (NTRS)

    Tyagi, Rajesh; Tseng, Fan T.

    1988-01-01

    This paper presents the development of a prototype Knowledge-based Decision Support System, currently under development, for scheduling payloads/experiments on space station missions. The DSS is being built on Symbolics, a Lisp machine, using KEE, a commercial knowledge engineering tool.

  7. Conventional and Knowledge-Based Information Retrieval with Prolog.

    ERIC Educational Resources Information Center

    Leigh, William; Paz, Noemi

    1988-01-01

    Describes the use of PROLOG to program knowledge-based information retrieval systems, in which the knowledge contained in a document is translated into machine processable logic. Several examples of the resulting search process, and the program rules supporting the process, are given. (10 references) (CLB)

  8. Knowledge-Based Aid: A Four Agency Comparative Study

    ERIC Educational Resources Information Center

    McGrath, Simon; King, Kenneth

    2004-01-01

    Part of the response of many development cooperation agencies to the challenges of globalisation, ICTs and the knowledge economy is to emphasise the importance of knowledge for development. This paper looks at the discourses and practices of ''knowledge-based aid'' through an exploration of four agencies: the World Bank, DFID, Sida and JICA. It…

  9. Expansion of the Gene Ontology knowledgebase and resources

    PubMed Central

    2017-01-01

    The Gene Ontology (GO) is a comprehensive resource of computable knowledge regarding the functions of genes and gene products. As such, it is extensively used by the biomedical research community for the analysis of -omics and related data. Our continued focus is on improving the quality and utility of the GO resources, and we welcome and encourage input from researchers in all areas of biology. In this update, we summarize the current contents of the GO knowledgebase, and present several new features and improvements that have been made to the ontology, the annotations and the tools. Among the highlights are 1) developments that facilitate access to, and application of, the GO knowledgebase, and 2) extensions to the resource as well as increasing support for descriptions of causal models of biological systems and network biology. To learn more, visit http://geneontology.org/. PMID:27899567

  10. Knowledge-based zonal grid generation for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Andrews, Alison E.

    1988-01-01

    Automation of flow field zoning in two dimensions is an important step towards reducing the difficulty of three-dimensional grid generation in computational fluid dynamics. Using a knowledge-based approach makes sense, but problems arise which are caused by aspects of zoning involving perception, lack of expert consensus, and design processes. These obstacles are overcome by means of a simple shape and configuration language, a tunable zoning archetype, and a method of assembling plans from selected, predefined subplans. A demonstration system for knowledge-based two-dimensional flow field zoning has been successfully implemented and tested on representative aerodynamic configurations. The results show that this approach can produce flow field zonings that are acceptable to experts with differing evaluation criteria.

  11. Managing Project Landscapes in Knowledge-Based Enterprises

    NASA Astrophysics Data System (ADS)

    Stantchev, Vladimir; Franke, Marc Roman

    Knowledge-based enterprises are typically conducting a large number of research and development projects simultaneously. This is a particularly challenging task in complex and diverse project landscapes. Project Portfolio Management (PPM) can be a viable framework for knowledge and innovation management in such landscapes. A standardized process with defined functions such as project data repository, project assessment, selection, reporting, and portfolio reevaluation can serve as a starting point. In this work we discuss the benefits a multidimensional evaluation framework can provide for knowledge-based enterprises. Furthermore, we describe a knowledge and learning strategy and process in the context of PPM and evaluate their practical applicability at different stages of the PPM process.

  12. A knowledgebase system to enhance scientific discovery: Telemakus

    PubMed Central

    Fuller, Sherrilynne S; Revere, Debra; Bugni, Paul F; Martin, George M

    2004-01-01

    Background With the rapid expansion of scientific research, the ability to effectively find or integrate new domain knowledge in the sciences is proving increasingly difficult. Efforts to improve and speed up scientific discovery are being explored on a number of fronts. However, much of this work is based on traditional search and retrieval approaches and the bibliographic citation presentation format remains unchanged. Methods Case study. Results The Telemakus KnowledgeBase System provides flexible new tools for creating knowledgebases to facilitate retrieval and review of scientific research reports. In formalizing the representation of the research methods and results of scientific reports, Telemakus offers a potential strategy to enhance the scientific discovery process. While other research has demonstrated that aggregating and analyzing research findings across domains augments knowledge discovery, the Telemakus system is unique in combining document surrogates with interactive concept maps of linked relationships across groups of research reports. Conclusion Based on how scientists conduct research and read the literature, the Telemakus KnowledgeBase System brings together three innovations in analyzing, displaying and summarizing research reports across a domain: (1) research report schema, a document surrogate of extracted research methods and findings presented in a consistent and structured schema format which mimics the research process itself and provides a high-level surrogate to facilitate searching and rapid review of retrieved documents; (2) research findings, used to index the documents, allowing searchers to request, for example, research studies which have studied the relationship between neoplasms and vitamin E; and (3) visual exploration interface of linked relationships for interactive querying of research findings across the knowledgebase and graphical displays of what is known as well as, through gaps in the map, what is yet to be tested

  13. Knowledge-Based Decision Support in Department of Defense Acquisitions

    DTIC Science & Technology

    2010-09-01

    2005) reviewed and analyzed the National Aeronautics and Space Administration ( NASA ) project management policies and compared them to the GAO’s best...practices on knowledge-based decision making. The study was primarily focused on the Goddard Space Flight Center, the Jet Propulsion Lab, Johnson ...Space Center, and Marshall Space Flight Center. During its investigation, the GAO found NASA deficient in key criteria and decision reviews to fully

  14. Knowledge-Based Production Management: Approaches, Results and Prospects

    DTIC Science & Technology

    1991-12-01

    In this paper we provide an overview of research in the field of knowledge-based production management . We begin by examining the important sources...of decision-making difficulty in practical production management domains, discussing the requirements implied by each with respect to the development...of effective production management tools, and identifying the general opportunities in this regard provided by AI-based technology. We then categorize

  15. Knowledge-based processing for aircraft flight control

    NASA Technical Reports Server (NTRS)

    Painter, John H.; Glass, Emily; Economides, Gregory; Russell, Paul

    1994-01-01

    This Contractor Report documents research in Intelligent Control using knowledge-based processing in a manner dual to methods found in the classic stochastic decision, estimation, and control discipline. Such knowledge-based control has also been called Declarative, and Hybid. Software architectures were sought, employing the parallelism inherent in modern object-oriented modeling and programming. The viewpoint adopted was that Intelligent Control employs a class of domain-specific software architectures having features common over a broad variety of implementations, such as management of aircraft flight, power distribution, etc. As much attention was paid to software engineering issues as to artificial intelligence and control issues. This research considered that particular processing methods from the stochastic and knowledge-based worlds are duals, that is, similar in a broad context. They provide architectural design concepts which serve as bridges between the disparate disciplines of decision, estimation, control, and artificial intelligence. This research was applied to the control of a subsonic transport aircraft in the airport terminal area.

  16. Systems, methods and apparatus for verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor); Rouff, Christopher A. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  17. Comparison of LISP and MUMPS as implementation languages for knowledge-based systems

    SciTech Connect

    Curtis, A.C.

    1984-01-01

    Major components of knowledge-based systems are summarized, along with the programming language features generally useful in their implementation. LISP and MUMPS are briefly described and compared as vehicles for building knowledge-based systems. The paper concludes with suggestions for extensions to MUMPS which might increase its usefulness in artificial intelligence applications without affecting the essential nature of the language. 8 references.

  18. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  19. Knowledge-Based Learning: Integration of Deductive and Inductive Learning for Knowledge Base Completion.

    ERIC Educational Resources Information Center

    Whitehall, Bradley Lane

    In constructing a knowledge-based system, the knowledge engineer must convert rules of thumb provided by the domain expert and previously solved examples into a working system. Research in machine learning has produced algorithms that create rules for knowledge-based systems, but these algorithms require either many examples or a complete domain…

  20. Knowledge-based system for automatic MBR control.

    PubMed

    Comas, J; Meabe, E; Sancho, L; Ferrero, G; Sipma, J; Monclús, H; Rodriguez-Roda, I

    2010-01-01

    MBR technology is currently challenging traditional wastewater treatment systems and is increasingly selected for WWTP upgrading. MBR systems typically are constructed on a smaller footprint, and provide superior treated water quality. However, the main drawback of MBR technology is that the permeability of membranes declines during filtration due to membrane fouling, which for a large part causes the high aeration requirements of an MBR to counteract this fouling phenomenon. Due to the complex and still unknown mechanisms of membrane fouling it is neither possible to describe clearly its development by means of a deterministic model, nor to control it with a purely mathematical law. Consequently the majority of MBR applications are controlled in an "open-loop" way i.e. with predefined and fixed air scour and filtration/relaxation or backwashing cycles, and scheduled inline or offline chemical cleaning as a preventive measure, without taking into account the real needs of membrane cleaning based on its filtration performance. However, existing theoretical and empirical knowledge about potential cause-effect relations between a number of factors (influent characteristics, biomass characteristics and operational conditions) and MBR operation can be used to build a knowledge-based decision support system (KB-DSS) for the automatic control of MBRs. This KB-DSS contains a knowledge-based control module, which, based on real time comparison of the current permeability trend with "reference trends", aims at optimizing the operation and energy costs and decreasing fouling rates. In practice the automatic control system proposed regulates the set points of the key operational variables controlled in MBR systems (permeate flux, relaxation and backwash times, backwash flows and times, aeration flow rates, chemical cleaning frequency, waste sludge flow rate and recycle flow rates) and identifies its optimal value. This paper describes the concepts and the 3-level architecture

  1. "Chromosome": a knowledge-based system for the chromosome classification.

    PubMed

    Ramstein, G; Bernadet, M

    1993-01-01

    Chromosome, a knowledge-based analysis system has been designed for the classification of human chromosomes. Its aim is to perform an optimal classification by driving a tool box containing the procedures of image processing, pattern recognition and classification. This paper presents the general architecture of Chromosome, based on a multiagent system generator. The image processing tool box is described from the met aphasic enhancement to the fine classification. Emphasis is then put on the knowledge base intended for the chromosome recognition. The global classification process is also presented, showing how Chromosome proceeds to classify a given chromosome. Finally, we discuss further extensions of the system for the karyotype building.

  2. Knowledge-based GIS techniques applied to geological engineering

    USGS Publications Warehouse

    Usery, E. Lynn; Altheide, Phyllis; Deister, Robin R.P.; Barr, David J.

    1988-01-01

    A knowledge-based geographic information system (KBGIS) approach which requires development of a rule base for both GIS processing and for the geological engineering application has been implemented. The rule bases are implemented in the Goldworks expert system development shell interfaced to the Earth Resources Data Analysis System (ERDAS) raster-based GIS for input and output. GIS analysis procedures including recoding, intersection, and union are controlled by the rule base, and the geological engineering map product is generted by the expert system. The KBGIS has been used to generate a geological engineering map of Creve Coeur, Missouri.

  3. Knowledge-based machine vision systems for space station automation

    NASA Technical Reports Server (NTRS)

    Ranganath, Heggere S.; Chipman, Laure J.

    1989-01-01

    Computer vision techniques which have the potential for use on the space station and related applications are assessed. A knowledge-based vision system (expert vision system) and the development of a demonstration system for it are described. This system implements some of the capabilities that would be necessary in a machine vision system for the robot arm of the laboratory module in the space station. A Perceptics 9200e image processor, on a host VAXstation, was used to develop the demonstration system. In order to use realistic test images, photographs of actual space shuttle simulator panels were used. The system's capabilities of scene identification and scene matching are discussed.

  4. Building validation tools for knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Stachowitz, R. A.; Chang, C. L.; Stock, T. S.; Combs, J. B.

    1987-01-01

    The Expert Systems Validation Associate (EVA), a validation system under development at the Lockheed Artificial Intelligence Center for more than a year, provides a wide range of validation tools to check the correctness, consistency and completeness of a knowledge-based system. A declarative meta-language (higher-order language), is used to create a generic version of EVA to validate applications written in arbitrary expert system shells. The architecture and functionality of EVA are presented. The functionality includes Structure Check, Logic Check, Extended Structure Check (using semantic information), Extended Logic Check, Semantic Check, Omission Check, Rule Refinement, Control Check, Test Case Generation, Error Localization, and Behavior Verification.

  5. An Introduction to the Heliophysics Event Knowledgebase for SDO

    NASA Astrophysics Data System (ADS)

    Hurlburt, Neal; Schrijver, Carolus; Cheung, Mark

    The immense volume of data generated by the suite of instruments on SDO requires new tools for efficient identifying and accessing data that is most relevant to research investigations. We have developed the Heliophysics Events Knowledgebase (HEK) to fill this need. The system developed in support of the HEK combines automated datamining using feature detection methods; high-performance visualization systems for data markup; and web-services and clients for searching the resulting metadata, reviewing results and efficient access to the data. We will review these components and present examples of their use with SDO data.

  6. Knowledge-based vision and simple visual machines.

    PubMed Central

    Cliff, D; Noble, J

    1997-01-01

    The vast majority of work in machine vision emphasizes the representation of perceived objects and events: it is these internal representations that incorporate the 'knowledge' in knowledge-based vision or form the 'models' in model-based vision. In this paper, we discuss simple machine vision systems developed by artificial evolution rather than traditional engineering design techniques, and note that the task of identifying internal representations within such systems is made difficult by the lack of an operational definition of representation at the causal mechanistic level. Consequently, we question the nature and indeed the existence of representations posited to be used within natural vision systems (i.e. animals). We conclude that representations argued for on a priori grounds by external observers of a particular vision system may well be illusory, and are at best place-holders for yet-to-be-identified causal mechanistic interactions. That is, applying the knowledge-based vision approach in the understanding of evolved systems (machines or animals) may well lead to theories and models that are internally consistent, computationally plausible, and entirely wrong. PMID:9304684

  7. Hospital nurses' use of knowledge-based information resources.

    PubMed

    Tannery, Nancy Hrinya; Wessel, Charles B; Epstein, Barbara A; Gadd, Cynthia S

    2007-01-01

    The purpose of this study was to evaluate the information-seeking practices of nurses before and after access to a library's electronic collection of information resources. This is a pre/post intervention study of nurses at a rural community hospital. The hospital contracted with an academic health sciences library for access to a collection of online knowledge-based resources. Self-report surveys were used to obtain information about nurses' computer use and how they locate and access information to answer questions related to their patient care activities. In 2001, self-report surveys were sent to the hospital's 573 nurses during implementation of access to online resources with a post-implementation survey sent 1 year later. At the initiation of access to the library's electronic resources, nurses turned to colleagues and print textbooks or journals to satisfy their information needs. After 1 year of access, 20% of the nurses had begun to use the library's electronic resources. The study outcome suggests ready access to knowledge-based electronic information resources can lead to changes in behavior among some nurses.

  8. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  9. Network fingerprint: a knowledge-based characterization of biomedical networks

    PubMed Central

    Cui, Xiuliang; He, Haochen; He, Fuchu; Wang, Shengqi; Li, Fei; Bo, Xiaochen

    2015-01-01

    It can be difficult for biomedical researchers to understand complex molecular networks due to their unfamiliarity with the mathematical concepts employed. To represent molecular networks with clear meanings and familiar forms for biomedical researchers, we introduce a knowledge-based computational framework to decipher biomedical networks by making systematic comparisons to well-studied “basic networks”. A biomedical network is characterized as a spectrum-like vector called “network fingerprint”, which contains similarities to basic networks. This knowledge-based multidimensional characterization provides a more intuitive way to decipher molecular networks, especially for large-scale network comparisons and clustering analyses. As an example, we extracted network fingerprints of 44 disease networks in the Kyoto Encyclopedia of Genes and Genomes (KEGG) database. The comparisons among the network fingerprints of disease networks revealed informative disease-disease and disease-signaling pathway associations, illustrating that the network fingerprinting framework will lead to new approaches for better understanding of biomedical networks. PMID:26307246

  10. Automated seeding of specialised wiki knowledgebases with BioKb

    PubMed Central

    2009-01-01

    Background Wiki technology has become a ubiquitous mechanism for dissemination of information, and places strong emphasis on collaboration. We aimed to leverage wiki technology to allow small groups of researchers to collaborate around a specific domain, for example a biological pathway. Automatically gathered seed data could be modified by the group and enriched with domain specific information. Results We describe a software system, BioKb, implemented as a plugin for the TWiki engine, and designed to facilitate construction of a field-specific wiki containing collaborative and automatically generated content. Features of this system include: query of publicly available resources such as KEGG, iHOP and MeSH, to generate 'seed' content for topics; simple definition of structure for topics of different types via an administration page; and interactive incorporation of relevant PubMed references. An exemplar is shown for the use of this system, in the creation of the RAASWiki knowledgebase on the renin-angiotensin-aldosterone system (RAAS). RAASWiki has been seeded with data by use of BioKb, and will be the subject of ongoing development into an extensive knowledgebase on the RAAS. Conclusion The BioKb system is available from http://www.bioinf.mvm.ed.ac.uk/twiki/bin/view/TWiki/BioKbPlugin as a plugin for the TWiki engine. PMID:19758431

  11. Extensible knowledge-based architecture for segmenting CT data

    NASA Astrophysics Data System (ADS)

    Brown, Matthew S.; McNitt-Gray, Michael F.; Goldin, Jonathan G.; Aberle, Denise R.

    1998-06-01

    A knowledge-based system has been developed for segmenting computed tomography (CT) images. Its modular architecture includes an anatomical model, image processing engine, inference engine and blackboard. The model contains a priori knowledge of size, shape, X-ray attenuation and relative position of anatomical structures. This knowledge is used to constrain low-level segmentation routines. Model-derived constraints and segmented image objects are both transformed into a common feature space and posted on the blackboard. The inference engine then matches image to model objects, based on the constraints. The transformation to feature space allows the knowledge and image data representations to be independent. Thus a high-level model can be used, with data being stored in a frame-based semantic network. This modularity and explicit representation of knowledge allows for straightforward system extension. We initially demonstrate an application to lung segmentation in thoracic CT, with subsequent extension of the knowledge-base to include tumors within the lung fields. The anatomical model was later augmented to include basic brain anatomy including the skull and blood vessels, to allow automatic segmentation of vascular structures in CT angiograms for 3D rendering and visualization.

  12. Knowledge-based simulation using object-oriented programming

    NASA Technical Reports Server (NTRS)

    Sidoran, Karen M.

    1993-01-01

    Simulations have become a powerful mechanism for understanding and modeling complex phenomena. Their results have had substantial impact on a broad range of decisions in the military, government, and industry. Because of this, new techniques are continually being explored and developed to make them even more useful, understandable, extendable, and efficient. One such area of research is the application of the knowledge-based methods of artificial intelligence (AI) to the computer simulation field. The goal of knowledge-based simulation is to facilitate building simulations of greatly increased power and comprehensibility by making use of deeper knowledge about the behavior of the simulated world. One technique for representing and manipulating knowledge that has been enhanced by the AI community is object-oriented programming. Using this technique, the entities of a discrete-event simulation can be viewed as objects in an object-oriented formulation. Knowledge can be factual (i.e., attributes of an entity) or behavioral (i.e., how the entity is to behave in certain circumstances). Rome Laboratory's Advanced Simulation Environment (RASE) was developed as a research vehicle to provide an enhanced simulation development environment for building more intelligent, interactive, flexible, and realistic simulations. This capability will support current and future battle management research and provide a test of the object-oriented paradigm for use in large scale military applications.

  13. Literature classification for semi-automated updating of biological knowledgebases

    PubMed Central

    2013-01-01

    Background As the output of biological assays increase in resolution and volume, the body of specialized biological data, such as functional annotations of gene and protein sequences, enables extraction of higher-level knowledge needed for practical application in bioinformatics. Whereas common types of biological data, such as sequence data, are extensively stored in biological databases, functional annotations, such as immunological epitopes, are found primarily in semi-structured formats or free text embedded in primary scientific literature. Results We defined and applied a machine learning approach for literature classification to support updating of TANTIGEN, a knowledgebase of tumor T-cell antigens. Abstracts from PubMed were downloaded and classified as either "relevant" or "irrelevant" for database update. Training and five-fold cross-validation of a k-NN classifier on 310 abstracts yielded classification accuracy of 0.95, thus showing significant value in support of data extraction from the literature. Conclusion We here propose a conceptual framework for semi-automated extraction of epitope data embedded in scientific literature using principles from text mining and machine learning. The addition of such data will aid in the transition of biological databases to knowledgebases. PMID:24564403

  14. Developing a Physiologically-Based Pharmacokinetic Model Knowledgebase in Support of Provisional Model Construction

    PubMed Central

    Grulke, Christopher M.; Chang, Daniel T.; Brooks, Raina D.; Leonard, Jeremy A.; Phillips, Martin B.; Hypes, Ethan D.; Fair, Matthew J.; Tornero-Velez, Rogelio; Johnson, Jeffre; Dary, Curtis C.; Tan, Yu-Mei

    2016-01-01

    Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-vetted models can be a great resource for the construction of models pertaining to new chemicals. A PBPK knowledgebase was compiled and developed from existing PBPK-related articles and used to develop new models. From 2,039 PBPK-related articles published between 1977 and 2013, 307 unique chemicals were identified for use as the basis of our knowledgebase. Keywords related to species, gender, developmental stages, and organs were analyzed from the articles within the PBPK knowledgebase. A correlation matrix of the 307 chemicals in the PBPK knowledgebase was calculated based on pharmacokinetic-relevant molecular descriptors. Chemicals in the PBPK knowledgebase were ranked based on their correlation toward ethylbenzene and gefitinib. Next, multiple chemicals were selected to represent exact matches, close analogues, or non-analogues of the target case study chemicals. Parameters, equations, or experimental data relevant to existing models for these chemicals and their analogues were used to construct new models, and model predictions were compared to observed values. This compiled knowledgebase provides a chemical structure-based approach for identifying PBPK models relevant to other chemical entities. Using suitable correlation metrics, we demonstrated that models of chemical analogues in the PBPK knowledgebase can guide the construction of PBPK models for other chemicals. PMID:26871706

  15. Knowledge-based systems: how will they affect manufacturing in the 80's

    SciTech Connect

    King, M.S.; Brooks, S.L.; Schaefer, R.M.

    1985-04-01

    Knowledge-based or ''expert'' systems have been in various stages of development and use for a long time in the academic world. Some of these systems have come out of the lab in recent years in the fields of medicine, geology, and computer system design. The use of knowledge-based systems in conjunction iwth manufacturing process planning and the emerging CAD/CAM/CAE technologies promises significant increases in engineering productivity. This paper's focus is on areas in manufacturing where knowledge-based systems could most benefit the engineer and industry. 13 refs., 3 figs.

  16. A real-time multiprocessor system for knowledge-based target-tracking

    NASA Astrophysics Data System (ADS)

    Irwin, P. D. S.; Farson, S. A.; Wilkinson, A. J.

    1989-12-01

    A real-time processing architecture for implementation of knowledge-based algorithms employed in infrared-image interpretation is described. Three stages of image interpretation (image segmentation, feature extraction, and feature examination by a knowledge-based system) are outlined. Dedicated hardware for the image segmentation and feature extraction are covered, along with a multitransputer architecture for implementation of data-dependent processes. Emphasis is placed on implementation of the description, frame-hypothesis, and slot-filling algorithms. An optimal algorithm for scheduling various tasks involved in implementing the rule set of the knowledge-based system is presented.

  17. Installing a Local Copy of the Reactome Web Site and Knowledgebase.

    PubMed

    McKay, Sheldon J; Weiser, Joel

    2015-06-19

    The Reactome project builds, maintains, and publishes a knowledgebase of biological pathways. The information in the knowledgebase is gathered from the experts in the field, peer reviewed and edited by Reactome editorial staff, and then published to the Reactome Web site, http://www.reactome.org. The Reactome software is open source and builds on top of other open-source or freely available software. Reactome data and code can be freely downloaded in its entirety and the Web site installed locally. This allows for more flexible interrogation of the data and also makes it possible to add one's own information to the knowledgebase.

  18. A knowledge-based agent prototype for Chinese address geocoding

    NASA Astrophysics Data System (ADS)

    Wei, Ran; Zhang, Xuehu; Ding, Linfang; Ma, Haoming; Li, Qi

    2009-10-01

    Chinese address geocoding is a difficult problem to deal with due to intrinsic complexities in Chinese address systems and a lack of standards in address assignments and usages. In order to improve existing address geocoding algorithm, a spatial knowledge-based agent prototype aimed at validating address geocoding results is built to determine the spatial accuracies as well as matching confidence. A portion of human's knowledge of judging the spatial closeness of two addresses is represented via first order logic and the corresponding algorithms are implemented with the Prolog language. Preliminary tests conducted using addresses matching result in Beijing area showed that the prototype can successfully assess the spatial closeness between the matching address and the query address with 97% accuracy.

  19. Detection of infrastructure manipulation with knowledge-based video surveillance

    NASA Astrophysics Data System (ADS)

    Muench, David; Hilsenbeck, Barbara; Kieritz, Hilke; Becker, Stefan; Grosselfinger, Ann-Kristin; Huebner, Wolfgang; Arens, Michael

    2016-10-01

    We are living in a world dependent on sophisticated technical infrastructure. Malicious manipulation of such critical infrastructure poses an enormous threat for all its users. Thus, running a critical infrastructure needs special attention to log the planned maintenance or to detect suspicious events. Towards this end, we present a knowledge-based surveillance approach capable of logging visual observable events in such an environment. The video surveillance modules are based on appearance-based person detection, which further is used to modulate the outcome of generic processing steps such as change detection or skin detection. A relation between the expected scene behavior and the underlying basic video surveillance modules is established. It will be shown that the combination already provides sufficient expressiveness to describe various everyday situations in indoor video surveillance. The whole approach is qualitatively and quantitatively evaluated on a prototypical scenario in a server room.

  20. Knowledge-based fault diagnosis system for refuse collection vehicle

    NASA Astrophysics Data System (ADS)

    Tan, CheeFai; Juffrizal, K.; Khalil, S. N.; Nidzamuddin, M. Y.

    2015-05-01

    The refuse collection vehicle is manufactured by local vehicle body manufacturer. Currently; the company supplied six model of the waste compactor truck to the local authority as well as waste management company. The company is facing difficulty to acquire the knowledge from the expert when the expert is absence. To solve the problem, the knowledge from the expert can be stored in the expert system. The expert system is able to provide necessary support to the company when the expert is not available. The implementation of the process and tool is able to be standardize and more accurate. The knowledge that input to the expert system is based on design guidelines and experience from the expert. This project highlighted another application on knowledge-based system (KBS) approached in trouble shooting of the refuse collection vehicle production process. The main aim of the research is to develop a novel expert fault diagnosis system framework for the refuse collection vehicle.

  1. Knowledge-Based Framework: its specification and new related discussions

    NASA Astrophysics Data System (ADS)

    Rodrigues, Douglas; Zaniolo, Rodrigo R.; Branco, Kalinka R. L. J. C.

    2015-09-01

    Unmanned Aerial Vehicle is a common application of critical embedded systems. The heterogeneity prevalent in these vehicles in terms of services for avionics is particularly relevant to the elaboration of multi-application missions. Besides, this heterogeneity in UAV services is often manifested in the form of characteristics such as reliability, security and performance. Different service implementations typically offer different guarantees in terms of these characteristics and in terms of associated costs. Particularly, we explore the notion of Service-Oriented Architecture (SOA) in the context of UAVs as safety-critical embedded systems for the composition of services to fulfil application-specified performance and dependability guarantees. So, we propose a framework for the deployment of these services and their variants. This framework is called Knowledge-Based Framework for Dynamically Changing Applications (KBF) and we specify its services module, discussing all the related issues.

  2. The Knowledge-Based Software Assistant: Beyond CASE

    NASA Technical Reports Server (NTRS)

    Carozzoni, Joseph A.

    1993-01-01

    This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.

  3. Autonomous Cryogenics Loading Operations Simulation Software: Knowledgebase Autonomous Test Engineer

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S., Jr.

    2013-01-01

    Working on the ACLO (Autonomous Cryogenics Loading Operations) project I have had the opportunity to add functionality to the physics simulation software known as KATE (Knowledgebase Autonomous Test Engineer), create a new application allowing WYSIWYG (what-you-see-is-what-you-get) creation of KATE schematic files and begin a preliminary design and implementation of a new subsystem that will provide vision services on the IHM (Integrated Health Management) bus. The functionality I added to KATE over the past few months includes a dynamic visual representation of the fluid height in a pipe based on number of gallons of fluid in the pipe and implementing the IHM bus connection within KATE. I also fixed a broken feature in the system called the Browser Display, implemented many bug fixes and made changes to the GUI (Graphical User Interface).

  4. Assessing an AI knowledge-base for asymptomatic liver diseases.

    PubMed

    Babic, A; Mathiesen, U; Hedin, K; Bodemar, G; Wigertz, O

    1998-01-01

    Discovering not yet seen knowledge from clinical data is of importance in the field of asymptomatic liver diseases. Avoidance of liver biopsy which is used as the ultimate confirmation of diagnosis by making the decision based on relevant laboratory findings only, would be considered an essential support. The system based on Quinlan's ID3 algorithm was simple and efficient in extracting the sought knowledge. Basic principles of applying the AI systems are therefore described and complemented with medical evaluation. Some of the diagnostic rules were found to be useful as decision algorithms i.e. they could be directly applied in clinical work and made a part of the knowledge-base of the Liver Guide, an automated decision support system.

  5. Knowledge-based fault diagnosis system for refuse collection vehicle

    SciTech Connect

    Tan, CheeFai; Juffrizal, K.; Khalil, S. N.; Nidzamuddin, M. Y.

    2015-05-15

    The refuse collection vehicle is manufactured by local vehicle body manufacturer. Currently; the company supplied six model of the waste compactor truck to the local authority as well as waste management company. The company is facing difficulty to acquire the knowledge from the expert when the expert is absence. To solve the problem, the knowledge from the expert can be stored in the expert system. The expert system is able to provide necessary support to the company when the expert is not available. The implementation of the process and tool is able to be standardize and more accurate. The knowledge that input to the expert system is based on design guidelines and experience from the expert. This project highlighted another application on knowledge-based system (KBS) approached in trouble shooting of the refuse collection vehicle production process. The main aim of the research is to develop a novel expert fault diagnosis system framework for the refuse collection vehicle.

  6. Knowledge-based system for flight information management. Thesis

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.

    1990-01-01

    The use of knowledge-based system (KBS) architectures to manage information on the primary flight display (PFD) of commercial aircraft is described. The PFD information management strategy used tailored the information on the PFD to the tasks the pilot performed. The KBS design and implementation of the task-tailored PFD information management application is described. The knowledge acquisition and subsequent system design of a flight-phase-detection KBS is also described. The flight-phase output of this KBS was used as input to the task-tailored PFD information management KBS. The implementation and integration of this KBS with existing aircraft systems and the other KBS is described. The flight tests are examined of both KBS's, collectively called the Task-Tailored Flight Information Manager (TTFIM), which verified their implementation and integration, and validated the software engineering advantages of the KBS approach in an operational environment.

  7. A knowledge-based system for prototypical reasoning

    NASA Astrophysics Data System (ADS)

    Lieto, Antonio; Minieri, Andrea; Piana, Alberto; Radicioni, Daniele P.

    2015-04-01

    In this work we present a knowledge-based system equipped with a hybrid, cognitively inspired architecture for the representation of conceptual information. The proposed system aims at extending the classical representational and reasoning capabilities of the ontology-based frameworks towards the realm of the prototype theory. It is based on a hybrid knowledge base, composed of a classical symbolic component (grounded on a formal ontology) with a typicality based one (grounded on the conceptual spaces framework). The resulting system attempts to reconcile the heterogeneous approach to the concepts in Cognitive Science with the dual process theories of reasoning and rationality. The system has been experimentally assessed in a conceptual categorisation task where common sense linguistic descriptions were given in input, and the corresponding target concepts had to be identified. The results show that the proposed solution substantially extends the representational and reasoning 'conceptual' capabilities of standard ontology-based systems.

  8. Knowledge-based architecture for airborne mine and minefield detection

    NASA Astrophysics Data System (ADS)

    Agarwal, Sanjeev; Menon, Deepak; Swonger, C. W.

    2004-09-01

    One of the primary lessons learned from airborne mid-wave infrared (MWIR) based mine and minefield detection research and development over the last few years has been the fact that no single algorithm or static detection architecture is able to meet mine and minefield detection performance specifications. This is true not only because of the highly varied environmental and operational conditions under which an airborne sensor is expected to perform but also due to the highly data dependent nature of sensors and algorithms employed for detection. Attempts to make the algorithms themselves more robust to varying operating conditions have only been partially successful. In this paper, we present a knowledge-based architecture to tackle this challenging problem. The detailed algorithm architecture is discussed for such a mine/minefield detection system, with a description of each functional block and data interface. This dynamic and knowledge-driven architecture will provide more robust mine and minefield detection for a highly multi-modal operating environment. The acquisition of the knowledge for this system is predominantly data driven, incorporating not only the analysis of historical airborne mine and minefield imagery data collection, but also other "all source data" that may be available such as terrain information and time of day. This "all source data" is extremely important and embodies causal information that drives the detection performance. This information is not being used by current detection architectures. Data analysis for knowledge acquisition will facilitate better understanding of the factors that affect the detection performance and will provide insight into areas for improvement for both sensors and algorithms. Important aspects of this knowledge-based architecture, its motivations and the potential gains from its implementation are discussed, and some preliminary results are presented.

  9. Risk Management of New Microelectronics for NASA: Radiation Knowledge-base

    NASA Technical Reports Server (NTRS)

    LaBel, Kenneth A.

    2004-01-01

    Contents include the following: NASA Missions - implications to reliability and radiation constraints. Approach to Insertion of New Technologies Technology Knowledge-base development. Technology model/tool development and validation. Summary comments.

  10. A study of knowledge-based systems for the Space Station

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Swietek, Gregg; Bullock, Bruce

    1989-01-01

    A rapid turnaround study on the potential uses of knowledge-based systems for Space Station Freedom was conducted from October 1987 through January 1988. Participants included both NASA personnel and experienced industrial knowledge engineers. Major results of the study included five recommended systems for the Baseline Configuration of the Space Station, an analysis of sensor hooks and scars, and a proposed plan for evolutionary growth of knowledge-based systems on the Space Station.

  11. Validation of highly reliable, real-time knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1988-01-01

    Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.

  12. Knowledge-based inference engine for online video dissemination

    NASA Astrophysics Data System (ADS)

    Zhou, Wensheng; Kuo, C.-C. Jay

    2000-10-01

    To facilitate easy access to rich information of multimedia over the Internet, we develop a knowledge-based classification system that supports automatic Indexing and filtering based on semantic concepts for the dissemination of on-line real-time media. Automatic segmentation, annotation and summarization of media for fast information browsing and updating are achieved in the same time. In the proposed system, a real-time scene-change detection proxy performs an initial video structuring process by splitting a video clip into scenes. Motional and visual features are extracted in real time for every detected scene by using online feature extraction proxies. Higher semantics are then derived through a joint use of low-level features along with inference rules in the knowledge base. Inference rules are derived through a supervised learning process based on representative samples. On-line media filtering based on semantic concepts becomes possible by using the proposed video inference engine. Video streams are either blocked or sent to certain channels depending on whether or not the video stream is matched with the user's profile. The proposed system is extensively evaluated by applying the engine to video of basketball games.

  13. KBGIS-II: A knowledge-based geographic information system

    NASA Technical Reports Server (NTRS)

    Smith, Terence; Peuquet, Donna; Menon, Sudhakar; Agarwal, Pankaj

    1986-01-01

    The architecture and working of a recently implemented Knowledge-Based Geographic Information System (KBGIS-II), designed to satisfy several general criteria for the GIS, is described. The system has four major functions including query-answering, learning and editing. The main query finds constrained locations for spatial objects that are describable in a predicate-calculus based spatial object language. The main search procedures include a family of constraint-satisfaction procedures that use a spatial object knowledge base to search efficiently for complex spatial objects in large, multilayered spatial data bases. These data bases are represented in quadtree form. The search strategy is designed to reduce the computational cost of search in the average case. The learning capabilities of the system include the addition of new locations of complex spatial objects to the knowledge base as queries are answered, and the ability to learn inductively definitions of new spatial objects from examples. The new definitions are added to the knowledge base by the system. The system is performing all its designated tasks successfully. Future reports will relate performance characteristics of the system.

  14. KBGIS-2: A knowledge-based geographic information system

    NASA Technical Reports Server (NTRS)

    Smith, T.; Peuquet, D.; Menon, S.; Agarwal, P.

    1986-01-01

    The architecture and working of a recently implemented knowledge-based geographic information system (KBGIS-2) that was designed to satisfy several general criteria for the geographic information system are described. The system has four major functions that include query-answering, learning, and editing. The main query finds constrained locations for spatial objects that are describable in a predicate-calculus based spatial objects language. The main search procedures include a family of constraint-satisfaction procedures that use a spatial object knowledge base to search efficiently for complex spatial objects in large, multilayered spatial data bases. These data bases are represented in quadtree form. The search strategy is designed to reduce the computational cost of search in the average case. The learning capabilities of the system include the addition of new locations of complex spatial objects to the knowledge base as queries are answered, and the ability to learn inductively definitions of new spatial objects from examples. The new definitions are added to the knowledge base by the system. The system is currently performing all its designated tasks successfully, although currently implemented on inadequate hardware. Future reports will detail the performance characteristics of the system, and various new extensions are planned in order to enhance the power of KBGIS-2.

  15. Selection of Construction Methods: A Knowledge-Based Approach

    PubMed Central

    Skibniewski, Miroslaw

    2013-01-01

    The appropriate selection of construction methods to be used during the execution of a construction project is a major determinant of high productivity, but sometimes this selection process is performed without the care and the systematic approach that it deserves, bringing negative consequences. This paper proposes a knowledge management approach that will enable the intelligent use of corporate experience and information and help to improve the selection of construction methods for a project. Then a knowledge-based system to support this decision-making process is proposed and described. To define and design the system, semistructured interviews were conducted within three construction companies with the purpose of studying the way that the method' selection process is carried out in practice and the knowledge associated with it. A prototype of a Construction Methods Knowledge System (CMKS) was developed and then validated with construction industry professionals. As a conclusion, the CMKS was perceived as a valuable tool for construction methods' selection, by helping companies to generate a corporate memory on this issue, reducing the reliance on individual knowledge and also the subjectivity of the decision-making process. The described benefits as provided by the system favor a better performance of construction projects. PMID:24453925

  16. Compiling knowledge-based systems from KEE to Ada

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  17. Knowledge-based graphical interfaces for presenting technical information

    NASA Technical Reports Server (NTRS)

    Feiner, Steven

    1988-01-01

    Designing effective presentations of technical information is extremely difficult and time-consuming. Moreover, the combination of increasing task complexity and declining job skills makes the need for high-quality technical presentations especially urgent. We believe that this need can ultimately be met through the development of knowledge-based graphical interfaces that can design and present technical information. Since much material is most naturally communicated through pictures, our work has stressed the importance of well-designed graphics, concentrating on generating pictures and laying out displays containing them. We describe APEX, a testbed picture generation system that creates sequences of pictures that depict the performance of simple actions in a world of 3D objects. Our system supports rules for determining automatically the objects to be shown in a picture, the style and level of detail with which they should be rendered, the method by which the action itself should be indicated, and the picture's camera specification. We then describe work on GRIDS, an experimental display layout system that addresses some of the problems in designing displays containing these pictures, determining the position and size of the material to be presented.

  18. Designing the Cloud-based DOE Systems Biology Knowledgebase

    SciTech Connect

    Lansing, Carina S.; Liu, Yan; Yin, Jian; Corrigan, Abigail L.; Guillen, Zoe C.; Kleese van Dam, Kerstin; Gorton, Ian

    2011-09-01

    Systems Biology research, even more than many other scientific domains, is becoming increasingly data-intensive. Not only have advances in experimental and computational technologies lead to an exponential increase in scientific data volumes and their complexity, but increasingly such databases themselves are providing the basis for new scientific discoveries. To engage effectively with these community resources, integrated analyses, synthesis and simulation software is needed, regularly supported by scientific workflows. In order to provide a more collaborative, community driven research environment for this heterogeneous setting, the Department of Energy (DOE) has decided to develop a federated, cloud based cyber infrastructure - the Systems Biology Knowledgebase (Kbase). Pacific Northwest National Laboratory (PNNL) with its long tradition in data intensive science lead two of the five initial pilot projects, these two focusing on defining and testing the basic federated cloud-based system architecture and develop a prototype implementation. Hereby the community wide accessibility of biological data and the capability to integrate and analyze this data within its changing research context were seen as key technical functionalities the Kbase needed to enable. In this paper we describe the results of our investigations into the design of a cloud based federated infrastructure for: (1) Semantics driven data discovery, access and integration; (2) Data annotation, publication and sharing; (3) Workflow enabled data analysis; and (4) Project based collaborative working. We describe our approach, exemplary use cases and our prototype implementation that demonstrates the feasibility of this approach.

  19. Plant Protein Annotation in the UniProt Knowledgebase1

    PubMed Central

    Schneider, Michel; Bairoch, Amos; Wu, Cathy H.; Apweiler, Rolf

    2005-01-01

    The Swiss-Prot, TrEMBL, Protein Information Resource (PIR), and DNA Data Bank of Japan (DDBJ) protein database activities have united to form the Universal Protein Resource (UniProt) Consortium. UniProt presents three database layers: the UniProt Archive, the UniProt Knowledgebase (UniProtKB), and the UniProt Reference Clusters. The UniProtKB consists of two sections: UniProtKB/Swiss-Prot (fully manually curated entries) and UniProtKB/TrEMBL (automated annotation, classification and extensive cross-references). New releases are published fortnightly. A specific Plant Proteome Annotation Program (http://www.expasy.org/sprot/ppap/) was initiated to cope with the increasing amount of data produced by the complete sequencing of plant genomes. Through UniProt, our aim is to provide the scientific community with a single, centralized, authoritative resource for protein sequences and functional information that will allow the plant community to fully explore and utilize the wealth of information available for both plant and nonplant model organisms. PMID:15888679

  20. ISPE: A knowledge-based system for fluidization studies

    SciTech Connect

    Reddy, S.

    1991-01-01

    Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all specified goals'' are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that can enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.

  1. Document Retrieval Using A Fuzzy Knowledge-Based System

    NASA Astrophysics Data System (ADS)

    Subramanian, Viswanath; Biswas, Gautam; Bezdek, James C.

    1986-03-01

    This paper presents the design and development of a prototype document retrieval system using a knowledge-based systems approach. Both the domain-specific knowledge base and the inferencing schemes are based on a fuzzy set theoretic framework. A query in natural language represents a request to retrieve a relevant subset of documents from a document base. Such a query, which can include both fuzzy terms and fuzzy relational operators, is converted into an unambiguous intermediate form by a natural language interface. Concepts that describe domain topics and the relationships between concepts, such as the synonym relation and the implication relation between a general concept and more specific concepts, have been captured in a knowledge base. The knowledge base enables the system to emulate the reasoning process followed by an expert, such as a librarian, in understanding and reformulating user queries. The retrieval mechanism processes the query in two steps. First it produces a pruned list of documents pertinent to the query. Second, it uses an evidence combination scheme to compute a degree of support between the query and individual documents produced in step one. The front-end component of the system then presents a set of document citations to the user in ranked order as an answer to the information request.

  2. Hyperincursion and the Globalization of the Knowledge-Based Economy

    NASA Astrophysics Data System (ADS)

    Leydesdorff, Loet

    2006-06-01

    In biological systems, the capacity of anticipation—that is, entertaining a model of the system within the system—can be considered as naturally given. Human languages enable psychological systems to construct and exchange mental models of themselves and their environments reflexively, that is, provide meaning to the events. At the level of the social system expectations can further be codified. When these codifications are functionally differentiated—like between market mechanisms and scientific research programs—the potential asynchronicity in the update among the subsystems provides room for a second anticipatory mechanism at the level of the transversal information exchange among differently codified meaning-processing subsystems. Interactions between the two different anticipatory mechanisms (the transversal one and the one along the time axis in each subsystem) may lead to co-evolutions and stabilization of expectations along trajectories. The wider horizon of knowledgeable expectations can be expected to meta-stabilize and also globalize a previously stabilized configuration of expectations against the axis of time. While stabilization can be considered as consequences of interaction and aggregation among incursive formulations of the logistic equation, globalization can be modeled using the hyperincursive formulation of this equation. The knowledge-based subdynamic at the global level which thus emerges, enables historical agents to inform the reconstruction of previous states and to co-construct future states of the social system, for example, in a techno-economic co-evolution.

  3. Verification of Legal Knowledge-base with Conflictive Concept

    NASA Astrophysics Data System (ADS)

    Hagiwara, Shingo; Tojo, Satoshi

    In this paper, we propose a verification methodology of large-scale legal knowledge. With a revision of legal code, we are forced to revise also other affected code to keep the consistency of law. Thus, our task is to revise the affected area properly and to investigate its adequacy. In this study, we extend the notion of inconsistency besides of the ordinary logical inconsistency, to include the conceptual conflicts. We obtain these conflictions from taxonomy data, and thus, we can avoid tedious manual declarations of opponent words. In the verification process, we adopt extended disjunctive logic programming (EDLP) to tolerate multiple consequences for a given set of antecedents. In addition, we employ abductive logic programming (ALP) regarding the situations to which the rules are applied as premises. Also, we restrict a legal knowledge-base to acyclic program to avoid the circulation of definitions, to justify the relevance of verdicts. Therefore, detecting cyclic parts of legal knowledge would be one of our objectives. The system is composed of two subsystems; we implement the preprocessor in Ruby to facilitate string manipulation, and the verifier in Prolog to exert the logical inference. Also, we employ XML format in the system to retain readability. In this study, we verify actual code of ordinances of Toyama prefecture, and show the experimental results.

  4. Knowledge-based control of an adaptive interface

    NASA Technical Reports Server (NTRS)

    Lachman, Roy

    1989-01-01

    The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.

  5. Selection of construction methods: a knowledge-based approach.

    PubMed

    Ferrada, Ximena; Serpell, Alfredo; Skibniewski, Miroslaw

    2013-01-01

    The appropriate selection of construction methods to be used during the execution of a construction project is a major determinant of high productivity, but sometimes this selection process is performed without the care and the systematic approach that it deserves, bringing negative consequences. This paper proposes a knowledge management approach that will enable the intelligent use of corporate experience and information and help to improve the selection of construction methods for a project. Then a knowledge-based system to support this decision-making process is proposed and described. To define and design the system, semistructured interviews were conducted within three construction companies with the purpose of studying the way that the method' selection process is carried out in practice and the knowledge associated with it. A prototype of a Construction Methods Knowledge System (CMKS) was developed and then validated with construction industry professionals. As a conclusion, the CMKS was perceived as a valuable tool for construction methods' selection, by helping companies to generate a corporate memory on this issue, reducing the reliance on individual knowledge and also the subjectivity of the decision-making process. The described benefits as provided by the system favor a better performance of construction projects.

  6. Matching sensors to missions using a knowledge-based approach

    NASA Astrophysics Data System (ADS)

    Preece, Alun; Gomez, Mario; de Mel, Geeth; Vasconcelos, Wamberto; Sleeman, Derek; Colley, Stuart; Pearson, Gavin; Pham, Tien; La Porta, Thomas

    2008-04-01

    Making decisions on how best to utilise limited intelligence, surveillance and reconnaisance (ISR) resources is a key issue in mission planning. This requires judgements about which kinds of available sensors are more or less appropriate for specific ISR tasks in a mission. A methodological approach to addressing this kind of decision problem in the military context is the Missions and Means Framework (MMF), which provides a structured way to analyse a mission in terms of tasks, and assess the effectiveness of various means for accomplishing those tasks. Moreover, the problem can be defined as knowledge-based matchmaking: matching the ISR requirements of tasks to the ISR-providing capabilities of available sensors. In this paper we show how the MMF can be represented formally as an ontology (that is, a specification of a conceptualisation); we also represent knowledge about ISR requirements and sensors, and then use automated reasoning to solve the matchmaking problem. We adopt the Semantic Web approach and the Web Ontology Language (OWL), allowing us to import elements of existing sensor knowledge bases. Our core ontologies use the description logic subset of OWL, providing efficient reasoning. We describe a prototype tool as a proof-of-concept for our approach. We discuss the various kinds of possible sensor-mission matches, both exact and inexact, and how the tool helps mission planners consider alternative choices of sensors.

  7. Framework Support For Knowledge-Based Software Development

    NASA Astrophysics Data System (ADS)

    Huseth, Steve

    1988-03-01

    The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.

  8. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2001-01-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  9. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2000-12-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  10. A knowledge-based system design/information tool

    NASA Technical Reports Server (NTRS)

    Allen, James G.; Sikora, Scott E.

    1990-01-01

    The objective of this effort was to develop a Knowledge Capture System (KCS) for the Integrated Test Facility (ITF) at the Dryden Flight Research Facility (DFRF). The DFRF is a NASA Ames Research Center (ARC) facility. This system was used to capture the design and implementation information for NASA's high angle-of-attack research vehicle (HARV), a modified F/A-18A. In particular, the KCS was used to capture specific characteristics of the design of the HARV fly-by-wire (FBW) flight control system (FCS). The KCS utilizes artificial intelligence (AI) knowledge-based system (KBS) technology. The KCS enables the user to capture the following characteristics of automated systems: the system design; the hardware (H/W) design and implementation; the software (S/W) design and implementation; and the utilities (electrical and hydraulic) design and implementation. A generic version of the KCS was developed which can be used to capture the design information for any automated system. The deliverable items for this project consist of the prototype generic KCS and an application, which captures selected design characteristics of the HARV FCS.

  11. Knowledge-Based Query Construction Using the CDSS Knowledge Base for Efficient Evidence Retrieval

    PubMed Central

    Afzal, Muhammad; Hussain, Maqbool; Ali, Taqdir; Hussain, Jamil; Khan, Wajahat Ali; Lee, Sungyoung; Kang, Byeong Ho

    2015-01-01

    Finding appropriate evidence to support clinical practices is always challenging, and the construction of a query to retrieve such evidence is a fundamental step. Typically, evidence is found using manual or semi-automatic methods, which are time-consuming and sometimes make it difficult to construct knowledge-based complex queries. To overcome the difficulty in constructing knowledge-based complex queries, we utilized the knowledge base (KB) of the clinical decision support system (CDSS), which has the potential to provide sufficient contextual information. To automatically construct knowledge-based complex queries, we designed methods to parse rule structure in KB of CDSS in order to determine an executable path and extract the terms by parsing the control structures and logic connectives used in the logic. The automatically constructed knowledge-based complex queries were executed on the PubMed search service to evaluate the results on the reduction of retrieved citations with high relevance. The average number of citations was reduced from 56,249 citations to 330 citations with the knowledge-based query construction approach, and relevance increased from 1 term to 6 terms on average. The ability to automatically retrieve relevant evidence maximizes efficiency for clinicians in terms of time, based on feedback collected from clinicians. This approach is generally useful in evidence-based medicine, especially in ambient assisted living environments where automation is highly important. PMID:26343669

  12. Autonomous Cryogenic Load Operations: Knowledge-Based Autonomous Test Engineer

    NASA Technical Reports Server (NTRS)

    Schrading, J. Nicolas

    2013-01-01

    The Knowledge-Based Autonomous Test Engineer (KATE) program has a long history at KSC. Now a part of the Autonomous Cryogenic Load Operations (ACLO) mission, this software system has been sporadically developed over the past 20 years. Originally designed to provide health and status monitoring for a simple water-based fluid system, it was proven to be a capable autonomous test engineer for determining sources of failure in the system. As part of a new goal to provide this same anomaly-detection capability for a complicated cryogenic fluid system, software engineers, physicists, interns and KATE experts are working to upgrade the software capabilities and graphical user interface. Much progress was made during this effort to improve KATE. A display of the entire cryogenic system's graph, with nodes for components and edges for their connections, was added to the KATE software. A searching functionality was added to the new graph display, so that users could easily center their screen on specific components. The GUI was also modified so that it displayed information relevant to the new project goals. In addition, work began on adding new pneumatic and electronic subsystems into the KATE knowledge base, so that it could provide health and status monitoring for those systems. Finally, many fixes for bugs, memory leaks, and memory errors were implemented and the system was moved into a state in which it could be presented to stakeholders. Overall, the KATE system was improved and necessary additional features were added so that a presentation of the program and its functionality in the next few months would be a success.

  13. A knowledge-based information system for monitoring drug levels.

    PubMed

    Wiener, F; Groth, T; Mortimer, O; Hallquist, I; Rane, A

    1989-06-01

    The expert system shell SMR has been enhanced to include information system routines for designing data screens and providing facilities for data entry, storage, retrieval, queries and descriptive statistics. The data for inference making is abstracted from the data base record and inserted into a data array to which the knowledge base is applied to derive the appropriate advice and comments. The enhanced system has been used to develop an intelligent information system for monitoring serum drug levels which includes evaluation of temporal changes and production of specialized printed reports. The module for digoxin has been fully developed and validated. To demonstrate the extension to other drugs a module for phenytoin was constructed with only a rudimentary knowledge base. Data from the request forms together with the S-digoxin results are entered into the data base by the department secretary. The day's results are then reviewed by the clinical pharmacologist. For each case, previous results may be displayed and are taken into account by the system in the decision process. The knowledge base is applied to the data to formulate an evaluative comment on the report returned to the requestor. The report includes a semi-graphic presentation of the current and previous results and either the system's interpretation or one entered by the pharmacologist if he does not agree with it. The pharmacologist's comment is also recorded in the data base for future retrieval, analysis and possible updating of the knowledge base. The system is now undergoing testing and evaluation under routine operations in the clinical pharmacology service. It is a prototype for other applications in both laboratory and clinical medicine currently under development at Uppsala University Hospital. This system may thus provide a vehicle for a more intensive penetration of knowledge-based systems in practical medical applications.

  14. Minimizing proteome redundancy in the UniProt Knowledgebase

    PubMed Central

    Bursteinas, Borisas; Britto, Ramona; Bely, Benoit; Auchincloss, Andrea; Rivoire, Catherine; Redaschi, Nicole; O'Donovan, Claire; Martin, Maria Jesus

    2016-01-01

    Advances in high-throughput sequencing have led to an unprecedented growth in genome sequences being submitted to biological databases. In particular, the sequencing of large numbers of nearly identical bacterial genomes during infection outbreaks and for other large-scale studies has resulted in a high level of redundancy in nucleotide databases and consequently in the UniProt Knowledgebase (UniProtKB). Redundancy negatively impacts on database searches by causing slower searches, an increase in statistical bias and cumbersome result analysis. The redundancy combined with the large data volume increases the computational costs for most reuses of UniProtKB data. All of this poses challenges for effective discovery in this wealth of data. With the continuing development of sequencing technologies, it is clear that finding ways to minimize redundancy is crucial to maintaining UniProt's essential contribution to data interpretation by our users. We have developed a methodology to identify and remove highly redundant proteomes from UniProtKB. The procedure identifies redundant proteomes by performing pairwise alignments of sets of sequences for pairs of proteomes and subsequently, applies graph theory to find dominating sets that provide a set of non-redundant proteomes with a minimal loss of information. This method was implemented for bacteria in mid-2015, resulting in a removal of 50 million proteins in UniProtKB. With every new release, this procedure is used to filter new incoming proteomes, resulting in a more scalable and scientifically valuable growth of UniProtKB. Database URL: http://www.uniprot.org/proteomes/ PMID:28025334

  15. Presidential Helicopter Acquisition: Program Established Knowledge-Based Business Case and Entered System Development with Plans for Managing Challenges

    DTIC Science & Technology

    2015-04-14

    Presidential Helicopter Acquisition: Program Established Knowledge-Based Business Case and Entered System Development with Plans for Managing Challenges...Presidential Helicopter Acquisition: Program Established Knowledge-Based Business Case and Entered System Development with Plans for Managing...progress by establishing a knowledge- based business case for entry into system development that included an approved cost, schedule and performance

  16. Applying knowledge-based methods to design and implement an air quality workshop

    NASA Astrophysics Data System (ADS)

    Schmoldt, Daniel L.; Peterson, David L.

    1991-09-01

    In response to protection needs in class I wilderness areas, forest land managers of the USDA Forest Service must provide input to regulatory agencies regarding air pollutant impacts on air quality-related values. Regional workshops have been convened for land managers and scientists to discuss the aspects and extent of wilderness protection needs. Previous experience with a national workshop indicated that a document summarizing workshop discussions will have little operational utility. An alternative is to create a knowledge-based analytical system, in addition to the document, to aid land managers in assessing effects of air pollutants on wilderness. Knowledge-based methods were used to design and conduct regional workshops in the western United States. Extracting knowledge from a large number of workshop participants required careful planning of workshop discussions. Knowledge elicitation methods helped with this task. This knowledge-based approach appears to be effective for focusing group discussions and collecting knowledge from large groups of specialists.

  17. Installing a Local Copy of the Reactome Web Site and Knowledgebase

    PubMed Central

    McKay, Sheldon J; Weiser, Joel

    2015-01-01

    The Reactome project builds, maintains, and publishes a knowledgebase of biological pathways. The information in the knowledgebase is gathered from the experts in the field, peer reviewed, and edited by Reactome editorial staff and then published to the Reactome Web site, http://www.reactome.org (see UNIT 8.7; Croft et al., 2013). The Reactome software is open source and builds on top of other open-source or freely available software. Reactome data and code can be freely downloaded in its entirety and the Web site installed locally. This allows for more flexible interrogation of the data and also makes it possible to add one’s own information to the knowledgebase. PMID:26087747

  18. A knowledge-based approach to automated flow-field zoning for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Vogel, Alison Andrews

    1989-01-01

    An automated three-dimensional zonal grid generation capability for computational fluid dynamics is shown through the development of a demonstration computer program capable of automatically zoning the flow field of representative two-dimensional (2-D) aerodynamic configurations. The applicability of a knowledge-based programming approach to the domain of flow-field zoning is examined. Several aspects of flow-field zoning make the application of knowledge-based techniques challenging: the need for perceptual information, the role of individual bias in the design and evaluation of zonings, and the fact that the zoning process is modeled as a constructive, design-type task (for which there are relatively few examples of successful knowledge-based systems in any domain). Engineering solutions to the problems arising from these aspects are developed, and a demonstration system is implemented which can design, generate, and output flow-field zonings for representative 2-D aerodynamic configurations.

  19. PRAIS: Distributed, real-time knowledge-based systems made easy

    NASA Technical Reports Server (NTRS)

    Goldstein, David G.

    1990-01-01

    This paper discusses an architecture for real-time, distributed (parallel) knowledge-based systems called the Parallel Real-time Artificial Intelligence System (PRAIS). PRAIS strives for transparently parallelizing production (rule-based) systems, even when under real-time constraints. PRAIS accomplishes these goals by incorporating a dynamic task scheduler, operating system extensions for fact handling, and message-passing among multiple copies of CLIPS executing on a virtual blackboard. This distributed knowledge-based system tool uses the portability of CLIPS and common message-passing protocols to operate over a heterogeneous network of processors.

  20. Multisensor detection and tracking of tactical ballistic missiles using knowledge-based state estimation

    NASA Astrophysics Data System (ADS)

    Woods, Edward; Queeney, Tom

    1994-06-01

    Westinghouse has developed and demonstrated a system that performs multisensor detection and tracking of tactical ballistic missiles (TBM). Under a USAF High Gear Program, we developed knowledge-based techniques to discriminate TBM targets from ground clutter, air breathing targets, and false alarms. Upon track initiation the optimal estimate of the target's launch point, impact point and instantaneous position was computed by fusing returns from noncollocated multiple sensors. The system also distinguishes different missile types during the boost phase and forms multiple hypotheses to account for measurement and knowledge base uncertainties. This paper outlines the salient features of the knowledge-based processing of the multisensor data.

  1. Knowledge-based engineering of a PLC controlled telescope

    NASA Astrophysics Data System (ADS)

    Pessemier, Wim; Raskin, Gert; Saey, Philippe; Van Winckel, Hans; Deconinck, Geert

    2016-08-01

    demonstrate the added value that technologies such as soft-PLCs and DSL-scripts and design methodologies such as knowledge-based engineering can bring to astronomical instrumentation.

  2. A national knowledge-based crop recognition in Mediterranean environment

    NASA Astrophysics Data System (ADS)

    Cohen, Yafit; Shoshany, Maxim

    2002-08-01

    Population growth, urban expansion, land degradation, civil strife and war may place plant natural resources for food and agriculture at risk. Crop and yield monitoring is basic information necessary for wise management of these resources. Satellite remote sensing techniques have proven to be cost-effective in widespread agricultural lands in Africa, America, Europe and Australia. However, they have had limited success in Mediterranean regions that are characterized by a high rate of spatio-temporal ecological heterogeneity and high fragmentation of farming lands. An integrative knowledge-based approach is needed for this purpose, which combines imagery and geographical data within the framework of an intelligent recognition system. This paper describes the development of such a crop recognition methodology and its application to an area that comprises approximately 40% of the cropland in Israel. This area contains eight crop types that represent 70% of Israeli agricultural production. Multi-date Landsat TM images representing seasonal vegetation cover variations were converted to normalized difference vegetation index (NDVI) layers. Field boundaries were delineated by merging Landsat data with SPOT-panchromatic images. Crop recognition was then achieved in two-phases, by clustering multi-temporal NDVI layers using unsupervised classification, and then applying 'split-and-merge' rules to these clusters. These rules were formalized through comprehensive learning of relationships between crop types, imagery properties (spectral and NDVI) and auxiliary data including agricultural knowledge, precipitation and soil types. Assessment of the recognition results using ground data from the Israeli Agriculture Ministry indicated an average recognition accuracy exceeding 85% which accounts for both omission and commission errors. The two-phase strategy implemented in this study is apparently successful for heterogeneous regions. This is due to the fact that it allows

  3. Improving Student Teachers' Knowledge-Base in Language Education through Critical Reading

    ERIC Educational Resources Information Center

    Mulumba, Mathias Bwanika

    2016-01-01

    The emergence of the digital era is redefining education and the pedagogical processes in an unpredictable manner. In the midst of the increased availability of print and online resources, the twenty-first century language teacher educator expects her (or his) student teachers to be reading beings if they are to improve their knowledge-base in…

  4. The Spread of Contingent Work in the Knowledge-Based Economy

    ERIC Educational Resources Information Center

    Szabo, Katalin; Negyesi, Aron

    2005-01-01

    Permanent employment, typical of industrial societies and bolstered by numerous social guaranties, has been declining in the past 2 decades. There has been a steady expansion of various forms of contingent work. The decomposition of traditional work is a logical consequence of the characteristic patterns of the knowledge-based economy. According…

  5. Small Knowledge-Based Systems in Education and Training: Something New Under the Sun.

    ERIC Educational Resources Information Center

    Wilson, Brent G.; Welsh, Jack R.

    1986-01-01

    Discusses artificial intelligence, robotics, natural language processing, and expert or knowledge-based systems research; examines two large expert systems, MYCIN and XCON; and reviews the resources required to build large expert systems and affordable smaller systems (intelligent job aids) for training. Expert system vendors and products are…

  6. A knowledge-based object recognition system for applications in the space station

    NASA Technical Reports Server (NTRS)

    Dhawan, Atam P.

    1988-01-01

    A knowledge-based three-dimensional (3D) object recognition system is being developed. The system uses primitive-based hierarchical relational and structural matching for the recognition of 3D objects in the two-dimensional (2D) image for interpretation of the 3D scene. At present, the pre-processing, low-level preliminary segmentation, rule-based segmentation, and the feature extraction are completed. The data structure of the primitive viewing knowledge-base (PVKB) is also completed. Algorithms and programs based on attribute-trees matching for decomposing the segmented data into valid primitives were developed. The frame-based structural and relational descriptions of some objects were created and stored in a knowledge-base. This knowledge-base of the frame-based descriptions were developed on the MICROVAX-AI microcomputer in LISP environment. The simulated 3D scene of simple non-overlapping objects as well as real camera data of images of 3D objects of low-complexity have been successfully interpreted.

  7. End-user oriented language to develop knowledge-based expert systems

    SciTech Connect

    Ueno, H.

    1983-01-01

    A description is given of the COMEX (compact knowledge based expert system) expert system language for application-domain users who want to develop a knowledge-based expert system by themselves. The COMEX system was written in FORTRAN and works on a microcomputer. COMEX is being used in several application domains such as medicine, education, and industry. 7 references.

  8. Universities and the Knowledge-Based Economy: Perceptions from a Developing Country

    ERIC Educational Resources Information Center

    Bano, Shah; Taylor, John

    2015-01-01

    This paper considers the role of universities in the creation of a knowledge-based economy (KBE) in a developing country, Pakistan. Some developing countries have moved quickly to develop a KBE, but progress in Pakistan is much slower. Higher education plays a crucial role as part of the triple helix model for innovation. Based on the perceptions…

  9. Case-based reasoning for space applications: Utilization of prior experience in knowledge-based systems

    NASA Technical Reports Server (NTRS)

    King, James A.

    1987-01-01

    The goal is to explain Case-Based Reasoning as a vehicle to establish knowledge-based systems based on experimental reasoning for possible space applications. This goal will be accomplished through an examination of reasoning based on prior experience in a sample domain, and also through a presentation of proposed space applications which could utilize Case-Based Reasoning techniques.

  10. GUIDON-WATCH: A Graphic Interface for Viewing a Knowledge-Based System. Technical Report #14.

    ERIC Educational Resources Information Center

    Richer, Mark H.; Clancey, William J.

    This paper describes GUIDON-WATCH, a graphic interface that uses multiple windows and a mouse to allow a student to browse a knowledge base and view reasoning processes during diagnostic problem solving. The GUIDON project at Stanford University is investigating how knowledge-based systems can provide the basis for teaching programs, and this…

  11. Learning Spaces: An ICT-Enabled Model of Future Learning in the Knowledge-Based Society

    ERIC Educational Resources Information Center

    Punie, Yves

    2007-01-01

    This article presents elements of a future vision of learning in the knowledge-based society which is enabled by ICT. It is not only based on extrapolations from trends and drivers that are shaping learning in Europe but also consists of a holistic attempt to envisage and anticipate future learning needs and requirements in the KBS. The…

  12. Learning and Innovation in the Knowledge-Based Economy: Beyond Clusters and Qualifications

    ERIC Educational Resources Information Center

    James, Laura; Guile, David; Unwin, Lorna

    2013-01-01

    For over a decade policy-makers have claimed that advanced industrial societies should develop a knowledge-based economy (KBE) in response to economic globalisation and the transfer of manufacturing jobs to lower cost countries. In the UK, this vision shaped New Labour's policies for vocational education and training (VET), higher education and…

  13. The Knowledge-Based Reasoning of Physical Education Teachers: A Comparison between Groups with Different Expertise

    ERIC Educational Resources Information Center

    Reuker, Sabine

    2017-01-01

    The study addresses professional vision, including the abilities of selective attention and knowledge-based reasoning. This article focuses on the latter ability. Groups with different sport-specific and pedagogical expertise (n = 60) were compared according to their observation and interpretation of sport activities in a four-field design. The…

  14. New knowledge-based genetic algorithm for excavator boom structural optimization

    NASA Astrophysics Data System (ADS)

    Hua, Haiyan; Lin, Shuwen

    2014-03-01

    Due to the insufficiency of utilizing knowledge to guide the complex optimal searching, existing genetic algorithms fail to effectively solve excavator boom structural optimization problem. To improve the optimization efficiency and quality, a new knowledge-based real-coded genetic algorithm is proposed. A dual evolution mechanism combining knowledge evolution with genetic algorithm is established to extract, handle and utilize the shallow and deep implicit constraint knowledge to guide the optimal searching of genetic algorithm circularly. Based on this dual evolution mechanism, knowledge evolution and population evolution can be connected by knowledge influence operators to improve the configurability of knowledge and genetic operators. Then, the new knowledge-based selection operator, crossover operator and mutation operator are proposed to integrate the optimal process knowledge and domain culture to guide the excavator boom structural optimization. Eight kinds of testing algorithms, which include different genetic operators, are taken as examples to solve the structural optimization of a medium-sized excavator boom. By comparing the results of optimization, it is shown that the algorithm including all the new knowledge-based genetic operators can more remarkably improve the evolutionary rate and searching ability than other testing algorithms, which demonstrates the effectiveness of knowledge for guiding optimal searching. The proposed knowledge-based genetic algorithm by combining multi-level knowledge evolution with numerical optimization provides a new effective method for solving the complex engineering optimization problem.

  15. Knowledge-based extraction of adverse drug events from biomedical text

    PubMed Central

    2014-01-01

    Background Many biomedical relation extraction systems are machine-learning based and have to be trained on large annotated corpora that are expensive and cumbersome to construct. We developed a knowledge-based relation extraction system that requires minimal training data, and applied the system for the extraction of adverse drug events from biomedical text. The system consists of a concept recognition module that identifies drugs and adverse effects in sentences, and a knowledge-base module that establishes whether a relation exists between the recognized concepts. The knowledge base was filled with information from the Unified Medical Language System. The performance of the system was evaluated on the ADE corpus, consisting of 1644 abstracts with manually annotated adverse drug events. Fifty abstracts were used for training, the remaining abstracts were used for testing. Results The knowledge-based system obtained an F-score of 50.5%, which was 34.4 percentage points better than the co-occurrence baseline. Increasing the training set to 400 abstracts improved the F-score to 54.3%. When the system was compared with a machine-learning system, jSRE, on a subset of the sentences in the ADE corpus, our knowledge-based system achieved an F-score that is 7 percentage points higher than the F-score of jSRE trained on 50 abstracts, and still 2 percentage points higher than jSRE trained on 90% of the corpus. Conclusion A knowledge-based approach can be successfully used to extract adverse drug events from biomedical text without need for a large training set. Whether use of a knowledge base is equally advantageous for other biomedical relation-extraction tasks remains to be investigated. PMID:24593054

  16. A NASA/RAE cooperation in the development of a real-time knowledge-based autopilot

    NASA Technical Reports Server (NTRS)

    Daysh, Colin; Corbin, Malcolm; Butler, Geoff; Duke, Eugene L.; Belle, Steven D.; Brumbaugh, Randal W.

    1991-01-01

    As part of a US/UK cooperative aeronautical research program, a joint activity between the NASA Dryden Flight Research Facility and the Royal Aerospace Establishment on knowledge-based systems was established. This joint activity is concerned with tools and techniques for the implementation and validation of real-time knowledge-based systems. The proposed next stage of this research is described, in which some of the problems of implementing and validating a knowledge-based autopilot for a generic high-performance aircraft are investigated.

  17. A community effort towards a knowledge-base and mathematical model of the human pathogen Salmonella Typhimurium LT2

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Metabolic reconstructions (MRs) are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG) knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium...

  18. Knowledge-Based Reinforcement Learning for Data Mining

    NASA Astrophysics Data System (ADS)

    Kudenko, Daniel; Grzes, Marek

    experts have developed heuristics that help them in planning and scheduling resources in their work place. However, this domain knowledge is often rough and incomplete. When the domain knowledge is used directly by an automated expert system, the solutions are often sub-optimal, due to the incompleteness of the knowledge, the uncertainty of environments, and the possibility to encounter unexpected situations. RL, on the other hand, can overcome the weaknesses of the heuristic domain knowledge and produce optimal solutions. In the talk we propose two techniques, which represent first steps in the area of knowledge-based RL (KBRL). The first technique [1] uses high-level STRIPS operator knowledge in reward shaping to focus the search for the optimal policy. Empirical results show that the plan-based reward shaping approach outperforms other RL techniques, including alternative manual and MDP-based reward shaping when it is used in its basic form. We showed that MDP-based reward shaping may fail and successful experiments with STRIPS-based shaping suggest modifications which can overcome encountered problems. The STRIPSbased method we propose allows expressing the same domain knowledge in a different way and the domain expert can choose whether to define an MDP or STRIPS planning task. We also evaluated the robustness of the proposed STRIPS-based technique to errors in the plan knowledge. In case that STRIPS knowledge is not available, we propose a second technique [2] that shapes the reward with hierarchical tile coding. Where the Q-function is represented with low-level tile coding, a V-function with coarser tile coding can be learned in parallel and used to approximate the potential for ground states. In the context of data mining, our KBRL approaches can also be used for any data collection task where the acquisition of data may incur considerable cost. In addition, observing the data collection agent in specific scenarios may lead to new insights into optimal data

  19. Knowledge-based vision for space station object motion detection, recognition, and tracking

    NASA Technical Reports Server (NTRS)

    Symosek, P.; Panda, D.; Yalamanchili, S.; Wehner, W., III

    1987-01-01

    Computer vision, especially color image analysis and understanding, has much to offer in the area of the automation of Space Station tasks such as construction, satellite servicing, rendezvous and proximity operations, inspection, experiment monitoring, data management and training. Knowledge-based techniques improve the performance of vision algorithms for unstructured environments because of their ability to deal with imprecise a priori information or inaccurately estimated feature data and still produce useful results. Conventional techniques using statistical and purely model-based approaches lack flexibility in dealing with the variabilities anticipated in the unstructured viewing environment of space. Algorithms developed under NASA sponsorship for Space Station applications to demonstrate the value of a hypothesized architecture for a Video Image Processor (VIP) are presented. Approaches to the enhancement of the performance of these algorithms with knowledge-based techniques and the potential for deployment of highly-parallel multi-processor systems for these algorithms are discussed.

  20. Delivering spacecraft control centers with embedded knowledge-based systems: The methodology issue

    NASA Astrophysics Data System (ADS)

    Ayache, S.; Haziza, M.; Cayrac, D.

    1994-11-01

    Matra Marconi Space (MMS) occupies a leading place in Europe in the domain of satellite and space data processing systems. The maturity of the knowledge-based systems (KBS) technology, the theoretical and practical experience acquired in the development of prototype, pre-operational and operational applications, make it possible today to consider the wide operational deployment of KBS's in space applications. In this perspective, MMS has to prepare the introduction of the new methods and support tools that will form the basis of the development of such systems. This paper introduces elements of the MMS methodology initiatives in the domain and the main rationale that motivated the approach. These initiatives develop along two main axes: knowledge engineering methods and tools, and a hybrid method approach for coexisting knowledge-based and conventional developments.

  1. A knowledge-based framework for image enhancement in aviation security.

    PubMed

    Singh, Maneesha; Singh, Sameer; Partridge, Derek

    2004-12-01

    The main aim of this paper is to present a knowledge-based framework for automatically selecting the best image enhancement algorithm from several available on a per image basis in the context of X-ray images of airport luggage. The approach detailed involves a system that learns to map image features that represent its viewability to one or more chosen enhancement algorithms. Viewability measures have been developed to provide an automatic check on the quality of the enhanced image, i.e., is it really enhanced? The choice is based on ground-truth information generated by human X-ray screening experts. Such a system, for a new image, predicts the best-suited enhancement algorithm. Our research details the various characteristics of the knowledge-based system and shows extensive results on real images.

  2. Design and implementation of knowledge-based framework for ground objects recognition in remote sensing images

    NASA Astrophysics Data System (ADS)

    Chen, Shaobin; Ding, Mingyue; Cai, Chao; Fu, Xiaowei; Sun, Yue; Chen, Duo

    2009-10-01

    The advance of image processing makes knowledge-based automatic image interpretation much more realistic than ever. In the domain of remote sensing image processing, the introduction of knowledge enhances the confidence of recognition of typical ground objects. There are mainly two approaches to employ knowledge: the first one is scattering knowledge in concrete program and relevant knowledge of ground objects are fixed by programming; the second is systematically storing knowledge in knowledge base to offer a unified instruction for each object recognition procedure. In this paper, a knowledge-based framework for ground objects recognition in remote sensing image is proposed. This framework takes the second means for using knowledge with a hierarchical architecture. The recognition of typical airport demonstrated the feasibility of the proposed framework.

  3. Delivering spacecraft control centers with embedded knowledge-based systems: The methodology issue

    NASA Technical Reports Server (NTRS)

    Ayache, S.; Haziza, M.; Cayrac, D.

    1994-01-01

    Matra Marconi Space (MMS) occupies a leading place in Europe in the domain of satellite and space data processing systems. The maturity of the knowledge-based systems (KBS) technology, the theoretical and practical experience acquired in the development of prototype, pre-operational and operational applications, make it possible today to consider the wide operational deployment of KBS's in space applications. In this perspective, MMS has to prepare the introduction of the new methods and support tools that will form the basis of the development of such systems. This paper introduces elements of the MMS methodology initiatives in the domain and the main rationale that motivated the approach. These initiatives develop along two main axes: knowledge engineering methods and tools, and a hybrid method approach for coexisting knowledge-based and conventional developments.

  4. Increasing levels of assistance in refinement of knowledge-based retrieval systems

    NASA Technical Reports Server (NTRS)

    Baudin, Catherine; Kedar, Smadar; Pell, Barney

    1994-01-01

    The task of incrementally acquiring and refining the knowledge and algorithms of a knowledge-based system in order to improve its performance over time is discussed. In particular, the design of DE-KART, a tool whose goal is to provide increasing levels of assistance in acquiring and refining indexing and retrieval knowledge for a knowledge-based retrieval system, is presented. DE-KART starts with knowledge that was entered manually, and increases its level of assistance in acquiring and refining that knowledge, both in terms of the increased level of automation in interacting with users, and in terms of the increased generality of the knowledge. DE-KART is at the intersection of machine learning and knowledge acquisition: it is a first step towards a system which moves along a continuum from interactive knowledge acquisition to increasingly automated machine learning as it acquires more knowledge and experience.

  5. Virtual Center for Renal Support: Definition of a Novel Knowledge-Based Telemedicine System

    DTIC Science & Technology

    2007-11-02

    second part, the formal definition of the novel Virtual Center for Renal Support (VCRS) is done. Design of VCRS relies on a model- based system...supervision of therapies. Keywords – Remote healthcare, telemedicine, ESRD, peritoneal dialysis, hemodialysis , ESRD costs, knowledge-based assistance...patients was 25.689 (745 pmp) [3], but 40% of prevalent ESRD patients had a functioning graft, 55% were in hemodialysis therapy and the rest were

  6. Knowledge-based and integrated monitoring and diagnosis in autonomous power systems

    NASA Technical Reports Server (NTRS)

    Momoh, J. A.; Zhang, Z. Z.

    1990-01-01

    A new technique of knowledge-based and integrated monitoring and diagnosis (KBIMD) to deal with abnormalities and incipient or potential failures in autonomous power systems is presented. The KBIMD conception is discussed as a new function of autonomous power system automation. Available diagnostic modelling, system structure, principles and strategies are suggested. In order to verify the feasibility of the KBIMD, a preliminary prototype expert system is designed to simulate the KBIMD function in a main electric network of the autonomous power system.

  7. Strategic Concept of Competition Model in Knowledge-Based Logistics in Machinebuilding

    NASA Astrophysics Data System (ADS)

    Medvedeva, O. V.

    2015-09-01

    A competitive labor market needs serious changing. Machinebuilding is one of the main problem domains. The current direction to promote human capital competition demands for modernization. Therefore, it is necessary to develop a strategy for social and economic promotion of competition in conditions of knowledge-based economy, in particularly, in machinebuilding. The necessity is demonstrated, as well as basic difficulties faced this strategy for machinebuilding.

  8. Knowledge-based approach to fault diagnosis and control in distributed process environments

    NASA Astrophysics Data System (ADS)

    Chung, Kwangsue; Tou, Julius T.

    1991-03-01

    This paper presents a new design approach to knowledge-based decision support systems for fault diagnosis and control for quality assurance and productivity improvement in automated manufacturing environments. Based on the observed manifestations, the knowledge-based diagnostic system hypothesizes a set of the most plausible disorders by mimicking the reasoning process of a human diagnostician. The data integration technique is designed to generate error-free hierarchical category files. A novel approach to diagnostic problem solving has been proposed by integrating the PADIKS (Pattern-Directed Knowledge-Based System) concept and the symbolic model of diagnostic reasoning based on the categorical causal model. The combination of symbolic causal reasoning and pattern-directed reasoning produces a highly efficient diagnostic procedure and generates a more realistic expert behavior. In addition, three distinctive constraints are designed to further reduce the computational complexity and to eliminate non-plausible hypotheses involved in the multiple disorders problem. The proposed diagnostic mechanism, which consists of three different levels of reasoning operations, significantly reduces the computational complexity in the diagnostic problem with uncertainty by systematically shrinking the hypotheses space. This approach is applied to the test and inspection data collected from a PCB manufacturing operation.

  9. A knowledge-based flight status monitor for real-time application in digital avionics systems

    NASA Technical Reports Server (NTRS)

    Duke, E. L.; Disbrow, J. D.; Butler, G. F.

    1989-01-01

    The Dryden Flight Research Facility of the National Aeronautics and Space Administration (NASA) Ames Research Center (Ames-Dryden) is the principal NASA facility for the flight testing and evaluation of new and complex avionics systems. To aid in the interpretation of system health and status data, a knowledge-based flight status monitor was designed. The monitor was designed to use fault indicators from the onboard system which are telemetered to the ground and processed by a rule-based model of the aircraft failure management system to give timely advice and recommendations in the mission control room. One of the important constraints on the flight status monitor is the need to operate in real time, and to pursue this aspect, a joint research activity between NASA Ames-Dryden and the Royal Aerospace Establishment (RAE) on real-time knowledge-based systems was established. Under this agreement, the original LISP knowledge base for the flight status monitor was reimplemented using the intelligent knowledge-based system toolkit, MUSE, which was developed under RAE sponsorship. Details of the flight status monitor and the MUSE implementation are presented.

  10. Response time satisfaction in a real-time knowledge-based system

    SciTech Connect

    Frank, D. ); Friesen, D.; Williams, G. . Dept. of Computer Science)

    1990-08-01

    Response to interrupts within a certain time frame is an important issue for all software operating in real-time environment. A knowledge-based system (KBS) is no exception. Prior work on real-time knowledge-based systems either concentrated on improving the performance of the KBS in order to meet these constraints or focused on producing a better solution as more time was allowed. However, a problem with much of the latter research was that it required inference-time costs to be hardcoded into the different branches of reasoning. This limited the type of reasoning possible and the size of the KBS. Furthermore, performing the analysis required to derive those numbers is very difficult in knowledge based systems. This research explored a model for overcoming these drawbacks. It is based on integrating conventional programming techniques used to control task processing with knowledge-based techniques used to actually produce task results. The C-Language Integrated Production System (CLIPS) was used for the inference engine in the KBS; using CLIPS for the inference engine simplified the rapid context switching required. Thus, the KBS could respond in a timely manner while maintaining the fullest spectrum of KBS functionality.

  11. Can Croatia Join Europe as Competitive Knowledge-based Society by 2010?

    PubMed Central

    Petrovečki, Mladen; Paar, Vladimir; Primorac, Dragan

    2006-01-01

    The 21st century has brought important changes in the paradigms of economic development, one of them being a shift toward recognizing knowledge and information as the most important factors of today. The European Union (EU) has been working hard to become the most competitive knowledge-based society in the world, and Croatia, an EU candidate country, has been faced with a similar task. To establish itself as one of the best knowledge-based country in the Eastern European region over the next four years, Croatia realized it has to create an education and science system correspondent with European standards and sensitive to labor market needs. For that purpose, the Croatian Ministry of Science, Education, and Sports (MSES) has created and started implementing a complex strategy, consisting of the following key components: the reform of education system in accordance with the Bologna Declaration; stimulation of scientific production by supporting national and international research projects; reversing the “brain drain” into “brain gain” and strengthening the links between science and technology; and informatization of the whole education and science system. In this comprehensive report, we describe the implementation of these measures, whose coordination with the EU goals presents a challenge, as well as an opportunity for Croatia to become a knowledge-based society by 2010. PMID:17167853

  12. A new collaborative knowledge-based approach for wireless sensor networks.

    PubMed

    Canada-Bago, Joaquin; Fernandez-Prieto, Jose Angel; Gadeo-Martos, Manuel Angel; Velasco, Juan Ramón

    2010-01-01

    This work presents a new approach for collaboration among sensors in Wireless Sensor Networks. These networks are composed of a large number of sensor nodes with constrained resources: limited computational capability, memory, power sources, etc. Nowadays, there is a growing interest in the integration of Soft Computing technologies into Wireless Sensor Networks. However, little attention has been paid to integrating Fuzzy Rule-Based Systems into collaborative Wireless Sensor Networks. The objective of this work is to design a collaborative knowledge-based network, in which each sensor executes an adapted Fuzzy Rule-Based System, which presents significant advantages such as: experts can define interpretable knowledge with uncertainty and imprecision, collaborative knowledge can be separated from control or modeling knowledge and the collaborative approach may support neighbor sensor failures and communication errors. As a real-world application of this approach, we demonstrate a collaborative modeling system for pests, in which an alarm about the development of olive tree fly is inferred. The results show that knowledge-based sensors are suitable for a wide range of applications and that the behavior of a knowledge-based sensor may be modified by inferences and knowledge of neighbor sensors in order to obtain a more accurate and reliable output.

  13. 3DSwap: curated knowledgebase of proteins involved in 3D domain swapping.

    PubMed

    Shameer, Khader; Shingate, Prashant N; Manjunath, S C P; Karthika, M; Pugalenthi, Ganesan; Sowdhamini, Ramanathan

    2011-01-01

    Three-dimensional domain swapping is a unique protein structural phenomenon where two or more protein chains in a protein oligomer share a common structural segment between individual chains. This phenomenon is observed in an array of protein structures in oligomeric conformation. Protein structures in swapped conformations perform diverse functional roles and are also associated with deposition diseases in humans. We have performed in-depth literature curation and structural bioinformatics analyses to develop an integrated knowledgebase of proteins involved in 3D domain swapping. The hallmark of 3D domain swapping is the presence of distinct structural segments such as the hinge and swapped regions. We have curated the literature to delineate the boundaries of these regions. In addition, we have defined several new concepts like 'secondary major interface' to represent the interface properties arising as a result of 3D domain swapping, and a new quantitative measure for the 'extent of swapping' in structures. The catalog of proteins reported in 3DSwap knowledgebase has been generated using an integrated structural bioinformatics workflow of database searches, literature curation, by structure visualization and sequence-structure-function analyses. The current version of the 3DSwap knowledgebase reports 293 protein structures, the analysis of such a compendium of protein structures will further the understanding molecular factors driving 3D domain swapping.

  14. Using CLIPS in the domain of knowledge-based massively parallel programming

    NASA Technical Reports Server (NTRS)

    Dvorak, Jiri J.

    1994-01-01

    The Program Development Environment (PDE) is a tool for massively parallel programming of distributed-memory architectures. Adopting a knowledge-based approach, the PDE eliminates the complexity introduced by parallel hardware with distributed memory and offers complete transparency in respect of parallelism exploitation. The knowledge-based part of the PDE is realized in CLIPS. Its principal task is to find an efficient parallel realization of the application specified by the user in a comfortable, abstract, domain-oriented formalism. A large collection of fine-grain parallel algorithmic skeletons, represented as COOL objects in a tree hierarchy, contains the algorithmic knowledge. A hybrid knowledge base with rule modules and procedural parts, encoding expertise about application domain, parallel programming, software engineering, and parallel hardware, enables a high degree of automation in the software development process. In this paper, important aspects of the implementation of the PDE using CLIPS and COOL are shown, including the embedding of CLIPS with C++-based parts of the PDE. The appropriateness of the chosen approach and of the CLIPS language for knowledge-based software engineering are discussed.

  15. Feasibility of using a knowledge-based system concept for in-flight primary flight display research

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.

    1991-01-01

    A study was conducted to determine the feasibility of using knowledge-based systems architectures for inflight research of primary flight display information management issues. The feasibility relied on the ability to integrate knowledge-based systems with existing onboard aircraft systems. And, given the hardware and software platforms available, the feasibility also depended on the ability to use interpreted LISP software with the real time operation of the primary flight display. In addition to evaluating these feasibility issues, the study determined whether the software engineering advantages of knowledge-based systems found for this application in the earlier workstation study extended to the inflight research environment. To study these issues, two integrated knowledge-based systems were designed to control the primary flight display according to pre-existing specifications of an ongoing primary flight display information management research effort. These two systems were implemented to assess the feasibility and software engineering issues listed. Flight test results were successful in showing the feasibility of using knowledge-based systems inflight with actual aircraft data.

  16. The International Conference on Human Resources Development Strategies in the Knowledge-Based Society [Proceedings] (Seoul, South Korea, August 29, 2001).

    ERIC Educational Resources Information Center

    Korea Research Inst. for Vocational Education and Training, Seoul.

    This document contains the following seven papers, all in both English and Korean, from a conference on human resources development and school-to-work transitions in the knowledge-based society: "The U.S. Experience as a Knowledge-based Economy in Transition and Its Impact on Industrial and Employment Structures" (Eric Im); "Changes…

  17. Evaluating Social and National Education Textbooks Based on the Criteria of Knowledge-Based Economy from the Perspectives of Elementary Teachers in Jordan

    ERIC Educational Resources Information Center

    Al-Edwan, Zaid Suleiman; Hamaidi, Diala Abdul Hadi

    2011-01-01

    Knowledge-based economy is a new implemented trend in the field of education in Jordan. The ministry of education in Jordan attempts to implement this trend's philosophy in its textbooks. This study examined the extent to which the (1st-3rd grade) social and national textbooks reflect knowledge-based economy criteria from the perspective of…

  18. A framework for knowledge acquisition, representation and problem-solving in knowledge-based planning

    NASA Astrophysics Data System (ADS)

    Martinez-Bermudez, Iliana

    This research addresses the problem of developing planning knowledge-based applications. In particular, it is concerned with the problems of knowledge acquisition and representation---the issues that remain an impediment to the development of large-scale, knowledge-based planning applications. This work aims to develop a model of planning problem solving that facilitates expert knowledge elicitation and also supports effective problem solving. Achieving this goal requires determining the types of knowledge used by planning experts, the structure of this knowledge, and the problem-solving process that results in the plan. While answering these questions it became clear that the knowledge structure, as well as the process of problem solving, largely depends on the knowledge available to the expert. This dissertation proposes classification of planning problems based on their use of expert knowledge. Such classification can help in the selection of the appropriate planning method when dealing with a specific planning problem. The research concentrates on one of the identified classes of planning problems that can be characterized by well-defined and well-structured problem-solving knowledge. To achieve a more complete knowledge representation architecture for such problems, this work employs the task-specific approach to problem solving. The result of this endeavor is a task-specific methodology that allows the representation and use of planning knowledge in a structural, consistent manner specific to the domain of the application. The shell for building a knowledge-based planning application was created as a proof of concept for the methodology described in this dissertation. This shell enabled the development of a system for manufacturing planning---COMPLAN. COMPLAN encompasses knowledge related to four generic techniques used in composite material manufacturing and, given the description of the composite part, creates a family of plans capable of producing it.

  19. Knowledge-based and statistically modeled relationships between residential moisture damage and occupant reported health symptoms

    NASA Astrophysics Data System (ADS)

    Haverinen, Ulla; Vahteristo, Mikko; Moschandreas, Demetrios; Nevalainen, Aino; Husman, Tuula; Pekkanen, Juha

    This study continues to develop a quantitative indicator of moisture damage induced exposure in relation to occupant health in residential buildings. Earlier, we developed a knowledge-based model that links moisture damage variables with health symptoms. This paper presents a statistical model in an effort to improve the knowledge-based model, and formulates a third, simplified model that combines aspects of the both two models. The database used includes detailed information on moisture damage from 164 houses and health questionnaire data from the occupants. Models were formulated using generalized linear model formulation procedures, with 10 moisture damage variables as possible covariates and a respiratory health symptom score as the dependent variable. An 80% random sample of the residences was used for the formulation of models and the remaining 20% were used to evaluate them. Risk ratios (RR) for the respiratory health symptom score among the 80% sample were between 1.32 (1.12-1.55) and 1.48 (1.19-1.83), calculated per 10 points index increase. For the 20% sample, RRs were between 1.71 (1.13-2.58) and 2.34 (1.69-3.23), respectively. Deviance values in relation to degrees of freedom were between 2.00-2.12 (80% sample) and 1.50-1.81 (20% sample). The models developed can be simulated as continuous variables and they all associated significantly with the symptom score, the association being verified with a subset of the database not employed in the model formulation. We concluded that the performance of all models was similar. Therefore, based on the knowledge-based and statistical models, we were able to construct a simple model that can be used in estimating the severity of moisture damage.

  20. SU-E-J-71: Spatially Preserving Prior Knowledge-Based Treatment Planning

    SciTech Connect

    Wang, H; Xing, L

    2015-06-15

    Purpose: Prior knowledge-based treatment planning is impeded by the use of a single dose volume histogram (DVH) curve. Critical spatial information is lost from collapsing the dose distribution into a histogram. Even similar patients possess geometric variations that becomes inaccessible in the form of a single DVH. We propose a simple prior knowledge-based planning scheme that extracts features from prior dose distribution while still preserving the spatial information. Methods: A prior patient plan is not used as a mere starting point for a new patient but rather stopping criteria are constructed. Each structure from the prior patient is partitioned into multiple shells. For instance, the PTV is partitioned into an inner, middle, and outer shell. Prior dose statistics are then extracted for each shell and translated into the appropriate Dmin and Dmax parameters for the new patient. Results: The partitioned dose information from a prior case has been applied onto 14 2-D prostate cases. Using prior case yielded final DVHs that was comparable to manual planning, even though the DVH for the prior case was different from the DVH for the 14 cases. Solely using a single DVH for the entire organ was also performed for comparison but showed a much poorer performance. Different ways of translating the prior dose statistics into parameters for the new patient was also tested. Conclusion: Prior knowledge-based treatment planning need to salvage the spatial information without transforming the patients on a voxel to voxel basis. An efficient balance between the anatomy and dose domain is gained through partitioning the organs into multiple shells. The use of prior knowledge not only serves as a starting point for a new case but the information extracted from the partitioned shells are also translated into stopping criteria for the optimization problem at hand.

  1. Knowledge-based approach for generating target system specifications from a domain model

    NASA Technical Reports Server (NTRS)

    Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan

    1992-01-01

    Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.

  2. Application of flight systems methodologies to the validation of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Duke, Eugene L.

    1988-01-01

    Flight and mission-critical systems are verified, qualified for flight, and validated using well-known and well-established techniques. These techniques define the validation methodology used for such systems. In order to verify, qualify, and validate knowledge-based systems (KBS's), the methodology used for conventional systems must be addressed, and the applicability and limitations of that methodology to KBS's must be identified. An outline of how this approach to the validation of KBS's is being developed and used is presented.

  3. Application of flight systems methodologies to the validation of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Duke, Eugene L.

    1988-01-01

    Flight and mission-critical systems are verified, qualified for flight, and validated using well-known and well-established techniques. These techniques define the validation methodology used for such systems. In order to verify, qualify, and validate knowledge-based systems (KBS's), the methodology used for conventional systems must be addressed, and the applicability and limitations of that methodology to KBS's must be identified. The author presents an outline of how this approach to the validation of KBS's is being developed and used at the Dryden Flight Research Facility of the NASA Ames Research Center.

  4. canSAR: an updated cancer research and drug discovery knowledgebase.

    PubMed

    Tym, Joseph E; Mitsopoulos, Costas; Coker, Elizabeth A; Razaz, Parisa; Schierz, Amanda C; Antolin, Albert A; Al-Lazikani, Bissan

    2016-01-04

    canSAR (http://cansar.icr.ac.uk) is a publicly available, multidisciplinary, cancer-focused knowledgebase developed to support cancer translational research and drug discovery. canSAR integrates genomic, protein, pharmacological, drug and chemical data with structural biology, protein networks and druggability data. canSAR is widely used to rapidly access information and help interpret experimental data in a translational and drug discovery context. Here we describe major enhancements to canSAR including new data, improved search and browsing capabilities, new disease and cancer cell line summaries and new and enhanced batch analysis tools.

  5. Facilitating superior chronic disease management through a knowledge-based systems development model.

    PubMed

    Wickramasinghe, Nilmini S; Goldberg, Steve

    2008-01-01

    To date, the adoption and diffusion of technology-enabled solutions to deliver better healthcare has been slow. There are many reasons for this. One of the most significant is that the existing methodologies that are normally used in general for Information and Communications Technology (ICT) implementations tend to be less successful in a healthcare context. This paper describes a knowledge-based adaptive mapping to realisation methodology to traverse successfully from idea to realisation rapidly and without compromising rigour so that success ensues. It is discussed in connection with trying to implement superior ICT-enabled approaches to facilitate superior Chronic Disease Management (CDM).

  6. Docking into knowledge-based potential fields: a comparative evaluation of DrugScore.

    PubMed

    Sotriffer, Christoph A; Gohlke, Holger; Klebe, Gerhard

    2002-05-09

    A new application of DrugScore is reported in which the knowledge-based pair potentials serve as objective function in docking optimizations. The Lamarckian genetic algorithm of AutoDock is used to search for favorable ligand binding modes guided by DrugScore grids as representations of the protein binding site. The approach is found to be successful in many cases where DrugScore-based re-ranking of already docked ligand conformations does not yield satisfactory results. Compared to the AutoDock scoring function, DrugScore yields slightly superior results in flexible docking.

  7. Using Unified Modeling Language for Conceptual Modelling of Knowledge-Based Systems

    NASA Astrophysics Data System (ADS)

    Abdullah, Mohd Syazwan; Benest, Ian; Paige, Richard; Kimble, Chris

    This paper discusses extending the Unified Modelling Language by means of a profile for modelling knowledge-based system in the context of Model Driven Architecture (MDA) framework. The profile is implemented using the eXecutable Modelling Framework (XMF) Mosaic tool. A case study from the health care domain demonstrates the practical use of this profile; with the prototype implemented in Java Expert System Shell (Jess). The paper also discusses the possible mapping of the profile elements to the platform specific model (PSM) of Jess and provides some discussion on the Production Rule Representation (PRR) standardisation work.

  8. Enroute flight-path planning - Cooperative performance of flight crews and knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Smith, Philip J.; Mccoy, Elaine; Layton, Chuck; Galdes, Deb

    1989-01-01

    Interface design issues associated with the introduction of knowledge-based systems into the cockpit are discussed. Such issues include not only questions about display and control design, they also include deeper system design issues such as questions about the alternative roles and responsibilities of the flight crew and the computer system. In addition, the feasibility of using enroute flight path planning as a context for exploring such research questions is considered. In particular, the development of a prototyping shell that allows rapid design and study of alternative interfaces and system designs is discussed.

  9. Generating MEDLINE search strategies using a librarian knowledge-based system.

    PubMed Central

    Peng, P.; Aguirre, A.; Johnson, S. B.; Cimino, J. J.

    1993-01-01

    We describe a librarian knowledge-based system that generates a search strategy from a query representation based on a user's information need. Together with the natural language parser AQUA, the system functions as a human/computer interface, which translates a user query from free text into a BRS Onsite search formulation, for searching the MEDLINE bibliographic database. In the system, conceptual graphs are used to represent the user's information need. The UMLS Metathesaurus and Semantic Net are used as the key knowledge sources in building the knowledge base. PMID:8130544

  10. A Knowledge-Based System For The Recognition Of Roads On SPOT Satellite Images

    NASA Astrophysics Data System (ADS)

    van Cleynenbreugel, J.; Suetens, Paul; Fierens, F.; Wambacq, Patrick; Oosterlinck, Andre J.

    1989-09-01

    Due to the resolution of current satellite imagery (e.g. SPOT), the extraction of roads and linear networks from satellite data has become a feasible - although labour-intensive - task for a human expert. This interpretation problem relies on structural image recognition as well as on expertise in combining data sources external to the image data (e.g. topography, landcover classification). In this paper different knowledge sources employed by human interpreters are discussed. Ways to implement these sources using current knowledge-based tools are suggested. A practical case study of knowledge integration is described.

  11. Recognition Of Partially Occluded Workpieces By A Knowledge-Based System

    NASA Astrophysics Data System (ADS)

    Serpico, S. B.; Vernazza, G.; Dellepiane, S.; Angela, P.

    1987-01-01

    A knowledge-based system is presented that is oriented toward partially occluded 2-D workpiece recognition in TV camera images. The generalized Hough transform is employed to extract elementary edge patterns. Intrinsic and relational information regarding elementary patterns is computed and then stored inside a net of frames. A similar net of frames is employed for workpiece model representation, for an easy matching with the previous net. A set of production rules provide the heuristics to find hints for locating focus-of-attention regions, while other production rules specify modalities for applying a hypothesis-generation-and-test process. Experimental results on a set of 20 workpieces are reported.

  12. Use of metaknowledge in the verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Morell, Larry J.

    1989-01-01

    Knowledge-based systems are modeled as deductive systems. The model indicates that the two primary areas of concern in verification are demonstrating consistency and completeness. A system is inconsistent if it asserts something that is not true of the modeled domain. A system is incomplete if it lacks deductive capability. Two forms of consistency are discussed along with appropriate verification methods. Three forms of incompleteness are discussed. The use of metaknowledge, knowledge about knowledge, is explored in connection to each form of incompleteness.

  13. Knowledge-based immunosuppressive therapy for kidney transplant patients--from theoretical model to clinical integration.

    PubMed

    Seeling, Walter; Plischke, Max; de Bruin, Jeroen S; Schuh, Christian

    2015-01-01

    Immunosuppressive therapy is a risky necessity after a patient received a kidney transplant. To reduce risks, a knowledge-based system was developed that determines the right dosage of the immunosuppresive agent Tacrolimus. A theoretical model, to classify medication blood levels as well as medication adaptions, was created using data from almost 500 patients, and over 13.000 examinations. This model was then translated into an Arden Syntax knowledge base, and integrated directly into the hospital information system of the Vienna General Hospital. In this paper we give an overview of the construction and integration of such a system.

  14. Requirements for an on-line knowledge-based anatomy information system.

    PubMed Central

    Brinkley, J. F.; Rosse, C.

    1998-01-01

    User feedback from the Digital Anatomist Web-based anatomy atlases, together with over 20 years of anatomy teaching experience, were used to formulate the requirements and system design for a next-generation anatomy information system. The main characteristic of this system over current image-based approaches is that it is knowledge-based. A foundational model of anatomy is accessed by an intelligent agent that uses its knowledge about the available anatomy resources and the user types to generate customized interfaces. Current usage statistics suggest that even partial implementation of this design will be of great practical value for both clinical and educational needs. Images Figure 1 PMID:9929347

  15. Knowledge-based goal-driven approach for information extraction and decision making for target recognition

    NASA Astrophysics Data System (ADS)

    Wilson, Roderick D.; Wilson, Anitra C.

    1996-06-01

    This paper presents a novel goal-driven approach for designing a knowledge-based system for information extraction and decision-making for target recognition. The underlying goal-driven model uses a goal frame tree schema for target organization, a hybrid rule-based pattern- directed formalism for target structural encoding, and a goal-driven inferential control strategy. The knowledge-base consists of three basic structures for the organization and control of target information: goals, target parameters, and an object-rulebase. Goal frames represent target recognition tasks as goals and subgoals in the knowledge base. Target parameters represent characteristic attributes of targets that are encoded as information atoms. Information atoms may have one or more assigned values and are used for information extraction. The object-rulebase consists of pattern/action assertional implications that describe the logical relationships existing between target parameter values. A goal realization process formulates symbolic patten expressions whose atomic values map to target parameters contained a priori in a hierarchical database of target state information. Symbolic pattern expression creation is accomplished via the application of a novel goal-driven inference strategy that logically prunes an AND/OR tree constructed object-rulebase. Similarity analysis is performed via pattern matching of query symbolic patterns and a priori instantiated target parameters.

  16. Dealing with difficult deformations: construction of a knowledge-based deformation atlas

    NASA Astrophysics Data System (ADS)

    Thorup, S. S.; Darvann, T. A.; Hermann, N. V.; Larsen, P.; Ólafsdóttir, H.; Paulsen, R. R.; Kane, A. A.; Govier, D.; Lo, L.-J.; Kreiborg, S.; Larsen, R.

    2010-03-01

    Twenty-three Taiwanese infants with unilateral cleft lip and palate (UCLP) were CT-scanned before lip repair at the age of 3 months, and again after lip repair at the age of 12 months. In order to evaluate the surgical result, detailed point correspondence between pre- and post-surgical images was needed. We have previously demonstrated that non-rigid registration using B-splines is able to provide automated determination of point correspondences in populations of infants without cleft lip. However, this type of registration fails when applied to the task of determining the complex deformation from before to after lip closure in infants with UCLP. The purpose of the present work was to show that use of prior information about typical deformations due to lip closure, through the construction of a knowledge-based atlas of deformations, could overcome the problem. Initially, mean volumes (atlases) for the pre- and post-surgical populations, respectively, were automatically constructed by non-rigid registration. An expert placed corresponding landmarks in the cleft area in the two atlases; this provided prior information used to build a knowledge-based deformation atlas. We model the change from pre- to post-surgery using thin-plate spline warping. The registration results are convincing and represent a first move towards an automatic registration method for dealing with difficult deformations due to this type of surgery.

  17. Knowledge-based approach to multiple-transaction processing and distributed data-base design

    SciTech Connect

    Park, J.T.

    1987-01-01

    The collective processing of multiple transactions in a data-base system has recently received renewed attention due to its capability of improving the overall performance of a data-base system and its applicability to the design of knowledge-based expert systems and extensible data-base systems. This dissertation consists of two parts. The first part presents a new knowledge-based approach to the problems of processing multiple concurrent queries and distributing replicated data objects for further improvement of the overall system performance. The second part deals with distributed database design, i.e., designing horizontal fragments using a semantic knowledge, and allocating data in a distributed environment. The semantic knowledge on data such as functional dependencies and semantic-data-integrity constraints are newly exploited for the identification of subset relationships between intermediate results of query executions involving joins, such that the (intermediate) results of queries can be utilized for the efficient processing of other queries. The expertise on the collective processing of multiple transactions is embodied into the rules of a rule-based expert system, MTP (Multiple Transaction Processor). In the second part, MTP is applied for the determination of horizontal fragments exploiting the semantic knowledge. Heuristics for allocating data in local area networks are developed.

  18. A knowledge-based control system for air-scour optimisation in membrane bioreactors.

    PubMed

    Ferrero, G; Monclús, H; Sancho, L; Garrido, J M; Comas, J; Rodríguez-Roda, I

    2011-01-01

    Although membrane bioreactors (MBRs) technology is still a growing sector, its progressive implementation all over the world, together with great technical achievements, has allowed it to reach a mature degree, just comparable to other more conventional wastewater treatment technologies. With current energy requirements around 0.6-1.1 kWh/m3 of treated wastewater and investment costs similar to conventional treatment plants, main market niche for MBRs can be areas with very high restrictive discharge limits, where treatment plants have to be compact or where water reuse is necessary. Operational costs are higher than for conventional treatments; consequently there is still a need and possibilities for energy saving and optimisation. This paper presents the development of a knowledge-based decision support system (DSS) for the integrated operation and remote control of the biological and physical (filtration and backwashing or relaxation) processes in MBRs. The core of the DSS is a knowledge-based control module for air-scour consumption automation and energy consumption minimisation.

  19. A knowledge-based approach to automatic detection of the spinal cord in CT images.

    PubMed

    Archip, Neculai; Erard, Pierre-Jean; Egmont-Petersen, Michael; Haefliger, Jean-Marie; Germond, Jean-Francois

    2002-12-01

    Accurate planning of radiation therapy entails the definition of treatment volumes and a clear delimitation of normal tissue of which unnecessary exposure should be prevented. The spinal cord is a radiosensitive organ, which should be precisely identified because an overexposure to radiation may lead to undesired complications for the patient such as neuronal disfunction or paralysis. In this paper, a knowledge-based approach to identifying the spinal cord in computed tomography images of the thorax is presented. The approach relies on a knowledge-base which consists of a so-called anatomical structures map (ASM) and a task-oriented architecture called the plan solver. The ASM contains a frame-like knowledge representation of the macro-anatomy in the human thorax. The plan solver is responsible for determining the position, orientation and size of the structures of interest to radiation therapy. The plan solver relies on a number of image processing operators. Some are so-called atomic (e.g., thresholding and snakes) whereas others are composite. The whole system has been implemented on a standard PC. Experiments performed on the image material from 23 patients show that the approach results in a reliable recognition of the spinal cord (92% accuracy) and the spinal canal (85% accuracy). The lamina is more problematic to locate correctly (accuracy 72%). The position of the outer thorax is always determined correctly.

  20. Knowledge-based algorithm for satellite image classification of urban wetlands

    NASA Astrophysics Data System (ADS)

    Xu, Xiaofan; Ji, Wei

    2014-10-01

    It has been a challenge to accurately detect urban wetlands with remotely sensed data by means of pixel-based image classification. This technical difficulty results mainly from inadequate spatial resolutions of satellite imagery, spectral similarities between urban wetlands and adjacent land covers, and spatial complexity of wetlands in human transformed, heterogeneous urban landscapes. To address this issue, an image classification approach has been developed to improve the mapping accuracy of urban wetlands by integrating the pixel-based classification with a knowledge-based algorithm. The algorithm includes a set of decision rules of identifying wetland cover in relation to their elevation, spatial adjacencies, habitat conditions, hydro-geomorphological characteristics, and relevant geo-statistics. ERDAS Imagine software was used to develop the knowledge base and implement the classification. The study area is the metropolitan region of Kansas City, USA. SPOT satellite images of 1992, 2008, and 2010 were classified into four classes - wetland, farmland, built-up land, and forestland. The results suggest that the knowledge-based image classification approach can enhance urban wetland detection capabilities and classification accuracies with remotely sensed satellite imagery.

  1. Integrated knowledge-based tools for documenting and monitoring damages to built heritage

    NASA Astrophysics Data System (ADS)

    Cacciotti, R.

    2015-08-01

    The advancements of information technologies as applied to the most diverse fields of science define a breakthrough in the accessibility and processing of data for both expert and non-expert users. Nowadays it is possible to evidence an increasingly relevant research effort in the context of those domains, such as that of cultural heritage protection, in which knowledge mapping and sharing constitute critical prerequisites for accomplishing complex professional tasks. The aim of this paper is to outline the main results and outputs of the MONDIS research project. This project focusses on the development of integrated knowledge-based tools grounded on an ontological representation of the field of heritage conservation. The scope is to overcome the limitations of earlier databases by the application of modern semantic technologies able to integrate, organize and process useful information concerning damages to built heritage objects. In particular MONDIS addresses the need for supporting a diverse range of stakeholders (e.g. administrators, owners and professionals) in the documentation and monitoring of damages to historical constructions and in finding related remedies. The paper concentrates on the presentation of the following integrated knowledgebased components developed within the project: (I) MONDIS mobile application (plus desktop version), (II) MONDIS record explorer, (III) Ontomind profiles, (IV) knowledge matrix and (V) terminology editor. An example of practical application of the MONDIS integrated system is also provided and finally discussed.

  2. Analytical and knowledge-based redundancy for fault diagnosis in process plants

    SciTech Connect

    Fathi, Z.; Ramirez, W.F. ); Korbicz, J. )

    1993-01-01

    The increasing complexity of process plants and their reliability have necessitated the development of more powerful methods for detecting and diagnosing process abnormalities. Among the underlying strategies, analytical redundancy and knowledge-based system techniques offer viable solutions. In this work, the authors consider the adaptive inclusion of analytical redundancy models (state and parameter estimation modules) in the diagnostic reasoning loop of a knowledge-based system. This helps overcome the difficulties associated with each category. The design method is a new layered knowledge base that houses compiled/qualitative knowledge in the high levels and process-general estimation knowledge in the low levels of a hierarchical knowledge structure. The compiled knowledge is used to narrow the diagnostic search space and provide an effective way of employing estimation modules. The estimation-based methods that resort to fundamental analysis provide the rationale for a qualitatively-guided reasoning process. The overall structure of the fault detection and isolation system based on the combined strategy is discussed focusing on the model-based redundancy methods which create the low levels of the hierarchical knowledge base. The system has been implemented using the condensate-feedwater subsystem of a coal-fired power plant. Due to the highly nonlinear and mixed-mode nature of the power plant dynamics, the modified extended Kalman filter is used in designing local detection filters.

  3. Predicting Mycobacterium tuberculosis Complex Clades Using Knowledge-Based Bayesian Networks

    PubMed Central

    Bennett, Kristin P.

    2014-01-01

    We develop a novel approach for incorporating expert rules into Bayesian networks for classification of Mycobacterium tuberculosis complex (MTBC) clades. The proposed knowledge-based Bayesian network (KBBN) treats sets of expert rules as prior distributions on the classes. Unlike prior knowledge-based support vector machine approaches which require rules expressed as polyhedral sets, KBBN directly incorporates the rules without any modification. KBBN uses data to refine rule-based classifiers when the rule set is incomplete or ambiguous. We develop a predictive KBBN model for 69 MTBC clades found in the SITVIT international collection. We validate the approach using two testbeds that model knowledge of the MTBC obtained from two different experts and large DNA fingerprint databases to predict MTBC genetic clades and sublineages. These models represent strains of MTBC using high-throughput biomarkers called spacer oligonucleotide types (spoligotypes), since these are routinely gathered from MTBC isolates of tuberculosis (TB) patients. Results show that incorporating rules into problems can drastically increase classification accuracy if data alone are insufficient. The SITVIT KBBN is publicly available for use on the World Wide Web. PMID:24864238

  4. A knowledge-based machine vision system for space station automation

    NASA Technical Reports Server (NTRS)

    Chipman, Laure J.; Ranganath, H. S.

    1989-01-01

    A simple knowledge-based approach to the recognition of objects in man-made scenes is being developed. Specifically, the system under development is a proposed enhancement to a robot arm for use in the space station laboratory module. The system will take a request from a user to find a specific object, and locate that object by using its camera input and information from a knowledge base describing the scene layout and attributes of the object types included in the scene. In order to use realistic test images in developing the system, researchers are using photographs of actual NASA simulator panels, which provide similar types of scenes to those expected in the space station environment. Figure 1 shows one of these photographs. In traditional approaches to image analysis, the image is transformed step by step into a symbolic representation of the scene. Often the first steps of the transformation are done without any reference to knowledge of the scene or objects. Segmentation of an image into regions generally produces a counterintuitive result in which regions do not correspond to objects in the image. After segmentation, a merging procedure attempts to group regions into meaningful units that will more nearly correspond to objects. Here, researchers avoid segmenting the image as a whole, and instead use a knowledge-directed approach to locate objects in the scene. The knowledge-based approach to scene analysis is described and the categories of knowledge used in the system are discussed.

  5. An architecture for integrating distributed and cooperating knowledge-based Air Force decision aids

    NASA Technical Reports Server (NTRS)

    Nugent, Richard O.; Tucker, Richard W.

    1988-01-01

    MITRE has been developing a Knowledge-Based Battle Management Testbed for evaluating the viability of integrating independently-developed knowledge-based decision aids in the Air Force tactical domain. The primary goal for the testbed architecture is to permit a new system to be added to a testbed with little change to the system's software. Each system that connects to the testbed network declares that it can provide a number of services to other systems. When a system wants to use another system's service, it does not address the server system by name, but instead transmits a request to the testbed network asking for a particular service to be performed. A key component of the testbed architecture is a common database which uses a relational database management system (RDBMS). The RDBMS provides a database update notification service to requesting systems. Normally, each system is expected to monitor data relations of interest to it. Alternatively, a system may broadcast an announcement message to inform other systems that an event of potential interest has occurred. Current research is aimed at dealing with issues resulting from integration efforts, such as dealing with potential mismatches of each system's assumptions about the common database, decentralizing network control, and coordinating multiple agents.

  6. Transformative Pedagogy, Leadership and School Organisation for the Twenty-First-Century Knowledge-Based Economy: The Case of Singapore

    ERIC Educational Resources Information Center

    Dimmock, Clive; Goh, Jonathan W. P.

    2011-01-01

    Singapore has a high performing school system; its students top international tests in maths and science. Yet while the Singapore government cherishes its world class "brand", it realises that in a globally competitive world, its schools need to prepare students for the twenty-first-century knowledge-based economy (KBE). Accordingly,…

  7. A Comparative Analysis of New Governance Instruments in the Transnational Educational Space: A Shift to Knowledge-Based Instruments?

    ERIC Educational Resources Information Center

    Ioannidou, Alexandra

    2007-01-01

    In recent years, the ongoing development towards a knowledge-based society--associated with globalization, an aging population, new technologies and organizational changes--has led to a more intensive analysis of education and learning throughout life with regard to quantitative, qualitative and financial aspects. In this framework, education…

  8. Development of the Knowledge-Based Standard for the Written Certification Examination of the American Board of Anesthesiology.

    ERIC Educational Resources Information Center

    Slogoff, Stephen; And Others

    1992-01-01

    Application of a knowledge-based standard in evaluating a written certification examination developed by the American Board of Anesthesiology established a standard of 57 percent correct over two years' examinations. This process is recommended for developing mastery-based (rather than normative-based) success criteria for evaluation of medical…

  9. Going beyond information management: using the Comprehensive Accreditation Manual for Hospitals to promote knowledge-based information services.

    PubMed

    Schardt, C M

    1998-10-01

    In 1987, the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) initiated the Agenda for Change, a major revision in the evaluation process for hospitals. An essential component of that change was to shift the emphasis away from standards for individual departments to standards for hospital-wide functions. In recent years, hospital librarians have focused their energy and attention on complying with the standards for the "Management of Information" chapter, specifically the IM.9 section on knowledge-based information. However, the JCAHO has listed the health sciences librarian and library services as having responsibilities in six other chapters within the Comprehensive Accreditation Manual for Hospitals. These chapters can have a major impact on the services of the hospital library for two reasons: (1) they are being read by hospital leaders and other professionals in the organization, and (2) they articulate specific ways to apply knowledge-based information services to the major functions within the hospital. These chapters are "Education"; "Improving Organizational Performance"; "Leadership"; "Management of Human Resources"; "Management of the Environment of Care"; and "Surveillance, Prevention, and Control of Infection." The standards that these chapters promote present specific opportunities for hospital librarians to apply knowledge-based information resources and service to hospital-wide functions. This article reviews these chapters and discusses the standards that relate to knowledge-based information.

  10. On the importance of the distance measures used to train and test knowledge-based potentials for proteins.

    PubMed

    Carlsen, Martin; Koehl, Patrice; Røgen, Peter

    2014-01-01

    Knowledge-based potentials are energy functions derived from the analysis of databases of protein structures and sequences. They can be divided into two classes. Potentials from the first class are based on a direct conversion of the distributions of some geometric properties observed in native protein structures into energy values, while potentials from the second class are trained to mimic quantitatively the geometric differences between incorrectly folded models and native structures. In this paper, we focus on the relationship between energy and geometry when training the second class of knowledge-based potentials. We assume that the difference in energy between a decoy structure and the corresponding native structure is linearly related to the distance between the two structures. We trained two distance-based knowledge-based potentials accordingly, one based on all inter-residue distances (PPD), while the other had the set of all distances filtered to reflect consistency in an ensemble of decoys (PPE). We tested four types of metric to characterize the distance between the decoy and the native structure, two based on extrinsic geometry (RMSD and GTD-TS*), and two based on intrinsic geometry (Q* and MT). The corresponding eight potentials were tested on a large collection of decoy sets. We found that it is usually better to train a potential using an intrinsic distance measure. We also found that PPE outperforms PPD, emphasizing the benefits of capturing consistent information in an ensemble. The relevance of these results for the design of knowledge-based potentials is discussed.

  11. Young People's Management of the Transition from Education to Employment in the Knowledge-Based Sector in Shanghai

    ERIC Educational Resources Information Center

    Wang, Qi; Lowe, John

    2011-01-01

    This paper reports on a study of the transition from university to work by students/employees in the complex and rapidly changing socio-economic context of contemporary Shanghai. It aims at understanding how highly educated young people perceive the nature and mode of operation of the newly emerging labour market for knowledge-based jobs, and how…

  12. Delivering Electronic Information in a Knowledge-Based Democracy. Summary of Proceedings (Washington, DC, July 14, 1993).

    ERIC Educational Resources Information Center

    Library of Congress, Washington, DC.

    The Library of Congress hosted a 1-day conference, "Delivering Electronic Information in a Knowledge-Based Democracy" to explore the public policy framework essential to creating electronic information resources and making them broadly available. Participants from a variety of sectors contributed to wide-ranging discussions on issues…

  13. Enhancing Student Learning in Knowledge-Based Courses: Integrating Team-Based Learning in Mass Communication Theory Classes

    ERIC Educational Resources Information Center

    Han, Gang; Newell, Jay

    2014-01-01

    This study explores the adoption of the team-based learning (TBL) method in knowledge-based and theory-oriented journalism and mass communication (J&MC) courses. It first reviews the origin and concept of TBL, the relevant theories, and then introduces the TBL method and implementation, including procedures and assessments, employed in an…

  14. Signaling network of dendritic cells in response to pathogens: a community-input supported knowledgebase

    PubMed Central

    2010-01-01

    Background Dendritic cells are antigen-presenting cells that play an essential role in linking the innate and adaptive immune systems. Much research has focused on the signaling pathways triggered upon infection of dendritic cells by various pathogens. The high level of activity in the field makes it desirable to have a pathway-based resource to access the information in the literature. Current pathway diagrams lack either comprehensiveness, or an open-access editorial interface. Hence, there is a need for a dependable, expertly curated knowledgebase that integrates this information into a map of signaling networks. Description We have built a detailed diagram of the dendritic cell signaling network, with the goal of providing researchers with a valuable resource and a facile method for community input. Network construction has relied on comprehensive review of the literature and regular updates. The diagram includes detailed depictions of pathways activated downstream of different pathogen recognition receptors such as Toll-like receptors, retinoic acid-inducible gene-I-like receptors, C-type lectin receptors and nucleotide-binding oligomerization domain-like receptors. Initially assembled using CellDesigner software, it provides an annotated graphical representation of interactions stored in Systems Biology Mark-up Language. The network, which comprises 249 nodes and 213 edges, has been web-published through the Biological Pathway Publisher software suite. Nodes are annotated with PubMed references and gene-related information, and linked to a public wiki, providing a discussion forum for updates and corrections. To gain more insight into regulatory patterns of dendritic cell signaling, we analyzed the network using graph-theory methods: bifan, feedforward and multi-input convergence motifs were enriched. This emphasis on activating control mechanisms is consonant with a network that subserves persistent and coordinated responses to pathogen detection

  15. The neXtProt knowledgebase on human proteins: 2017 update

    PubMed Central

    Gaudet, Pascale; Michel, Pierre-André; Zahn-Zabal, Monique; Britan, Aurore; Cusin, Isabelle; Domagalski, Marcin; Duek, Paula D.; Gateau, Alain; Gleizes, Anne; Hinard, Valérie; Rech de Laval, Valentine; Lin, JinJin; Nikitin, Frederic; Schaeffer, Mathieu; Teixeira, Daniel; Lane, Lydie; Bairoch, Amos

    2017-01-01

    The neXtProt human protein knowledgebase (https://www.nextprot.org) continues to add new content and tools, with a focus on proteomics and genetic variation data. neXtProt now has proteomics data for over 85% of the human proteins, as well as new tools tailored to the proteomics community. Moreover, the neXtProt release 2016-08-25 includes over 8000 phenotypic observations for over 4000 variations in a number of genes involved in hereditary cancers and channelopathies. These changes are presented in the current neXtProt update. All of the neXtProt data are available via our user interface and FTP site. We also provide an API access and a SPARQL endpoint for more technical applications. PMID:27899619

  16. Knowledge-Based Analysis And Understanding Of 3D Medical Images

    NASA Astrophysics Data System (ADS)

    Dhawan, Atam P.; Juvvadi, Sridhar

    1988-06-01

    The anatomical three-dimensional (3D) medical imaging modalities, such as X-ray CT and MRI, have been well recognized in the diagnostic radiology for several years while the nuclear medicine modalities, such as PET, have just started making a strong impact through functional imaging. Though PET images provide the functional information about the human organs, they are hard to interpret because of the lack of anatomical information. Our objective is to develop a knowledge-based biomedical image analysis system which can interpret the anatomical images (such as CT). The anatomical information thus obtained can then be used in analyzing PET images of the same patient. This will not only help in interpreting PET images but it will also provide a means of studying the correlation between the anatomical and functional imaging. This paper presents the preliminary results of the knowledge based biomedical image analysis system for interpreting CT images of the chest.

  17. Knowledge-Based Parallel Performance Technology for Scientific Application Competitiveness Final Report

    SciTech Connect

    Malony, Allen D; Shende, Sameer

    2011-08-15

    The primary goal of the University of Oregon's DOE "œcompetitiveness" project was to create performance technology that embodies and supports knowledge of performance data, analysis, and diagnosis in parallel performance problem solving. The target of our development activities was the TAU Performance System and the technology accomplishments reported in this and prior reports have all been incorporated in the TAU open software distribution. In addition, the project has been committed to maintaining strong interactions with the DOE SciDAC Performance Engineering Research Institute (PERI) and Center for Technology for Advanced Scientific Component Software (TASCS). This collaboration has proved valuable for translation of our knowledge-based performance techniques to parallel application development and performance engineering practice. Our outreach has also extended to the DOE Advanced CompuTational Software (ACTS) collection and project. Throughout the project we have participated in the PERI and TASCS meetings, as well as the ACTS annual workshops.

  18. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.

    PubMed

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/.

  19. Using fuzzy logic to integrate neural networks and knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Yen, John

    1991-01-01

    Outlined here is a novel hybrid architecture that uses fuzzy logic to integrate neural networks and knowledge-based systems. The author's approach offers important synergistic benefits to neural nets, approximate reasoning, and symbolic processing. Fuzzy inference rules extend symbolic systems with approximate reasoning capabilities, which are used for integrating and interpreting the outputs of neural networks. The symbolic system captures meta-level information about neural networks and defines its interaction with neural networks through a set of control tasks. Fuzzy action rules provide a robust mechanism for recognizing the situations in which neural networks require certain control actions. The neural nets, on the other hand, offer flexible classification and adaptive learning capabilities, which are crucial for dynamic and noisy environments. By combining neural nets and symbolic systems at their system levels through the use of fuzzy logic, the author's approach alleviates current difficulties in reconciling differences between low-level data processing mechanisms of neural nets and artificial intelligence systems.

  20. Collaborative development of knowledge-based support systems: a case study.

    PubMed

    Lindgren, Helena; Winnberg, Patrik J; Yan, Chunli

    2012-01-01

    We investigate a user-driven collaborative knowledge engineering and interaction design process. The outcome is a knowledge-based support application tailored to physicians in the local dementia care community. The activity is organized as a part of a collaborative effort between different organizations to develop their local clinical practice. Six local practitioners used the generic decision-support prototype system DMSS-R developed for the dementia domain during a period and participated in evaluations and re-design. Additional two local domain experts and a domain expert external to the local community modeled the content and design of DMSS-R by using the modeling system ACKTUS. Obstacles and success factors occurring when enabling the end-users to design their own tools are detected and interpreted using a proposed framework for improving care through the use of clinical guidelines. The results are discussed.

  1. Building organisational cyber resilience: A strategic knowledge-based view of cyber security management.

    PubMed

    Ferdinand, Jason

    The concept of cyber resilience has emerged in recent years in response to the recognition that cyber security is more than just risk management. Cyber resilience is the goal of organisations, institutions and governments across the world and yet the emerging literature is somewhat fragmented due to the lack of a common approach to the subject. This limits the possibility of effective collaboration across public, private and governmental actors in their efforts to build and maintain cyber resilience. In response to this limitation, and to calls for a more strategically focused approach, this paper offers a knowledge-based view of cyber security management that explains how an organisation can build, assess, and maintain cyber resilience.

  2. Expert Knowledge-Based Automatic Sleep Stage Determination by Multi-Valued Decision Making Method

    NASA Astrophysics Data System (ADS)

    Wang, Bei; Sugi, Takenao; Kawana, Fusae; Wang, Xingyu; Nakamura, Masatoshi

    In this study, an expert knowledge-based automatic sleep stage determination system working on a multi-valued decision making method is developed. Visual inspection by a qualified clinician is adopted to obtain the expert knowledge database. The expert knowledge database consists of probability density functions of parameters for various sleep stages. Sleep stages are determined automatically according to the conditional probability. Totally, four subjects were participated. The automatic sleep stage determination results showed close agreements with the visual inspection on sleep stages of awake, REM (rapid eye movement), light sleep and deep sleep. The constructed expert knowledge database reflects the distributions of characteristic parameters which can be adaptive to variable sleep data in hospitals. The developed automatic determination technique based on expert knowledge of visual inspection can be an assistant tool enabling further inspection of sleep disorder cases for clinical practice.

  3. Knowledge-based reasoning in the Paladin tactical decision generation system

    NASA Technical Reports Server (NTRS)

    Chappell, Alan R.

    1993-01-01

    A real-time tactical decision generation system for air combat engagements, Paladin, has been developed. A pilot's job in air combat includes tasks that are largely symbolic. These symbolic tasks are generally performed through the application of experience and training (i.e. knowledge) gathered over years of flying a fighter aircraft. Two such tasks, situation assessment and throttle control, are identified and broken out in Paladin to be handled by specialized knowledge based systems. Knowledge pertaining to these tasks is encoded into rule-bases to provide the foundation for decisions. Paladin uses a custom built inference engine and a partitioned rule-base structure to give these symbolic results in real-time. This paper provides an overview of knowledge-based reasoning systems as a subset of rule-based systems. The knowledge used by Paladin in generating results as well as the system design for real-time execution is discussed.

  4. Knowledge-Based Manufacturing and Structural Design for a High Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Marx, William J.; Mavris, Dimitri N.; Schrage, Daniel P.

    1994-01-01

    The aerospace industry is currently addressing the problem of integrating manufacturing and design. To address the difficulties associated with using many conventional procedural techniques and algorithms, one feasible way to integrate the two concepts is with the development of an appropriate Knowledge-Based System (KBS). The authors present their reasons for selecting a KBS to integrate design and manufacturing. A methodology for an aircraft producibility assessment is proposed, utilizing a KBS for manufacturing process selection, that addresses both procedural and heuristic aspects of designing and manufacturing of a High Speed Civil Transport (HSCT) wing. A cost model is discussed that would allow system level trades utilizing information describing the material characteristics as well as the manufacturing process selections. Statements of future work conclude the paper.

  5. Knowledge-based approach to de novo design using reaction vectors.

    PubMed

    Patel, Hina; Bodkin, Michael J; Chen, Beining; Gillet, Valerie J

    2009-05-01

    A knowledge-based approach to the de novo design of synthetically feasible molecules is described. The method is based on reaction vectors which represent the structural changes that take place at the reaction center along with the environment in which the reaction occurs. The reaction vectors are derived automatically from a database of reactions which is not restricted by size or reaction complexity. A structure generation algorithm has been developed whereby reaction vectors can be applied to previously unseen starting materials in order to suggest novel syntheses. The approach has been implemented in KNIME and is validated by reproducing known synthetic routes. We then present applications of the method in different drug design scenarios including lead optimization and library enumeration. The method offers great potential for capturing and using the growing body of data on reactions that is becoming available through electronic laboratory notebooks.

  6. The neXtProt knowledgebase on human proteins: current status

    PubMed Central

    Gaudet, Pascale; Michel, Pierre-André; Zahn-Zabal, Monique; Cusin, Isabelle; Duek, Paula D.; Evalet, Olivier; Gateau, Alain; Gleizes, Anne; Pereira, Mario; Teixeira, Daniel; Zhang, Ying; Lane, Lydie; Bairoch, Amos

    2015-01-01

    neXtProt (http://www.nextprot.org) is a human protein-centric knowledgebase developed at the SIB Swiss Institute of Bioinformatics. Focused solely on human proteins, neXtProt aims to provide a state of the art resource for the representation of human biology by capturing a wide range of data, precise annotations, fully traceable data provenance and a web interface which enables researchers to find and view information in a comprehensive manner. Since the introductory neXtProt publication, significant advances have been made on three main aspects: the representation of proteomics data, an extended representation of human variants and the development of an advanced search capability built around semantic technologies. These changes are presented in the current neXtProt update. PMID:25593349

  7. Knowledge-based potentials in bioinformatics: From a physicist’s viewpoint

    NASA Astrophysics Data System (ADS)

    Zheng, Wei-Mou

    2015-12-01

    Biological raw data are growing exponentially, providing a large amount of information on what life is. It is believed that potential functions and the rules governing protein behaviors can be revealed from analysis on known native structures of proteins. Many knowledge-based potentials for proteins have been proposed. Contrary to most existing review articles which mainly describe technical details and applications of various potential models, the main foci for the discussion here are ideas and concepts involving the construction of potentials, including the relation between free energy and energy, the additivity of potentials of mean force and some key issues in potential construction. Sequence analysis is briefly viewed from an energetic viewpoint. Project supported in part by the National Natural Science Foundation of China (Grant Nos. 11175224 and 11121403).

  8. A knowledge-based expert system for managing underground coal mines in the US

    SciTech Connect

    Grayson, R.L.; Yuan, S.; Dean, J.M.; Reddy, N.P. )

    1990-07-01

    Research by the U.S. Bureau of Mines (BOM) on the reasons why some mines are more productive than others has revealed the importance of good mine management practices. The Mine Management Support System is being developed, under the cosponsorship of the BOM and the West Virginia Energy and Water Research Center, as a knowledge-based expert system for better management of underground coal mines. Concentrating on capturing the complex body of knowledge needed to enhance efficient management of a mine, it will encompass information and preferred rules on work scheduling, work practices, regulations impinging on the accomplishment of work, responses to operating problems, and the labor-management work agreement. In this paper different components of the mine system, modeled using an object-oriented layering technique, will be displayed graphically to aid in coordinating work plans, and to present locations of equipment, supplies, and proposed subsystem components.

  9. Analysis of knowledge-based expert systems as tools for construction design

    NASA Astrophysics Data System (ADS)

    Cole, Arthur N.

    1991-03-01

    Because construction costs are continuously rising, Congress mandated that those within the respective branches of military service who are responsible for planning and executing construction programs develop policies and procedures that ensure that the individual projects are designed, bid, and constructed as rapidly as possible. This requires an approach that demands maximum efficiency from the design process. Reviews are necessary to ensure that designs meet all requirements, but the reviews themselves must be conducted in the least amount of time so as to preclude delays. Design tools that increases efficiency are knowledge-based expert systems which are interactive computer programs that incorporate judgement, experience, rules of thumb, and other expertise, so as to provide knowledgeable advice about a specific domain. They mimic the thought process employed by a human expert in solving a problem.

  10. Structural topology design of container ship based on knowledge-based engineering and level set method

    NASA Astrophysics Data System (ADS)

    Cui, Jin-ju; Wang, De-yu; Shi, Qi-qi

    2015-06-01

    Knowledge-Based Engineering (KBE) is introduced into the ship structural design in this paper. From the implementation of KBE, the design solutions for both Rules Design Method (RDM) and Interpolation Design Method (IDM) are generated. The corresponding Finite Element (FE) models are generated. Topological design of the longitudinal structures is studied where the Gaussian Process (GP) is employed to build the surrogate model for FE analysis. Multi-objective optimization methods inspired by Pareto Front are used to reduce the design tank weight and outer surface area simultaneously. Additionally, an enhanced Level Set Method (LSM) which employs implicit algorithm is applied to the topological design of typical bracket plate which is used extensively in ship structures. Two different sets of boundary conditions are considered. The proposed methods show satisfactory efficiency and accuracy.

  11. Geomorphological feature extraction from a digital elevation model through fuzzy knowledge-based classification

    NASA Astrophysics Data System (ADS)

    Argialas, Demetre P.; Tzotsos, Angelos

    2003-03-01

    The objective of this research was the investigation of advanced image analysis methods for geomorphological mapping. Methods employed included multiresolution segmentation of the Digital Elevation Model (DEM) GTOPO30 and fuzzy knowledge based classification of the segmented DEM into three geomorphological classes: mountain ranges, piedmonts and basins. The study area was a segment of the Basin and Range Physiographic Province in Nevada, USA. The implementation was made in eCognition. In particular, the segmentation of GTOPO30 resulted into primitive objects. The knowledge-based classification of the primitive objects based on their elevation and shape parameters, resulted in the extraction of the geomorphological features. The resulted boundaries in comparison to those by previous studies were found satisfactory. It is concluded that geomorphological feature extraction can be carried out through fuzzy knowledge based classification as implemented in eCognition.

  12. Optimization of knowledge-based systems and expert system building tools

    NASA Technical Reports Server (NTRS)

    Yasuda, Phyllis; Mckellar, Donald

    1993-01-01

    The objectives of the NASA-AMES Cooperative Agreement were to investigate, develop, and evaluate, via test cases, the system parameters and processing algorithms that constrain the overall performance of the Information Sciences Division's Artificial Intelligence Research Facility. Written reports covering various aspects of the grant were submitted to the co-investigators for the grant. Research studies concentrated on the field of artificial intelligence knowledge-based systems technology. Activities included the following areas: (1) AI training classes; (2) merging optical and digital processing; (3) science experiment remote coaching; (4) SSF data management system tests; (5) computer integrated documentation project; (6) conservation of design knowledge project; (7) project management calendar and reporting system; (8) automation and robotics technology assessment; (9) advanced computer architectures and operating systems; and (10) honors program.

  13. FTDD973: A multimedia knowledge-based system and methodology for operator training and diagnostics

    NASA Technical Reports Server (NTRS)

    Hekmatpour, Amir; Brown, Gary; Brault, Randy; Bowen, Greg

    1993-01-01

    FTDD973 (973 Fabricator Training, Documentation, and Diagnostics) is an interactive multimedia knowledge based system and methodology for computer-aided training and certification of operators, as well as tool and process diagnostics in IBM's CMOS SGP fabrication line (building 973). FTDD973 is an example of what can be achieved with modern multimedia workstations. Knowledge-based systems, hypertext, hypergraphics, high resolution images, audio, motion video, and animation are technologies that in synergy can be far more useful than each by itself. FTDD973's modular and object-oriented architecture is also an example of how improvements in software engineering are finally making it possible to combine many software modules into one application. FTDD973 is developed in ExperMedia/2; and OS/2 multimedia expert system shell for domain experts.

  14. A Knowledge-based Evolution Algorithm approach to political districting problem

    NASA Astrophysics Data System (ADS)

    Chou, Chung-I.

    2011-01-01

    The political districting problem is to study how to partition a comparatively large zone into many minor electoral districts. In our previously works, we have mapped this political problem onto a q-state Potts model system by using statistical physics methods. The political constraints (such as contiguity, population equality, etc.) are transformed to an energy function with interactions between sites or external fields acting on the system. Several optimization algorithms such as simulated annealing method and genetic algorithm have been applied to this problem. In this report, we will show how to apply the Knowledge-based Evolution Algorithm (KEA) to the problem. Our test objects include two real cities (Taipei and Kaohsiung) and the simulated cities. The results showed the KEA can reach the same minimum which has been found by using other methods in each test case.

  15. Cyanobacterial KnowledgeBase (CKB), a Compendium of Cyanobacterial Genomes and Proteomes

    PubMed Central

    Mohandass, Shylajanaciyar; Varadharaj, Sangeetha; Thilagar, Sivasudha; Abdul Kareem, Kaleel Ahamed; Dharmar, Prabaharan; Gopalakrishnan, Subramanian; Lakshmanan, Uma

    2015-01-01

    Cyanobacterial KnowledgeBase (CKB) is a free access database that contains the genomic and proteomic information of 74 fully sequenced cyanobacterial genomes belonging to seven orders. The database also contains tools for sequence analysis. The Species report and the gene report provide details about each species and gene (including sequence features and gene ontology annotations) respectively. The database also includes cyanoBLAST, an advanced tool that facilitates comparative analysis, among cyanobacterial genomes and genomes of E. coli (prokaryote) and Arabidopsis (eukaryote). The database is developed and maintained by the Sub-Distributed Informatics Centre (sponsored by the Department of Biotechnology, Govt. of India) of the National Facility for Marine Cyanobacteria, a facility dedicated to marine cyanobacterial research. CKB is freely available at http://nfmc.res.in/ckb/index.html. PMID:26305368

  16. KIPSE1: A Knowledge-based Interactive Problem Solving Environment for data estimation and pattern classification

    NASA Technical Reports Server (NTRS)

    Han, Chia Yung; Wan, Liqun; Wee, William G.

    1990-01-01

    A knowledge-based interactive problem solving environment called KIPSE1 is presented. The KIPSE1 is a system built on a commercial expert system shell, the KEE system. This environment gives user capability to carry out exploratory data analysis and pattern classification tasks. A good solution often consists of a sequence of steps with a set of methods used at each step. In KIPSE1, solution is represented in the form of a decision tree and each node of the solution tree represents a partial solution to the problem. Many methodologies are provided at each node to the user such that the user can interactively select the method and data sets to test and subsequently examine the results. Otherwise, users are allowed to make decisions at various stages of problem solving to subdivide the problem into smaller subproblems such that a large problem can be handled and a better solution can be found.

  17. A Knowledge-Based System for the Computer Assisted Diagnosis of Endoscopic Images

    NASA Astrophysics Data System (ADS)

    Kage, Andreas; Münzenmayer, Christian; Wittenberg, Thomas

    Due to the actual demographic development the use of Computer-Assisted Diagnosis (CAD) systems becomes a more important part of clinical workflows and clinical decision making. Because changes on the mucosa of the esophagus can indicate the first stage of cancerous developments, there is a large interest to detect and correctly diagnose any such lesion. We present a knowledge-based system which is able to support a physician with the interpretation and diagnosis of endoscopic images of the esophagus. Our system is designed to support the physician directly during the examination of the patient, thus prodving diagnostic assistence at the point of care (POC). Based on an interactively marked region in an endoscopic image of interest, the system provides a diagnostic suggestion, based on an annotated reference image database. Furthermore, using relevant feedback mechanisms, the results can be enhanced interactively.

  18. Knowledge-based automatic recognition technology of radome from infrared images

    NASA Astrophysics Data System (ADS)

    Wang, Xiao-jian; Ma, Ling; Fang, Xiao; Chen, Lei; Lu, Hong-bin

    2009-07-01

    In this paper, a kind of knowledge-based automatic target recognition (ATR) technology of radome from infrared image is studied. The circular imaging of radome is used as the characteristic distinguished from background to realize target recognition. For the characteristic of low contrast of infrared image, brightness transformation is used to preliminarily enhance the contrast of the original image. In the light of the fact that target background outline statistically takes on vertical and horizontal directivity, a kind of revised Sobel operator with direction of 45°and 135°is adopted to detect edge feature so that background noise is effectively suppressed. To reduce the error ratio of target recognition from single frame image, the method to inspect the relativity of target recognition results of successive frames is adopted. The performance of the algorithm is tested using actually taken infrared radome images, and the right recognition ratio is around 90%, which turns out that this technology is feasible.

  19. A knowledge-based imaging informatics approach to managing patients treated with proton beam therapy

    NASA Astrophysics Data System (ADS)

    Liu, B. J.; Huang, H. K.; Law, M.; Le, Anh; Documet, Jorge; Gertych, Arek

    2007-03-01

    Last year we presented work on an imaging informatics approach towards developing quantitative knowledge and tools based on standardized DICOM-RT objects for Image-Guided Radiation Therapy. In this paper, we have extended this methodology to perform knowledge-based medical imaging informatics research on specific clinical scenarios where brain tumor patients are treated with Proton Beam Therapy (PT). PT utilizes energized charged particles, protons, to deliver dose to the target region. Protons are energized to specific velocities which determine where they will deposit maximum energy within the body to destroy cancerous cells. Treatment Planning is similar in workflow to traditional Radiation Therapy methods such as Intensity-Modulated Radiation Therapy (IMRT) which utilizes a priori knowledge to drive the treatment plan in an inverse manner. In March 2006, two new RT Objects were drafted in a DICOM-RT Supplement 102 specifically for Ion Therapy which includes Proton Therapy. The standardization of DICOM-RT-ION objects and the development of a knowledge base as well as decision-support tools that can be add-on features to the ePR DICOM-RT system were researched. We have developed a methodology to perform knowledge-based medical imaging informatics research on specific clinical scenarios. This methodology can be used to extend to Proton Therapy and the development of future clinical decision-making scenarios during the course of the patient's treatment that utilize "inverse treatment planning". In this paper, we present the initial steps toward extending this methodology for PT and lay the foundation for development of future decision-support tools tailored to cancer patients treated with PT. By integrating decision-support knowledge and tools designed to assist in the decision-making process, a new and improved "knowledge-enhanced treatment planning" approach can be realized.

  20. Knowledge-based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.

    1992-01-01

    Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scaleable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on the concept called the Data Hub. With the Data Hub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (access, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.

  1. Knowledge-based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.

    1993-01-01

    Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scalable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded with this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on concept called the DataHub. With the DataHub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (address, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.

  2. Comparative development of knowledge-based bioeconomy in the European Union and Turkey.

    PubMed

    Celikkanat Ozan, Didem; Baran, Yusuf

    2014-09-01

    Biotechnology, defined as the technological application that uses biological systems and living organisms, or their derivatives, to create or modify diverse products or processes, is widely used for healthcare, agricultural and environmental applications. The continuity in industrial applications of biotechnology enables the rise and development of the bioeconomy concept. Bioeconomy, including all applications of biotechnology, is defined as translation of knowledge received from life sciences into new, sustainable, environment friendly and competitive products. With the advanced research and eco-efficient processes in the scope of bioeconomy, more healthy and sustainable life is promised. Knowledge-based bioeconomy with its economic, social and environmental potential has already been brought to the research agendas of European Union (EU) countries. The aim of this study is to summarize the development of knowledge-based bioeconomy in EU countries and to evaluate Turkey's current situation compared to them. EU-funded biotechnology research projects under FP6 and FP7 and nationally-funded biotechnology projects under The Scientific and Technological Research Council of Turkey (TUBITAK) Academic Research Funding Program Directorate (ARDEB) and Technology and Innovation Funding Programs Directorate (TEYDEB) were examined. In the context of this study, the main research areas and subfields which have been funded, the budget spent and the number of projects funded since 2003 both nationally and EU-wide and the gaps and overlapping topics were analyzed. In consideration of the results, detailed suggestions for Turkey have been proposed. The research results are expected to be used as a roadmap for coordinating the stakeholders of bioeconomy and integrating Turkish Research Areas into European Research Areas.

  3. Integrating a modern knowledge-based system architecture with a legacy VA database: the ATHENA and EON projects at Stanford.

    PubMed

    Advani, A; Tu, S; O'Connor, M; Coleman, R; Goldstein, M K; Musen, M

    1999-01-01

    We present a methodology and database mediator tool for integrating modern knowledge-based systems, such as the Stanford EON architecture for automated guideline-based decision-support, with legacy databases, such as the Veterans Health Information Systems & Technology Architecture (VISTA) systems, which are used nation-wide. Specifically, we discuss designs for database integration in ATHENA, a system for hypertension care based on EON, at the VA Palo Alto Health Care System. We describe a new database mediator that affords the EON system both physical and logical data independence from the legacy VA database. We found that to achieve our design goals, the mediator requires two separate mapping levels and must itself involve a knowledge-based component.

  4. The role of textual semantic constraints in knowledge-based inference generation during reading comprehension: A computational approach.

    PubMed

    Yeari, Menahem; van den Broek, Paul

    2015-01-01

    The present research adopted a computational approach to explore the extent to which the semantic content of texts constrains the activation of knowledge-based inferences. Specifically, we examined whether textual semantic constraints (TSC) can explain (1) the activation of predictive inferences, (2) the activation of bridging inferences and (3) the higher prevalence of the activation of bridging inferences compared to predictive inferences. To examine these hypotheses, we computed the strength of semantic associations between texts and probe items as presented to human readers in previous behavioural studies, using the Latent Semantic Analysis (LSA) algorithm. We tested whether stronger semantic associations are observed for inferred items compared to control items. Our results show that in 15 out of 17 planned comparisons, the computed strength of semantic associations successfully simulated the activation of inferences. These findings suggest that TSC play a central role in the activation of knowledge-based inferences.

  5. CIViC is a community knowledgebase for expert crowdsourcing the clinical interpretation of variants in cancer

    PubMed Central

    Griffith, Malachi; Spies, Nicholas C; Krysiak, Kilannin; McMichael, Joshua F; Coffman, Adam C; Danos, Arpad M; Ainscough, Benjamin J; Ramirez, Cody A; Rieke, Damian T; Kujan, Lynzey; Barnell, Erica K; Wagner, Alex H; Skidmore, Zachary L; Wollam, Amber; Liu, Connor J; Jones, Martin R; Bilski, Rachel L; Lesurf, Robert; Feng, Yan-Yang; Shah, Nakul M; Bonakdar, Melika; Trani, Lee; Matlock, Matthew; Ramu, Avinash; Campbell, Katie M; Spies, Gregory C; Graubert, Aaron P; Gangavarapu, Karthik; Eldred, James M; Larson, David E; Walker, Jason R; Good, Benjamin M; Wu, Chunlei; Su, Andrew I; Dienstmann, Rodrigo; Margolin, Adam A; Tamborero, David; Lopez-Bigas, Nuria; Jones, Steven J M; Bose, Ron; Spencer, David H; Wartman, Lukas D; Wilson, Richard K; Mardis, Elaine R; Griffith, Obi L

    2017-01-01

    CIViC is an expert-crowdsourced knowledgebase for Clinical Interpretation of Variants in Cancer describing the therapeutic, prognostic, diagnostic and predisposing relevance of inherited and somatic variants of all types. CIViC is committed to open-source code, open-access content, public application programming interfaces (APIs) and provenance of supporting evidence to allow for the transparent creation of current and accurate variant interpretations for use in cancer precision medicine. PMID:28138153

  6. Knowledge-based personalized search engine for the Web-based Human Musculoskeletal System Resources (HMSR) in biomechanics.

    PubMed

    Dao, Tien Tuan; Hoang, Tuan Nha; Ta, Xuan Hien; Tho, Marie Christine Ho Ba

    2013-02-01

    Human musculoskeletal system resources of the human body are valuable for the learning and medical purposes. Internet-based information from conventional search engines such as Google or Yahoo cannot response to the need of useful, accurate, reliable and good-quality human musculoskeletal resources related to medical processes, pathological knowledge and practical expertise. In this present work, an advanced knowledge-based personalized search engine was developed. Our search engine was based on a client-server multi-layer multi-agent architecture and the principle of semantic web services to acquire dynamically accurate and reliable HMSR information by a semantic processing and visualization approach. A security-enhanced mechanism was applied to protect the medical information. A multi-agent crawler was implemented to develop a content-based database of HMSR information. A new semantic-based PageRank score with related mathematical formulas were also defined and implemented. As the results, semantic web service descriptions were presented in OWL, WSDL and OWL-S formats. Operational scenarios with related web-based interfaces for personal computers and mobile devices were presented and analyzed. Functional comparison between our knowledge-based search engine, a conventional search engine and a semantic search engine showed the originality and the robustness of our knowledge-based personalized search engine. In fact, our knowledge-based personalized search engine allows different users such as orthopedic patient and experts or healthcare system managers or medical students to access remotely into useful, accurate, reliable and good-quality HMSR information for their learning and medical purposes.

  7. Analysis of Defects in Trouser Manufacturing: Development of a Knowledge-Based Framework. Volume 1. Final Technical Report

    DTIC Science & Technology

    1992-02-28

    apparel manufacturing. Two knowledge-based software systems--FDAS (Fabric Defects Analysis System) and SDAS (Sewing Defects Analysis System) -- have been...sewing, finishing and packing departments of an apparel plant producing denim trousers. Based on the visual description of the defect in the fabric...type, orientation and mode of repetition of the defect), FDAS identifies the defect and suggest possible causes and remedies. Apparel Quality Control

  8. Evaluation of a Knowledge-Based Planning Solution for Head and Neck Cancer

    SciTech Connect

    Tol, Jim P. Delaney, Alexander R.; Dahele, Max; Slotman, Ben J.; Verbakel, Wilko F.A.R.

    2015-03-01

    Purpose: Automated and knowledge-based planning techniques aim to reduce variations in plan quality. RapidPlan uses a library consisting of different patient plans to make a model that can predict achievable dose-volume histograms (DVHs) for new patients and uses those models for setting optimization objectives. We benchmarked RapidPlan versus clinical plans for 2 patient groups, using 3 different libraries. Methods and Materials: Volumetric modulated arc therapy plans of 60 recent head and neck cancer patients that included sparing of the salivary glands, swallowing muscles, and oral cavity were evenly divided between 2 models, Model{sub 30A} and Model{sub 30B}, and were combined in a third model, Model{sub 60}. Knowledge-based plans were created for 2 evaluation groups: evaluation group 1 (EG1), consisting of 15 recent patients, and evaluation group 2 (EG2), consisting of 15 older patients in whom only the salivary glands were spared. RapidPlan results were compared with clinical plans (CP) for boost and/or elective planning target volume homogeneity index, using HI{sub B}/HI{sub E} = 100 × (D2% − D98%)/D50%, and mean dose to composite salivary glands, swallowing muscles, and oral cavity (D{sub sal}, D{sub swal}, and D{sub oc}, respectively). Results: For EG1, RapidPlan improved HI{sub B} and HI{sub E} values compared with CP by 1.0% to 1.3% and 1.0% to 0.6%, respectively. Comparable D{sub sal} and D{sub swal} values were seen in Model{sub 30A}, Model{sub 30B}, and Model{sub 60}, decreasing by an average of 0.1, 1.0, and 0.8 Gy and 4.8, 3.7, and 4.4 Gy, respectively. However, differences were noted between individual organs at risk (OARs), with Model{sub 30B} increasing D{sub oc} by 0.1, 3.2, and 2.8 Gy compared with CP, Model{sub 30A}, and Model{sub 60}. Plan quality was less consistent when the patient was flagged as an outlier. For EG2, RapidPlan decreased D{sub sal} by 4.1 to 4.9 Gy on average, whereas HI{sub B} and HI{sub E} decreased by 1.1% to

  9. Design of Composite Structures Using Knowledge-Based and Case Based Reasoning

    NASA Technical Reports Server (NTRS)

    Lambright, Jonathan Paul

    1996-01-01

    A method of using knowledge based and case based reasoning to assist designers during conceptual design tasks of composite structures was proposed. The cooperative use of heuristics, procedural knowledge, and previous similar design cases suggests a potential reduction in design cycle time and ultimately product lead time. The hypothesis of this work is that the design process of composite structures can be improved by using Case-Based Reasoning (CBR) and Knowledge-Based (KB) reasoning in the early design stages. The technique of using knowledge-based and case-based reasoning facilitates the gathering of disparate information into one location that is easily and readily available. The method suggests that the inclusion of downstream life-cycle issues into the conceptual design phase reduces potential of defective, and sub-optimal composite structures. Three industry experts were interviewed extensively. The experts provided design rules, previous design cases, and test problems. A Knowledge Based Reasoning system was developed using the CLIPS (C Language Interpretive Procedural System) environment and a Case Based Reasoning System was developed using the Design Memory Utility For Sharing Experiences (MUSE) xviii environment. A Design Characteristic State (DCS) was used to document the design specifications, constraints, and problem areas using attribute-value pair relationships. The DCS provided consistent design information between the knowledge base and case base. Results indicated that the use of knowledge based and case based reasoning provided a robust design environment for composite structures. The knowledge base provided design guidance from well defined rules and procedural knowledge. The case base provided suggestions on design and manufacturing techniques based on previous similar designs and warnings of potential problems and pitfalls. The case base complemented the knowledge base and extended the problem solving capability beyond the existence of

  10. Knowledge-based prediction of three-dimensional dose distributions for external beam radiotherapy

    SciTech Connect

    Shiraishi, Satomi; Moore, Kevin L.

    2016-01-15

    Purpose: To demonstrate knowledge-based 3D dose prediction for external beam radiotherapy. Methods: Using previously treated plans as training data, an artificial neural network (ANN) was trained to predict a dose matrix based on patient-specific geometric and planning parameters, such as the closest distance (r) to planning target volume (PTV) and organ-at-risks (OARs). Twenty-three prostate and 43 stereotactic radiosurgery/radiotherapy (SRS/SRT) cases with at least one nearby OAR were studied. All were planned with volumetric-modulated arc therapy to prescription doses of 81 Gy for prostate and 12–30 Gy for SRS. Using these clinically approved plans, ANNs were trained to predict dose matrix and the predictive accuracy was evaluated using the dose difference between the clinical plan and prediction, δD = D{sub clin} − D{sub pred}. The mean (〈δD{sub r}〉), standard deviation (σ{sub δD{sub r}}), and their interquartile range (IQR) for the training plans were evaluated at a 2–3 mm interval from the PTV boundary (r{sub PTV}) to assess prediction bias and precision. Initially, unfiltered models which were trained using all plans in the cohorts were created for each treatment site. The models predict approximately the average quality of OAR sparing. Emphasizing a subset of plans that exhibited superior to the average OAR sparing during training, refined models were created to predict high-quality rectum sparing for prostate and brainstem sparing for SRS. Using the refined model, potentially suboptimal plans were identified where the model predicted further sparing of the OARs was achievable. Replans were performed to test if the OAR sparing could be improved as predicted by the model. Results: The refined models demonstrated highly accurate dose distribution prediction. For prostate cases, the average prediction bias for all voxels irrespective of organ delineation ranged from −1% to 0% with maximum IQR of 3% over r{sub PTV} ∈ [ − 6, 30] mm. The

  11. A Knowledge-Based Approach to Improving and Homogenizing Intensity Modulated Radiation Therapy Planning Quality Among Treatment Centers: An Example Application to Prostate Cancer Planning

    SciTech Connect

    Good, David; Lo, Joseph; Lee, W. Robert; Wu, Q. Jackie; Yin, Fang-Fang; Das, Shiva K.

    2013-09-01

    Purpose: Intensity modulated radiation therapy (IMRT) treatment planning can have wide variation among different treatment centers. We propose a system to leverage the IMRT planning experience of larger institutions to automatically create high-quality plans for outside clinics. We explore feasibility by generating plans for patient datasets from an outside institution by adapting plans from our institution. Methods and Materials: A knowledge database was created from 132 IMRT treatment plans for prostate cancer at our institution. The outside institution, a community hospital, provided the datasets for 55 prostate cancer cases, including their original treatment plans. For each “query” case from the outside institution, a similar “match” case was identified in the knowledge database, and the match case’s plan parameters were then adapted and optimized to the query case by use of a semiautomated approach that required no expert planning knowledge. The plans generated with this knowledge-based approach were compared with the original treatment plans at several dose cutpoints. Results: Compared with the original plan, the knowledge-based plan had a significantly more homogeneous dose to the planning target volume and a significantly lower maximum dose. The volumes of the rectum, bladder, and femoral heads above all cutpoints were nominally lower for the knowledge-based plan; the reductions were significantly lower for the rectum. In 40% of cases, the knowledge-based plan had overall superior (lower) dose–volume histograms for rectum and bladder; in 54% of cases, the comparison was equivocal; in 6% of cases, the knowledge-based plan was inferior for both bladder and rectum. Conclusions: Knowledge-based planning was superior or equivalent to the original plan in 95% of cases. The knowledge-based approach shows promise for homogenizing plan quality by transferring planning expertise from more experienced to less experienced institutions.

  12. Initial Validation of a Knowledge-Based Measure of Social Information Processing and Anger Management

    PubMed Central

    Cassano, Michael; MacEvoy, Julie Paquette; Costigan, Tracy

    2010-01-01

    Over the past fifteen years many schools have utilized aggression prevention programs. Despite these apparent advances, many programs are not examined systematically to determine the areas in which they are most effective. One reason for this is that many programs, especially those in urban under-resourced areas, do not utilize outcome measures that are sensitive to the needs of ethnic minority students. The current study illustrates how a new knowledge-based measure of social information processing and anger management techniques was designed through a partnership-based process to ensure that it would be sensitive to the needs of urban, predominately African American youngsters, while also having broad potential applicability for use as an outcome assessment tool for aggression prevention programs focusing upon social information processing. The new measure was found to have strong psychometric properties within a sample of urban predominately African American youth, as item analyses suggested that almost all items discriminate well between more and less knowledgeable individuals, that the test-retest reliability of the measure is strong, and that the measure appears to be sensitive to treatment changes over time. In addition, the overall score of this new measure is moderately associated with attributions of hostility on two measures (negative correlations) and demonstrates a low to moderate negative association with peer and teacher report measures of overt and relational aggression. More research is needed to determine the measure's utility outside of the urban school context. PMID:20449645

  13. Human Disease Insight: An integrated knowledge-based platform for disease-gene-drug information.

    PubMed

    Tasleem, Munazzah; Ishrat, Romana; Islam, Asimul; Ahmad, Faizan; Hassan, Md Imtaiyaz

    2016-01-01

    The scope of the Human Disease Insight (HDI) database is not limited to researchers or physicians as it also provides basic information to non-professionals and creates disease awareness, thereby reducing the chances of patient suffering due to ignorance. HDI is a knowledge-based resource providing information on human diseases to both scientists and the general public. Here, our mission is to provide a comprehensive human disease database containing most of the available useful information, with extensive cross-referencing. HDI is a knowledge management system that acts as a central hub to access information about human diseases and associated drugs and genes. In addition, HDI contains well-classified bioinformatics tools with helpful descriptions. These integrated bioinformatics tools enable researchers to annotate disease-specific genes and perform protein analysis, search for biomarkers and identify potential vaccine candidates. Eventually, these tools will facilitate the analysis of disease-associated data. The HDI provides two types of search capabilities and includes provisions for downloading, uploading and searching disease/gene/drug-related information. The logistical design of the HDI allows for regular updating. The database is designed to work best with Mozilla Firefox and Google Chrome and is freely accessible at http://humandiseaseinsight.com.

  14. Knowledge-based analysis of microarrays for the discovery of transcriptional regulation relationships

    PubMed Central

    2010-01-01

    Background The large amount of high-throughput genomic data has facilitated the discovery of the regulatory relationships between transcription factors and their target genes. While early methods for discovery of transcriptional regulation relationships from microarray data often focused on the high-throughput experimental data alone, more recent approaches have explored the integration of external knowledge bases of gene interactions. Results In this work, we develop an algorithm that provides improved performance in the prediction of transcriptional regulatory relationships by supplementing the analysis of microarray data with a new method of integrating information from an existing knowledge base. Using a well-known dataset of yeast microarrays and the Yeast Proteome Database, a comprehensive collection of known information of yeast genes, we show that knowledge-based predictions demonstrate better sensitivity and specificity in inferring new transcriptional interactions than predictions from microarray data alone. We also show that comprehensive, direct and high-quality knowledge bases provide better prediction performance. Comparison of our results with ChIP-chip data and growth fitness data suggests that our predicted genome-wide regulatory pairs in yeast are reasonable candidates for follow-up biological verification. Conclusion High quality, comprehensive, and direct knowledge bases, when combined with appropriate bioinformatic algorithms, can significantly improve the discovery of gene regulatory relationships from high throughput gene expression data. PMID:20122245

  15. HIVed, a knowledgebase for differentially expressed human genes and proteins during HIV infection, replication and latency

    PubMed Central

    Li, Chen; Ramarathinam, Sri H.; Revote, Jerico; Khoury, Georges; Song, Jiangning; Purcell, Anthony W.

    2017-01-01

    Measuring the altered gene expression level and identifying differentially expressed genes/proteins during HIV infection, replication and latency is fundamental for broadening our understanding of the mechanisms of HIV infection and T-cell dysfunction. Such studies are crucial for developing effective strategies for virus eradication from the body. Inspired by the availability and enrichment of gene expression data during HIV infection, replication and latency, in this study, we proposed a novel compendium termed HIVed (HIV expression database; http://hivlatency.erc.monash.edu/) that harbours comprehensive functional annotations of proteins, whose genes have been shown to be dysregulated during HIV infection, replication and latency using different experimental designs and measurements. We manually curated a variety of third-party databases for structural and functional annotations of the protein entries in HIVed. With the goal of benefiting HIV related research, we collected a number of biological annotations for all the entries in HIVed besides their expression profile, including basic protein information, Gene Ontology terms, secondary structure, HIV-1 interaction and pathway information. We hope this comprehensive protein-centric knowledgebase can bridge the gap between the understanding of differentially expressed genes and the functions of their protein products, facilitating the generation of novel hypotheses and treatment strategies to fight against the HIV pandemic. PMID:28358052

  16. Knowledge-based deformable surface model with application to segmentation of brain structures in MRI

    NASA Astrophysics Data System (ADS)

    Ghanei, Amir; Soltanian-Zadeh, Hamid; Elisevich, Kost; Fessler, Jeffrey A.

    2001-07-01

    We have developed a knowledge-based deformable surface for segmentation of medical images. This work has been done in the context of segmentation of hippocampus from brain MRI, due to its challenge and clinical importance. The model has a polyhedral discrete structure and is initialized automatically by analyzing brain MRI sliced by slice, and finding few landmark features at each slice using an expert system. The expert system decides on the presence of the hippocampus and its general location in each slice. The landmarks found are connected together by a triangulation method, to generate a closed initial surface. The surface deforms under defined internal and external force terms thereafter, to generate an accurate and reproducible boundary for the hippocampus. The anterior and posterior (AP) limits of the hippocampus is estimated by automatic analysis of the location of brain stem, and some of the features extracted in the initialization process. These data are combined together with a priori knowledge using Bayes method to estimate a probability density function (pdf) for the length of the structure in sagittal direction. The hippocampus AP limits are found by optimizing this pdf. The model is tested on real clinical data and the results show very good model performance.

  17. Online Mendelian Inheritance in Man (OMIM), a knowledgebase of human genes and genetic disorders.

    PubMed

    Hamosh, Ada; Scott, Alan F; Amberger, Joanna S; Bocchini, Carol A; McKusick, Victor A

    2005-01-01

    Online Mendelian Inheritance in Man (OMIM) is a comprehensive, authoritative and timely knowledgebase of human genes and genetic disorders compiled to support human genetics research and education and the practice of clinical genetics. Started by Dr Victor A. McKusick as the definitive reference Mendelian Inheritance in Man, OMIM (http://www.ncbi.nlm.nih.gov/omim/) is now distributed electronically by the National Center for Biotechnology Information, where it is integrated with the Entrez suite of databases. Derived from the biomedical literature, OMIM is written and edited at Johns Hopkins University with input from scientists and physicians around the world. Each OMIM entry has a full-text summary of a genetically determined phenotype and/or gene and has numerous links to other genetic databases such as DNA and protein sequence, PubMed references, general and locus-specific mutation databases, HUGO nomenclature, MapViewer, GeneTests, patient support groups and many others. OMIM is an easy and straightforward portal to the burgeoning information in human genetics.

  18. A methodology for evaluating potential KBS (Knowledge-Based Systems) applications

    SciTech Connect

    Melton, R.B.; DeVaney, D.M.; Whiting, M.A.; Laufmann, S.C.

    1989-06-01

    It is often difficult to assess how well Knowledge-Based Systems (KBS) techniques and paradigms may be applied to automating various tasks. This report describes the approach and organization of an assessment procedure that involves two levels of analysis. Level One can be performed by individuals with little technical expertise relative to KBS development, while Level Two is intended to be used by experienced KBS developers. The two levels review four groups of issues: goals, appropriateness, resources, and non-technical considerations. Those criteria are identified which are important at each step in the assessment. A qualitative methodology for scoring the task relative to the assessment criteria is provided to alloy analysts to make better informed decisions with regard to the potential effectiveness of applying KBS technology. In addition to this documentation, the assessment methodology has been implemented for personal computers use using the HYPERCARD{trademark} software on a Macintosh{trademark} computer. This interactive mode facilities small group analysis of potential KBS applications and permits a non-sequential appraisal with provisions for automated note-keeping and question scoring. The results provide a useful tool for assessing the feasibility of using KBS techniques in performing tasks in support of treaty verification or IC functions. 13 refs., 3 figs.

  19. Self-revealing software: a method for producing understandable knowledge-based systems

    SciTech Connect

    Paul, J.

    1988-01-01

    System self-explanation is critical for the construction, utility, acceptance, and maintenance of complex, knowledge-based software. This dissertation presents a new methodology and implementation techniques that enable software systems to explain their knowledge and reasoning, i.e., to become self-revealing. These systems are capable of self-analysis-they introspect about their own knowledge and behavior. This internal awareness coupled with intelligent communication ability provide the critical resources necessary for more understandable, self-explaining systems. The theory addresses the spectrum of explanation goals and is applicable to complex and unstructured domains and to general control structures. The method, called REVEAL, represents the culmination of research and experimentation with new explanation techniques conducted as part of the development of a legal expert system, SAL (System for Asbestos Litigation). SAL adheres to the design philosophy of REVEAL and uses many of the associated techniques. Throughout the dissertation, the theoretical concepts are demonstrated by examples of their implementation in SAL.

  20. Computing gene expression data with a knowledge-based gene clustering approach.

    PubMed

    Rosa, Bruce A; Oh, Sookyung; Montgomery, Beronda L; Chen, Jin; Qin, Wensheng

    2010-01-01

    Computational analysis methods for gene expression data gathered in microarray experiments can be used to identify the functions of previously unstudied genes. While obtaining the expression data is not a difficult task, interpreting and extracting the information from the datasets is challenging. In this study, a knowledge-based approach which identifies and saves important functional genes before filtering based on variability and fold change differences was utilized to study light regulation. Two clustering methods were used to cluster the filtered datasets, and clusters containing a key light regulatory gene were located. The common genes to both of these clusters were identified, and the genes in the common cluster were ranked based on their coexpression to the key gene. This process was repeated for 11 key genes in 3 treatment combinations. The initial filtering method reduced the dataset size from 22,814 probes to an average of 1134 genes, and the resulting common cluster lists contained an average of only 14 genes. These common cluster lists scored higher gene enrichment scores than two individual clustering methods. In addition, the filtering method increased the proportion of light responsive genes in the dataset from 1.8% to 15.2%, and the cluster lists increased this proportion to 18.4%. The relatively short length of these common cluster lists compared to gene groups generated through typical clustering methods or coexpression networks narrows the search for novel functional genes while increasing the likelihood that they are biologically relevant.

  1. Development of a Knowledgebase to Integrate, Analyze, Distribute, and Visualize Microbial Community Systems Biology Data

    SciTech Connect

    Banfield, Jillian

    2015-01-15

    We have developed a flexible knowledgebase system, ggKbase, (http://gg.berkeley.edu), to enable effective data analysis and knowledge generation from samples from which metagenomic and other ‘omics’ data are obtained. Within ggKbase, data can be interpreted, integrated and linked to other databases and services. Sequence information from complex metagenomic samples can be quickly and effectively resolved into genomes and biologically meaningful investigations of an organism’s metabolic potential can then be conducted. Critical features make analyses efficient, allowing analysis of hundreds of genomes at a time. The system is being used to support research in multiple DOE-relevant systems, including the LBNL SFA subsurface science biogeochemical cycling research at Rifle, Colorado. ggKbase is supporting the research of a rapidly growing group of users. It has enabled studies of carbon cycling in acid mine drainage ecosystems, biologically-mediated transformations in deep subsurface biomes sampled from mines and the north slope of Alaska, to study the human microbiome and for laboratory bioreactor-based remediation investigations.

  2. ISPE: A knowledge-based system for fluidization studies. 1990 Annual report

    SciTech Connect

    Reddy, S.

    1991-01-01

    Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all ``specified goals`` are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that can enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.

  3. Knowledge-based image understanding and classification system for medical image databases

    NASA Astrophysics Data System (ADS)

    Luo, Hui; Gaborski, Roger S.; Acharya, Raj S.

    2002-05-01

    With the advent of Computer Radiographs(CR) and Digital Radiographs(DR), image understanding and classification in medical image databases have attracted considerable attention. In this paper, we propose a knowledge-based image understanding and classification system for medical image databases. An object-oriented knowledge model has been introduced and the idea that content features of medical images must hierarchically match to the related knowledge model is used. As a result of finding the best match model, the input image can be classified. The implementation of the system includes three stages. The first stage focuses on the match of the coarse pattern of the model class and has three steps: image preprocessing, feature extraction, and neural network classification. Once the coarse shape classification is done, a small set of plausible model candidates are then employed for a detailed match in the second stage. Its match outputs imply the result models might be contained in the processed images. Finally, an evaluation strategy is used to further confirm the results. The performance of the system has been tested on different types of digital radiographs, including pelvis, ankle, elbow and etc. The experimental results suggest that the system prototype is applicable and robust, and the accuracy of the system is near 70% in our image databases.

  4. A knowledge-based, two-step procedure for extracting channel networks from noisy dem data

    NASA Astrophysics Data System (ADS)

    Smith, Terence R.; Zhan, Cixiang; Gao, Peng

    We present a new procedure for extracting channel networks from noisy DEM data. The procedure is a knowledge-based, two-step procedure employing both local and nonlocal information. In particular, we employ a model of an ideal drainage network as a source of constraints that must be satisfied by the output of the procedure. We embed these constraints as part of the network extraction procedure. In a first step, the procedure employs the facet model of Haralick to extract valley information from digital images. The constraints employed at this stage relate to conditions indicating reliable valley pixels. In a second step, the procedure applies knowledge of drainage networks to integrate reliable valley points discovered into a network of single-pixel width lines. This network satisfies the constraints imposed by viewing a drainage network as a binary tree in which the channel segments have a one-pixel width. The procedure performs well on DEM data in the example investigated. The overall worst-case performance of the procedure is O( N) log N), but the most computationally intensive step in the procedure is parallelized easily. Hence the procedure is a good candidate for automation.

  5. The Application of Integrated Knowledge-based Systems for the Biomedical Risk Assessment Intelligent Network (BRAIN)

    NASA Technical Reports Server (NTRS)

    Loftin, Karin C.; Ly, Bebe; Webster, Laurie; Verlander, James; Taylor, Gerald R.; Riley, Gary; Culbert, Chris; Holden, Tina; Rudisill, Marianne

    1993-01-01

    One of NASA's goals for long duration space flight is to maintain acceptable levels of crew health, safety, and performance. One way of meeting this goal is through the Biomedical Risk Assessment Intelligent Network (BRAIN), an integrated network of both human and computer elements. The BRAIN will function as an advisor to flight surgeons by assessing the risk of in-flight biomedical problems and recommending appropriate countermeasures. This paper describes the joint effort among various NASA elements to develop BRAIN and an Infectious Disease Risk Assessment (IDRA) prototype. The implementation of this effort addresses the technological aspects of the following: (1) knowledge acquisition; (2) integration of IDRA components; (3) use of expert systems to automate the biomedical prediction process; (4) development of a user-friendly interface; and (5) integration of the IDRA prototype and Exercise Countermeasures Intelligent System (ExerCISys). Because the C Language, CLIPS (the C Language Integrated Production System), and the X-Window System were portable and easily integrated, they were chosen as the tools for the initial IDRA prototype. The feasibility was tested by developing an IDRA prototype that predicts the individual risk of influenza. The application of knowledge-based systems to risk assessment is of great market value to the medical technology industry.

  6. SIGMA: A Knowledge-Based Simulation Tool Applied to Ecosystem Modeling

    NASA Technical Reports Server (NTRS)

    Dungan, Jennifer L.; Keller, Richard; Lawless, James G. (Technical Monitor)

    1994-01-01

    The need for better technology to facilitate building, sharing and reusing models is generally recognized within the ecosystem modeling community. The Scientists' Intelligent Graphical Modelling Assistant (SIGMA) creates an environment for model building, sharing and reuse which provides an alternative to more conventional approaches which too often yield poorly documented, awkwardly structured model code. The SIGMA interface presents the user a list of model quantities which can be selected for computation. Equations to calculate the model quantities may be chosen from an existing library of ecosystem modeling equations, or built using a specialized equation editor. Inputs for dim equations may be supplied by data or by calculation from other equations. Each variable and equation is expressed using ecological terminology and scientific units, and is documented with explanatory descriptions and optional literature citations. Automatic scientific unit conversion is supported and only physically-consistent equations are accepted by the system. The system uses knowledge-based semantic conditions to decide which equations in its library make sense to apply in a given situation, and supplies these to the user for selection. "Me equations and variables are graphically represented as a flow diagram which provides a complete summary of the model. Forest-BGC, a stand-level model that simulates photosynthesis and evapo-transpiration for conifer canopies, was originally implemented in Fortran and subsequenty re-implemented using SIGMA. The SIGMA version reproduces daily results and also provides a knowledge base which greatly facilitates inspection, modification and extension of Forest-BGC.

  7. Ab initio protein structure assembly using continuous structure fragments and optimized knowledge-based force field.

    PubMed

    Xu, Dong; Zhang, Yang

    2012-07-01

    Ab initio protein folding is one of the major unsolved problems in computational biology owing to the difficulties in force field design and conformational search. We developed a novel program, QUARK, for template-free protein structure prediction. Query sequences are first broken into fragments of 1-20 residues where multiple fragment structures are retrieved at each position from unrelated experimental structures. Full-length structure models are then assembled from fragments using replica-exchange Monte Carlo simulations, which are guided by a composite knowledge-based force field. A number of novel energy terms and Monte Carlo movements are introduced and the particular contributions to enhancing the efficiency of both force field and search engine are analyzed in detail. QUARK prediction procedure is depicted and tested on the structure modeling of 145 nonhomologous proteins. Although no global templates are used and all fragments from experimental structures with template modeling score >0.5 are excluded, QUARK can successfully construct 3D models of correct folds in one-third cases of short proteins up to 100 residues. In the ninth community-wide Critical Assessment of protein Structure Prediction experiment, QUARK server outperformed the second and third best servers by 18 and 47% based on the cumulative Z-score of global distance test-total scores in the FM category. Although ab initio protein folding remains a significant challenge, these data demonstrate new progress toward the solution of the most important problem in the field.

  8. Capturing district nursing through a knowledge-based electronic caseload analysis tool (eCAT).

    PubMed

    Kane, Kay

    2014-03-01

    The Electronic Caseload Analysis Tool (eCAT) is a knowledge-based software tool to assist the caseload analysis process. The tool provides a wide range of graphical reports, along with an integrated clinical advisor, to assist district nurses, team leaders, operational and strategic managers with caseload analysis by describing, comparing and benchmarking district nursing practice in the context of population need, staff resources, and service structure. District nurses and clinical lead nurses in Northern Ireland developed the tool, along with academic colleagues from the University of Ulster, working in partnership with a leading software company. The aim was to use the eCAT tool to identify the nursing need of local populations, along with the variances in district nursing practice, and match the workforce accordingly. This article reviews the literature, describes the eCAT solution and discusses the impact of eCAT on nursing practice, staff allocation, service delivery and workforce planning, using fictitious exemplars and a post-implementation evaluation from the trusts.

  9. The fault monitoring and diagnosis knowledge-based system for space power systems: AMPERES, phase 1

    NASA Technical Reports Server (NTRS)

    Lee, S. C.

    1989-01-01

    The objective is to develop a real time fault monitoring and diagnosis knowledge-based system (KBS) for space power systems which can save costly operational manpower and can achieve more reliable space power system operation. The proposed KBS was developed using the Autonomously Managed Power System (AMPS) test facility currently installed at NASA Marshall Space Flight Center (MSFC), but the basic approach taken for this project could be applicable for other space power systems. The proposed KBS is entitled Autonomously Managed Power-System Extendible Real-time Expert System (AMPERES). In Phase 1 the emphasis was put on the design of the overall KBS, the identification of the basic research required, the initial performance of the research, and the development of a prototype KBS. In Phase 2, emphasis is put on the completion of the research initiated in Phase 1, and the enhancement of the prototype KBS developed in Phase 1. This enhancement is intended to achieve a working real time KBS incorporated with the NASA space power system test facilities. Three major research areas were identified and progress was made in each area. These areas are real time data acquisition and its supporting data structure; sensor value validations; development of inference scheme for effective fault monitoring and diagnosis, and its supporting knowledge representation scheme.

  10. Diagnosis by integrating model-based reasoning with knowledge-based reasoning

    NASA Technical Reports Server (NTRS)

    Bylander, Tom

    1988-01-01

    Our research investigates how observations can be categorized by integrating a qualitative physical model with experiential knowledge. Our domain is diagnosis of pathologic gait in humans, in which the observations are the gait motions, muscle activity during gait, and physical exam data, and the diagnostic hypotheses are the potential muscle weaknesses, muscle mistimings, and joint restrictions. Patients with underlying neurological disorders typically have several malfunctions. Among the problems that need to be faced are: the ambiguity of the observations, the ambiguity of the qualitative physical model, correspondence of the observations and hypotheses to the qualitative physical model, the inherent uncertainty of experiential knowledge, and the combinatorics involved in forming composite hypotheses. Our system divides the work so that the knowledge-based reasoning suggests which hypotheses appear more likely than others, the qualitative physical model is used to determine which hypotheses explain which observations, and another process combines these functionalities to construct a composite hypothesis based on explanatory power and plausibility. We speculate that the reasoning architecture of our system is generally applicable to complex domains in which a less-than-perfect physical model and less-than-perfect experiential knowledge need to be combined to perform diagnosis.

  11. DBD-Hunter: a knowledge-based method for the prediction of DNA-protein interactions.

    PubMed

    Gao, Mu; Skolnick, Jeffrey

    2008-07-01

    The structures of DNA-protein complexes have illuminated the diversity of DNA-protein binding mechanisms shown by different protein families. This lack of generality could pose a great challenge for predicting DNA-protein interactions. To address this issue, we have developed a knowledge-based method, DNA-binding Domain Hunter (DBD-Hunter), for identifying DNA-binding proteins and associated binding sites. The method combines structural comparison and the evaluation of a statistical potential, which we derive to describe interactions between DNA base pairs and protein residues. We demonstrate that DBD-Hunter is an accurate method for predicting DNA-binding function of proteins, and that DNA-binding protein residues can be reliably inferred from the corresponding templates if identified. In benchmark tests on approximately 4000 proteins, our method achieved an accuracy of 98% and a precision of 84%, which significantly outperforms three previous methods. We further validate the method on DNA-binding protein structures determined in DNA-free (apo) state. We show that the accuracy of our method is only slightly affected on apo-structures compared to the performance on holo-structures cocrystallized with DNA. Finally, we apply the method to approximately 1700 structural genomics targets and predict that 37 targets with previously unknown function are likely to be DNA-binding proteins. DBD-Hunter is freely available at http://cssb.biology.gatech.edu/skolnick/webservice/DBD-Hunter/.

  12. Data mining and intelligent queries in a knowledge-based multimedia medical database system

    NASA Astrophysics Data System (ADS)

    Zhang, Shuhua; Coleman, John D.

    2000-04-01

    Multimedia medical databases have accumulated large quantities of data and information about patients and their medical conditions. Patterns and relationships within this data could provide new knowledge for making better medical decisions. Unfortunately, few technologies have been developed and applied to discover and use this hidden knowledge. We are currently developing a next generation knowledge-based multimedia medical database, named MedBase, with advanced behaviors for data analysis and data fusion. As part of this R&D effort, a knowledge-rich data model is constructed to incorporate data mining techniques/tools to assist the building of medical knowledge bases, and to facilitate intelligent answering of users' investigative and knowledge queries in the database. Techniques such as data generalization, classification, clustering, semantic structures, and concept hierarchies, are used to acquire and represent both symbolic and spatial knowledge implicit in the database. With the availability of semantic structures, concept hierarchies and generalized knowledge, queries may be posed and answered at multiple levels of abstraction. In this article we provide a general description of the approaches and efforts undertaken so far in the MedBase project.

  13. Ada and knowledge-based systems: A prototype combining the best of both worlds

    NASA Technical Reports Server (NTRS)

    Brauer, David C.

    1986-01-01

    A software architecture is described which facilitates the construction of distributed expert systems using Ada and selected knowledge based systems. This architecture was utilized in the development of a Knowledge-based Maintenance Expert System (KNOMES) prototype for the Space Station Mobile Service Center (MSC). The KNOMES prototype monitors a simulated data stream from MSC sensors and built-in test equipment. It detects anomalies in the data and performs diagnosis to determine the cause. The software architecture which supports the KNOMES prototype allows for the monitoring and diagnosis tasks to be performed concurrently. The basic concept of this software architecture is named ACTOR (Ada Cognitive Task ORganization Scheme). An individual ACTOR is a modular software unit which contains both standard data processing and artificial intelligence components. A generic ACTOR module contains Ada packages for communicating with other ACTORs and accessing various data sources. The knowledge based component of an ACTOR determines the role it will play in a system. In this prototype, an ACTOR will monitor the MSC data stream.

  14. VIP: A knowledge-based design aid for the engineering of space systems

    NASA Technical Reports Server (NTRS)

    Lewis, Steven M.; Bellman, Kirstie L.

    1990-01-01

    The Vehicles Implementation Project (VIP), a knowledge-based design aid for the engineering of space systems is described. VIP combines qualitative knowledge in the form of rules, quantitative knowledge in the form of equations, and other mathematical modeling tools. The system allows users rapidly to develop and experiment with models of spacecraft system designs. As information becomes available to the system, appropriate equations are solved symbolically and the results are displayed. Users may browse through the system, observing dependencies and the effects of altering specific parameters. The system can also suggest approaches to the derivation of specific parameter values. In addition to providing a tool for the development of specific designs, VIP aims at increasing the user's understanding of the design process. Users may rapidly examine the sensitivity of a given parameter to others in the system and perform tradeoffs or optimizations of specific parameters. A second major goal of VIP is to integrate the existing corporate knowledge base of models and rules into a central, symbolic form.

  15. Sensorimotor representation and knowledge-based reasoning for spatial exploration and localisation.

    PubMed

    Zetzsche, C; Wolter, J; Schill, K

    2008-12-01

    We investigate a hybrid system for autonomous exploration and navigation, and implement it in a virtual mobile agent, which operates in virtual spatial environments. The system is based on several distinguishing properties. The representation is not map-like, but based on sensorimotor features, i.e. on combinations of sensory features and motor actions. The system has a hybrid architecture, which integrates a bottom-up processing of sensorimotor features with a top-down, knowledge-based reasoning strategy. This strategy selects the optimal motor action in each step according to the principle of maximum information gain. Two sensorimotor levels with different behavioural granularity are implemented, a macro-level, which controls the movements of the agent in space, and a micro-level, which controls its eye movements. At each level, the same type of hybrid architecture and the same principle of information gain are used for sensorimotor control. The localisation performance of the system is tested with large sets of virtual rooms containing different mixtures of unique and non-unique objects. The results demonstrate that the system efficiently performs those exploratory motor actions that yield a maximum amount of information about the current environment. Localisation is typically achieved within a few steps. Furthermore, the computational complexity of the underlying computations is limited, and the system is robust with respect to minor variations in the spatial environments.

  16. Knowledge-based automated technique for measuring total lung volume from CT

    NASA Astrophysics Data System (ADS)

    Brown, Matthew S.; McNitt-Gray, Michael F.; Mankovich, Nicholas J.; Goldin, Jonathan G.; Aberle, Denise R.

    1996-04-01

    A robust, automated technique has been developed for estimating total lung volumes from chest computed tomography (CT) images. The technique includes a method for segmenting major chest anatomy. A knowledge-based approach automates the calculation of separate volumes of the whole thorax, lungs, and central tracheo-bronchial tree from volumetric CT data sets. A simple, explicit 3D model describes properties such as shape, topology and X-ray attenuation, of the relevant anatomy, which constrain the segmentation of these anatomic structures. Total lung volume is estimated as the sum of the right and left lungs and excludes the central airways. The method requires no operator intervention. In preliminary testing, the system was applied to image data from two healthy subjects and four patients with emphysema who underwent both helical CT and pulmonary function tests. To obtain single breath-hold scans, the healthy subjects were scanned with a collimation of 5 mm and a pitch of 1.5, while the emphysema patients were scanned with collimation of 10 mm at a pitch of 2.0. CT data were reconstructed as contiguous image sets. Automatically calculated volumes were consistent with body plethysmography results (< 10% difference).

  17. Knowledge-based factor analysis of multidimensional nuclear medicine image sequences

    NASA Astrophysics Data System (ADS)

    Yap, Jeffrey T.; Chen, Chin-Tu; Cooper, Malcolm; Treffert, Jon D.

    1994-05-01

    We have developed a knowledge-based approach to analyzing dynamic nuclear medicine data sets using factor analysis. Prior knowledge is used as constraints to produce factor images and their associated time functions which are physically and physiologically realistic. These methods have been applied to both planar and tomographic image sequences acquired using various single-photon emitting and positron emitting radiotracers. Computer-simulated data, non-human primate studies, and human clinical studies have been used to develop and evaluate the methodology. The organ systems studied include the kidneys, heart, brain, liver, and bone. The factors generated represent various isolated aspects of physiologic function, such as tissue perfusion and clearance. In some clinical studies, the factors have indicated the potential to isolate diseased tissue from normally functioning tissue. In addition, the factor analysis of data acquired using newly developed radioligands has shown the ability to differentiate the specific binding of the radioligand to the targeted receptors from the non-specific binding. This suggests the potential use of factor analysis in the development and evaluation of radiolabeled compounds as well as in the investigation of specific receptor systems and their role in diagnosing disease.

  18. Chemogenomics knowledgebased polypharmacology analyses of drug abuse related G-protein coupled receptors and their ligands

    PubMed Central

    Xie, Xiang-Qun; Wang, Lirong; Liu, Haibin; Ouyang, Qin; Fang, Cheng; Su, Weiwei

    2013-01-01

    Drug abuse (DA) and addiction is a complex illness, broadly viewed as a neurobiological impairment with genetic and environmental factors that influence its development and manifestation. Abused substances can disrupt the activity of neurons by interacting with many proteins, particularly G-protein coupled receptors (GPCRs). A few medicines that target the central nervous system (CNS) can also modulate DA related proteins, such as GPCRs, which can act in conjunction with the controlled psychoactive substance(s) and increase side effects. To fully explore the molecular interaction networks that underlie DA and to effectively modulate the GPCRs in these networks with small molecules for DA treatment, we built a drug-abuse domain specific chemogenomics knowledgebase (DA-KB) to centralize the reported chemogenomics research information related to DA and CNS disorders in an effort to benefit researchers across a broad range of disciplines. We then focus on the analysis of GPCRs as many of them are closely related with DA. Their distribution in human tissues was also analyzed for the study of side effects caused by abused drugs. We further implement our computational algorithms/tools to explore DA targets, DA mechanisms and pathways involved in polydrug addiction and to explore polypharmacological effects of the GPCR ligands. Finally, the polypharmacology effects of GPCRs-targeted medicines for DA treatment were investigated and such effects can be exploited for the development of drugs with polypharmacophore for DA intervention. The chemogenomics database and the analysis tools will help us better understand the mechanism of drugs abuse and facilitate to design new medications for system pharmacotherapy of DA. PMID:24567719

  19. A Knowledge-Based Weighting Framework to Boost the Power of Genome-Wide Association Studies

    PubMed Central

    Li, Miao-Xin; Sham, Pak C.; Cherny, Stacey S.; Song, You-Qiang

    2010-01-01

    Background We are moving to second-wave analysis of genome-wide association studies (GWAS), characterized by comprehensive bioinformatical and statistical evaluation of genetic associations. Existing biological knowledge is very valuable for GWAS, which may help improve their detection power particularly for disease susceptibility loci of moderate effect size. However, a challenging question is how to utilize available resources that are very heterogeneous to quantitatively evaluate the statistic significances. Methodology/Principal Findings We present a novel knowledge-based weighting framework to boost power of the GWAS and insightfully strengthen their explorative performance for follow-up replication and deep sequencing. Built upon diverse integrated biological knowledge, this framework directly models both the prior functional information and the association significances emerging from GWAS to optimally highlight single nucleotide polymorphisms (SNPs) for subsequent replication. In the theoretical calculation and computer simulation, it shows great potential to achieve extra over 15% power to identify an association signal of moderate strength or to use hundreds of whole-genome subjects fewer to approach similar power. In a case study on late-onset Alzheimer disease (LOAD) for a proof of principle, it highlighted some genes, which showed positive association with LOAD in previous independent studies, and two important LOAD related pathways. These genes and pathways could be originally ignored due to involved SNPs only having moderate association significance. Conclusions/Significance With user-friendly implementation in an open-source Java package, this powerful framework will provide an important complementary solution to identify more true susceptibility loci with modest or even small effect size in current GWAS for complex diseases. PMID:21217833

  20. A knowledge-based imaging informatics approach for managing proton beam therapy of cancer patients.

    PubMed

    Liu, Brent J

    2007-08-01

    The need for a unified patient-oriented information system to handle complex proton therapy (PT) imaging and informatics data during the course of patient treatment is becoming steadily apparent due to the ever increasing demands for better diagnostic treatment planning and more accurate information. Currently, this information is scattered throughout each of the different treatment and information systems in the oncology department. Furthermore, the lack of organization with standardized methods makes it difficult and time-consuming to navigate through the maze of data, resulting in challenges during patient treatment planning. We present a methodology to develop this electronic patient record (ePR) system based on DICOM standards and perform knowledge-based medical imaging informatics research on specific clinical scenarios where patients are treated with PT. Treatment planning is similar in workflow to traditional radiation therapy (RT) methods such as intensity-modulated radiation therapy (IMRT), which utilizes a priori knowledge to drive the treatment plan in an inverse manner. In March 2006, two new RT objects were drafted in a DICOM-RT Supplement 102 specifically for ion therapy, which includes PT. The standardization of DICOM-RT-ION objects and the development of a knowledge base as well as decision-support tools that can be add-on features to the ePR DICOM-RT system were researched. This methodology can be used to extend to PT and the development of future clinical decision-making scenarios during the course of the patient's treatment that utilize "inverse treatment planning." We present the initial steps of this imaging and informatics methodology for PT and lay the foundation for development of future decision-support tools tailored to cancer patients treated with PT. By integrating decision-support knowledge and tools designed to assist in the decision-making process, a new and improved "knowledge-enhanced treatment planning" approach can be realized.

  1. Transferability of Empirical Potentials and the Knowledgebase of Interatomic Models (KIM)

    NASA Astrophysics Data System (ADS)

    Karls, Daniel S.

    Empirical potentials have proven to be an indispensable tool in understanding complex material behavior at the atomic scale due to their unrivaled computational efficiency. However, as they are currently used in the materials community, the realization of their full utility is stifled by a number of implementational difficulties. An emerging project specifically aimed to address these problems is the Knowledgebase of Interatomic Models (KIM). The primary purpose of KIM is to serve as an open-source, publically accessible repository of standardized implementations of empirical potentials (Models), simulation codes which use them to compute material properties (Tests), and first-principles/experimental data corresponding to these properties (Reference Data). Aside from eliminating the redundant expenditure of scientific resources and the irreproducibility of results computed using empirical potentials, a unique benefit offered by KIM is the ability to gain a further understanding of a Model's transferability, i.e. its ability to make accurate predictions for material properties which it was not fitted to reproduce. In the present work, we begin by surveying the various classes of mathematical representations of atomic environments which are used to define empirical potentials. We then proceed to offer a broad characterization of empirical potentials in the context of machine learning which reveals three distinct categories with which any potential may be associated. Combining one of the aforementioned representations of atomic environments with a suitable regression technique, we define the Regression Algorithm for Transferability Estimation (RATE), which permits a quantitative estimation of the transferability of an arbitrary potential. Finally, we demonstrate the application of RATE to a specific training set consisting of bulk structures, clusters, surfaces, and nanostructures of silicon. A specific analysis of the underlying quantities inferred by RATE which are

  2. MetRxn: a knowledgebase of metabolites and reactions spanning metabolic models and databases

    PubMed Central

    2012-01-01

    Background Increasingly, metabolite and reaction information is organized in the form of genome-scale metabolic reconstructions that describe the reaction stoichiometry, directionality, and gene to protein to reaction associations. A key bottleneck in the pace of reconstruction of new, high-quality metabolic models is the inability to directly make use of metabolite/reaction information from biological databases or other models due to incompatibilities in content representation (i.e., metabolites with multiple names across databases and models), stoichiometric errors such as elemental or charge imbalances, and incomplete atomistic detail (e.g., use of generic R-group or non-explicit specification of stereo-specificity). Description MetRxn is a knowledgebase that includes standardized metabolite and reaction descriptions by integrating information from BRENDA, KEGG, MetaCyc, Reactome.org and 44 metabolic models into a single unified data set. All metabolite entries have matched synonyms, resolved protonation states, and are linked to unique structures. All reaction entries are elementally and charge balanced. This is accomplished through the use of a workflow of lexicographic, phonetic, and structural comparison algorithms. MetRxn allows for the download of standardized versions of existing genome-scale metabolic models and the use of metabolic information for the rapid reconstruction of new ones. Conclusions The standardization in description allows for the direct comparison of the metabolite and reaction content between metabolic models and databases and the exhaustive prospecting of pathways for biotechnological production. This ever-growing dataset currently consists of over 76,000 metabolites participating in more than 72,000 reactions (including unresolved entries). MetRxn is hosted on a web-based platform that uses relational database models (MySQL). PMID:22233419

  3. Knowledge-Based Methods To Train and Optimize Virtual Screening Ensembles

    PubMed Central

    2016-01-01

    Ensemble docking can be a successful virtual screening technique that addresses the innate conformational heterogeneity of macromolecular drug targets. Yet, lacking a method to identify a subset of conformational states that effectively segregates active and inactive small molecules, ensemble docking may result in the recommendation of a large number of false positives. Here, three knowledge-based methods that construct structural ensembles for virtual screening are presented. Each method selects ensembles by optimizing an objective function calculated using the receiver operating characteristic (ROC) curve: either the area under the ROC curve (AUC) or a ROC enrichment factor (EF). As the number of receptor conformations, N, becomes large, the methods differ in their asymptotic scaling. Given a set of small molecules with known activities and a collection of target conformations, the most resource intense method is guaranteed to find the optimal ensemble but scales as O(2N). A recursive approximation to the optimal solution scales as O(N2), and a more severe approximation leads to a faster method that scales linearly, O(N). The techniques are generally applicable to any system, and we demonstrate their effectiveness on the androgen nuclear hormone receptor (AR), cyclin-dependent kinase 2 (CDK2), and the peroxisome proliferator-activated receptor δ (PPAR-δ) drug targets. Conformations that consisted of a crystal structure and molecular dynamics simulation cluster centroids were used to form AR and CDK2 ensembles. Multiple available crystal structures were used to form PPAR-δ ensembles. For each target, we show that the three methods perform similarly to one another on both the training and test sets. PMID:27097522

  4. A knowledge-based approach to estimating the magnitude and spatial patterns of potential threats to soil biodiversity.

    PubMed

    Orgiazzi, Alberto; Panagos, Panos; Yigini, Yusuf; Dunbar, Martha B; Gardi, Ciro; Montanarella, Luca; Ballabio, Cristiano

    2016-03-01

    Because of the increasing pressures exerted on soil, below-ground life is under threat. Knowledge-based rankings of potential threats to different components of soil biodiversity were developed in order to assess the spatial distribution of threats on a European scale. A list of 13 potential threats to soil biodiversity was proposed to experts with different backgrounds in order to assess the potential for three major components of soil biodiversity: soil microorganisms, fauna, and biological functions. This approach allowed us to obtain knowledge-based rankings of threats. These classifications formed the basis for the development of indices through an additive aggregation model that, along with ad-hoc proxies for each pressure, allowed us to preliminarily assess the spatial patterns of potential threats. Intensive exploitation was identified as the highest pressure. In contrast, the use of genetically modified organisms in agriculture was considered as the threat with least potential. The potential impact of climate change showed the highest uncertainty. Fourteen out of the 27 considered countries have more than 40% of their soils with moderate-high to high potential risk for all three components of soil biodiversity. Arable soils are the most exposed to pressures. Soils within the boreal biogeographic region showed the lowest risk potential. The majority of soils at risk are outside the boundaries of protected areas. First maps of risks to three components of soil biodiversity based on the current scientific knowledge were developed. Despite the intrinsic limits of knowledge-based assessments, a remarkable potential risk to soil biodiversity was observed. Guidelines to preliminarily identify and circumscribe soils potentially at risk are provided. This approach may be used in future research to assess threat at both local and global scale and identify areas of possible risk and, subsequently, design appropriate strategies for monitoring and protection of soil

  5. Development of Design-a-Trial, a knowledge-based critiquing system for authors of clinical trial protocols.

    PubMed

    Wyatt, J C; Altman, D G; Heathfield, H A; Pantin, C F

    1994-06-01

    Many published clinical trials are poorly designed, suggesting that the protocol was incomplete, disorganised or contained errors. This fact, doctors' limited statistical skills and the shortage of medical statisticians, prompted us to develop a knowledge-based aid, Design-a-Trial, for authors of clinical trial protocols. This interviews a physician, prompts them with suitable design options, comments on the statistical rigour and feasibility of their proposed design and generates a 6-page draft protocol document. This paper outlines the process used to develop Design-a-Trial, presents preliminary evaluation results, and discusses lessons we learned which may apply to the developed of other medical decision-aids.

  6. Personal profile of medical students selected through a knowledge-based exam only: are we missing suitable students?

    PubMed Central

    Abbiati, Milena; Baroffio, Anne; Gerbase, Margaret W.

    2016-01-01

    Introduction A consistent body of literature highlights the importance of a broader approach to select medical school candidates both assessing cognitive capacity and individual characteristics. However, selection in a great number of medical schools worldwide is still based on knowledge exams, a procedure that might neglect students with needed personal characteristics for future medical practice. We investigated whether the personal profile of students selected through a knowledge-based exam differed from those not selected. Methods Students applying for medical school (N=311) completed questionnaires assessing motivations for becoming a doctor, learning approaches, personality traits, empathy, and coping styles. Selection was based on the results of MCQ tests. Principal component analysis was used to draw a profile of the students. Differences between selected and non-selected students were examined by Multivariate ANOVAs, and their impact on selection by logistic regression analysis. Results Students demonstrating a profile of diligence with higher conscientiousness, deep learning approach, and task-focused coping were more frequently selected (p=0.01). Other personal characteristics such as motivation, sociability, and empathy did not significantly differ, comparing selected and non-selected students. Conclusion Selection through a knowledge-based exam privileged diligent students. It did neither advantage nor preclude candidates with a more humane profile. PMID:27079886

  7. Development of the Knowledge-based & Empirical Combined Scoring Algorithm (KECSA) to Score Protein-Ligand Interactions

    PubMed Central

    Zheng, Zheng

    2013-01-01

    We describe a novel knowledge-based protein-ligand scoring function that employs a new definition for the reference state, allowing us to relate a statistical potential to a Lennard-Jones (LJ) potential. In this way, the LJ potential parameters were generated from protein-ligand complex structural data contained in the PDB. Forty-nine types of atomic pairwise interactions were derived using this method, which we call the knowledge-based and empirical combined scoring algorithm (KECSA). Two validation benchmarks were introduced to test the performance of KECSA. The first validation benchmark included two test sets that address the training-set and enthalpy/entropy of KECSA The second validation benchmark suite included two large-scale and five small-scale test sets to compare the reproducibility of KECSA with respect to two empirical score functions previously developed in our laboratory (LISA and LISA+), as well as to other well-known scoring methods. Validation results illustrate that KECSA shows improved performance in all test sets when compared with other scoring methods especially in its ability to minimize the RMSE. LISA and LISA+ displayed similar performance using the correlation coefficient and Kendall τ as the metric of quality for some of the small test sets. Further pathways for improvement are discussed which would KECSA more sensitive to subtle changes in ligand structure. PMID:23560465

  8. A community effort towards a knowledge-base and mathematical model of the human pathogen Salmonella Typhimurium LT2

    SciTech Connect

    Thiele, Ines; Hyduke, Daniel R.; Steeb, Benjamin; Fankam, Guy; Allen, Douglas K.; Bazzani, Susanna; Charusanti, Pep; Chen, Feng-Chi; Fleming, Ronan MT; Hsiung, Chao A.; De Keersmaecker, Sigrid CJ; Liao, Yu-Chieh; Marchal, Kathleen; Mo, Monica L.; Özdemir, Emre; Raghunathan, Anu; Reed, Jennifer L.; Shin, Sook-Il; Sigurbjörnsdóttir, Sara; Steinmann, Jonas; Sudarsan, Suresh; Swainston, Neil; Thijs, Inge M.; Zengler, Karsten; Palsson, Bernhard O.; Adkins, Joshua N.; Bumann, Dirk

    2011-01-01

    Metabolic reconstructions (MRs) are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG) knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium is a human pathogen, causes various diseases and its increasing antibiotic resistance poses a public health problem. Here, we describe a community-driven effort, in which more than 20 experts in S. Typhimurium biology and systems biology collaborated to reconcile and expand the S. Typhimurium BiGG knowledge-base. The consensus MR was obtained starting from two independently developed MRs for S. Typhimurium. Key results of this reconstruction jamboree include i) development and implementation of a community-based workflow for MR annotation and reconciliation; ii) incorporation of thermodynamic information; and iii) use of the consensus MR to identify potential multi-target drug therapy approaches. Finally, taken together, with the growing number of parallel MRs a structured, community-driven approach will be necessary to maximize quality while increasing adoption of MRs in experimental design and interpretation.

  9. Development of an intelligent interface for adding spatial objects to a knowledge-based geographic information system

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Goettsche, Craig

    1989-01-01

    Earth Scientists lack adequate tools for quantifying complex relationships between existing data layers and studying and modeling the dynamic interactions of these data layers. There is a need for an earth systems tool to manipulate multi-layered, heterogeneous data sets that are spatially indexed, such as sensor imagery and maps, easily and intelligently in a single system. The system can access and manipulate data from multiple sensor sources, maps, and from a learned object hierarchy using an advanced knowledge-based geographical information system. A prototype Knowledge-Based Geographic Information System (KBGIS) was recently constructed. Many of the system internals are well developed, but the system lacks an adequate user interface. A methodology is described for developing an intelligent user interface and extending KBGIS to interconnect with existing NASA systems, such as imagery from the Land Analysis System (LAS), atmospheric data in Common Data Format (CDF), and visualization of complex data with the National Space Science Data Center Graphics System. This would allow NASA to quickly explore the utility of such a system, given the ability to transfer data in and out of KBGIS easily. The use and maintenance of the object hierarchies as polymorphic data types brings, to data management, a while new set of problems and issues, few of which have been explored above the prototype level.

  10. MO-FG-303-03: Demonstration of Universal Knowledge-Based 3D Dose Prediction

    SciTech Connect

    Shiraishi, S; Moore, K L

    2015-06-15

    Purpose: To demonstrate a knowledge-based 3D dose prediction methodology that can accurately predict achievable radiotherapy distributions. Methods: Using previously treated plans as input, an artificial neural network (ANN) was trained to predict 3D dose distributions based on 14 patient-specific anatomical parameters including the distance (r) to planning target volume (PTV) boundary, organ-at-risk (OAR) boundary distances, and angular position ( θ,φ). 23 prostate and 49 stereotactic radiosurgery (SRS) cases with ≥1 nearby OARs were studied. All were planned with volumetric-modulated arc therapy (VMAT) to prescription doses of 81Gy for prostate and 12–30Gy for SRS. Site-specific ANNs were trained using all prostate 23 plans and using a 24 randomly-selected subset for the SRS model. The remaining 25 SRS plans were used to validate the model. To quantify predictive accuracy, the dose difference between the clinical plan and prediction were calculated on a voxel-by-voxel basis δD(r,θ,φ)=Dclin(r,θ,φ)-Dpred(r, θ,φ). Grouping voxels by boundary distance, the mean <δ Dr>=(1/N)Σ -θ,φ D(r,θ,φ) and inter-quartile range (IQR) quantified the accuracy of this method for deriving DVH estimations. The standard deviation (σ) of δ D quantified the 3D dose prediction error on a voxel-by-voxel basis. Results: The ANNs were highly accurate in predictive ability for both prostate and SRS plans. For prostate, <δDr> ranged from −0.8% to +0.6% (max IQR=3.8%) over r=0–32mm, while 3D dose prediction accuracy averaged from σ=5–8% across the same range. For SRS, from r=0–34mm the training set <δDr> ranged from −3.7% to +1.5% (max IQR=4.4%) while the validation set <δDr> ranged from −2.2% to +5.8% (max IQR=5.3%). 3D dose prediction accuracy averaged σ=2.5% for the training set and σ=4.0% over the same interval. Conclusion: The study demonstrates this technique’s ability to predict achievable 3D dose distributions for VMAT SRS and prostate. Future

  11. Architecture for Knowledge-Based and Federated Search of Online Clinical Evidence

    PubMed Central

    Walther, Martin; Nguyen, Ken; Lovell, Nigel H

    2005-01-01

    Background It is increasingly difficult for clinicians to keep up-to-date with the rapidly growing biomedical literature. Online evidence retrieval methods are now seen as a core tool to support evidence-based health practice. However, standard search engine technology is not designed to manage the many different types of evidence sources that are available or to handle the very different information needs of various clinical groups, who often work in widely different settings. Objectives The objectives of this paper are (1) to describe the design considerations and system architecture of a wrapper-mediator approach to federate search system design, including the use of knowledge-based, meta-search filters, and (2) to analyze the implications of system design choices on performance measurements. Methods A trial was performed to evaluate the technical performance of a federated evidence retrieval system, which provided access to eight distinct online resources, including e-journals, PubMed, and electronic guidelines. The Quick Clinical system architecture utilized a universal query language to reformulate queries internally and utilized meta-search filters to optimize search strategies across resources. We recruited 227 family physicians from across Australia who used the system to retrieve evidence in a routine clinical setting over a 4-week period. The total search time for a query was recorded, along with the duration of individual queries sent to different online resources. Results Clinicians performed 1662 searches over the trial. The average search duration was 4.9 ± 3.2 s (N = 1662 searches). Mean search duration to the individual sources was between 0.05 s and 4.55 s. Average system time (ie, system overhead) was 0.12 s. Conclusions The relatively small system overhead compared to the average time it takes to perform a search for an individual source shows that the system achieves a good trade-off between performance and reliability. Furthermore, despite

  12. Knowledge-based prediction of plan quality metrics in intracranial stereotactic radiosurgery

    SciTech Connect

    Shiraishi, Satomi; Moore, Kevin L.; Tan, Jun; Olsen, Lindsey A.

    2015-02-15

    Purpose: The objective of this work was to develop a comprehensive knowledge-based methodology for predicting achievable dose–volume histograms (DVHs) and highly precise DVH-based quality metrics (QMs) in stereotactic radiosurgery/radiotherapy (SRS/SRT) plans. Accurate QM estimation can identify suboptimal treatment plans and provide target optimization objectives to standardize and improve treatment planning. Methods: Correlating observed dose as it relates to the geometric relationship of organs-at-risk (OARs) to planning target volumes (PTVs) yields mathematical models to predict achievable DVHs. In SRS, DVH-based QMs such as brain V{sub 10Gy} (volume receiving 10 Gy or more), gradient measure (GM), and conformity index (CI) are used to evaluate plan quality. This study encompasses 223 linear accelerator-based SRS/SRT treatment plans (SRS plans) using volumetric-modulated arc therapy (VMAT), representing 95% of the institution’s VMAT radiosurgery load from the past four and a half years. Unfiltered models that use all available plans for the model training were built for each category with a stratification scheme based on target and OAR characteristics determined emergently through initial modeling process. Model predictive accuracy is measured by the mean and standard deviation of the difference between clinical and predicted QMs, δQM = QM{sub clin} − QM{sub pred}, and a coefficient of determination, R{sup 2}. For categories with a large number of plans, refined models are constructed by automatic elimination of suspected suboptimal plans from the training set. Using the refined model as a presumed achievable standard, potentially suboptimal plans are identified. Predictions of QM improvement are validated via standardized replanning of 20 suspected suboptimal plans based on dosimetric predictions. The significance of the QM improvement is evaluated using the Wilcoxon signed rank test. Results: The most accurate predictions are obtained when plans are

  13. Reflexive Professionalism as a Second Generation of Evidence-Based Practice: Some Considerations on the Special Issue "What Works? Modernizing the Knowledge-Base of Social Work"

    ERIC Educational Resources Information Center

    Otto, Hans-Uwe; Polutta, Andreas; Ziegler, Holger

    2009-01-01

    This article refers sympathetically to the thoughtful debates and positions in the "Research on Social Work Practice" ("RSWP"; Special Issue, July, 2008 issue) on "What Works? Modernizing the Knowledge-Base of Social Work." It highlights the need for empirical efficacy and effectiveness research in social work and…

  14. An Emerging Knowledge-Based Economy in China? Indicators from OECD Databases. OECD Science, Technology and Industry Working Papers, 2004/4

    ERIC Educational Resources Information Center

    Criscuolo, Chiara; Martin, Ralf

    2004-01-01

    The main objective of this Working Paper is to show a set of indicators on the knowledge-based economy for China, mainly compiled from databases within EAS, although data from databases maintained by other parts of the OECD are included as well. These indicators are put in context by comparison with data for the United States, Japan and the EU (or…

  15. Considering Human Capital Theory in Assessment and Training: Mapping the Gap between Current Skills and the Needs of a Knowledge-Based Economy in Northeast Iowa

    ERIC Educational Resources Information Center

    Mihm-Herold, Wendy

    2010-01-01

    In light of the current economic downturn, thousands of Iowans are unemployed and this is the ideal time to build the skills of the workforce to compete in the knowledge-based economy so businesses and entrepreneurs can compete in a global economy. A tool for assessing the skills and knowledge of dislocated workers and students as well as…

  16. New Learning Models for the New Knowledge-Based Economy: Professional and Local-Personal Networks as a Source of Knowledge Development in the Multimedia Sector.

    ERIC Educational Resources Information Center

    Tremblay, Diane-Gabrielle

    The role of professional and local-personal networks as a source of knowledge development in the new knowledge-based economy was examined in a 15-month study that focuses on people working in the multimedia industry in Montreal, Quebec. The study focused on the modes of exchange and learning, collaborative work, and management and development of…

  17. Creating a Knowledge-Based Economy in the United Arab Emirates: Realising the Unfulfilled Potential of Women in the Science, Technology and Engineering Fields

    ERIC Educational Resources Information Center

    Aswad, Noor Ghazal; Vidican, Georgeta; Samulewicz, Diana

    2011-01-01

    As the United Arab Emirates (UAE) moves towards a knowledge-based economy, maximising the participation of the national workforce, especially women, in the transformation process is crucial. Using survey methods and semi-structured interviews, this paper examines the factors that influence women's decisions regarding their degree programme and…

  18. Photon Optimizer (PO) prevails over Progressive Resolution Optimizer (PRO) for VMAT planning with or without knowledge-based solution.

    PubMed

    Jiang, Fan; Wu, Hao; Yue, Haizhen; Jia, Fei; Zhang, Yibao

    2017-03-01

    The enhanced dosimetric performance of knowledge-based volumetric modulated arc therapy (VMAT) planning might be jointly contributed by the patient-specific optimization objectives, as estimated by the RapidPlan model, and by the potentially improved Photon Optimizer (PO) algorithm than the previous Progressive Resolution Optimizer (PRO) engine. As PO is mandatory for RapidPlan estimation but optional for conventional manual planning, appreciating the two optimizers may provide practical guidelines for the algorithm selection because knowledge-based planning may not replace the current method completely in a short run. Using a previously validated dose-volume histogram (DVH) estimation model which can produce clinically acceptable plans automatically for rectal cancer patients without interactive manual adjustment, this study reoptimized 30 historically approved plans (referred as clinical plans that were created manually with PRO) with RapidPlan solution (PO plans). Then the PRO algorithm was utilized to optimize the plans again using the same dose-volume constraints as PO plans, where the line objectives were converted as a series of point objectives automatically (PRO plans). On the basis of comparable target dose coverage, the combined applications of new objectives and PO algorithm have significantly reduced the organs-at-risk (OAR) exposure by 23.49-32.72% than the clinical plans. These discrepancies have been largely preserved after substituting PRO for PO, indicating the dosimetric improvements were mostly attributable to the refined objectives. Therefore, Eclipse users of earlier versions may instantly benefit from adopting the model-generated objectives from other RapidPlan-equipped centers, even with PRO algorithm. However, the additional contribution made by the PO relative to PRO accounted for 1.54-3.74%, suggesting PO should be selected with priority whenever available, with or without RapidPlan solution as a purchasable package. Significantly

  19. Computational and human observer image quality evaluation of low dose, knowledge-based CT iterative reconstruction

    PubMed Central

    Eck, Brendan L.; Fahmi, Rachid; Brown, Kevin M.; Zabic, Stanislav; Raihani, Nilgoun; Miao, Jun; Wilson, David L.

    2015-01-01

    Purpose: Aims in this study are to (1) develop a computational model observer which reliably tracks the detectability of human observers in low dose computed tomography (CT) images reconstructed with knowledge-based iterative reconstruction (IMR™, Philips Healthcare) and filtered back projection (FBP) across a range of independent variables, (2) use the model to evaluate detectability trends across reconstructions and make predictions of human observer detectability, and (3) perform human observer studies based on model predictions to demonstrate applications of the model in CT imaging. Methods: Detectability (d′) was evaluated in phantom studies across a range of conditions. Images were generated using a numerical CT simulator. Trained observers performed 4-alternative forced choice (4-AFC) experiments across dose (1.3, 2.7, 4.0 mGy), pin size (4, 6, 8 mm), contrast (0.3%, 0.5%, 1.0%), and reconstruction (FBP, IMR), at fixed display window. A five-channel Laguerre–Gauss channelized Hotelling observer (CHO) was developed with internal noise added to the decision variable and/or to channel outputs, creating six different internal noise models. Semianalytic internal noise computation was tested against Monte Carlo and used to accelerate internal noise parameter optimization. Model parameters were estimated from all experiments at once using maximum likelihood on the probability correct, PC. Akaike information criterion (AIC) was used to compare models of different orders. The best model was selected according to AIC and used to predict detectability in blended FBP-IMR images, analyze trends in IMR detectability improvements, and predict dose savings with IMR. Predicted dose savings were compared against 4-AFC study results using physical CT phantom images. Results: Detection in IMR was greater than FBP in all tested conditions. The CHO with internal noise proportional to channel output standard deviations, Model-k4, showed the best trade-off between fit and

  20. Intelligent personal navigator supported by knowledge-based systems for estimating dead reckoning navigation parameters

    NASA Astrophysics Data System (ADS)

    Moafipoor, Shahram

    Personal navigators (PN) have been studied for about a decade in different fields and applications, such as safety and rescue operations, security and emergency services, and police and military applications. The common goal of all these applications is to provide precise and reliable position, velocity, and heading information of each individual in various environments. In the PN system developed in this dissertation, the underlying assumption is that the system does not require pre-existing infrastructure to enable pedestrian navigation. To facilitate this capability, a multisensor system concept, based on the Global Positioning System (GPS), inertial navigation, barometer, magnetometer, and a human pedometry model has been developed. An important aspect of this design is to use the human body as navigation sensor to facilitate Dead Reckoning (DR) navigation in GPS-challenged environments. The system is designed predominantly for outdoor environments, where occasional loss of GPS lock may happen; however, testing and performance demonstration have been extended to indoor environments. DR navigation is based on a relative-measurement approach, with the key idea of integrating the incremental motion information in the form of step direction (SD) and step length (SL) over time. The foundation of the intelligent navigation system concept proposed here rests in exploiting the human locomotion pattern, as well as change of locomotion in varying environments. In this context, the term intelligent navigation represents the transition from the conventional point-to-point DR to dynamic navigation using the knowledge about the mechanism of the moving person. This approach increasingly relies on integrating knowledge-based systems (KBS) and artificial intelligence (AI) methodologies, including artificial neural networks (ANN) and fuzzy logic (FL). In addition, a general framework of the quality control for the real-time validation of the DR processing is proposed, based on a

  1. An intelligent, knowledge-based multiple criteria decision making advisor for systems design

    NASA Astrophysics Data System (ADS)

    Li, Yongchang

    of an appropriate decision making method. Furthermore, some DMs may be exclusively using one or two specific methods which they are familiar with or trust and not realizing that they may be inappropriate to handle certain classes of the problems, thus yielding erroneous results. These issues reveal that in order to ensure a good decision a suitable decision method should be chosen before the decision making process proceeds. The first part of this dissertation proposes an MCDM process supported by an intelligent, knowledge-based advisor system referred to as Multi-Criteria Interactive Decision-Making Advisor and Synthesis process (MIDAS), which is able to facilitate the selection of the most appropriate decision making method and which provides insight to the user for fulfilling different preferences. The second part of this dissertation presents an autonomous decision making advisor which is capable of dealing with ever-evolving real time information and making autonomous decisions under uncertain conditions. The advisor encompasses a Markov Decision Process (MDP) formulation which takes uncertainty into account when determines the best action for each system state. (Abstract shortened by UMI.)

  2. Computational and human observer image quality evaluation of low dose, knowledge-based CT iterative reconstruction

    SciTech Connect

    Eck, Brendan L.; Fahmi, Rachid; Miao, Jun; Brown, Kevin M.; Zabic, Stanislav; Raihani, Nilgoun; Wilson, David L.

    2015-10-15

    Purpose: Aims in this study are to (1) develop a computational model observer which reliably tracks the detectability of human observers in low dose computed tomography (CT) images reconstructed with knowledge-based iterative reconstruction (IMR™, Philips Healthcare) and filtered back projection (FBP) across a range of independent variables, (2) use the model to evaluate detectability trends across reconstructions and make predictions of human observer detectability, and (3) perform human observer studies based on model predictions to demonstrate applications of the model in CT imaging. Methods: Detectability (d′) was evaluated in phantom studies across a range of conditions. Images were generated using a numerical CT simulator. Trained observers performed 4-alternative forced choice (4-AFC) experiments across dose (1.3, 2.7, 4.0 mGy), pin size (4, 6, 8 mm), contrast (0.3%, 0.5%, 1.0%), and reconstruction (FBP, IMR), at fixed display window. A five-channel Laguerre–Gauss channelized Hotelling observer (CHO) was developed with internal noise added to the decision variable and/or to channel outputs, creating six different internal noise models. Semianalytic internal noise computation was tested against Monte Carlo and used to accelerate internal noise parameter optimization. Model parameters were estimated from all experiments at once using maximum likelihood on the probability correct, P{sub C}. Akaike information criterion (AIC) was used to compare models of different orders. The best model was selected according to AIC and used to predict detectability in blended FBP-IMR images, analyze trends in IMR detectability improvements, and predict dose savings with IMR. Predicted dose savings were compared against 4-AFC study results using physical CT phantom images. Results: Detection in IMR was greater than FBP in all tested conditions. The CHO with internal noise proportional to channel output standard deviations, Model-k4, showed the best trade-off between fit

  3. Ontology Language to Support Description of Experiment Control System Semantics, Collaborative Knowledge-Base Design and Ontology Reuse

    SciTech Connect

    Vardan Gyurjyan, D Abbott, G Heyes, E Jastrzembski, B Moffit, C Timmer, E Wolin

    2009-10-01

    In this paper we discuss the control domain specific ontology that is built on top of the domain-neutral Resource Definition Framework (RDF). Specifically, we will discuss the relevant set of ontology concepts along with the relationships among them in order to describe experiment control components and generic event-based state machines. Control Oriented Ontology Language (COOL) is a meta-data modeling language that provides generic means for representation of physics experiment control processes and components, and their relationships, rules and axioms. It provides a semantic reference frame that is useful for automating the communication of information for configuration, deployment and operation. COOL has been successfully used to develop a complete and dynamic knowledge-base for experiment control systems, developed using the AFECS framework.

  4. Prediction of Slot Shape and Slot Size for Improving the Performance of Microstrip Antennas Using Knowledge-Based Neural Networks

    PubMed Central

    De, Asok

    2014-01-01

    In the last decade, artificial neural networks have become very popular techniques for computing different performance parameters of microstrip antennas. The proposed work illustrates a knowledge-based neural networks model for predicting the appropriate shape and accurate size of the slot introduced on the radiating patch for achieving desired level of resonance, gain, directivity, antenna efficiency, and radiation efficiency for dual-frequency operation. By incorporating prior knowledge in neural model, the number of required training patterns is drastically reduced. Further, the neural model incorporated with prior knowledge can be used for predicting response in extrapolation region beyond the training patterns region. For validation, a prototype is also fabricated and its performance parameters are measured. A very good agreement is attained between measured, simulated, and predicted results. PMID:27382616

  5. An approach to knowledge engineering to support knowledge-based simulation of payload ground processing at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Mcmanus, Shawn; Mcdaniel, Michael

    1989-01-01

    Planning for processing payloads was always difficult and time-consuming. With the advent of Space Station Freedom and its capability to support a myriad of complex payloads, the planning to support this ground processing maze involves thousands of man-hours of often tedious data manipulation. To provide the capability to analyze various processing schedules, an object oriented knowledge-based simulation environment called the Advanced Generic Accomodations Planning Environment (AGAPE) is being developed. Having nearly completed the baseline system, the emphasis in this paper is directed toward rule definition and its relation to model development and simulation. The focus is specifically on the methodologies implemented during knowledge acquisition, analysis, and representation within the AGAPE rule structure. A model is provided to illustrate the concepts presented. The approach demonstrates a framework for AGAPE rule development to assist expert system development.

  6. PCOSKB: A KnowledgeBase on genes, diseases, ontology terms and biochemical pathways associated with PolyCystic Ovary Syndrome.

    PubMed

    Joseph, Shaini; Barai, Ram Shankar; Bhujbalrao, Rasika; Idicula-Thomas, Susan

    2016-01-04

    Polycystic ovary syndrome (PCOS) is one of the major causes of female subfertility worldwide and ≈ 7-10% of women in reproductive age are affected by it. The affected individuals exhibit varying types and levels of comorbid conditions, along with the classical PCOS symptoms. Extensive studies on PCOS across diverse ethnic populations have resulted in a plethora of information on dysregulated genes, gene polymorphisms and diseases linked to PCOS. However, efforts have not been taken to collate and link these data. Our group, for the first time, has compiled PCOS-related information available through scientific literature; cross-linked it with molecular, biochemical and clinical databases and presented it as a user-friendly, web-based online knowledgebase for the benefit of the scientific and clinical community. Manually curated information on associated genes, single nucleotide polymorphisms, diseases, gene ontology terms and pathways along with supporting reference literature has been collated and included in PCOSKB (http://pcoskb.bicnirrh.res.in).

  7. The pan-genome: towards a knowledge-based discovery of novel targets for vaccines and antibacterials.

    PubMed

    Muzzi, Alessandro; Masignani, Vega; Rappuoli, Rino

    2007-06-01

    During the past decade, sequencing of the entire genome of pathogenic bacteria has become a widely used practice in microbiology research. More recently, sequence data from multiple isolates of a single pathogen have provided new insights into the microevolution of a species as well as helping researchers to decipher its virulence mechanisms. The comparison of multiple strains of a single species has resulted in the definition of the species pan-genome, as a measure of the total gene repertoire that can pertain to a given microorganism. This concept can be exploited not only to study the diversity of a species, but also, as we discuss here, to provide the opportunity to use a knowledge-based approach for the development of novel vaccine candidates and new-generation targets for antimicrobials.

  8. STRUTEX: A prototype knowledge-based system for initially configuring a structure to support point loads in two dimensions

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Feyock, Stefan; Sobieszczanski-Sobieski, Jaroslaw

    1988-01-01

    The purpose of this research effort is to investigate the benefits that might be derived from applying artificial intelligence tools in the area of conceptual design. Therefore, the emphasis is on the artificial intelligence aspects of conceptual design rather than structural and optimization aspects. A prototype knowledge-based system, called STRUTEX, was developed to initially configure a structure to support point loads in two dimensions. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user by integrating a knowledge base interface and inference engine, a data base interface, and graphics while keeping the knowledge base and data base files separate. The system writes a file which can be input into a structural synthesis system, which combines structural analysis and optimization.

  9. STRUTEX: A prototype knowledge-based system for initially configuring a structure to support point loads in two dimensions

    NASA Technical Reports Server (NTRS)

    Robers, James L.; Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    Only recently have engineers begun making use of Artificial Intelligence (AI) tools in the area of conceptual design. To continue filling this void in the design process, a prototype knowledge-based system, called STRUTEX has been developed to initially configure a structure to support point loads in two dimensions. This prototype was developed for testing the application of AI tools to conceptual design as opposed to being a testbed for new methods for improving structural analysis and optimization. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user. How the system is constructed to interact with the user is described. Of special interest is the information flow between the knowledge base and the data base under control of the algorithmic main program. Examples of computed and refined structures are presented during the explanation of the system.

  10. SU-F-BRA-13: Knowledge-Based Treatment Planning for Prostate LDR Brachytherapy Based On Principle Component Analysis

    SciTech Connect

    Roper, J; Bradshaw, B; Godette, K; Schreibmann, E; Chanyavanich, V

    2015-06-15

    Purpose: To create a knowledge-based algorithm for prostate LDR brachytherapy treatment planning that standardizes plan quality using seed arrangements tailored to individual physician preferences while being fast enough for real-time planning. Methods: A dataset of 130 prior cases was compiled for a physician with an active prostate seed implant practice. Ten cases were randomly selected to test the algorithm. Contours from the 120 library cases were registered to a common reference frame. Contour variations were characterized on a point by point basis using principle component analysis (PCA). A test case was converted to PCA vectors using the same process and then compared with each library case using a Mahalanobis distance to evaluate similarity. Rank order PCA scores were used to select the best-matched library case. The seed arrangement was extracted from the best-matched case and used as a starting point for planning the test case. Computational time was recorded. Any subsequent modifications were recorded that required input from a treatment planner to achieve an acceptable plan. Results: The computational time required to register contours from a test case and evaluate PCA similarity across the library was approximately 10s. Five of the ten test cases did not require any seed additions, deletions, or moves to obtain an acceptable plan. The remaining five test cases required on average 4.2 seed modifications. The time to complete manual plan modifications was less than 30s in all cases. Conclusion: A knowledge-based treatment planning algorithm was developed for prostate LDR brachytherapy based on principle component analysis. Initial results suggest that this approach can be used to quickly create treatment plans that require few if any modifications by the treatment planner. In general, test case plans have seed arrangements which are very similar to prior cases, and thus are inherently tailored to physician preferences.

  11. Experiences in improving the state of the practice in verification and validation of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; French, Scott W.; Hamilton, David

    1994-01-01

    Knowledge-based systems (KBS's) are in general use in a wide variety of domains, both commercial and government. As reliance on these types of systems grows, the need to assess their quality and validity reaches critical importance. As with any software, the reliability of a KBS can be directly attributed to the application of disciplined programming and testing practices throughout the development life-cycle. However, there are some essential differences between conventional software and KBSs, both in construction and use. The identification of these differences affect the verification and validation (V&V) process and the development of techniques to handle them. The recognition of these differences is the basis of considerable on-going research in this field. For the past three years IBM (Federal Systems Company - Houston) and the Software Technology Branch (STB) of NASA/Johnson Space Center have been working to improve the 'state of the practice' in V&V of Knowledge-based systems. This work was motivated by the need to maintain NASA's ability to produce high quality software while taking advantage of new KBS technology. To date, the primary accomplishment has been the development and teaching of a four-day workshop on KBS V&V. With the hope of improving the impact of these workshops, we also worked directly with NASA KBS projects to employ concepts taught in the workshop. This paper describes two projects that were part of this effort. In addition to describing each project, this paper describes problems encountered and solutions proposed in each case, with particular emphasis on implications for transferring KBS V&V technology beyond the NASA domain.

  12. An iterative knowledge-based scoring function to predict protein-ligand interactions: II. Validation of the scoring function.

    PubMed

    Huang, Sheng-You; Zou, Xiaoqin

    2006-11-30

    We have developed an iterative knowledge-based scoring function (ITScore) to describe protein-ligand interactions. Here, we assess ITScore through extensive tests on native structure identification, binding affinity prediction, and virtual database screening. Specifically, ITScore was first applied to a test set of 100 protein-ligand complexes constructed by Wang et al. (J Med Chem 2003, 46, 2287), and compared with 14 other scoring functions. The results show that ITScore yielded a high success rate of 82% on identifying native-like binding modes under the criterion of rmsd < or = 2 A for each top-ranked ligand conformation. The success rate increased to 98% if the top five conformations were considered for each ligand. In the case of binding affinity prediction, ITScore also obtained a good correlation for this test set (R = 0.65). Next, ITScore was used to predict binding affinities of a second diverse test set of 77 protein-ligand complexes prepared by Muegge and Martin (J Med Chem 1999, 42, 791), and compared with four other widely used knowledge-based scoring functions. ITScore yielded a high correlation of R2 = 0.65 (or R = 0.81) in the affinity prediction. Finally, enrichment tests were performed with ITScore against four target proteins using the compound databases constructed by Jacobsson et al. (J Med Chem 2003, 46, 5781). The results were compared with those of eight other scoring functions. ITScore yielded high enrichments in all four database screening tests. ITScore can be easily combined with the existing docking programs for the use of structure-based drug design.

  13. Applications of artificial intelligence 1993: Knowledge-based systems in aerospace and industry; Proceedings of the Meeting, Orlando, FL, Apr. 13-15, 1993

    NASA Technical Reports Server (NTRS)

    Fayyad, Usama M. (Editor); Uthurusamy, Ramasamy (Editor)

    1993-01-01

    The present volume on applications of artificial intelligence with regard to knowledge-based systems in aerospace and industry discusses machine learning and clustering, expert systems and optimization techniques, monitoring and diagnosis, and automated design and expert systems. Attention is given to the integration of AI reasoning systems and hardware description languages, care-based reasoning, knowledge, retrieval, and training systems, and scheduling and planning. Topics addressed include the preprocessing of remotely sensed data for efficient analysis and classification, autonomous agents as air combat simulation adversaries, intelligent data presentation for real-time spacecraft monitoring, and an integrated reasoner for diagnosis in satellite control. Also discussed are a knowledge-based system for the design of heat exchangers, reuse of design information for model-based diagnosis, automatic compilation of expert systems, and a case-based approach to handling aircraft malfunctions.

  14. Proceedings of the Conference on Knowledge-Based Software Assistant (5th) Held in Liverpool, New York on 24-28 September 1990

    DTIC Science & Technology

    1991-03-01

    Case Tool(PMCT) introduces an embedded automated knowledge-based approach to requirements traceability. Quality improvements can be made in the software...Traditional mappings generated by the bookkeeping I documentation approach provide no insight into the quality principles of Requirements Engineering...in the past. Pragmatically, the quality in general of a DGSS directly relates to the extent to which its implementation approaches the ideal in the

  15. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  16. Object recognition in brain CT-scans: Knowledge-based fusion of data from multiple feature extractors

    SciTech Connect

    Li, H.; Deklerck, R.; Cuyper, B. De; Nyssen, E.; Cornelis, J.; Hermanus, A.

    1995-06-01

    This paper describes a knowledge-based image interpretation system for the segmentation and labeling of a series of 2-D brain X-ray CT-scans, parallel to the orbito-metal plane. The system combines the image primitive information produced by different low level vision techniques in order to improve the reliability of the segmentation and the image interpretation. It is implemented in a blackboard environment that is holding various types of prior information and which controls the interpretation process. The scoring model is applied for the fusion of information derived from three types of image primitives (points, edges, and regions). A model, containing both analogical and propositional knowledge on the brain objects, is used to direct the interpretation process. The linguistic variables, introduced to describe the propositional features of the brain model, are defined by fuzzy membership functions. Constraint functions are applied to evaluate the plausibility of the mapping between image primitives and brain model data objects. Procedural knowledge has been integrated into different knowledge sources. Experimental results illustrate the reliability and robustness of the system against small variations in slice orientation and interpatient variability in the images.

  17. Data acquisition for a real time fault monitoring and diagnosis knowledge-based system for space power system

    NASA Technical Reports Server (NTRS)

    Wilhite, Larry D.; Lee, S. C.; Lollar, Louis F.

    1989-01-01

    The design and implementation of the real-time data acquisition and processing system employed in the AMPERES project is described, including effective data structures for efficient storage and flexible manipulation of the data by the knowledge-based system (KBS), the interprocess communication mechanism required between the data acquisition system and the KBS, and the appropriate data acquisition protocols for collecting data from the sensors. Sensor data are categorized as critical or noncritical data on the basis of the inherent frequencies of the signals and the diagnostic requirements reflected in their values. The critical data set contains 30 analog values and 42 digital values and is collected every 10 ms. The noncritical data set contains 240 analog values and is collected every second. The collected critical and noncritical data are stored in separate circular buffers. Buffers are created in shared memory to enable other processes, i.e., the fault monitoring and diagnosis process and the user interface process, to freely access the data sets.

  18. APL: An angle probability list to improve knowledge-based metaheuristics for the three-dimensional protein structure prediction.

    PubMed

    Borguesan, Bruno; Barbachan e Silva, Mariel; Grisci, Bruno; Inostroza-Ponta, Mario; Dorn, Márcio

    2015-12-01

    Tertiary protein structure prediction is one of the most challenging problems in structural bioinformatics. Despite the advances in algorithm development and computational strategies, predicting the folded structure of a protein only from its amino acid sequence remains as an unsolved problem. We present a new computational approach to predict the native-like three-dimensional structure of proteins. Conformational preferences of amino acid residues and secondary structure information were obtained from protein templates stored in the Protein Data Bank and represented as an Angle Probability List. Two knowledge-based prediction methods based on Genetic Algorithms and Particle Swarm Optimization were developed using this information. The proposed method has been tested with twenty-six case studies selected to validate our approach with different classes of proteins and folding patterns. Stereochemical and structural analysis were performed for each predicted three-dimensional structure. Results achieved suggest that the Angle Probability List can improve the effectiveness of metaheuristics used to predicted the three-dimensional structure of protein molecules by reducing its conformational search space.

  19. Knowledge-based expert systems and a proof-of-concept case study for multiple sequence alignment construction and analysis.

    PubMed

    Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron; Thompson, Julie Dawn

    2009-01-01

    The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented.

  20. PCOSKB: A KnowledgeBase on genes, diseases, ontology terms and biochemical pathways associated with PolyCystic Ovary Syndrome

    PubMed Central

    Joseph, Shaini; Barai, Ram Shankar; Bhujbalrao, Rasika; Idicula-Thomas, Susan

    2016-01-01

    Polycystic ovary syndrome (PCOS) is one of the major causes of female subfertility worldwide and ≈7–10% of women in reproductive age are affected by it. The affected individuals exhibit varying types and levels of comorbid conditions, along with the classical PCOS symptoms. Extensive studies on PCOS across diverse ethnic populations have resulted in a plethora of information on dysregulated genes, gene polymorphisms and diseases linked to PCOS. However, efforts have not been taken to collate and link these data. Our group, for the first time, has compiled PCOS-related information available through scientific literature; cross-linked it with molecular, biochemical and clinical databases and presented it as a user-friendly, web-based online knowledgebase for the benefit of the scientific and clinical community. Manually curated information on associated genes, single nucleotide polymorphisms, diseases, gene ontology terms and pathways along with supporting reference literature has been collated and included in PCOSKB (http://pcoskb.bicnirrh.res.in). PMID:26578565

  1. Rapid Design of Knowledge-Based Scoring Potentials for Enrichment of Near-Native Geometries in Protein-Protein Docking

    PubMed Central

    Sasse, Alexander; de Vries, Sjoerd J.; Schindler, Christina E. M.; de Beauchêne, Isaure Chauvot

    2017-01-01

    Protein-protein docking protocols aim to predict the structures of protein-protein complexes based on the structure of individual partners. Docking protocols usually include several steps of sampling, clustering, refinement and re-scoring. The scoring step is one of the bottlenecks in the performance of many state-of-the-art protocols. The performance of scoring functions depends on the quality of the generated structures and its coupling to the sampling algorithm. A tool kit, GRADSCOPT (GRid Accelerated Directly SCoring OPTimizing), was designed to allow rapid development and optimization of different knowledge-based scoring potentials for specific objectives in protein-protein docking. Different atomistic and coarse-grained potentials can be created by a grid-accelerated directly scoring dependent Monte-Carlo annealing or by a linear regression optimization. We demonstrate that the scoring functions generated by our approach are similar to or even outperform state-of-the-art scoring functions for predicting near-native solutions. Of additional importance, we find that potentials specifically trained to identify the native bound complex perform rather poorly on identifying acceptable or medium quality (near-native) solutions. In contrast, atomistic long-range contact potentials can increase the average fraction of near-native poses by up to a factor 2.5 in the best scored 1% decoys (compared to existing scoring), emphasizing the need of specific docking potentials for different steps in the docking protocol. PMID:28118389

  2. Development of a Knowledge-based Application Utilizing Ontologies for the Continuing Site-specific JJ1017 Master Maintenance.

    PubMed

    Kobayashi, Tatsuaki; Tsuji, Shintaro; Yagahara, Ayako; Tanikawa, Takumi; Umeda, Tokuo

    2015-07-01

    The purpose of this study was to develop the JJ1017 Knowledge-based Application (JKA) to support the continuing maintenance of a site-specific JJ1017 master defined by the JJ1017 guideline as a standard radiologic procedure master for medical information systems that are being adopted by some medical facilities in Japan. The method consisted of the following three steps: (1) construction of the JJ1017 Ontology (JJOnt) as a knowledge base using the Hozo (an environment for building/using ontologies); (2) development of modules (operation, I/O, graph modules) that are required to continue the maintenance of a site-specific JJ1017 master; and (3) unit testing of the JKA that consists of the JJOnt and the modules. As a result, the number of classes included in the JJOnt was 21,697. Within the radiologic procedure classes included in the above, the ratio of a JJ1017 master code for an external beam radiotherapy was the highest (51%). In unit testing of the JKA, we checked the main operations (e.g., keyword search of a JJ1017 master code/code meaning, editing the description of classes, etc.). The JJOnt is a knowledge base for implementing features that medical technologists find necessary in medical information systems. To enable medical technologists to exchange/retrieve semantically accurate information while using medical information systems in the future, we expect the JKA to support the maintenance and improvement of the site-specific JJ1017 master.

  3. Integrating knowledge-based multi-criteria evaluation techniques with GIS for landfill site selection: A case study using AHP

    NASA Astrophysics Data System (ADS)

    Fagbohun, B. J.; Aladejana, O. O.

    2016-09-01

    A major challenge in most growing urban areas of developing countries, without a pre-existing land use plan is the sustainable and efficient management of solid wastes. Siting a landfill is a complicated task because of several environmental regulations. This challenge gives birth to the need to develop efficient strategies for the selection of proper waste disposal sites in accordance with all existing environmental regulations. This paper presents a knowledge-based multi-criteria decision analysis using GIS for the selection of suitable landfill site in Ado-Ekiti, Nigeria. In order to identify suitable sites for landfill, seven factors - land use/cover, geology, river, soil, slope, lineament and roads - were taken into consideration. Each factor was classified and ranked based on prior knowledge about the area and existing guidelines. Weights for each factor were determined through pair-wise comparison using Saaty's 9 point scale and AHP. The integration of factors according to their weights using weighted index overlay analysis revealed that 39.23 km2 within the area was suitable to site a landfill. The resulting suitable area was classified as high suitability covering 6.47 km2 (16.49%), moderate suitability 25.48 km2 (64.95%) and low suitability 7.28 km2 (18.56%) based on their overall weights.

  4. Structure for a knowledge-based system to estimate Soviet tactics in the air-land battle. Master's thesis

    SciTech Connect

    Fletcher, A.M.

    1988-03-01

    The purpose of this thesis was to build a prototype decision aid that can use knowledge about Soviet military doctrine and tactics to infer when, where, and how the Soviet Army plans to attack NATO defenses given intelligence data about Soviet (Red) military units, terrain data, and the positions of the NATO (Blue) defenses. Issues are raised that must be resolved before such a decision aid, which is part of the Rapid Application of Air Power concept, can become operational. First examined is the need to shorten the C2 decision cycle in order for the ATOC staff to keep pace with the tempo of modern warfare. The Rapid Application of Air Power is a concept that includes automating various steps in the decision cycle to allow air power to be applied proactively to stop Soviet forces before they obtain critical objectives. A structure is presented for automating the second step in the decision cycle, assessing and clarifying the situation, through a knowledge-based decision aid for interpreting intelligence data from the perspective of Soviet (Red) doctrine and estimating future Red tactical objectives and maneuvers.

  5. Towards knowledge-based systems in clinical practice: development of an integrated clinical information and knowledge management support system.

    PubMed

    Kalogeropoulos, Dimitris A; Carson, Ewart R; Collinson, Paul O

    2003-09-01

    Given that clinicians presented with identical clinical information will act in different ways, there is a need to introduce into routine clinical practice methods and tools to support the scientific homogeneity and accountability of healthcare decisions and actions. The benefits expected from such action include an overall reduction in cost, improved quality of care, patient and public opinion satisfaction. Computer-based medical data processing has yielded methods and tools for managing the task away from the hospital management level and closer to the desired disease and patient management level. To this end, advanced applications of information and disease process modelling technologies have already demonstrated an ability to significantly augment clinical decision making as a by-product. The wide-spread acceptance of evidence-based medicine as the basis of cost-conscious and concurrently quality-wise accountable clinical practice suffices as evidence supporting this claim. Electronic libraries are one-step towards an online status of this key health-care delivery quality control environment. Nonetheless, to date, the underlying information and knowledge management technologies have failed to be integrated into any form of pragmatic or marketable online and real-time clinical decision making tool. One of the main obstacles that needs to be overcome is the development of systems that treat both information and knowledge as clinical objects with same modelling requirements. This paper describes the development of such a system in the form of an intelligent clinical information management system: a system which at the most fundamental level of clinical decision support facilitates both the organised acquisition of clinical information and knowledge and provides a test-bed for the development and evaluation of knowledge-based decision support functions.

  6. Knowledge-Based Personal Health System to empower outpatients of diabetes mellitus by means of P4 Medicine.

    PubMed

    Bresó, Adrián; Sáez, Carlos; Vicente, Javier; Larrinaga, Félix; Robles, Montserrat; García-Gómez, Juan Miguel

    2015-01-01

    Diabetes Mellitus (DM) affects hundreds of millions of people worldwide and it imposes a large economic burden on healthcare systems. We present a web patient empowering system (PHSP4) that ensures continuous monitoring and assessment of the health state of patients with DM (type I and II). PHSP4 is a Knowledge-Based Personal Health System (PHS) which follows the trend of P4 Medicine (Personalized, Predictive, Preventive, and Participative). It provides messages to outpatients and clinicians about the achievement of objectives, follow-up, and treatments adjusted to the patient condition. Additionally, it calculates a four-component risk vector of the associated pathologies with DM: Nephropathy, Diabetic retinopathy, Diabetic foot, and Cardiovascular event. The core of the system is a Rule-Based System which Knowledge Base is composed by a set of rules implementing the recommendations of the American Diabetes Association (ADA) (American Diabetes Association: http://www.diabetes.org/ ) clinical guideline. The PHSP4 is designed to be standardized and to facilitate its interoperability by means of terminologies (SNOMED-CT [The International Health Terminology Standards Development Organization: http://www.ihtsdo.org/snomed-ct/ ] and UCUM [The Unified Code for Units of Measure: http://unitsofmeasure.org/ ]), standardized clinical documents (HL7 CDA R2 [Health Level Seven International: http://www.hl7.org/index.cfm ]) for managing Electronic Health Record (EHR). We have evaluated the functionality of the system and its users' acceptance of the system using simulated and real data, and a questionnaire based in the Technology Acceptance Model methodology (TAM). Finally results show the reliability of the system and the high acceptance of clinicians.

  7. AlzPlatform: An Alzheimer’s Disease Domain-Specific Chemogenomics Knowledgebase for Polypharmacology and Target Identification Research

    PubMed Central

    2015-01-01

    Alzheimer’s disease (AD) is one of the most complicated progressive neurodegeneration diseases that involve many genes, proteins, and their complex interactions. No effective medicines or treatments are available yet to stop or reverse the progression of the disease due to its polygenic nature. To facilitate discovery of new AD drugs and better understand the AD neurosignaling pathways involved, we have constructed an Alzheimer’s disease domain-specific chemogenomics knowledgebase, AlzPlatform (www.cbligand.org/AD/) with cloud computing and sourcing functions. AlzPlatform is implemented with powerful computational algorithms, including our established TargetHunter, HTDocking, and BBB Predictor for target identification and polypharmacology analysis for AD research. The platform has assembled various AD-related chemogenomics data records, including 928 genes and 320 proteins related to AD, 194 AD drugs approved or in clinical trials, and 405 188 chemicals associated with 1 023 137 records of reported bioactivities from 38 284 corresponding bioassays and 10 050 references. Furthermore, we have demonstrated the application of the AlzPlatform in three case studies for identification of multitargets and polypharmacology analysis of FDA-approved drugs and also for screening and prediction of new AD active small chemical molecules and potential novel AD drug targets by our established TargetHunter and/or HTDocking programs. The predictions were confirmed by reported bioactivity data and our in vitro experimental validation. Overall, AlzPlatform will enrich our knowledge for AD target identification, drug discovery, and polypharmacology analyses and, also, facilitate the chemogenomics data sharing and information exchange/communications in aid of new anti-AD drug discovery and development. PMID:24597646

  8. Assessing side-chain perturbations of the protein backbone: a knowledge-based classification of residue Ramachandran space.

    PubMed

    Dahl, David B; Bohannan, Zach; Mo, Qianxing; Vannucci, Marina; Tsai, Jerry

    2008-05-02

    Grouping the 20 residues is a classic strategy to discover ordered patterns and insights about the fundamental nature of proteins, their structure, and how they fold. Usually, this categorization is based on the biophysical and/or structural properties of a residue's side-chain group. We extend this approach to understand the effects of side chains on backbone conformation and to perform a knowledge-based classification of amino acids by comparing their backbone phi, psi distributions in different types of secondary structure. At this finer, more specific resolution, torsion angle data are often sparse and discontinuous (especially for nonhelical classes) even though a comprehensive set of protein structures is used. To ensure the precision of Ramachandran plot comparisons, we applied a rigorous Bayesian density estimation method that produces continuous estimates of the backbone phi, psi distributions. Based on this statistical modeling, a robust hierarchical clustering was performed using a divergence score to measure the similarity between plots. There were seven general groups based on the clusters from the complete Ramachandran data: nonpolar/beta-branched (Ile and Val), AsX (Asn and Asp), long (Met, Gln, Arg, Glu, Lys, and Leu), aromatic (Phe, Tyr, His, and Cys), small (Ala and Ser), bulky (Thr and Trp), and, lastly, the singletons of Gly and Pro. At the level of secondary structure (helix, sheet, turn, and coil), these groups remain somewhat consistent, although there are a few significant variations. Besides the expected uniqueness of the Gly and Pro distributions, the nonpolar/beta-branched and AsX clusters were very consistent across all types of secondary structure. Effectively, this consistency across the secondary structure classes implies that side-chain steric effects strongly influence a residue's backbone torsion angle conformation. These results help to explain the plasticity of amino acid substitutions on protein structure and should help in

  9. Materials Characterization at Utah State University: Facilities and Knowledge-base of Electronic Properties of Materials Applicable to Spacecraft Charging

    NASA Technical Reports Server (NTRS)

    Dennison, J. R.; Thomson, C. D.; Kite, J.; Zavyalov, V.; Corbridge, Jodie

    2004-01-01

    In an effort to improve the reliability and versatility of spacecraft charging models designed to assist spacecraft designers in accommodating and mitigating the harmful effects of charging on spacecraft, the NASA Space Environments and Effects (SEE) Program has funded development of facilities at Utah State University for the measurement of the electronic properties of both conducting and insulating spacecraft materials. We present here an overview of our instrumentation and capabilities, which are particularly well suited to study electron emission as related to spacecraft charging. These measurements include electron-induced secondary and backscattered yields, spectra, and angular resolved measurements as a function of incident energy, species and angle, plus investigations of ion-induced electron yields, photoelectron yields, sample charging and dielectric breakdown. Extensive surface science characterization capabilities are also available to fully characterize the samples in situ. Our measurements for a wide array of conducting and insulating spacecraft materials have been incorporated into the SEE Charge Collector Knowledge-base as a Database of Electronic Properties of Materials Applicable to Spacecraft Charging. This Database provides an extensive compilation of electronic properties, together with parameterization of these properties in a format that can be easily used with existing spacecraft charging engineering tools and with next generation plasma, charging, and radiation models. Tabulated properties in the Database include: electron-induced secondary electron yield, backscattered yield and emitted electron spectra; He, Ar and Xe ion-induced electron yields and emitted electron spectra; photoyield and solar emittance spectra; and materials characterization including reflectivity, dielectric constant, resistivity, arcing, optical microscopy images, scanning electron micrographs, scanning tunneling microscopy images, and Auger electron spectra. Further

  10. Design and Implementation of Hydrologic Process Knowledge-base Ontology: A case study for the Infiltration Process

    NASA Astrophysics Data System (ADS)

    Elag, M.; Goodall, J. L.

    2013-12-01

    Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL

  11. Sbexpert users guide (version 1.0): A knowledge-based decision-support system for spruce beetle management. Forest Service general technical report

    SciTech Connect

    Reynolds, K.M.; Holsten, E.H.; Werner, R.A.

    1995-03-01

    SBexpert version 1.0 is a knowledge-based decision-support system for management of spruce beetle developed for use in Microsoft Windows. The users guide provides detailed instructions on the use of all SBexpert features. SBexpert has four main subprograms; introduction, analysis, textbook, and literature. The introduction is the first of the five subtopics in the SBexpert help system. The analysis topic is an advisory system for spruce beetle management that provides recommendation for reducing spruce beetle hazard and risk to spruce stands and is the main analytical topic in SBexpert. The textbook and literature topics provide complementary decision support for analysis.

  12. Can knowledge-based N management produce more staple grain with lower greenhouse gas emission and reactive nitrogen pollution? A meta-analysis.

    PubMed

    Xia, Longlong; Lam, Shu Kee; Chen, Deli; Wang, Jinyang; Tang, Quan; Yan, Xiaoyuan

    2016-08-10

    Knowledge-based nitrogen (N) management, which is designed for a better synchronization of crop N demand with N supply, is critical for global food security and environmental sustainability. Yet, a comprehensive assessment on how these N management practices affect food production, greenhouse gas emission (GHG), and N pollution in China is lacking. We compiled the results of 376 studies (1166 observations) to evaluate the overall effects of seven knowledge-based N management practices on crop productivity, nitrous oxide (N2 O) emission, and major reactive N (Nr) losses (ammonia, NH3 ; N leaching and runoff), for staple grain (rice, wheat, and corn) production in China. These practices included the application of controlled-release N fertilizer, nitrification inhibitor (NI) and urease inhibitor (UI), higher splitting frequency of fertilizer N application, lower basal N fertilizer (BF) proportion, deep placement of N fertilizer, and optimal N rate based on soil N test. Our results showed that, compared to traditional N management, these knowledge-based N practices significantly increased grain yields by 1.3-10.0%, which is attributed to the higher aboveground N uptake (5.1-12.1%) and N use efficiency in grain (8.0-48.2%). Moreover, these N management practices overall reduced GHG emission and Nr losses, by 5.4-39.8% for N2 O emission, 30.7-61.5% for NH3 emission (except for the NI application), 13.6-37.3% for N leaching, and 15.5-45.0% for N runoff. The use of NI increased NH3 emission by 27.5% (9.0-56.0%), which deserves extra-attention. The cost and benefit analysis indicated that the yield profit of these N management practices exceeded the corresponding input cost, which resulted in a significant increase of the net economic benefit by 2.9-12.6%. These results suggest that knowledge-based N management practice can be considered an effective way to ensure food security and improve environmental sustainability, while increasing economic return.

  13. Discovery of novel inhibitors of Aurora kinases with indazole scaffold: In silico fragment-based and knowledge-based drug design.

    PubMed

    Chang, Chun-Feng; Lin, Wen-Hsing; Ke, Yi-Yu; Lin, Yih-Shyan; Wang, Wen-Chieh; Chen, Chun-Hwa; Kuo, Po-Chu; Hsu, John T A; Uang, Biing-Jiun; Hsieh, Hsing-Pang

    2016-11-29

    Aurora kinases have emerged as important anticancer targets so that there are several inhibitors have advanced into clinical study. Herein, we identified novel indazole derivatives as potent Aurora kinases inhibitors by utilizing in silico fragment-based approach and knowledge-based drug design. After intensive hit-to-lead optimization, compounds 17 (dual Aurora A and B), 21 (Aurora B selective) and 30 (Aurora A selective) possessed indazole privileged scaffold with different substituents, which provide sub-type kinase selectivity. Computational modeling helps in understanding that the isoform selectivity could be targeted specific residue in the Aurora kinase binding pocket in particular targeting residues Arg220, Thr217 or Glu177.

  14. Development and evaluation of a clinical model for lung cancer patients using stereotactic body radiotherapy (SBRT) within a knowledge-based algorithm for treatment planning.

    PubMed

    Snyder, Karen Chin; Kim, Jinkoo; Reding, Anne; Fraser, Corey; Gordon, James; Ajlouni, Munther; Movsas, Benjamin; Chetty, Indrin J

    2016-11-08

    The purpose of this study was to describe the development of a clinical model for lung cancer patients treated with stereotactic body radiotherapy (SBRT) within a knowledge-based algorithm for treatment planning, and to evaluate the model performance and applicability to different planning techniques, tumor locations, and beam arrangements. 105 SBRT plans for lung cancer patients previously treated at our institution were included in the development of the knowledge-based model (KBM). The KBM was trained with a combination of IMRT, VMAT, and 3D CRT techniques. Model performance was validated with 25 cases, for both IMRT and VMAT. The full KBM encompassed lesions located centrally vs. peripherally (43:62), upper vs. lower (62:43), and anterior vs. posterior (60:45). Four separate sub-KBMs were created based on tumor location. Results were compared with the full KBM to evaluate its robustness. Beam templates were used in conjunction with the optimizer to evaluate the model's ability to handle suboptimal beam placements. Dose differences to organs-at-risk (OAR) were evaluated between the plans gener-ated by each KBM. Knowledge-based plans (KBPs) were comparable to clinical plans with respect to target conformity and OAR doses. The KBPs resulted in a lower maximum spinal cord dose by 1.0 ± 1.6 Gy compared to clinical plans, p = 0.007. Sub-KBMs split according to tumor location did not produce significantly better DVH estimates compared to the full KBM. For central lesions, compared to the full KBM, the peripheral sub-KBM resulted in lower dose to 0.035 cc and 5 cc of the esophagus, both by 0.4Gy ± 0.8Gy, p = 0.025. For all lesions, compared to the full KBM, the posterior sub-KBM resulted in higher dose to 0.035 cc, 0.35 cc, and 1.2 cc of the spinal cord by 0.2 ± 0.4Gy, p = 0.01. Plans using template beam arrangements met target and OAR criteria, with an increase noted in maximum heart dose (1.2 ± 2.2Gy, p = 0.01) and GI (0.2 ± 0.4, p = 0.01) for the nine

  15. UniProtKB/Swiss-Prot, the Manually Annotated Section of the UniProt KnowledgeBase: How to Use the Entry View.

    PubMed

    Boutet, Emmanuel; Lieberherr, Damien; Tognolli, Michael; Schneider, Michel; Bansal, Parit; Bridge, Alan J; Poux, Sylvain; Bougueleret, Lydie; Xenarios, Ioannis

    2016-01-01

    The Universal Protein Resource (UniProt, http://www.uniprot.org ) consortium is an initiative of the SIB Swiss Institute of Bioinformatics (SIB), the European Bioinformatics Institute (EBI) and the Protein Information Resource (PIR) to provide the scientific community with a central resource for protein sequences and functional information. The UniProt consortium maintains the UniProt KnowledgeBase (UniProtKB), updated every 4 weeks, and several supplementary databases including the UniProt Reference Clusters (UniRef) and the UniProt Archive (UniParc).The Swiss-Prot section of the UniProt KnowledgeBase (UniProtKB/Swiss-Prot) contains publicly available expertly manually annotated protein sequences obtained from a broad spectrum of organisms. Plant protein entries are produced in the frame of the Plant Proteome Annotation Program (PPAP), with an emphasis on characterized proteins of Arabidopsis thaliana and Oryza sativa. High level annotations provided by UniProtKB/Swiss-Prot are widely used to predict annotation of newly available proteins through automatic pipelines.The purpose of this chapter is to present a guided tour of a UniProtKB/Swiss-Prot entry. We will also present some of the tools and databases that are linked to each entry.

  16. TH-A-9A-08: Knowledge-Based Quality Control of Clinical Stereotactic Radiosurgery Treatment Plans

    SciTech Connect

    Shiraishi, S; Moore, K L; Tan, J; Olsen, L

    2014-06-15

    Purpose: To develop a quality control tool to reduce stereotactic radiosurgery (SRS) planning variability using models that predict achievable plan quality metrics (QMs) based on individual patient anatomy. Methods: Using a knowledge-based methodology that quantitatively correlates anatomical geometric features to resultant organ-at-risk (OAR) dosimetry, we developed models for predicting achievable OAR dose-volume histograms (DVHs) by training with a cohort of previously treated SRS patients. The DVH-based QMs used in this work are the gradient measure, GM=(3/4pi)^1/3*[V50%^1/3−V100%^1/3], and V10Gy of normal brain. As GM quantifies the total rate of dose fall-off around the planning target volume (PTV), all voxels inside the patient's body contour were treated as OAR for DVH prediction. 35 previously treated SRS plans from our institution were collected; all were planned with non-coplanar volumetric-modulated arc therapy to prescription doses of 12–25 Gy. Of the 35-patient cohort, 15 were used for model training and 20 for model validation. Accuracies of the predictions were quantified by the mean and the standard deviation of the difference between clinical and predicted QMs, δQM=QM-clin−QM-pred. Results: Best agreement between predicted and clinical QMs was obtained when models were built separately for V-PTV<2.5cc and V-PTV>2.5cc. Eight patients trained the V-PTV<2.5cc model and seven patients trained the V-PTV>2.5cc models, respectively. The mean and the standard deviation of δGM were 0.3±0.4mm for the training sets and −0.1±0.6mm for the validation sets, demonstrating highly accurate GM predictions. V10Gy predictions were also highly accurate, with δV10Gy=0.8±0.7cc for the training sets and δV10Gy=0.7±1.4cc for the validation sets. Conclusion: The accuracy of the models in predicting two key SRS quality metrics highlights the potential of this technique for quality control for SRS treatments. Future investigations will seek to determine

  17. The use of knowledge-based Genetic Algorithm for starting time optimisation in a lot-bucket MRP

    NASA Astrophysics Data System (ADS)

    Ridwan, Muhammad; Purnomo, Andi

    2016-01-01

    In production planning, Material Requirement Planning (MRP) is usually developed based on time-bucket system, a period in the MRP is representing the time and usually weekly. MRP has been successfully implemented in Make To Stock (MTS) manufacturing, where production activity must be started before customer demand is received. However, to be implemented successfully in Make To Order (MTO) manufacturing, a modification is required on the conventional MRP in order to make it in line with the real situation. In MTO manufacturing, delivery schedule to the customers is defined strictly and must be fulfilled in order to increase customer satisfaction. On the other hand, company prefers to keep constant number of workers, hence production lot size should be constant as well. Since a bucket in conventional MRP system is representing time and usually weekly, hence, strict delivery schedule could not be accommodated. Fortunately, there is a modified time-bucket MRP system, called as lot-bucket MRP system that proposed by Casimir in 1999. In the lot-bucket MRP system, a bucket is representing a lot, and the lot size is preferably constant. The time to finish every lot could be varying depends on due date of lot. Starting time of a lot must be determined so that every lot has reasonable production time. So far there is no formal method to determine optimum starting time in the lot-bucket MRP system. Trial and error process usually used for it but some time, it causes several lots have very short production time and the lot-bucket MRP would be infeasible to be executed. This paper presents the use of Genetic Algorithm (GA) for optimisation of starting time in a lot-bucket MRP system. Even though GA is well known as powerful searching algorithm, however, improvement is still required in order to increase possibility of GA in finding optimum solution in shorter time. A knowledge-based system has been embedded in the proposed GA as the improvement effort, and it is proven that the

  18. A dosimetric evaluation of knowledge-based VMAT planning with simultaneous integrated boosting for rectal cancer patients.

    PubMed

    Wu, Hao; Jiang, Fan; Yue, Haizhen; Li, Sha; Zhang, Yibao

    2016-11-08

    RapidPlan, a commercial knowledge-based optimizer, has been tested on head and neck, lung, esophageal, breast, liver, and prostate cancer patients. To appraise its performance on VMAT planning with simultaneous integrated boosting (SIB) for rectal cancer, this study configured a DVH (dose-volume histogram) estimation model consisting 80 best-effort manual cases of this type. Using the model-generated objectives, the MLC (multileaf collimator) sequences of other 70 clinically approved plans were reoptimized, while the remaining parameters, such as field geometry and photon energy, were maintained. Dosimetric outcomes were assessed by comparing homogeneity index (HI), conformal index (CI), hot spots (volumes receiving over 107% of the prescribed dose, V107%), mean dose and dose to the 50% volume of femoral head (Dmean_FH and D50%_FH), and urinary bladder (Dmean_UB and D50%_UB), and the mean DVH plotting. Paired samples t-test or Wilcoxon signed-rank test suggested that comparable CI were achieved by RapidPlan (0.99± 0.04 for PTVboost, and 1.03 ± 0.02 for PTV) and original plans (1.00 ± 0.05 for PTVboost and 1.03 ± 0.02 for PTV), respectively (p > 0.05). Slightly improved HI of planning target volume (PTVboost) and PTV were observed in the RapidPlan cases (0.05 ± 0.01 for PTVboost, and 0.26 ± 0.01 for PTV) than the original plans (0.06 ± 0.01 for PTVboost and 0.26 ± 0.01 for PTV), p < 0.05. More cases with positive V107% were found in the original (18 plans) than the RapidPlan group (none). RapidPlan significantly reduced the D50%_FH (by 1.53 Gy / 9.86% from 15.52 ± 2.17 to 13.99± 1.16 Gy), Dmean_FH (by 1.29 Gy / 7.78% from 16.59± 2.07 to 15.30 ± 0.70 G), D50%_UB (by 4.93 Gy / 17.50% from 28.17 ± 3.07 to 23.24± 2.13 Gy), and Dmean_UB (by 3.94Gy / 13.43% from 29.34 ± 2.34 to 25.40 ± 1.36 Gy), respectively. The more concentrated distribution of RapidPlan data points indicated an enhanced consis-tency of plan quality.

  19. Producing Qualified Graduates and Assuring Education Quality in the Knowledge-Based Society: Roles and Issues of Graduate Education. Report of the International Workshop on Graduate Education, 2009. RIHE International Seminar Reports. No.14

    ERIC Educational Resources Information Center

    Research Institute for Higher Education, Hiroshima University (NJ3), 2010

    2010-01-01

    Through being specially funded by the Ministry of Education and Science in 2008, the Research Institute for Higher Education (RIHE) in Hiroshima University has been able to implement a new research project on the reform of higher education in the knowledge-based society of the 21st century. Thus RIHE hosted the second International Workshop on…

  20. The ECOTOX Knowledgebase

    EPA Science Inventory

    The U.S. EPAs ECOTOX database is the largest compilation of ecotoxicity data, providing information on the adverse effects of single chemical stressors to ecologically relevant aquatic and terrestrial species. The database includes 614,090 test records abstracted from over 27,000...

  1. Knowledge-Based Simulation

    DTIC Science & Technology

    1989-07-01

    as LDS (Waterman and Peterson, 1981), TATR ( Callero , Waterman, and Kipps, 1984), and SAL (Paul, Waterman, and Peterson, 1986). Finally, RAND’s...modeling techmque developed at RAND for use in judgment research (Veit, Callero , and Rose, 1984). (This methodology produces an algebraic model showing...forces typically focus on dlefensive operations (Veit, Rose, and Callero , 1981). Another aspect that differentiates Soviet and Western views on initiative

  2. A dosimetric evaluation of knowledge-based VMAT planning with simultaneous integrated boosting for rectal cancer patients.

    PubMed

    Wu, Hao; Jiang, Fan; Yue, Haizhen; Li, Sha; Zhang, Yibao

    2016-11-01

    RapidPlan, a commercial knowledge-based optimizer, has been tested on head and neck, lung, esophageal, breast, liver, and prostate cancer patients. To appraise its performance on VMAT planning with simultaneous integrated boosting (SIB) for rectal cancer, this study configured a DVH (dose-volume histogram) estimation model consisting 80 best-effort manual cases of this type. Using the model- generated objectives, the MLC (multileaf collimator) sequences of other 70 clinically approved plans were reoptimized, while the remaining parameters, such as field geometry and photon energy, were maintained. Dosimetric outcomes were assessed by comparing homogeneity index (HI), conformal index (CI), hot spots (volumes receiving over 107% of the prescribed dose, V107%), mean dose and dose to the 50% volume of femoral head (Dmean_FH and D50%_FH), and urinary bladder (Dmean_UB and D50%_UB), and the mean DVH plotting. Paired samples t-test or Wilcoxon signed-rank test suggested that comparable CI were achieved by RapidPlan (0.99 ± 0.04 for PTVboost, and 1.03 ± 0.02 for PTV) and original plans (1.00 ± 0.05 for PTVboost and 1.03 ± 0.02 for PTV), respectively (p > 0.05). Slightly improved HI of planning target volume (PTVboost) and PTV were observed in the RapidPlan cases (0.05 ± 0.01 for PTVboost, and 0.26 ± 0.01 for PTV) than the original plans (0.06 ± 0.01 for PTVboost and 0.26 ± 0.01 for PTV), p < 0.05. More cases with positive V107% were found in the original (18 plans) than the RapidPlan group (none). RapidPlan significantly reduced the D50%_FH (by 1.53 Gy/9.86% from 15.52 ± 2.17 to 13.99 ± 1.16 Gy), Dmean_FH (by 1.29 Gy/7.78% from 16.59±2.07 to 15.30±0.70 G), D50%_UB (by 4.93 Gy/17.50% from 28.17±3.07 to 23.24±2.13 Gy), and Dmean_UB (by 3.94 Gy/13.43% from 29.34±2.34 to 25.40±1.36 Gy), respectively. The more concentrated distribution of RapidPlan data points indicated an enhanced consistency of plan

  3. [The SIAARTI consensus document on the management of patients with end-stage chronic organ failure. From evidence-based medicine to knowledge-based medicine].

    PubMed

    Bertolini, Guido

    2014-01-01

    The management of patients with end-stage chronic organ failure is an increasingly important topic, since the extraordinary medical and technological advances have significantly reduced mortality and improved quality of life with prolonged survival of end-stage diseases. What should be the plan of care for these patients? Who should bear the responsibility for care? With what targets? These are crucial questions, to which modern medicine should provide convincing answers. The authors of the document explicitly resisted the temptation to draw up guidelines, showing that it is possible to customize medical intervention on the individual patient, keeping it tightly linked to the available knowledge. This is the most relevant aspect of the document: it goes beyond the classical concept of evidence-based medicine choosing to refer to the most dynamic knowledge-based medicine approach.

  4. Creating a knowledge-based economy in the United Arab Emirates: realising the unfulfilled potential of women in the science, technology and engineering fields

    NASA Astrophysics Data System (ADS)

    Ghazal Aswad, Noor; Vidican, Georgeta; Samulewicz, Diana

    2011-12-01

    As the United Arab Emirates (UAE) moves towards a knowledge-based economy, maximising the participation of the national workforce, especially women, in the transformation process is crucial. Using survey methods and semi-structured interviews, this paper examines the factors that influence women's decisions regarding their degree programme and their attitudes towards science, technology and engineering (STE). The findings point to the importance of adapting mainstream policies to the local context and the need to better understand the effect of culture and society on the individual and the economy. There is a need to increase interest in STE by raising awareness of what the fields entail, potential careers and their suitability with existing cultural beliefs. Also suggested is the need to overcome negative stereotypes of engineering, implement initiatives for further family involvement at the higher education level, as well as the need to ensure a greater availability of STE university programmes across the UAE.

  5. Development and evaluation of a clinical model for lung cancer patients using stereotactic body radiotherapy (SBRT) within a knowledge-based algorithm for treatment planning.

    PubMed

    Chin Snyder, Karen; Kim, Jinkoo; Reding, Anne; Fraser, Corey; Gordon, James; Ajlouni, Munther; Movsas, Benjamin; Chetty, Indrin J

    2016-11-01

    The purpose of this study was to describe the development of a clinical model for lung cancer patients treated with stereotactic body radiotherapy (SBRT) within a knowledge-based algorithm for treatment planning, and to evaluate the model performance and applicability to different planning techniques, tumor locations, and beam arrangements. 105 SBRT plans for lung cancer patients previously treated at our institution were included in the development of the knowledge-based model (KBM). The KBM was trained with a combination of IMRT, VMAT, and 3D CRT techniques. Model performance was validated with 25 cases, for both IMRT and VMAT. The full KBM encompassed lesions located centrally vs. peripherally (43:62), upper vs. lower (62:43), and anterior vs. posterior (60:45). Four separate sub-KBMs were created based on tumor location. Results were compared with the full KBM to evaluate its robustness. Beam templates were used in conjunction with the optimizer to evaluate the model's ability to handle suboptimal beam placements. Dose differences to organs-at-risk (OAR) were evaluated between the plans generated by each KBM. Knowledge-based plans (KBPs) were comparable to clinical plans with respect to target conformity and OAR doses. The KBPs resulted in a lower maximum spinal cord dose by 1.0±1.6Gy compared to clinical plans, p=0.007. Sub-KBMs split according to tumor location did not produce significantly better DVH estimates compared to the full KBM. For central lesions, compared to the full KBM, the peripheral sub-KBM resulted in lower dose to 0.035 cc and 5 cc of the esophagus, both by 0.4Gy±0.8Gy, p=0.025. For all lesions, compared to the full KBM, the posterior sub-KBM resulted in higher dose to 0.035 cc, 0.35 cc, and 1.2 cc of the spinal cord by 0.2±0.4Gy, p=0.01. Plans using template beam arrangements met target and OAR criteria, with an increase noted in maximum heart dose (1.2±2.2Gy, p=0.01) and GI (0.2±0.4, p=0.01) for the nine-field plans relative to KBPs

  6. Neural network pattern recognition of photoacoustic FTIR spectra and knowledge-based techniques for detection of mycotoxigenic fungi in food grains.

    PubMed

    Gordon, S H; Wheeler, B C; Schudy, R B; Wicklow, D T; Greene, R V

    1998-02-01

    Fourier transform infrared photoacoustic spectroscopy (FTIR-PAS), a highly sensitive probe of the surfaces of solid substrates, is used to detect toxigenic fungal contamination in corn. Kernels of corn infected with mycotoxigenic fungi, such as Aspergillus flavus, display FTIR-PAS spectra that differ significantly form spectra of uninfected kernels. Photoacoustic infrared spectral features were identified, and an artificial neural network was trained to distinguish contaminated form uncontaminated corn by pattern recognition. Work is in progress to integrate epidemiological information about cereal crop fungal disease into the pattern recognition program to produce a more knowledge-based, and hence more reliable and specific, technique. A model of a hierarchically organized expert system is proposed, using epidemiological factors such as corn variety, plant stress and susceptibility to infection, geographic location, weather, insect vectors, and handling and storage conditions, in addition to the analytical data, to predict Al. flavus and other kinds of toxigenic fungal contamination that might be present in food grains.

  7. DES-ncRNA: A knowledgebase for exploring information about human micro and long noncoding RNAs based on literature-mining.

    PubMed

    Salhi, Adil; Essack, Magbubah; Alam, Tanvir; Bajic, Vladan P; Ma, Lina; Radovanovic, Aleksandar; Marchand, Benoit; Schmeier, Sebastian; Zhang, Zhang; Bajic, Vladimir B

    2017-04-07

    Noncoding RNAs (ncRNAs), particularly microRNAs (miRNAs) and long ncRNAs (lncRNAs), are important players in diseases and emerge as novel drug targets. Thus, unraveling the relationships between ncRNAs and other biomedical entities in cells are critical for better understanding ncRNA roles that may eventually help develop their use in medicine. To support ncRNA research and facilitate retrieval of relevant information regarding miRNAs and lncRNAs from the plethora of published ncRNA-related research, we developed DES-ncRNA ( www.cbrc.kaust.edu.sa/des_ncrna ). DES-ncRNA is a knowledgebase containing text- and data-mined information from public scientific literature and other public resources. Exploration of mined information is enabled through terms and pairs of terms from 19 topic-specific dictionaries including, for example, antibiotics, toxins, drugs, enzymes, mutations, pathways, human genes and proteins, drug indications and side effects, mutations, diseases, etc. DES-ncRNA contains approximately 878,000 associations of terms from these dictionaries of which 36,222 (5,373) are with regards to miRNAs (lncRNAs). We provide several ways to explore information regarding ncRNAs to users including controlled generation of association networks as well as hypotheses generation. We show an example how DES-ncRNA can aid research on Alzheimer's disease and suggest potential therapeutic role for Fasudil. DES-ncRNA is a powerful tool that can be used on its own or as a complement to the existing resources, to support research in human ncRNA. To our knowledge, this is the only knowledgebase dedicated to human miRNAs and lncRNAs derived primarily through literature-mining enabling exploration of a broad spectrum of associated biomedical entities, not paralleled by any other resource.

  8. Reification of abstract concepts to improve comprehension using interactive virtual environments and a knowledge-based design: a renal physiology model.

    PubMed

    Alverson, Dale C; Saiki, Stanley M; Caudell, Thomas P; Goldsmith, Timothy; Stevens, Susan; Saland, Linda; Colleran, Kathleen; Brandt, John; Danielson, Lee; Cerilli, Lisa; Harris, Alexis; Gregory, Martin C; Stewart, Randall; Norenberg, Jeffery; Shuster, George; Panaoitis; Holten, James; Vergera, Victor M; Sherstyuk, Andrei; Kihmm, Kathleen; Lui, Jack; Wang, Kin Lik

    2006-01-01

    Several abstract concepts in medical education are difficult to teach and comprehend. In order to address this challenge, we have been applying the approach of reification of abstract concepts using interactive virtual environments and a knowledge-based design. Reification is the process of making abstract concepts and events, beyond the realm of direct human experience, concrete and accessible to teachers and learners. Entering virtual worlds and simulations not otherwise easily accessible provides an opportunity to create, study, and evaluate the emergence of knowledge and comprehension from the direct interaction of learners with otherwise complex abstract ideas and principles by bringing them to life. Using a knowledge-based design process and appropriate subject matter experts, knowledge structure methods are applied in order to prioritize, characterize important relationships, and create a concept map that can be integrated into the reified models that are subsequently developed. Applying these principles, our interdisciplinary team has been developing a reified model of the nephron into which important physiologic functions can be integrated and rendered into a three dimensional virtual environment called Flatland, a virtual environments development software tool, within which a learners can interact using off-the-shelf hardware. The nephron model can be driven dynamically by a rules-based artificial intelligence engine, applying the rules and concepts developed in conjunction with the subject matter experts. In the future, the nephron model can be used to interactively demonstrate a number of physiologic principles or a variety of pathological processes that may be difficult to teach and understand. In addition, this approach to reification can be applied to a host of other physiologic and pathological concepts in other systems. These methods will require further evaluation to determine their impact and role in learning.

  9. A catalog of putative adverse outcome pathways (AOPs) that will enhance the utility of ToxCast high throughput screening data for hazard identification, delivered via a putative AOP knowledgebase and a ToxCast assay annotation file that can be linked with the iCSS dashboard.

    EPA Science Inventory

    A number of putative AOPs for several distinct MIEs of thyroid disruption have been formulated for amphibian metamorphosis and fish swim bladder inflation. These have been entered into the AOP knowledgebase on the OECD WIKI.

  10. An innovative approach to addressing childhood obesity: a knowledge-based infrastructure for supporting multi-stakeholder partnership decision-making in Quebec, Canada.

    PubMed

    Addy, Nii Antiaye; Shaban-Nejad, Arash; Buckeridge, David L; Dubé, Laurette

    2015-01-23

    Multi-stakeholder partnerships (MSPs) have become a widespread means for deploying policies in a whole of society strategy to address the complex problem of childhood obesity. However, decision-making in MSPs is fraught with challenges, as decision-makers are faced with complexity, and have to reconcile disparate conceptualizations of knowledge across multiple sectors with diverse sets of indicators and data. These challenges can be addressed by supporting MSPs with innovative tools for obtaining, organizing and using data to inform decision-making. The purpose of this paper is to describe and analyze the development of a knowledge-based infrastructure to support MSP decision-making processes. The paper emerged from a study to define specifications for a knowledge-based infrastructure to provide decision support for community-level MSPs in the Canadian province of Quebec. As part of the study, a process assessment was conducted to understand the needs of communities as they collect, organize, and analyze data to make decisions about their priorities. The result of this process is a "portrait", which is an epidemiological profile of health and nutrition in their community. Portraits inform strategic planning and development of interventions, and are used to assess the impact of interventions. Our key findings indicate ambiguities and disagreement among MSP decision-makers regarding causal relationships between actions and outcomes, and the relevant data needed for making decisions. MSP decision-makers expressed a desire for easy-to-use tools that facilitate the collection, organization, synthesis, and analysis of data, to enable decision-making in a timely manner. Findings inform conceptual modeling and ontological analysis to capture the domain knowledge and specify relationships between actions and outcomes. This modeling and analysis provide the foundation for an ontology, encoded using OWL 2 Web Ontology Language. The ontology is developed to provide semantic

  11. MO-F-CAMPUS-T-04: Development and Evaluation of a Knowledge-Based Model for Treatment Planning of Lung Cancer Patients Using Stereotactic Body Radiotherapy (SBRT)

    SciTech Connect

    Snyder, K; Kim, J; Reding, A; Fraser, C; Lu, S; Gordon, J; Ajlouni, M; Movsas, B; Chetty, I

    2015-06-15

    Purpose: To describe the development of a knowledge-based treatment planning model for lung cancer patients treated with SBRT, and to evaluate the model performance and applicability to different planning techniques and tumor locations. Methods: 105 lung SBRT plans previously treated at our institution were included in the development of the model using Varian’s RapidPlan DVH estimation algorithm. The model was trained with a combination of IMRT, VMAT, and 3D–CRT techniques. Tumor locations encompassed lesions located centrally vs peripherally (43:62), upper vs lower (62:43), and anterior vs posterior lobes (60:45). The model performance was validated with 25 cases independent of the training set, for both IMRT and VMAT. Model generated plans were created with only one optimization and no planner intervention. The original, general model was also divided into four separate models according to tumor location. The model was also applied using different beam templates to further improve workflow. Dose differences to targets and organs-at-risk were evaluated. Results: IMRT and VMAT RapidPlan generated plans were comparable to clinical plans with respect to target coverage and several OARs. Spinal cord dose was lowered in the model-based plans by 1Gy compared to the clinical plans, p=0.008. Splitting the model according to tumor location resulted in insignificant differences in DVH estimation. The peripheral model decreased esophagus dose to the central lesions by 0.5Gy compared to the original model, p=0.025, and the posterior model increased dose to the spinal cord by 1Gy compared to the anterior model, p=0.001. All template beam plans met OAR criteria, with 1Gy increases noted in maximum heart dose for the 9-field plans, p=0.04. Conclusion: A RapidPlan knowledge-based model for lung SBRT produces comparable results to clinical plans, with increased consistency and greater efficiency. The model encompasses both IMRT and VMAT techniques, differing tumor locations

  12. An Innovative Approach to Addressing Childhood Obesity: A Knowledge-Based Infrastructure for Supporting Multi-Stakeholder Partnership Decision-Making in Quebec, Canada

    PubMed Central

    Addy, Nii Antiaye; Shaban-Nejad, Arash; Buckeridge, David L.; Dubé, Laurette

    2015-01-01

    Multi-stakeholder partnerships (MSPs) have become a widespread means for deploying policies in a whole of society strategy to address the complex problem of childhood obesity. However, decision-making in MSPs is fraught with challenges, as decision-makers are faced with complexity, and have to reconcile disparate conceptualizations of knowledge across multiple sectors with diverse sets of indicators and data. These challenges can be addressed by supporting MSPs with innovative tools for obtaining, organizing and using data to inform decision-making. The purpose of this paper is to describe and analyze the development of a knowledge-based infrastructure to support MSP decision-making processes. The paper emerged from a study to define specifications for a knowledge-based infrastructure to provide decision support for community-level MSPs in the Canadian province of Quebec. As part of the study, a process assessment was conducted to understand the needs of communities as they collect, organize, and analyze data to make decisions about their priorities. The result of this process is a “portrait”, which is an epidemiological profile of health and nutrition in their community. Portraits inform strategic planning and development of interventions, and are used to assess the impact of interventions. Our key findings indicate ambiguities and disagreement among MSP decision-makers regarding causal relationships between actions and outcomes, and the relevant data needed for making decisions. MSP decision-makers expressed a desire for easy-to-use tools that facilitate the collection, organization, synthesis, and analysis of data, to enable decision-making in a timely manner. Findings inform conceptual modeling and ontological analysis to capture the domain knowledge and specify relationships between actions and outcomes. This modeling and analysis provide the foundation for an ontology, encoded using OWL 2 Web Ontology Language. The ontology is developed to provide

  13. FluKB: A Knowledge-Based System for Influenza Vaccine Target Discovery and Analysis of the Immunological Properties of Influenza Viruses

    PubMed Central

    Simon, Christian; Kudahl, Ulrich J.; Sun, Jing; Olsen, Lars Rønn; Zhang, Guang Lan; Reinherz, Ellis L.; Brusic, Vladimir

    2015-01-01

    FluKB is a knowledge-based system focusing on data and analytical tools for influenza vaccine discovery. The main goal of FluKB is to provide access to curated influenza sequence and epitope data and enhance the analysis of influenza sequence diversity and the analysis of targets of immune responses. FluKB consists of more than 400,000 influenza protein sequences, known epitope data (357 verified T-cell epitopes, 685 HLA binders, and 16 naturally processed MHC ligands), and a collection of 28 influenza antibodies and their structurally defined B-cell epitopes. FluKB was built using a modular framework allowing the implementation of analytical workflows and includes standard search tools, such as keyword search and sequence similarity queries, as well as advanced tools for the analysis of sequence variability. The advanced analytical tools for vaccine discovery include visual mapping of T- and B-cell vaccine targets and assessment of neutralizing antibody coverage. FluKB supports the discovery of vaccine targets and the analysis of viral diversity and its implications for vaccine discovery as well as potential T-cell breadth and antibody cross neutralization involving multiple strains. FluKB is representation of a new generation of databases that integrates data, analytical tools, and analytical workflows that enable comprehensive analysis and automatic generation of analysis reports. PMID:26504853

  14. User`s guide for the KBERT 1.0 code: For the knowledge-based estimation of hazards of radioactive material releases from DOE nuclear facilities

    SciTech Connect

    Browitt, D.S.; Washington, K.E.; Powers, D.A.

    1995-07-01

    The possibility of worker exposure to radioactive materials during accidents at nuclear facilities is a principal concern of the DOE. The KBERT software has been developed at Sandia National Laboratories under DOE support to address this issue by assisting in the estimation of risks posed by accidents at chemical and nuclear facilities. KBERT is an acronym for Knowledge-Based system for Estimating hazards of Radioactive material release Transients. The current prototype version of KBERT focuses on calculation of doses and consequences to in-facility workers due to accidental releases of radioactivity. This report gives detailed instructions on how a user who is familiar with the design, layout and potential hazards of a facility can use KBERT to assess the risks to workers in that facility. KBERT is a tool that allows a user to simulate possible accidents and observe the predicted consequences. Potential applications of KBERT include the evaluation of the efficacy of evacuation practices, worker shielding, personal protection equipment and the containment of hazardous materials.

  15. Prediction and validation of protein-protein interactors from genome-wide DNA-binding data using a knowledge-based machine-learning approach.

    PubMed

    Waardenberg, Ashley J; Homan, Bernou; Mohamed, Stephanie; Harvey, Richard P; Bouveret, Romaric

    2016-09-01

    The ability to accurately predict the DNA targets and interacting cofactors of transcriptional regulators from genome-wide data can significantly advance our understanding of gene regulatory networks. NKX2-5 is a homeodomain transcription factor that sits high in the cardiac gene regulatory network and is essential for normal heart development. We previously identified genomic targets for NKX2-5 in mouse HL-1 atrial cardiomyocytes using DNA-adenine methyltransferase identification (DamID). Here, we apply machine learning algorithms and propose a knowledge-based feature selection method for predicting NKX2-5 protein : protein interactions based on motif grammar in genome-wide DNA-binding data. We assessed model performance using leave-one-out cross-validation and a completely independent DamID experiment performed with replicates. In addition to identifying previously described NKX2-5-interacting proteins, including GATA, HAND and TBX family members, a number of novel interactors were identified, with direct protein : protein interactions between NKX2-5 and retinoid X receptor (RXR), paired-related homeobox (PRRX) and Ikaros zinc fingers (IKZF) validated using the yeast two-hybrid assay. We also found that the interaction of RXRα with NKX2-5 mutations found in congenital heart disease (Q187H, R189G and R190H) was altered. These findings highlight an intuitive approach to accessing protein-protein interaction information of transcription factors in DNA-binding experiments.

  16. Development and Assessment of a Geographic Knowledge-Based Model for Mapping Suitable Areas for Rift Valley Fever Transmission in Eastern Africa

    PubMed Central

    Tran, Annelise; Trevennec, Carlène; Lutwama, Julius; Sserugga, Joseph; Gély, Marie; Pittiglio, Claudia; Pinto, Julio; Chevalier, Véronique

    2016-01-01

    Rift Valley fever (RVF), a mosquito-borne disease affecting ruminants and humans, is one of the most important viral zoonoses in Africa. The objective of the present study was to develop a geographic knowledge-based method to map the areas suitable for RVF amplification and RVF spread in four East African countries, namely, Kenya, Tanzania, Uganda and Ethiopia, and to assess the predictive accuracy of the model using livestock outbreak data from Kenya and Tanzania. Risk factors and their relative importance regarding RVF amplification and spread were identified from a literature review. A numerical weight was calculated for each risk factor using an analytical hierarchy process. The corresponding geographic data were collected, standardized and combined based on a weighted linear combination to produce maps of the suitability for RVF transmission. The accuracy of the resulting maps was assessed using RVF outbreak locations in livestock reported in Kenya and Tanzania between 1998 and 2012 and the ROC curve analysis. Our results confirmed the capacity of the geographic information system-based multi-criteria evaluation method to synthesize available scientific knowledge and to accurately map (AUC = 0.786; 95% CI [0.730–0.842]) the spatial heterogeneity of RVF suitability in East Africa. This approach provides users with a straightforward and easy update of the maps according to data availability or the further development of scientific knowledge. PMID:27631374

  17. Implementation of a knowledge-based methodology in a decision support system for the design of suitable wastewater treatment process flow diagrams.

    PubMed

    Garrido-Baserba, Manel; Reif, Rubén; Hernández, Francesc; Poch, Manel

    2012-12-15

    In light of rapid global change, the demand for wastewater treatment is increasing rapidly and will continue to do so in the near future. Wastewater management is a complex puzzle for which the proper pieces must be combined to achieve the desired solution, requiring the simultaneous consideration of technical, economic, social and environmental issues. In this context, a knowledge-based methodology (KBM) for the conceptual design of wastewater treatment plant (WWTP) process flow diagrams (PFDs) and its application for two scenarios is presented in this paper. The core of the KBM is composed of two knowledge bases (KBs). The first, a specification knowledge base (S-KB), summarizes the main features of the different treatment technologies: pollutants removal efficiency, operational costs and technical reliability. The second, a compatibility knowledge base (C-KB), contains information about the different interactions amongst the treatment technologies and determines their degree of compatibility. The proposed methodology is based on a decision hierarchy that uses the information contained in both KBs to generate all possible WWTP configurations, screening and selecting appropriate configurations based on user-specified requirements and scenario characteristics. The design of the most adequate treatment train for small and medium sized wastewater treatment plants (2000 and 50,000 p.e. respectively) according to different restrictions (space constraints, operation simplicity and cost optimization) was the example in order to show the usefulness of the KBM.

  18. Distance learning on the Web supported by Javascript: a critical appraisal with examples from clay mineralogy and knowledge-based tests

    NASA Astrophysics Data System (ADS)

    Krumm, S.; Thum, I.

    1998-08-01

    The hypertext mark-up language (HTML) is used to create hypertext documents in use on the World-Wide Web (WWW), built up as a client/server model. In this paper we discuss the enhancement of HTML documents with JavaScript, a script language understood by most common browsers. JavaScript is considered an easy means for bringing interactivity and answer checking to educational Web pages. It is faster to learn compared to using a programming language like PERL and has the advantage of high portability between different operating systems. Because all actions are performed on the client side, it reduces net traffic and pages can be used off-line. Educational usage, including tests and operations in future distance learning are outlined. Examples of JavaScript supported documents are given using clay mineralogy and knowledge-based tests as examples. A critical review of this relatively new technology reveals some compatibility problems but these seem to be offset by the possibility to make Web pages more attractive.

  19. Conformational temperature-dependent behavior of a histone H2AX: a coarse-grained Monte Carlo approach via knowledge-based interaction potentials.

    PubMed

    Fritsche, Miriam; Pandey, Ras B; Farmer, Barry L; Heermann, Dieter W

    2012-01-01

    Histone proteins are not only important due to their vital role in cellular processes such as DNA compaction, replication and repair but also show intriguing structural properties that might be exploited for bioengineering purposes such as the development of nano-materials. Based on their biological and technological implications, it is interesting to investigate the structural properties of proteins as a function of temperature. In this work, we study the spatial response dynamics of the histone H2AX, consisting of 143 residues, by a coarse-grained bond fluctuating model for a broad range of normalized temperatures. A knowledge-based interaction matrix is used as input for the residue-residue Lennard-Jones potential.We find a variety of equilibrium structures including global globular configurations at low normalized temperature (T* = 0.014), combination of segmental globules and elongated chains (T* = 0.016,0.017), predominantly elongated chains (T* = 0.019,0.020), as well as universal SAW conformations at high normalized temperature (T* ≥ 0.023). The radius of gyration of the protein exhibits a non-monotonic temperature dependence with a maximum at a characteristic temperature (T(c)* = 0.019) where a crossover occurs from a positive (stretching at T* ≤ T(c)*) to negative (contraction at T* ≥ T(c)*) thermal response on increasing T*.

  20. Individual 3D region-of-interest atlas of the human brain: knowledge-based class image analysis for extraction of anatomical objects

    NASA Astrophysics Data System (ADS)

    Wagenknecht, Gudrun; Kaiser, Hans-Juergen; Sabri, Osama; Buell, Udalrich

    2000-06-01

    After neural network-based classification of tissue types, the second step of atlas extraction is knowledge-based class image analysis to get anatomically meaningful objects. Basic algorithms are region growing, mathematical morphology operations, and template matching. A special algorithm was designed for each object. The class label of each voxel and the knowledge about the relative position of anatomical objects to each other and to the sagittal midplane of the brain can be utilized for object extraction. User interaction is only necessary to define starting, mid- and end planes for most object extractions and to determine the number of iterations for erosion and dilation operations. Extraction can be done for the following anatomical brain regions: cerebrum; cerebral hemispheres; cerebellum; brain stem; white matter (e.g., centrum semiovale); gray matter [cortex, frontal, parietal, occipital, temporal lobes, cingulum, insula, basal ganglia (nuclei caudati, putamen, thalami)]. For atlas- based quantification of functional data, anatomical objects can be convoluted with the point spread function of functional data to take into account the different resolutions of morphological and functional modalities. This method allows individual atlas extraction from MRI image data of a patient without the need of warping individual data to an anatomical or statistical MRI brain atlas.

  1. Validation of an enhanced knowledge-based method for segmentation and quantitative analysis of intrathoracic airway trees from three-dimensional CT images

    SciTech Connect

    Sonka, M.; Park, W.; Hoffman, E.A.

    1995-12-31

    Accurate assessment of airway physiology, evaluated in terms of geometric changes, is critically dependent upon the accurate imaging and image segmentation of the three-dimensional airway tree structure. The authors have previously reported a knowledge-based method for three-dimensional airway tree segmentation from high resolution CT (HRCT) images. Here, they report a substantially improved version of the method. In the current implementation, the method consists of several stages. First, the lung borders are automatically determined in the three-dimensional set of HRCT data. The primary airway tree is semi-automatically identified. In the next stage, potential airways are determined in individual CT slices using a rule-based system that uses contextual information and a priori knowledge about pulmonary anatomy. Using three-dimensional connectivity properties of the pulmonary airway tree, the three-dimensional tree is constructed from the set of adjacent slices. The method`s performance and accuracy were assessed in five 3D HRCT canine images. Computer-identified airways matched 226/258 observer-defined airways (87.6%); the computer method failed to detect the airways in the remaining 32 locations. By visual assessment of rendered airway trees, the experienced observers judged the computer-detected airway trees as highly realistic.

  2. Prediction and validation of protein–protein interactors from genome-wide DNA-binding data using a knowledge-based machine-learning approach

    PubMed Central

    Homan, Bernou; Mohamed, Stephanie; Harvey, Richard P.; Bouveret, Romaric

    2016-01-01

    The ability to accurately predict the DNA targets and interacting cofactors of transcriptional regulators from genome-wide data can significantly advance our understanding of gene regulatory networks. NKX2-5 is a homeodomain transcription factor that sits high in the cardiac gene regulatory network and is essential for normal heart development. We previously identified genomic targets for NKX2-5 in mouse HL-1 atrial cardiomyocytes using DNA-adenine methyltransferase identification (DamID). Here, we apply machine learning algorithms and propose a knowledge-based feature selection method for predicting NKX2-5 protein : protein interactions based on motif grammar in genome-wide DNA-binding data. We assessed model performance using leave-one-out cross-validation and a completely independent DamID experiment performed with replicates. In addition to identifying previously described NKX2-5-interacting proteins, including GATA, HAND and TBX family members, a number of novel interactors were identified, with direct protein : protein interactions between NKX2-5 and retinoid X receptor (RXR), paired-related homeobox (PRRX) and Ikaros zinc fingers (IKZF) validated using the yeast two-hybrid assay. We also found that the interaction of RXRα with NKX2-5 mutations found in congenital heart disease (Q187H, R189G and R190H) was altered. These findings highlight an intuitive approach to accessing protein–protein interaction information of transcription factors in DNA-binding experiments. PMID:27683156

  3. Numerical, Analytical, Experimental Study of Fluid Dynamic Forces in Seals. Volume 1; Executive Summary and Description of Knowledge-Based System

    NASA Technical Reports Server (NTRS)

    Liang, Anita D. (Technical Monitor); Shapiro, Wilbur; Aggarwal, Bharat

    2004-01-01

    The objectives of the program were to develop computational fluid dynamics (CFD) codes and simpler industrial codes for analyzing and designing advanced seals for air-breathing and space propulsion engines. The CFD code SCISEAL is capable of producing full three-dimensional flow field information for a variety of cylindrical configurations. An implicit multidomain capability allows the division of complex flow domains to allow optimum use of computational cells. SCISEAL also has the unique capability to produce cross-coupled stiffness and damping coefficients for rotordynamic computations. The industrial codes consist of a series of separate stand-alone modules designed for expeditious parametric analyses and optimization of a wide variety of cylindrical and face seals. Coupled through a Knowledge-Based System (KBS) that provides a user-friendly Graphical User Interface (GUI), the industrial codes are PC based using an OS/2 operating system. These codes were designed to treat film seals where a clearance exists between the rotating and stationary components. Leakage is inhibited by surface roughness, small but stiff clearance films, and viscous pumping devices. The codes have demonstrated to be a valuable resource for seal development of future air-breathing and space propulsion engines.

  4. Identification of a candidate single-nucleotide polymorphism related to chemotherapeutic response through a combination of knowledge-based algorithm and hypothesis-free genomic data.

    PubMed

    Takahashi, Hiro; Kaniwa, Nahoko; Saito, Yoshiro; Sai, Kimie; Hamaguchi, Tetsuya; Shirao, Kuniaki; Shimada, Yasuhiro; Matsumura, Yasuhiro; Ohtsu, Atsushi; Yoshino, Takayuki; Takahashi, Anna; Odaka, Yoko; Okuyama, Misuzu; Sawada, Jun-ichi; Sakamoto, Hiromi; Yoshida, Teruhiko

    2013-12-01

    Inter-individual variations in drug responses among patients are known to cause serious problems in medicine. Genome-wide association study (GWAS) is powerful for examining single-nucleotide polymorphisms (SNPs) and their relationships with drug response variations. However, no significant SNP has been identified using GWAS due to multiple testing problems. Therefore, we propose a combination method consisting of knowledge-based algorithm, two stages of screening, and permutation test for identifying SNPs in the present study. We applied this method to a genome-wide pharmacogenomics study for which 109,365 SNPs had been genotyped using Illumina Human-1 BeadChip for 119 gastric cancer patients treated with fluoropyrimidine. We identified rs2293347 in epidermal growth factor receptor (EGFR) is as a candidate SNP related to chemotherapeutic response. The p value for the rs2293347 was 2.19 × 10(-5) for Fisher's exact test, and the p value was 0.00360 for the permutation test (multiple testing problems are corrected). Additionally, rs2293347 was clearly superior to clinical parameters and showed a sensitivity value of 55.0% and specificity value of 94.4% in the evaluation by using multiple regression models. Recent studies have shown that combination chemotherapy of fluoropyrimidine and EGFR-targeting agents is effective for gastric cancer patients highly expressing EGFR. These results suggest that rs2293347 is a potential predictive factor for selecting chemotherapies, such as fluoropyrimidine alone or combination chemotherapies.

  5. Knowledge engineering for adverse drug event prevention: on the design and development of a uniform, contextualized and sustainable knowledge-based framework.

    PubMed

    Koutkias, Vassilis; Kilintzis, Vassilis; Stalidis, George; Lazou, Katerina; Niès, Julie; Durand-Texte, Ludovic; McNair, Peter; Beuscart, Régis; Maglaveras, Nicos

    2012-06-01

    The primary aim of this work was the development of a uniform, contextualized and sustainable knowledge-based framework to support adverse drug event (ADE) prevention via Clinical Decision Support Systems (CDSSs). In this regard, the employed methodology involved first the systematic analysis and formalization of the knowledge sources elaborated in the scope of this work, through which an application-specific knowledge model has been defined. The entire framework architecture has been then specified and implemented by adopting Computer Interpretable Guidelines (CIGs) as the knowledge engineering formalism for its construction. The framework integrates diverse and dynamic knowledge sources in the form of rule-based ADE signals, all under a uniform Knowledge Base (KB) structure, according to the defined knowledge model. Equally important, it employs the means to contextualize the encapsulated knowledge, in order to provide appropriate support considering the specific local environment (hospital, medical department, language, etc.), as well as the mechanisms for knowledge querying, inference, sharing, and management. In this paper, we present thoroughly the establishment of the proposed knowledge framework by presenting the employed methodology and the results obtained as regards implementation, performance and validation aspects that highlight its applicability and virtue in medication safety.

  6. SU-E-P-58: Dosimetric Study of Conventional Intensity-Modulated Radiotherapy and Knowledge-Based Radiation Therapy for Postoperation of Cervix Cancer

    SciTech Connect

    Ma, C; Yin, Y

    2015-06-15

    Purpose: To compare the dosimetric difference of the target volume and organs at risk(OARs) between conventional intensity-modulated radiotherapy(C-IMRT) and knowledge-based radiation therapy (KBRT) plans for cervix cancer. Methods: 39 patients with cervical cancer after surgery were randomly selected, 20 patient plans were used to create the model, the other 19 cases used for comparative evaluation. All plans were designed in Eclipse system. The prescription dose was 30.6Gy, 17 fractions, OARs dose satisfied to the clinical requirement. A paired t test was used to evaluate the differences of dose-volume histograms (DVH). Results: Comparaed to C-IMRT plan, the KBRT plan target can achieve the similar target dose coverage, D98,D95,D2,HI and CI had no difference (P≥0.05). The dose of rectum, bladder and femoral heads had no significant differences(P≥0.05). The time was used to design treatment plan was significant reduced. Conclusion: This study shows that postoperative radiotherapy of cervical KBRT plans can achieve the similar target and OARs dose, but the shorter designing time.

  7. Iterative Knowledge-Based Scoring Functions Derived from Rigid and Flexible Decoy Structures: Evaluation with the 2013 and 2014 CSAR Benchmarks.

    PubMed

    Yan, Chengfei; Grinter, Sam Z; Merideth, Benjamin Ryan; Ma, Zhiwei; Zou, Xiaoqin

    2016-06-27

    In this study, we developed two iterative knowledge-based scoring functions, ITScore_pdbbind(rigid) and ITScore_pdbbind(flex), using rigid decoy structures and flexible decoy structures, respectively, that were generated from the protein-ligand complexes in the refined set of PDBbind 2012. These two scoring functions were evaluated using the 2013 and 2014 CSAR benchmarks. The results were compared with the results of two other scoring functions, the Vina scoring function and ITScore, the scoring function that we previously developed from rigid decoy structures for a smaller set of protein-ligand complexes. A graph-based method was developed to evaluate the root-mean-square deviation between two conformations of the same ligand with different atom names and orders due to different file preparations, and the program is freely available. Our study showed that the two new scoring functions developed from the larger training set yielded significantly improved performance in binding mode predictions. For binding affinity predictions, all four scoring functions showed protein-dependent performance. We suggest the development of protein-family-dependent scoring functions for accurate binding affinity prediction.

  8. The peptide agonist-binding site of the glucagon-like peptide-1 (GLP-1) receptor based on site-directed mutagenesis and knowledge-based modelling.

    PubMed

    Dods, Rachel L; Donnelly, Dan

    2015-11-23

    Glucagon-like peptide-1 (7-36)amide (GLP-1) plays a central role in regulating blood sugar levels and its receptor, GLP-1R, is a target for anti-diabetic agents such as the peptide agonist drugs exenatide and liraglutide. In order to understand the molecular nature of the peptide-receptor interaction, we used site-directed mutagenesis and pharmacological profiling to highlight nine sites as being important for peptide agonist binding and/or activation. Using a knowledge-based approach, we constructed a 3D model of agonist-bound GLP-1R, basing the conformation of the N-terminal region on that of the receptor-bound NMR structure of the related peptide pituitary adenylate cyclase-activating protein (PACAP21). The relative position of the extracellular to the transmembrane (TM) domain, as well as the molecular details of the agonist-binding site itself, were found to be different from the model that was published alongside the crystal structure of the TM domain of the glucagon receptor, but were nevertheless more compatible with published mutagenesis data. Furthermore, the NMR-determined structure of a high-potency cyclic conformationally-constrained 11-residue analogue of GLP-1 was also docked into the receptor-binding site. Despite having a different main chain conformation to that seen in the PACAP21 structure, four conserved residues (equivalent to His-7, Glu-9, Ser-14 and Asp-15 in GLP-1) could be structurally aligned and made similar interactions with the receptor as their equivalents in the GLP-1-docked model, suggesting the basis of a pharmacophore for GLP-1R peptide agonists. In this way, the model not only explains current mutagenesis and molecular pharmacological data but also provides a basis for further experimental design.

  9. Academic Performance on First-Year Medical School Exams: How Well Does It Predict Later Performance on Knowledge-Based and Clinical Assessments?

    PubMed

    Krupat, Edward; Pelletier, Stephen R; Dienstag, Jules L

    2017-01-01

    Number of appearances in the bottom quartile of 1st-year medical school exams were used to represent the extent to which students were having academic difficulties. Medical educators have long expressed a desire to have indicators of medical student performance that have strong predictive validity. Predictors traditionally used fell into 4 general categories: demographic (e.g., gender), other background factors (e.g., college major), performance/aptitude (e.g., medical college admission test scores), and noncognitive factors (e.g., curiosity). These factors, however, have an inconsistent record of predicting student performance. In comparison to traditional predictive factors, we sought to determine the extent to which academic performance in the 1st-year of medical school, as measured by examination performance in the bottom quartile of the class in 7 required courses, predicted later performance on a variety of assessments, both knowledge based (e.g., United States Medical Licensing Examination Step 1 and Step IICK) and clinical skills based (e.g., clerkship grades and objective structured clinical exam performance). Of all predictors measured, number of appearances in the bottom quartile in Year 1 was the most strongly related to performance in knowledge-based assessments, as well as clinically related outcomes, and, for each outcome, bottom-quartile performance accounted for additional variance beyond that of the traditional predictors. Low academic performance in the 1st year of medical school is a meaningful risk factor with both predictive validity and predictive utility for low performance later in medical school. The question remains as to how we can incorporate this indicator into a system of formative assessment that effectively addresses the challenges of medical students once they have been identified.

  10. A knowledge-based molecular screen uncovers a broad-spectrum OsSWEET14 resistance allele to bacterial blight from wild rice.

    PubMed

    Hutin, Mathilde; Sabot, François; Ghesquière, Alain; Koebnik, Ralf; Szurek, Boris

    2015-11-01

    Transcription activator-like (TAL) effectors are type III-delivered transcription factors that enhance the virulence of plant pathogenic Xanthomonas species through the activation of host susceptibility (S) genes. TAL effectors recognize their DNA target(s) via a partially degenerate code, whereby modular repeats in the TAL effector bind to nucleotide sequences in the host promoter. Although this knowledge has greatly facilitated our power to identify new S genes, it can also be easily used to screen plant genomes for variations in TAL effector target sequences and to predict for loss-of-function gene candidates in silico. In a proof-of-principle experiment, we screened a germplasm of 169 rice accessions for polymorphism in the promoter of the major bacterial blight susceptibility S gene OsSWEET14, which encodes a sugar transporter targeted by numerous strains of Xanthomonas oryzae pv. oryzae. We identified a single allele with a deletion of 18 bp overlapping with the binding sites targeted by several TAL effectors known to activate the gene. We show that this allele, which we call xa41(t), confers resistance against half of the tested Xoo strains, representative of various geographic origins and genetic lineages, highlighting the selective pressure on the pathogen to accommodate OsSWEET14 polymorphism, and reciprocally the apparent limited possibilities for the host to create variability at this particular S gene. Analysis of xa41(t) conservation across the Oryza genus enabled us to hypothesize scenarios as to its evolutionary history, prior to and during domestication. Our findings demonstrate that resistance through TAL effector-dependent loss of S-gene expression can be greatly fostered upon knowledge-based molecular screening of a large collection of host plants.

  11. Setting up a large set of protein-ligand PDB complexes for the development and validation of knowledge-based docking algorithms

    PubMed Central

    Diago, Luis A; Morell, Persy; Aguilera, Longendri; Moreno, Ernesto

    2007-01-01

    Background The number of algorithms available to predict ligand-protein interactions is large and ever-increasing. The number of test cases used to validate these methods is usually small and problem dependent. Recently, several databases have been released for further understanding of protein-ligand interactions, having the Protein Data Bank as backend support. Nevertheless, it appears to be difficult to test docking methods on a large variety of complexes. In this paper we report the development of a new database of protein-ligand complexes tailored for testing of docking algorithms. Methods Using a new definition of molecular contact, small ligands contained in the 2005 PDB edition were identified and processed. The database was enriched in molecular properties. In particular, an automated typing of ligand atoms was performed. A filtering procedure was applied to select a non-redundant dataset of complexes. Data mining was performed to obtain information on the frequencies of different types of atomic contacts. Docking simulations were run with the program DOCK. Results We compiled a large database of small ligand-protein complexes, enriched with different calculated properties, that currently contains more than 6000 non-redundant structures. As an example to demonstrate the value of the new database, we derived a new set of chemical matching rules to be used in the context of the program DOCK, based on contact frequencies between ligand atoms and points representing the protein surface, and proved their enhanced efficiency with respect to the default set of rules included in that program. Conclusion The new database constitutes a valuable resource for the development of knowledge-based docking algorithms and for testing docking programs on large sets of protein-ligand complexes. The new chemical matching rules proposed in this work significantly increase the success rate in DOCKing simulations. The database developed in this work is available at . PMID:17718923

  12. Determining the impact on the professional learning of graduates of a science and pedagogical content knowledge-based graduate degree program

    NASA Astrophysics Data System (ADS)

    Mike, Alyson Mary

    This study examined the professional learning of participants in a science and pedagogical content knowledge-based graduate degree program, specifically the Master of Science in Science Education (MSSE) at Montana State University. The program's blended learning model includes distance learning coursework and laboratory, field and seminar experiences. Three-quarters of the faculty are scientists. The study sought to identify program components that contribute to a graduate course of study that is coherent, has academic rigor, and contributes to educator's professional growth and learning. The study examined the program from three perspectives: recommendations for teachers' professional learning through professional development, components of a quality graduate program, and a framework for distance learning. No large-scale studies on comprehensive models of teacher professional learning leading to change in practice have been conducted in the United States. The literature on teachers' professional learning is small. Beginning with a comprehensive review of the literature, this study sought to identify components of professional learning through professional development for teachers. The MSSE professional learning survey was designed for students and faculty, and 349 students and 24 faculty responded. The student survey explored how course experiences fostered professional learning. Open-ended responses on the student survey provided insight regarding specific program experiences influencing key categories of professional learning. A parallel faculty survey was designed to elicit faculty perspectives on the extent to which their courses fostered science content knowledge and other aspects of professional learning. Case study data and portfolios from MSSE students were used to provide deeper insights into the influential aspects of the program. The study provided evidence of significant professional learning among science teacher participants. This growth occurred in

  13. Distributed Knowledge-Based Systems

    DTIC Science & Technology

    1989-03-15

    For example, patients with cerebral palsy, a disease affecting motor control, typically have several muscles that function improperly in different...even motor ) phenomena, but the framework is meant to cover complex cognitive phenomena as well. Eliminative materialism in philosophy, Gibsonian...linear analogue architectures [21], have been developed. These models typically deal with motor or perceptual phenomena; neural networks that capture a

  14. Knowledge-Based Replanning System.

    DTIC Science & Technology

    1987-05-01

    implementation and the line of reasoning to explain why a specific approach was taken. Section 3 discusses the basic components of KRS and, in some cases ...itportability, maintainability, and extensibility all suffer. In the case of APE-II,this truism has been born out. Not only is the parser completely...is expected to be an AIRCRAFT in which case it is the OBJECT of FLY or is expected to be a BIRD in which case it is both the ACTOR and the OBJECT of

  15. Automated knowledge-base refinement

    NASA Technical Reports Server (NTRS)

    Mooney, Raymond J.

    1994-01-01

    Over the last several years, we have developed several systems for automatically refining incomplete and incorrect knowledge bases. These systems are given an imperfect rule base and a set of training examples and minimally modify the knowledge base to make it consistent with the examples. One of our most recent systems, FORTE, revises first-order Horn-clause knowledge bases. This system can be viewed as automatically debugging Prolog programs based on examples of correct and incorrect I/O pairs. In fact, we have already used the system to debug simple Prolog programs written by students in a programming language course. FORTE has also been used to automatically induce and revise qualitative models of several continuous dynamic devices from qualitative behavior traces. For example, it has been used to induce and revise a qualitative model of a portion of the Reaction Control System (RCS) of the NASA Space Shuttle. By fitting a correct model of this portion of the RCS to simulated qualitative data from a faulty system, FORTE was also able to correctly diagnose simple faults in this system.

  16. Knowledge-based pitch detection

    NASA Astrophysics Data System (ADS)

    Dove, W. P.

    1986-06-01

    Many problems in signal processing involve a mixture of numerical and symbolic knowledge. Examples of problems of this sort include the recognition of speech and the analysis of images. This thesis focuses on the problem of employing a mixture of symbolic and numerical knowledge within a single system, through the development of a system directed at a modified pitch detection problem. For this thesis, the conventional pitch detection problem was modified by providing a phonetic transcript and sex/age information as input to the system, in addition to the acoustic waveform. The Pitch Detector's Assistant (PDA) system that was developed is an interactive facility for evaluating ways of approaching this problem. The PDA system allows the user to interrupt processing at any point, change either input data, derived data, or problem knowledge and continue execution.

  17. Knowledge-Based Search Tactics.

    ERIC Educational Resources Information Center

    Shute, Steven J.; Smith, Philip J.

    1993-01-01

    Describes an empirical study that was conducted to examine the performance of expert search intermediaries from Chemical Abstracts Service. Highlights include subject-independent and subject-dependent expertise; a model of the use of subject-specific knowledge; and implications for computerized intermediary systems and for training human…

  18. Knowledge-Based Pitch Detection.

    DTIC Science & Technology

    1986-06-01

    tin to the nose and the other passage going past the tongue to the mouth. The velum can be used to close off the passage through the nose (the nasal...tract), so only a single passage extends from the glottis. The tongue , lips and jaw can be used ra NOSTRIL MOUTH - 4 ’ ŖNGUAGU Figure 2.1 X-ray and...is called aspiration (as in the sound wh"). When the constriction is made with the tongue lips or teeth (as in the earlier examples), then the sound

  19. Reuse: A knowledge-based approach

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.

  20. PROUST: Knowledge-Based Program Understanding.

    DTIC Science & Technology

    1983-08-01

    undeniably yes. If anything, PROUST is the minimum that is required! The basis for this conclusion is twofold: 1. In Artificial Intelligence research...Role of Plans in Intellegent Teaching Systems. In Brown, J. S. and Sleeman, D. (editors), Intellegent Tutoring Systems. New York. 1981. [8] Goldstein, I...95, 1978. (12] Rich, C. A Formal Representation for Plans in the Programmer’s Apprentice. In Proc. of the Seventh Int. Joint Conf. on Artificial

  1. NASDA knowledge-based network planning system

    NASA Technical Reports Server (NTRS)

    Yamaya, K.; Fujiwara, M.; Kosugi, S.; Yambe, M.; Ohmori, M.

    1993-01-01

    One of the SODS (space operation and data system) sub-systems, NP (network planning) was the first expert system used by NASDA (national space development agency of Japan) for tracking and control of satellite. The major responsibilities of the NP system are: first, the allocation of network and satellite control resources and, second, the generation of the network operation plan data (NOP) used in automated control of the stations and control center facilities. Up to now, the first task of network resource scheduling was done by network operators. NP system automatically generates schedules using its knowledge base, which contains information on satellite orbits, station availability, which computer is dedicated to which satellite, and how many stations must be available for a particular satellite pass or a certain time period. The NP system is introduced.

  2. Knowledge-Based System Analysis and Control

    DTIC Science & Technology

    1989-09-30

    for use as a training tool (given considerable enlargement of its present circuit data base and problem repertoire), because it can provide step-by...from the slow and expensive process of training personnel in complex professional specialties. Tech Control began to emerge as a skill area ripe for...for any purpose but offline training . In late FY87 and early FY88, planning was therefore begun for a new expert system which would have no air gap

  3. Fundamentals of Knowledge-Based Techniques

    DTIC Science & Technology

    2006-09-01

    Predicate logic knowledge representation models what are known as facts within a domain and represents these facts so that an inference engine or...ANyyy, x]Target Relation [ANsss, T1] [ANyyy, T2] RxPower Relation [T1, x] [T2, x] Figure 3. Relational DBMS Model of Facts and Causes...Semantic Nets Originally semantic nets were developed for the purpose of modeling the English language (7), for a computer to understand. A method

  4. Reactome: a knowledgebase of biological pathways

    PubMed Central

    Joshi-Tope, G.; Gillespie, M.; Vastrik, I.; D'Eustachio, P.; Schmidt, E.; de Bono, B.; Jassal, B.; Gopinath, G.R.; Wu, G.R.; Matthews, L.; Lewis, S.; Birney, E.; Stein, L.

    2005-01-01

    Reactome, located at http://www.reactome.org is a curated, peer-reviewed resource of human biological processes. Given the genetic makeup of an organism, the complete set of possible reactions constitutes its reactome. The basic unit of the Reactome database is a reaction; reactions are then grouped into causal chains to form pathways. The Reactome data model allows us to represent many diverse processes in the human system, including the pathways of intermediary metabolism, regulatory pathways, and signal transduction, and high-level processes, such as the cell cycle. Reactome provides a qualitative framework, on which quantitative data can be superimposed. Tools have been developed to facilitate custom data entry and annotation by expert biologists, and to allow visualization and exploration of the finished dataset as an interactive process map. Although our primary curational domain is pathways from Homo sapiens, we regularly create electronic projections of human pathways onto other organisms via putative orthologs, thus making Reactome relevant to model organism research communities. The database is publicly available under open source terms, which allows both its content and its software infrastructure to be freely used and redistributed. PMID:15608231

  5. Uncertainty Models for Knowledge-Based Systems

    DTIC Science & Technology

    1991-08-01

    D. V. (1982). Improving judgment by reconciling incoherence. The behavioral and brain Sciences, 4, 317-370. (26] Carnap , R. (1958). Introduction to...Symbolic Logic and its Applications. Dover, N. Y. References 597 [271 Carnap , R. (1959). The Logical Syntax of Language. Littlefield, Adam and Co...Paterson, New Jersey. [28] Carnap , R. (1960). Meaning and Necessity, a Study in Semantic and Model Logic. Phoenix Books, Univ. of Chicago. [29] Carrega

  6. Knowledge-Based Software Development Tools

    DTIC Science & Technology

    1993-09-01

    inference, finite differencing, and data structure selection are discussed. A detailed case study is presented that shows how these systems could cooperate...theories that were codified in the language. In particular, encoded in CHI were tireories for: generating data structure implementations 𔄃], which mil...problems, and the finite-differencing program optinmizatica technique Implementation knowledge for data structure generatmin and performance estimatim

  7. Knowledge-based systems for power management

    NASA Technical Reports Server (NTRS)

    Lollar, L. F.

    1992-01-01

    NASA-Marshall's Electrical Power Branch has undertaken the development of expert systems in support of further advancements in electrical power system automation. Attention is given to the features (1) of the Fault Recovery and Management Expert System, (2) a resource scheduler or Master of Automated Expert Scheduling Through Resource Orchestration, and (3) an adaptive load-priority manager, or Load Priority List Management System. The characteristics of an advisory battery manager for the Hubble Space Telescope, designated the 'nickel-hydrogen expert system', are also noted.

  8. Prior knowledge-based approach for associating ...

    EPA Pesticide Factsheets

    Evaluating the potential human health and/or ecological risks associated with exposures to complex chemical mixtures in the ambient environment is one of the central challenges of chemical safety assessment and environmental protection. There is a need for approaches that can help to integrate chemical monitoring and bio-effects data to evaluate risks associated with chemicals present in the environment. We used prior knowledge about chemical-gene interactions to develop a knowledge assembly model for detected chemicals at five locations near two wastewater treatment plants. The assembly model was used to generate hypotheses about the biological impacts of the chemicals at each location. The hypotheses were tested using empirical hepatic gene expression data from fathead minnows exposed for 12 d at each location. Empirical gene expression data was also mapped to the assembly models to statistically evaluate the likelihood of a chemical contributing to the observed biological responses. The prior knowledge approach was able reasonably hypothesize the biological impacts at one site but not the other. Chemicals most likely contributing to the observed biological responses were identified at each location. Despite limitations to the approach, knowledge assembly models have strong potential for associating chemical occurrence with potential biological effects and providing a foundation for hypothesis generation to guide research and/or monitoring efforts relat

  9. Knowledge-based flow field zoning

    NASA Technical Reports Server (NTRS)

    Andrews, Alison E.

    1988-01-01

    Automation flow field zoning in two dimensions is an important step towards easing the three-dimensional grid generation bottleneck in computational fluid dynamics. A knowledge based approach works well, but certain aspects of flow field zoning make the use of such an approach challenging. A knowledge based flow field zoner, called EZGrid, was implemented and tested on representative two-dimensional aerodynamic configurations. Results are shown which illustrate the way in which EZGrid incorporates the effects of physics, shape description, position, and user bias in a flow field zoning.

  10. Strategies for Knowledge-Based Image Interpretation.

    DTIC Science & Technology

    1982-05-01

    hypothesis formation and hypothesis verification Certain assumptions have beer ) ’ide in the experiments. Fhe system assumes a camera posltIOn that is...object. PACE! IF, The strategy did not work well One problemn was the basic irability to label any region with (ireat accuracy. Another was the v...c h..r J fe. ’efe PAifll - A. No iep Lamber . .. ’ lin 1 . "A Fra ne o k . Rep resent thu L-.IWuLedqv, I r T he P Lj ar ci -( un ltir Vi.sio~n, l". W

  11. Knowledge-based operation and management of communications systems

    NASA Technical Reports Server (NTRS)

    Heggestad, Harold M.

    1988-01-01

    Expert systems techniques are being applied in operation and control of the Defense Communications System (DCS), which has the mission of providing reliable worldwide voice, data and message services for U.S. forces and commands. Thousands of personnel operate DCS facilities, and many of their functions match the classical expert system scenario: complex, skill-intensive environments with a full spectrum of problems in training and retention, cost containment, modernization, and so on. Two of these functions are: (1) fault isolation and restoral of dedicated circuits at Tech Control Centers, and (2) network management for the Defense Switched Network (the modernized dial-up voice system currently replacing AUTOVON). An expert system for the first of these is deployed for evaluation purposes at Andrews Air Force Base, and plans are being made for procurement of operational systems. In the second area, knowledge obtained with a sophisticated simulator is being embedded in an expert system. The background, design and status of both projects are described.

  12. A knowledge-based system for controlling automobile traffic

    NASA Technical Reports Server (NTRS)

    Maravas, Alexander; Stengel, Robert F.

    1994-01-01

    Transportation network capacity variations arising from accidents, roadway maintenance activity, and special events as well as fluctuations in commuters' travel demands complicate traffic management. Artificial intelligence concepts and expert systems can be useful in framing policies for incident detection, congestion anticipation, and optimal traffic management. This paper examines the applicability of intelligent route guidance and control as decision aids for traffic management. Basic requirements for managing traffic are reviewed, concepts for studying traffic flow are introduced, and mathematical models for modeling traffic flow are examined. Measures for quantifying transportation network performance levels are chosen, and surveillance and control strategies are evaluated. It can be concluded that automated decision support holds great promise for aiding the efficient flow of automobile traffic over limited-access roadways, bridges, and tunnels.

  13. Knowledge-Based Extensible Natural Language Interface Technology Program

    DTIC Science & Technology

    1989-11-30

    np,v.)](]JI * 1 (sempred,rulelO] II pnp <-TREE--) [semresult,[ [specificity(np,v..)],[v..4] ], : : [sempred,rule7], [modifier,-0474], - [pmod,f...Dialogue Between User and Itself". 3. Herb Chapman completed an Independent Study project in August 1988 in conjunction with this effort. This...Grosz, B.J. & Sidner, C.L. 1985. Discourse Structure and the Proper Treatment of Interruptions. Proceedings of IJCAI-85, pp. 832-839. I [Grosz86

  14. Dynamic reasoning in a knowledge-based system

    NASA Technical Reports Server (NTRS)

    Rao, Anand S.; Foo, Norman Y.

    1988-01-01

    Any space based system, whether it is a robot arm assembling parts in space or an onboard system monitoring the space station, has to react to changes which cannot be foreseen. As a result, apart from having domain-specific knowledge as in current expert systems, a space based AI system should also have general principles of change. This paper presents a modal logic which can not only represent change but also reason with it. Three primitive operations, expansion, contraction and revision are introduced and axioms which specify how the knowledge base should change when the external world changes are also specified. Accordingly the notion of dynamic reasoning is introduced, which unlike the existing forms of reasoning, provide general principles of change. Dynamic reasoning is based on two main principles, namely minimize change and maximize coherence. A possible-world semantics which incorporates the above two principles is also discussed. The paper concludes by discussing how the dynamic reasoning system can be used to specify actions and hence form an integral part of an autonomous reasoning and planning system.

  15. Purifying biopharmaceuticals: knowledge-based chromatographic process development.

    PubMed

    Hanke, Alexander T; Ottens, Marcel

    2014-04-01

    The purification of biopharmaceuticals is commonly considered the bottleneck of the manufacturing process. Increasing product diversity, along with growing regulatory and economic constraints raise the need to adopt new rational, systematic, and generally applicable process development strategies. Liquid chromatography is the key step in most purification processes and a well-understood unit operation, yet this understanding is still rarely effectively utilized during process development. Knowledge of the composition of the mixture, the molecular properties of the solutes and how they interact with the resins are required to rationalise the design choices. Here, we provide an overview of the advances in the determination and measurement of these properties and interactions, and outline their use throughout the different stages of downstream process development.

  16. KNOBS (Knowledge-Based System) - The Final Report (1982).

    DTIC Science & Technology

    1986-08-01

    Definitions.................................... 60 Anaphoric References.................................... 60 Arbitrary decisions in Conceptual Dependency...variable that needed to be bound. The pattern matcher needed to bind all these variables while looking for a single clause in the list of currently...in fact the value for the given slot. If the frame has been bound to a constant, and the value is a variable, INFER will bind the variable to the

  17. Towards a knowledge-based correction of iron chlorosis.

    PubMed

    Abadía, Javier; Vázquez, Saúl; Rellán-Álvarez, Rubén; El-Jendoubi, Hamdi; Abadía, Anunciación; Alvarez-Fernández, Ana; López-Millán, Ana Flor

    2011-05-01

    Iron (Fe) deficiency-induced chlorosis is a major nutritional disorder in crops growing in calcareous soils. Iron deficiency in fruit tree crops causes chlorosis, decreases in vegetative growth and marked fruit yield and quality losses. Therefore, Fe fertilizers, either applied to the soil or delivered to the foliage, are used every year to control Fe deficiency in these crops. On the other hand, a substantial body of knowledge is available on the fundamentals of Fe uptake, long and short distance Fe transport and subcellular Fe allocation in plants. Most of this basic knowledge, however, applies only to Fe deficiency, with studies involving Fe fertilization (i.e., with Fe-deficient plants resupplied with Fe) being still scarce. This paper reviews recent developments in Fe-fertilizer research and the state-of-the-art of the knowledge on Fe acquisition, transport and utilization in plants. Also, the effects of Fe-fertilization on the plant responses to Fe deficiency are reviewed. Agronomical Fe-fertilization practices should benefit from the basic knowledge on plant Fe homeostasis already available; this should be considered as a long-term goal that can optimize fertilizer inputs, reduce grower's costs and minimize the environmental impact of fertilization.

  18. Software Acquisition Manager’s Knowledge-Based Expert System.

    DTIC Science & Technology

    1982-06-30

    System is obtained by the formula: -59- a FS = RI + FSAE where FS is Full System MM RI is Release 1 MM FSAE is Full System Additional Effort MM with...lines of code (SLOC) for Release 1 (RI) and the Full System Additional Effort ( FSAE ) and shows the results of the computation (SLOC PF) rounded to...manpower: Release 1 (RI) is 39.4 MM for SLOC. Full System Additional Effort ( FSAE ) is j27.0 MM for SLOC. The results above and in Figure 10 for the

  19. A model for a knowledge-based system's life cycle

    NASA Technical Reports Server (NTRS)

    Kiss, Peter A.

    1990-01-01

    The American Institute of Aeronautics and Astronautics has initiated a Committee on Standards for Artificial Intelligence. Presented here are the initial efforts of one of the working groups of that committee. The purpose here is to present a candidate model for the development life cycle of Knowledge Based Systems (KBS). The intent is for the model to be used by the Aerospace Community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are detailed as are and the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.

  20. Dynamic Enforcement of Knowledge-based Security Policies

    DTIC Science & Technology

    2011-04-05

    probabilistic computation based on sampling. I. INTRODUCTION Facebook, Twitter, Flickr , and other successful on-line ser- vices enable users to easily...advertisement was for CM Photographics , and targets offers for wedding photography packages at women between the ages of 24 and 30 who list in their

  1. Knowledge-based data analysis comes of age.

    PubMed

    Ochs, Michael F

    2010-01-01

    The emergence of high-throughput technologies for measuring biological systems has introduced problems for data interpretation that must be addressed for proper inference. First, analysis techniques need to be matched to the biological system, reflecting in their mathematical structure the underlying behavior being studied. When this is not done, mathematical techniques will generate answers, but the values and reliability estimates may not accurately reflect the biology. Second, analysis approaches must address the vast excess in variables measured (e.g. transcript levels of genes) over the number of samples (e.g. tumors, time points), known as the 'large-p, small-n' problem. In large-p, small-n paradigms, standard statistical techniques generally fail, and computational learning algorithms are prone to overfit the data. Here we review the emergence of techniques that match mathematical structure to the biology, the use of integrated data and prior knowledge to guide statistical analysis, and the recent emergence of analysis approaches utilizing simple biological models. We show that novel biological insights have been gained using these techniques.

  2. Satellite Contamination and Materials Outgassing Knowledgebase - An Interactive Database Reference

    NASA Technical Reports Server (NTRS)

    Green, D. B.; Burns, Dewitt (Technical Monitor)

    2001-01-01

    The goal of this program is to collect at one site much of the knowledge accumulated about the outgassing properties of aerospace materials based on ground testing, the effects of this outgassing observed on spacecraft in flight, and the broader contamination environment measured by instruments on-orbit. We believe that this Web site will help move contamination a step forward, away from anecdotal folklore toward engineering discipline. Our hope is that once operational, this site will form a nucleus for information exchange, that users will not only take information from our knowledge base, but also provide new information from ground testing and space missions, expanding and increasing the value of this site to all. We urge Government and industry users to endorse this approach that will reduce redundant testing, reduce unnecessary delays, permit uniform comparisons, and permit informed decisions.

  3. A Knowledge-Base for Rehabilitation of Airfield Concrete Pavements

    DTIC Science & Technology

    1991-01-01

    30 2-11. FUEL FILTER RULE ..................................................... 32 2-12. CROSSWORD PUZZLE ARCHITECTURE...architecture. Lets assume a group of students in a classroom are going to solve a crossword puzzle that is drawn on a Blackboard. Although the clues to...German. Figure 2-12 shows how the crossword puzzle problem might be represented using a Blackboard architecture. The teacher will allow either the

  4. Utilitarian Model of Measuring Confidence within Knowledge-Based Societies

    ERIC Educational Resources Information Center

    Jack, Brady Michael; Hung, Kuan-Ming; Liu, Chia Ju; Chiu, Houn Lin

    2009-01-01

    This paper introduces a utilitarian confidence testing statistic called Risk Inclination Model (RIM) which indexes all possible confidence wagering combinations within the confines of a defined symmetrically point-balanced test environment. This paper presents the theoretical underpinnings, a formal derivation, a hypothetical application, and…

  5. Knowledge-Based Coalition Planning and Operations for Medical Applications

    DTIC Science & Technology

    2002-04-01

    This paper will investigate the characteristics of women who are diagnosed with cervical cancer , features of cancer tissue on X-ray/MRI/PET images...of cervical cancer at Peter MacCallum Cancer Institute (Melbourne). These techniques will require an advanced software platform to store and retrieve

  6. Enabling a systems biology knowledgebase with gaggle and firegoose

    SciTech Connect

    Baliga, Nitin S.

    2014-12-12

    The overall goal of this project was to extend the existing Gaggle and Firegoose systems to develop an open-source technology that runs over the web and links desktop applications with many databases and software applications. This technology would enable researchers to incorporate workflows for data analysis that can be executed from this interface to other online applications. The four specific aims were to (1) provide one-click mapping of genes, proteins, and complexes across databases and species; (2) enable multiple simultaneous workflows; (3) expand sophisticated data analysis for online resources; and enhance open-source development of the Gaggle-Firegoose infrastructure. Gaggle is an open-source Java software system that integrates existing bioinformatics programs and data sources into a user-friendly, extensible environment to allow interactive exploration, visualization, and analysis of systems biology data. Firegoose is an extension to the Mozilla Firefox web browser that enables data transfer between websites and desktop tools including Gaggle. In the last phase of this funding period, we have made substantial progress on development and application of the Gaggle integration framework. We implemented the workspace to the Network Portal. Users can capture data from Firegoose and save them to the workspace. Users can create workflows to start multiple software components programmatically and pass data between them. Results of analysis can be saved to the cloud so that they can be easily restored on any machine. We also developed the Gaggle Chrome Goose, a plugin for the Google Chrome browser in tandem with an opencpu server in the Amazon EC2 cloud. This allows users to interactively perform data analysis on a single web page using the R packages deployed on the opencpu server. The cloud-based framework facilitates collaboration between researchers from multiple organizations. We have made a number of enhancements to the cmonkey2 application to enable and improve the integration within different environments, and we have created a new tools pipeline for generating EGRIN2 models in a largely automated way.

  7. Application of Knowledge-Based Techniques to Tracking Function

    DTIC Science & Technology

    2006-09-01

    C. Musso, N. Oudjane, F. Legland, “Improving regularised particle filters”, in [38]. [40] C. Hue, J.-P. Le Cadre, P. Perez , “Sequential Monte Carlo...John Wiley & Sons, New York, January 1990. [50] D. Halls, J. Llinas, “Handbook of multisensor data fusion”, CRC, 2001. [51] Y. Salama , R. Senne

  8. Utilizing knowledge-base semantics in graph-based algorithms

    SciTech Connect

    Darwiche, A.

    1996-12-31

    Graph-based algorithms convert a knowledge base with a graph structure into one with a tree structure (a join-tree) and then apply tree-inference on the result. Nodes in the join-tree are cliques of variables and tree-inference is exponential in w*, the size of the maximal clique in the join-tree. A central property of join-trees that validates tree-inference is the running-intersection property: the intersection of any two cliques must belong to every clique on the path between them. We present two key results in connection to graph-based algorithms. First, we show that the running-intersection property, although sufficient, is not necessary for validating tree-inference. We present a weaker property for this purpose, called running-interaction, that depends on non-structural (semantical) properties of a knowledge base. We also present a linear algorithm that may reduce w* of a join-tree, possibly destroying its running-intersection property, while maintaining its running-interaction property and, hence, its validity for tree-inference. Second, we develop a simple algorithm for generating trees satisfying the running-interaction property. The algorithm bypasses triangulation (the standard technique for constructing join-trees) and does not construct a join-tree first. We show that the proposed algorithm may in some cases generate trees that are more efficient than those generated by modifying a join-tree.

  9. Knowledge-based optical coatings design and manufacturing

    NASA Astrophysics Data System (ADS)

    Guenther, Karl H.; Gonzalez, Avelino J.; Yoo, Hoi J.

    1990-12-01

    The theory of thin film optics is well developed for the spectral analysis of a given optical coating. The inverse synthesis - designing an optical coating for a certain spectral performance - is more complicated. Usually a multitude of theoretical designs is feasible because most design problems are over-determined with the number of layers possible with three variables each (n, k, t). The expertise of a good thin film designer comes in at this point with a mostly intuitive selection of certain designs based on previous experience and current manufacturing capabilities. Manufacturing a designed coating poses yet another subset of multiple solutions, as thin if in deposition technology has evolved over the years with a vast variety of different processes. The abundance of published literature may often be more confusing than helpful to the practicing thin film engineer, even if he has time and opportunity to read it. The choice of the right process is also severely limited by the given manufacturing hardware and cost considerations which may not easily allow for the adaption of a new manufacturing approach, even if it promises to be better technically (it ought to be also cheaper). On the user end of the thin film coating business, the typical optical designer or engineer who needs an optical coating may have limited or no knowledge at all about the theoretical and manufacturing criteria for the optimum selection of what he needs. This can be sensed frequently by overly tight tolerances and requirements for optical performance which sometimes stretch the limits of mother nature. We introduce here a know1edge-based system (KBS) intended to assist expert designers and manufacturers in their task of maximizing results and minimizing errors, trial runs, and unproductive time. It will help the experts to manipulate parameters which are largely determined through heuristic reasoning by employing artificial intelligence techniques. In a later state, the KBS will include a module allowing the layman user of coatings to make the right choice.

  10. CACTUS: Command and Control Training Using Knowledge-Based Simulations.

    ERIC Educational Resources Information Center

    Hartley, J. R.; And Others

    1992-01-01

    Describes a computer simulation, CACTUS, that was developed in the United Kingdom to help police with command and control training for large crowd control incidents. Use of the simulation for pre-event planning and decision making is discussed, debriefing is described, and the role of the trainer is considered. (LRW)

  11. A Study of Knowledge-Based Systems for Photo Interpretation.

    DTIC Science & Technology

    1980-06-01

    OIL (15] CAI Electronics SOPHIE (10] Medicine GUIDON [14] Learning Chemistry Meta-DENDRAL (i] Agriculture INDUCE [19] Mathematics AM [40] Intelligent...16 6. Computer-Aided Instruction: GUIDON Three types of traditional computer-aided instruction (CAI) are often distinguished: frame-oriented drill-and...systems have an obvious contribution to make to CAI. The GUIDON system developed by Clancey at Stanford exploits the MYCIN knowledge base about

  12. Confidence Testing for Knowledge-Based Global Communities

    ERIC Educational Resources Information Center

    Jack, Brady Michael; Liu, Chia-Ju; Chiu, Houn-Lin; Shymansky, James A.

    2009-01-01

    This proposal advocates the position that the use of confidence wagering (CW) during testing can predict the accuracy of a student's test answer selection during between-subject assessments. Data revealed female students were more favorable to taking risks when making CW and less inclined toward risk aversion than their male counterparts. Student…

  13. Knowledge-based model building of proteins: concepts and examples.

    PubMed Central

    Bajorath, J.; Stenkamp, R.; Aruffo, A.

    1993-01-01

    We describe how to build protein models from structural templates. Methods to identify structural similarities between proteins in cases of significant, moderate to low, or virtually absent sequence similarity are discussed. The detection and evaluation of structural relationships is emphasized as a central aspect of protein modeling, distinct from the more technical aspects of model building. Computational techniques to generate and complement comparative protein models are also reviewed. Two examples, P-selectin and gp39, are presented to illustrate the derivation of protein model structures and their use in experimental studies. PMID:7505680

  14. Knowledge-Based Economies and Education: A Grand Canyon Analogy

    ERIC Educational Resources Information Center

    Mahy, Colleen; Krimmel, Tyler

    2008-01-01

    Expeditions inspire people to reach beyond themselves. Today, post-secondary education requires as much planning as any expedition. However, there has been a trend that has seen just over half of all high school students in Ontario going on to post-secondary education. While some people have barely noticed this statistic, the OECD has released a…

  15. Knowledge-based modelling of historical surfaces using lidar data

    NASA Astrophysics Data System (ADS)

    Höfler, Veit; Wessollek, Christine; Karrasch, Pierre

    2016-10-01

    Currently in archaeological studies digital elevation models are mainly used especially in terms of shaded reliefs for the prospection of archaeological sites. Hesse (2010) provides a supporting software tool for the determination of local relief models during the prospection using LiDAR scans. Furthermore the search for relicts from WW2 is also in the focus of his research. In James et al. (2006) the determined contour lines were used to reconstruct locations of archaeological artefacts such as buildings. This study is much more and presents an innovative workflow of determining historical high resolution terrain surfaces using recent high resolution terrain models and sedimentological expert knowledge. Based on archaeological field studies (Franconian Saale near Bad Neustadt in Germany) the sedimentological analyses shows that archaeological interesting horizon and geomorphological expert knowledge in combination with particle size analyses (Koehn, DIN ISO 11277) are useful components for reconstructing surfaces of the early Middle Ages. Furthermore the paper traces how it is possible to use additional information (extracted from a recent digital terrain model) to support the process of determination historical surfaces. Conceptual this research is based on methodology of geomorphometry and geo-statistics. The basic idea is that the working procedure is based on the different input data. One aims at tracking the quantitative data and the other aims at processing the qualitative data. Thus, the first quantitative data were available for further processing, which were later processed with the qualitative data to convert them to historical heights. In the final stage of the workflow all gathered information are stored in a large data matrix for spatial interpolation using the geostatistical method of Kriging. Besides the historical surface, the algorithm also provides a first estimation of accuracy of the modelling. The presented workflow is characterized by a high flexibility and the opportunity to include new available data in the process at any time.

  16. An application of knowledge-based systems to satellite control

    NASA Astrophysics Data System (ADS)

    Skiffington, B.; Carrig, J.; Kornell, J.

    This paper describes an expert system prototype which approaches some issues of satellite command and control. The task of the prototype system is to assist a spacecraft controller in maneuvering a geosynchronous satellite for the purpose of maintaining an accurate spacecraft pointing angle, i.e., station keeping. From an expert system's point of view, two features of the system are notable. First, a tool for automated knowledge acquisition was employed. Because the domain experts were in Maryland while the AI experts were in California, a means to automate knowledge acquisition was required. Second, the system involves a blend of simulation and expert systems technology distributed between a DEC VAX computer and a LISP machine (a special purpose AI computer). This kind of distribution is a plausible model for potential real-world installations.

  17. Knowledge-based topographic feature extraction in medical images

    NASA Astrophysics Data System (ADS)

    Qian, JianZhong; Khair, Mohammad M.

    1995-08-01

    Diagnostic medical imaging often contains variations of patient anatomies, camera mispositioning, or other imperfect imaging condiitons. These variations contribute to uncertainty about shapes and boundaries of objects in images. As the results sometimes image features, such as traditional edges, may not be identified reliably and completely. We describe a knowledge based system that is able to reason about such uncertainties and use partial and locally ambiguous information to infer about shapes and lcoation of objects in an image. The system uses directional topographic features (DTFS), such as ridges and valleys, labeled from the underlying intensity surface to correlate to the intrinsic anatomical information. By using domain specific knowledge, the reasoning system can deduce significant anatomical landmarks based upon these DTFS, and can cope with uncertainties and fill in missing information. A succession of levels of representation for visual information and an active process of uncertain reasoning about this visual information are employed to realiably achieve the goal of image analysis. These landmarks can then be used in localization of anatomy of interest, image registration, or other clinical processing. The successful application of this system to a large set of planar cardiac images of nuclear medicine studies has demonstrated its efficiency and accuracy.

  18. A special purpose knowledge-based face localization method

    NASA Astrophysics Data System (ADS)

    Hassanat, Ahmad; Jassim, Sabah

    2008-04-01

    This paper is concerned with face localization for visual speech recognition (VSR) system. Face detection and localization have got a great deal of attention in the last few years, because it is an essential pre-processing step in many techniques that handle or deal with faces, (e.g. age, face, gender, race and visual speech recognition). We shall present an efficient method for localization human's faces in video images captured on mobile constrained devices, under a wide variation in lighting conditions. We use a multiphase method that may include all or some of the following steps starting with image pre-processing, followed by a special purpose edge detection, then an image refinement step. The output image will be passed through a discrete wavelet decomposition procedure, and the computed LL sub-band at a certain level will be transformed into a binary image that will be scanned by using a special template to select a number of possible candidate locations. Finally, we fuse the scores from the wavelet step with scores determined by color information for the candidate location and employ a form of fuzzy logic to distinguish face from non-face locations. We shall present results of large number of experiments to demonstrate that the proposed face localization method is efficient and achieve high level of accuracy that outperforms existing general-purpose face detection methods.

  19. Knowledge-based image bandwidth compression and enhancement

    NASA Astrophysics Data System (ADS)

    Saghri, John A.; Tescher, Andrew G.

    1987-01-01

    Techniques for incorporating a priori knowledge in the digital coding and bandwidth compression of image data are described and demonstrated. An algorithm for identifying and highlighting thin lines and point objects prior to coding is presented, and the precoding enhancement of a slightly smoothed version of the image is shown to be more effective than enhancement of the original image. Also considered are readjustment of the local distortion parameter and variable-block-size coding. The line-segment criteria employed in the classification are listed in a table, and sample images demonstrating the effectiveness of the enhancement techniques are presented.

  20. A Knowledge-Based Representation Scheme for Environmental Science Models

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Dungan, Jennifer L.; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    One of the primary methods available for studying environmental phenomena is the construction and analysis of computational models. We have been studying how artificial intelligence techniques can be applied to assist in the development and use of environmental science models within the context of NASA-sponsored activities. We have identified several high-utility areas as potential targets for research and development: model development; data visualization, analysis, and interpretation; model publishing and reuse, training and education; and framing, posing, and answering questions. Central to progress on any of the above areas is a representation for environmental models that contains a great deal more information than is present in a traditional software implementation. In particular, a traditional software implementation is devoid of any semantic information that connects the code with the environmental context that forms the background for the modeling activity. Before we can build AI systems to assist in model development and usage, we must develop a representation for environmental models that adequately describes a model's semantics and explicitly represents the relationship between the code and the modeling task at hand. We have developed one such representation in conjunction with our work on the SIGMA (Scientists' Intelligent Graphical Modeling Assistant) environment. The key feature of the representation is that it provides a semantic grounding for the symbols in a set of modeling equations by linking those symbols to an explicit representation of the underlying environmental scenario.

  1. The AI Bus architecture for distributed knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain

    1991-01-01

    The AI Bus architecture is layered, distributed object oriented framework developed to support the requirements of advanced technology programs for an order of magnitude improvement in software costs. The consequent need for highly autonomous computer systems, adaptable to new technology advances over a long lifespan, led to the design of an open architecture and toolbox for building large scale, robust, production quality systems. The AI Bus accommodates a mix of knowledge based and conventional components, running on heterogeneous, distributed real world and testbed environment. The concepts and design is described of the AI Bus architecture and its current implementation status as a Unix C++ library or reusable objects. Each high level semiautonomous agent process consists of a number of knowledge sources together with interagent communication mechanisms based on shared blackboards and message passing acquaintances. Standard interfaces and protocols are followed for combining and validating subsystems. Dynamic probes or demons provide an event driven means for providing active objects with shared access to resources, and each other, while not violating their security.

  2. Compiling knowledge-based systems specified in KEE to Ada

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Feldman, Roy D.

    1991-01-01

    The first year of the PrKAda project is recounted. The primary goal was to develop a system for delivering Artificial Intelligence applications developed in the ProKappa system in a pure-Ada environment. The following areas are discussed: the ProKappa core and ProTalk programming language; the current status of the implementation; the limitations and restrictions of the current system; and the development of Ada-language message handlers in the ProKappa environment.

  3. Entity Bases: Large-Scale Knowledgebases for Intelligence Data

    DTIC Science & Technology

    2009-02-01

    and then copy the first shelter’s name into Google Maps to get its full address and geocode . She would paste the resulting information into the...retrieving the matching addresses and geocodes . In some cases the shelter name may be ambiguous and might return multiple answers: here CopyCat would...significance. This work was carried out jointly by USC ISI, Fetch Technolo- gies , and, in a separately funded effort, the University of Pennsylvania

  4. Impact of knowledge-based software engineering on aerospace systems

    NASA Technical Reports Server (NTRS)

    Peyton, Liem; Gersh, Mark A.; Swietek, Gregg

    1991-01-01

    The emergence of knowledge engineering as a software technology will dramatically alter the use of software by expanding application areas across a wide spectrum of industries. The engineering and management of large aerospace software systems could benefit from a knowledge engineering approach. An understanding of this technology can potentially make significant improvements to the current practice of software engineering, and provide new insights into future development and support practices.

  5. Designer: A Knowledge-Based Graphic Design Assistant.

    ERIC Educational Resources Information Center

    Weitzman, Louis

    This report describes Designer, an interactive tool for assisting with the design of two-dimensional graphic interfaces for instructional systems. The system, which consists of a color graphics interface to a mathematical simulation, provides enhancements to the Graphics Editor component of Steamer (a computer-based training system designed to aid…

  6. Fuzzy logic controllers: A knowledge-based system perspective

    NASA Technical Reports Server (NTRS)

    Bonissone, Piero P.

    1993-01-01

    Over the last few years we have seen an increasing number of applications of Fuzzy Logic Controllers. These applications range from the development of auto-focus cameras, to the control of subway trains, cranes, automobile subsystems (automatic transmissions), domestic appliances, and various consumer electronic products. In summary, we consider a Fuzzy Logic Controller to be a high level language with its local semantics, interpreter, and compiler, which enables us to quickly synthesize non-linear controllers for dynamic systems.

  7. Knowledge-based processing for aircraft flight control

    NASA Technical Reports Server (NTRS)

    Painter, John H.

    1991-01-01

    The purpose is to develop algorithms and architectures for embedding artificial intelligence in aircraft guidance and control systems. With the approach adopted, AI-computing is used to create an outer guidance loop for driving the usual aircraft autopilot. That is, a symbolic processor monitors the operation and performance of the aircraft. Then, based on rules and other stored knowledge, commands are automatically formulated for driving the autopilot so as to accomplish desired flight operations. The focus is on developing a software system which can respond to linguistic instructions, input in a standard format, so as to formulate a sequence of simple commands to the autopilot. The instructions might be a fairly complex flight clearance, input either manually or by data-link. Emphasis is on a software system which responds much like a pilot would, employing not only precise computations, but, also, knowledge which is less precise, but more like common-sense. The approach is based on prior work to develop a generic 'shell' architecture for an AI-processor, which may be tailored to many applications by describing the application in appropriate processor data bases (libraries). Such descriptions include numerical models of the aircraft and flight control system, as well as symbolic (linguistic) descriptions of flight operations, rules, and tactics.

  8. Designer: A Knowledge-Based Graphic Design Assistant.

    DTIC Science & Technology

    1986-07-01

    1983; Ching, 1979; Dondis , 1973; Hurlburt, 1977; Marcus, 1986; Reilly & Roach. 1984; Sherwood, 1981; Taylor, 1960; Wong, 1972). Unfortunately, the...informa- tion ( Dondis , 1973). These visual techniques represent a vocabulary in which to describe the design. These techniques in conjunction with the...Artificial Intelligence, 28, 197-224. Dondis , D. A. (1973). A primer of visual literacy. Cambridge, MA: MIT Press. Glenn, B. (1986). Descriptor: A model for

  9. Digitizing legacy documents: A knowledge-base preservation project

    SciTech Connect

    Anderson, E.; Atkinson, R.; Crego, C.; Slisz, J.; Tompson, S.

    1998-09-01

    As more library customers and staff throughout the world come to rely upon rapid electronic access to fulltext documents, there is increasing demand to also make older documents electronically accessible. Illinois State Library grant funds allowed us to purchase hardware and software necessary to answer this demand. We created a production system to scan our legacy documents, convert them into Portable Document Format (PDF), save them to a server for World Wide Web access, and write them to CD discs for distribution.

  10. Knowledge-based control for robot self-localization

    NASA Technical Reports Server (NTRS)

    Bennett, Bonnie Kathleen Holte

    1993-01-01

    Autonomous robot systems are being proposed for a variety of missions including the Mars rover/sample return mission. Prior to any other mission objectives being met, an autonomous robot must be able to determine its own location. This will be especially challenging because location sensors like GPS, which are available on Earth, will not be useful, nor will INS sensors because their drift is too large. Another approach to self-localization is required. In this paper, we describe a novel approach to localization by applying a problem solving methodology. The term 'problem solving' implies a computational technique based on logical representational and control steps. In this research, these steps are derived from observing experts solving localization problems. The objective is not specifically to simulate human expertise but rather to apply its techniques where appropriate for computational systems. In doing this, we describe a model for solving the problem and a system built on that model, called localization control and logic expert (LOCALE), which is a demonstration of concept for the approach and the model. The results of this work represent the first successful solution to high-level control aspects of the localization problem.

  11. Design of a knowledge-based welding advisor

    SciTech Connect

    Kleban, S.D.

    1996-06-01

    Expert system implementation can take numerous forms ranging form traditional declarative rule-based systems with if-then syntax to imperative programming languages that capture expertise in procedural code. The artificial intelligence community generally thinks of expert systems as rules or rule-bases and an inference engine to process the knowledge. The welding advisor developed at Sandia National Laboratories and described in this paper deviates from this by codifying expertise using object representation and methods. Objects allow computer scientists to model the world as humans perceive it giving us a very natural way to encode expert knowledge. The design of the welding advisor, which generates and evaluates solutions, will be compared and contrasted to a traditional rule- based system.

  12. Apprenticeship learning techniques for knowledge-based systems

    SciTech Connect

    Wilkins, D.C.

    1987-01-01

    This thesis describes apprenticeship learning techniques for automation of the transfer of expertise. Apprenticeship learning is a form of learning by watching, in which learning occurs as a byproduct of building explanations of human problem-solving actions. As apprenticeship is the most-powerful method that human experts use to refine and debug their expertise in knowledge-intensive domains such as medicine; this motivates giving such capabilities to an expert system. The major accomplishment in this thesis is showing how an explicit representation of the strategy knowledge to solve a general problem class, such as diagnosis, can provide a basis for learning the knowledge that is specific to a particular domain, such as medicine. The Odysseus learning program provides the first demonstration of using the same technique to transfer of expertise to and from an expert system knowledge base. Another major focus of this thesis is limitations of apprenticeship learning. It is shown that extant techniques for reasoning under uncertainty for expert systems lead to a sociopathic knowledge base.

  13. Thermal Performance Testing of EMU and CSAFE Liquid Cooling Gannents

    NASA Technical Reports Server (NTRS)

    Rhodes, Richard; Bue, Grant; Meginnis, Ian; Hakam, Mary; Radford, Tamara

    2013-01-01

    Future exploration missions require the development of a new liquid cooling garment (LCG) to support the next generation extravehicular activity (EVA) suit system. The new LCG must offer greater system reliability, optimal thermal performance as required by mission directive, and meet other design requirements including improved tactile comfort. To advance the development of a future LCG, a thermal performance test was conducted to evaluate: (1) the comparable thermal performance of the EMU LCG and the CSAFE developed engineering evaluation unit (EEU) LCG, (2) the effect of the thermal comfort undergarment (TCU) on the EMU LCG tactile and thermal comfort, and (3) the performance of a torso or upper body only LCG shirt to evaluate a proposed auxiliary loop. To evaluate the thermal performance of each configuration, a metabolic test was conducted using the Demonstrator Spacesuit to create a relevant test environment. Three (3) male test subjects of similar height and weight walked on a treadmill at various speeds to produce three different metabolic loads - resting (300-600 BTU/hr), walking at a slow pace (1200 BTU/hr), and walking at a brisk pace (2200 BTU/hr). Each subject participated in five tests - two wearing the CSAFE full LCG, one wearing the EMU LCG without TCUs, one wearing the EMU LCG with TCUs, and one with the CSAFE shirt-only. During the test, performance data for the breathing air and cooling water systems and subject specific data was collected to define the thermal performance of the configurations. The test results show that the CSAFE EEU LCG and EMU LCG with TCU had comparable performance. The testing also showed that an auxiliary loop LCG, sized similarly to the shirt-only configuration, should provide adequate cooling for contingency scenarios. Finally, the testing showed that the TCU did not significantly hinder LCG heat transfer, and may prove to be acceptable for future suit use with additional analysis and testing.

  14. Theory of mind selectively predicts preschoolers' knowledge-based selective word learning.

    PubMed

    Brosseau-Liard, Patricia; Penney, Danielle; Poulin-Dubois, Diane

    2015-11-01

    Children can selectively attend to various attributes of a model, such as past accuracy or physical strength, to guide their social learning. There is a debate regarding whether a relation exists between theory-of-mind skills and selective learning. We hypothesized that high performance on theory-of-mind tasks would predict preference for learning new words from accurate informants (an epistemic attribute), but not from physically strong informants (a non-epistemic attribute). Three- and 4-year-olds (N = 65) completed two selective learning tasks, and their theory-of-mind abilities were assessed. As expected, performance on a theory-of-mind battery predicted children's preference to learn from more accurate informants but not from physically stronger informants. Results thus suggest that preschoolers with more advanced theory of mind have a better understanding of knowledge and apply that understanding to guide their selection of informants. This work has important implications for research on children's developing social cognition and early learning.

  15. Generic supervisor: A knowledge-based tool for control of space station on-board systems

    NASA Technical Reports Server (NTRS)

    Carnes, J. R.; Nelson, R.

    1988-01-01

    The concept of a generic module for management of onboard systems grew out of the structured analysis effort for the Space Station software. Hierarchical specification of subsystems software revealed that nontrivial supervisory elements are required at all levels. The number of supervisors (and subsequent software) required to implement the hierarchical control over onboard functions comprise a large portion of the Space Station software. Thus, a generic knowledge based supervisory module significantly reduces the amount of software developed. This module, the Generic Supervisor, depends on its knowledge of control to provide direction for subordinates and feedback to superiors within a specific subsystem area. The Generic Supervisor provides an adaptable and maintainable control system. A portion of the Space Station Environmental Control and Life Support System (ECLSS) was implemented as a hierarchy of supervisors. This prototype implementation demonstrates the feasibility of a generic knowledge based supervisor, and its facility to meet complex mission requirements.

  16. Knowledge-Based Inferences across the Hemispheres: Domain Makes a Difference

    ERIC Educational Resources Information Center

    Shears, Connie; Hawkins, Amanda; Varner, Andria; Lewis, Lindsey; Heatley, Jennifer; Twachtmann, Lisa

    2008-01-01

    Language comprehension occurs when the left-hemisphere (LH) and the right-hemisphere (RH) share information derived from discourse [Beeman, M. J., Bowden, E. M., & Gernsbacher, M. A. (2000). Right and left hemisphere cooperation for drawing predictive and coherence inferences during normal story comprehension. "Brain and Language, 71", 310-336].…

  17. A Knowledge-Based Approach to Retrieving Teaching Materials for Context-Aware Learning

    ERIC Educational Resources Information Center

    Shih, Wen-Chung; Tseng, Shian-Shyong

    2009-01-01

    With the rapid development of wireless communication and sensor technologies, ubiquitous learning has become a promising solution to educational problems. In context-aware ubiquitous learning environments, it is required that learning content is retrieved according to environmental contexts, such as learners' location. Also, a learning content…

  18. Knowledge-based changes to health systems: the Thai experience in policy development.

    PubMed Central

    Tangcharoensathien, Viroj; Wibulpholprasert, Suwit; Nitayaramphong, Sanguan

    2004-01-01

    Over the past two decades the government in Thailand has adopted an incremental approach to extending health-care coverage to the population. It first offered coverage to government employees and their dependents, and then introduced a scheme under which low-income people were exempt from charges for health care. This scheme was later extended to include elderly people, children younger than 12 years of age and disabled people. A voluntary public insurance scheme was implemented to cover those who could afford to pay for their own care. Private sector employees were covered by the Social Health Insurance scheme, which was implemented in 1991. Despite these efforts, 30% of the population remained uninsured in 2001. In October of that year, the new government decided to embark on a programme to provide universal health-care coverage. This paper describes how research into health systems and health policy contributed to the move towards universal coverage. Data on health systems financing and functioning had been gathered before and after the founding of the Health Systems Research Institute in early 1990. In 1991, a contract capitation model had been used to launch the Social Health Insurance scheme. The advantages of using a capitation model are that it contains costs and provides an acceptable quality of service as opposed to the cost escalation and inefficiency that occur under fee-for-service reimbursement models, such as the one used to provide medical benefits to civil servants. An analysis of the implementation of universal coverage found that politics moved universal coverage onto the policy agenda during the general election campaign in January 2001. The capacity for research on health systems and policy to generate evidence guided the development of the policy and the design of the system at a later stage. Because the reformists who sought to bring about universal coverage (who were mostly civil servants in the Ministry of Public Health and members of nongovernmental organizations) were able to bridge the gap between researchers and politicians, an evidence-based political decision was made. Additionally, the media played a part in shaping the societal consensus on universal coverage. PMID:15643796

  19. Guidon-Watch: A Graphic Interface for Viewing a Knowledge-Based System.

    DTIC Science & Technology

    1985-08-01

    studies reported here were supported (in part) by: The Office of Naval Research Personnel and Training Research Programs, Psychological Sciences Division...INTERLISP-D. The black and white display screen is 1024 pixels wide by 808 pixels high, which provides approximately 75 pixels per inch resolution...reasons to make a clean separation between the user interface and the rest of a software system (Zdybel, et al., 1981, Smith, et al., 1984, Ciccarelli

  20. Mental Models and Problem Solving with a Knowledge-Based Expert System,

    DTIC Science & Technology

    1985-10-01

    cc 79 30) Percent of the peripheral WBC’s which are immature forms, ** 12 31) Is Pt321 a compromised host (e.g. alcoholic, sickle - cell - ’ disease...34 Naval Research Laboratory Mr. Herb Marks Code 7592 Naval Surface Weapons Center .- , . Computer Sciences & Systems NSWC/DL Washington, D. C. 20375 Code N

  1. Knowledge-based reasoning to annotate noncoding RNA using multi-agent system.

    PubMed

    Arruda, Wosley C; Souza, Daniel S; Ralha, Célia G; Walter, Maria Emilia M T; Raiol, Tainá; Brigido, Marcelo M; Stadler, Peter F

    2015-12-01

    Noncoding RNAs (ncRNAs) have been focus of intense research over the last few years. Since characteristics and signals of ncRNAs are not entirely known, researchers use different computational tools together with their biological knowledge to predict putative ncRNAs. In this context, this work presents ncRNA-Agents, a multi-agent system to annotate ncRNAs based on the output of different tools, using inference rules to simulate biologists' reasoning. Experiments with data from the fungus Saccharomyces cerevisiae allowed to measure the performance of ncRNA-Agents, with better sensibility, when compared to Infernal, a widely used tool for annotating ncRNA. Besides, data of the Schizosaccharomyces pombe and Paracoccidioides brasiliensis fungi identified novel putative ncRNAs, which demonstrated the usefulness of our approach. NcRNA-Agents can be be found at: http://www.biomol.unb.br/ncrna-agents.

  2. Deception Detection in Expert Source Information Through Bayesian Knowledge-Bases

    DTIC Science & Technology

    2008-02-04

    intelligence and have implemented deception detection algorithms using probabilistic,intelligent, multi - agent systems . We have also conducted numerous...Bayesian Knowledge Bases," Data and Knowledge Engineering 64, 218-241, 2008. Yuan, Xiuqing, "Deception Detection in Multi - Agent System and War

  3. Remote Sensing Terminology in a Global and Knowledge-Based World

    NASA Astrophysics Data System (ADS)

    Kancheva, Rumiana

    The paper is devoted to terminology issues related to all aspects of remote sensing research and applications. Terminology is the basis for a better understanding among people. It is crucial to keep up with the latest developments and novelties of the terminology in advanced technology fields such as aerospace science and industry. This is especially true in remote sensing and geoinformatics which develop rapidly and have ever extending applications in various domains of science and human activities. Remote sensing terminology issues are directly relevant to the contemporary worldwide policies on information accessibility, dissemination and utilization of research results in support of solutions to global environmental challenges and sustainable development goals. Remote sensing and spatial information technologies are an integral part of the international strategies for cooperation in scientific, research and application areas with a particular accent on environmental monitoring, ecological problems natural resources management, climate modeling, weather forecasts, disaster mitigation and many others to which remote sensing data can be put. Remote sensing researchers, professionals, students and decision makers of different counties and nationalities should fully understand, interpret and translate into their native language any term, definition or acronym found in papers, books, proceedings, specifications, documentation, and etc. The importance of the correct use, precise definition and unification of remote sensing terms refers not only to people working in this field but also to experts in a variety of disciplines who handle remote sensing data and information products. In this paper, we draw the attention on the specifics, peculiarities and recent needs of compiling specialized dictionaries in the area of remote sensing focusing on Earth observations and the integration of remote sensing with other geoinformation technologies such as photogrammetry, geodesy, GIS, etc. Our belief is that the elaboration of bilingual and multilingual dictionaries and glossaries in this spreading, most technically advanced and promising field of human expertise is of great practical importance. The work on an English-Bulgarian Dictionary of Remote Sensing Terms is described including considerations on its scope, structure, information content, sellection of terms, and etc. The vision builds upon previous national and international experience and makes use of ongoing activities on the subject. Any interest in cooperation and initiating suchlike collaborative projects is welcome and highly appreciated.

  4. MYCIN: A Knowledge-Based Consultation Program for Infectious Disease Diagnosis.

    ERIC Educational Resources Information Center

    van Melle, William

    1978-01-01

    Describes the structure of MYCIN, a computer-based consultation system designed to assist physicians in the diagnosis of and therapy selection for patients with bacterial infections, and an explanation system which can answer simple English questions in order to justify its advice or educate the user. (Author/VT)

  5. The discovery of bioisoster compound for plumbagin using the knowledge-based rational method

    NASA Astrophysics Data System (ADS)

    Jeong, Seo Hee; Choi, Jung Sup; Ko, Young Kwan; Kang, Nam Sook

    2015-04-01

    Arabidopsis thaliana 7-Keto-8-AminoPelargonic Acid Synthase (AtKAPAS) is a crucial herbicide target, and AtKAPAS inhibitors are widely available in the agrochemical market. The herbicide plumbagin is known as a potent inhibitor for AtKAPAS but it is extremely toxic. In this study, we identified the metabolic site of plumbagin and also performed a similarity-based library analysis using 2D fingerprints and a docking study. Four compounds as virtual hits were derived from plumbagin. Treatment of Digitaria ciliaris with compound 2, one of four hit compounds, stunted the growth of leaves and the leaf tissue was desiccated or burned within three days. Thus, we expect that compound 2 will be developed as a new herbicide and additionally our strategy will provide helpful information for optimizing lead compounds.

  6. Sensor explication: knowledge-based robotic plan execution through logical objects.

    PubMed

    Budenske, J; Gini, M

    1997-01-01

    Complex robot tasks are usually described as high level goals, with no details on how to achieve them. However, details must be provided to generate primitive commands to control a real robot. A sensor explication concept that makes details explicit from general commands is presented. We show how the transformation from high-level goals to primitive commands can be performed at execution time and we propose an architecture based on reconfigurable objects that contain domain knowledge and knowledge about the sensors and actuators available. Our approach is based on two premises: 1) plan execution is an information gathering process where determining what information is relevant is a great part of the process; and 2) plan execution requires that many details are made explicit. We show how our approach is used in solving the task of moving a robot to and through an unknown, and possibly narrow, doorway; where sonic range data is used to find the doorway, walls, and obstacles. We illustrate the difficulty of such a task using data from a large number of experiments we conducted with a real mobile robot. The laboratory results illustrate how the proper application of knowledge in the integration and utilization of sensors and actuators increases the robustness of plan execution.

  7. Knowledge-Based, Central Nervous System (CNS) Lead Selection and Lead Optimization for CNS Drug Discovery

    PubMed Central

    2011-01-01

    The central nervous system (CNS) is the major area that is affected by aging. Alzheimer’s disease (AD), Parkinson’s disease (PD), brain cancer, and stroke are the CNS diseases that will cost trillions of dollars for their treatment. Achievement of appropriate blood–brain barrier (BBB) penetration is often considered a significant hurdle in the CNS drug discovery process. On the other hand, BBB penetration may be a liability for many of the non-CNS drug targets, and a clear understanding of the physicochemical and structural differences between CNS and non-CNS drugs may assist both research areas. Because of the numerous and challenging issues in CNS drug discovery and the low success rates, pharmaceutical companies are beginning to deprioritize their drug discovery efforts in the CNS arena. Prompted by these challenges and to aid in the design of high-quality, efficacious CNS compounds, we analyzed the physicochemical property and the chemical structural profiles of 317 CNS and 626 non-CNS oral drugs. The conclusions derived provide an ideal property profile for lead selection and the property modification strategy during the lead optimization process. A list of substructural units that may be useful for CNS drug design was also provided here. A classification tree was also developed to differentiate between CNS drugs and non-CNS oral drugs. The combined analysis provided the following guidelines for designing high-quality CNS drugs: (i) topological molecular polar surface area of <76 Å2 (25–60 Å2), (ii) at least one (one or two, including one aliphatic amine) nitrogen, (iii) fewer than seven (two to four) linear chains outside of rings, (iv) fewer than three (zero or one) polar hydrogen atoms, (v) volume of 740–970 Å3, (vi) solvent accessible surface area of 460–580 Å2, and (vii) positive QikProp parameter CNS. The ranges within parentheses may be used during lead optimization. One violation to this proposed profile may be acceptable. The chemoinformatics approaches for graphically analyzing multiple properties efficiently are presented. PMID:22267984

  8. Knowledge-Based Vision Techniques for the Autonomous Land Vehicle Program

    DTIC Science & Technology

    1991-10-01

    Supercomputing, Santa Clara, CA, April 30-May 5, 19891). Langdon, J.H., J. Bruckner, and H. H. Baker, "Pedal Mechanics and Bipedalism in Early Hominids ...34 in Origines de la Bipedie chez les Hominides , Editors Y. Coppens and B. Senat, Paris, France, June 1990. Quam, L.H. and T.M. Strat, "SRI Image

  9. Multi-Case Knowledge-Based IMRT Treatment Planning in Head and Neck Cancer

    NASA Astrophysics Data System (ADS)

    Grzetic, Shelby Mariah

    Head and neck cancer (HNC) IMRT treatment planning is a challenging process that relies heavily on the planner's experience. Previously, we used the single, best match from a library of manually planned cases to semi-automatically generate IMRT plans for a new patient. The current multi-case Knowledge Based Radiation Therapy (MC-KBRT) study utilized different matching cases for each of six individual organs-at-risk (OARs), then combined those six cases to create the new treatment plan. From a database of 103 patient plans created by experienced planners, MC-KBRT plans were created for 40 (17 unilateral and 23 bilateral) HNC "query" patients. For each case, 2D beam's-eye-view images were used to find similar geometric "match" patients separately for each of 6 OARs. Dose distributions for each OAR from the 6 matching cases were combined and then warped to suit the query case's geometry. The dose-volume constraints were used to create the new query treatment plan without the need for human decision-making throughout the IMRT optimization. The optimized MC-KBRT plans were compared against the clinically approved plans and Version 1 (previous KBRT using only one matching case with dose warping) using the dose metrics: mean, median, and maximum (brainstem and cord+5mm) doses. Compared to Version 1, MC-KBRT had no significant reduction of the dose to any of the OARs in either unilateral or bilateral cases. Compared to the manually planned unilateral cases, there was significant reduction of the oral cavity mean/median dose (>2Gy) at the expense of the contralateral parotid. Compared to the manually planned bilateral cases, reduction of dose was significant in the ipsilateral parotid, larynx, and oral cavity (>3Gy mean/median) while maintaining PTV coverage. MC-KBRT planning in head and neck cancer generates IMRT plans with better dose sparing than manually created plans. MC-KBRT using multiple case matches does not show significant dose reduction compared to using a single match case with dose warping.

  10. Fast QRS Detection with an Optimized Knowledge-Based Method: Evaluation on 11 Standard ECG Databases

    PubMed Central

    Elgendi, Mohamed

    2013-01-01

    The current state-of-the-art in automatic QRS detection methods show high robustness and almost negligible error rates. In return, the methods are usually based on machine-learning approaches that require sufficient computational resources. However, simple-fast methods can also achieve high detection rates. There is a need to develop numerically efficient algorithms to accommodate the new trend towards battery-driven ECG devices and to analyze long-term recorded signals in a time-efficient manner. A typical QRS detection method has been reduced to a basic approach consisting of two moving averages that are calibrated by a knowledge base using only two parameters. In contrast to high-accuracy methods, the proposed method can be easily implemented in a digital filter design. PMID:24066054

  11. A Knowledge-Based Strategy for the Automated Support to Network Management Tasks

    NASA Astrophysics Data System (ADS)

    Abar, Sameera; Kinoshita, Tetsuo

    This paper presents a domain-ontology driven multi-agent based scheme for representing the knowledge of the communication network management system. In the proposed knowledge-intensive framework, the static domain-related concepts are articulated as the domain knowledge ontology. The experiential knowledge for managing the network is represented as the fault-case reasoning models, and it is explicitly encoded as the core knowledge of multi-agent middleware layer as heuristic production-type rules. These task-oriented management expertise manipulates the domain content and structure during the diagnostic sessions. The agents' rules along with the embedded generic java-based problem-solving algorithms and run-time log information, perform the automated management tasks. For the proof of concept, an experimental network system has been implemented in our laboratory, and the deployment of some test-bed scenarios is performed. Experimental results confirm a marked reduction in the management-overhead of the network administrator, as compared to the manual network management techniques, in terms of the time-taken and effort-done during a particular fault-diagnosis session. Validation of the reusability/modifiability aspects of our system, illustrates the flexible manipulation of the knowledge fragments within diverse application contexts. The proposed approach can be regarded as one of the pioneered steps towards representing the network knowledge via reusable domain ontology and intelligent agents for the automated network management support systems.

  12. Knowledge-Based Transformational Synthesis of Efficient Structures for Concurrent Computation.

    DTIC Science & Technology

    1985-09-30

    An identity for the enumeration’s operation is selected. This can be artificial , a special null value that is checked for. , Fourth, an ordering for...34797), December 1979 [Coo72] D. C. Cooper, "Theorem Proving in Arithmetic Without Multiplica- tion" Machine Intellegence # 7,1972, pp. 91-99 ICRi81

  13. Design consideration in constructing high performance embedded Knowledge-Based Systems (KBS)

    NASA Technical Reports Server (NTRS)

    Dalton, Shelly D.; Daley, Philip C.

    1988-01-01

    As the hardware trends for artificial intelligence (AI) involve more and more complexity, the process of optimizing the computer system design for a particular problem will also increase in complexity. Space applications of knowledge based systems (KBS) will often require an ability to perform both numerically intensive vector computations and real time symbolic computations. Although parallel machines can theoretically achieve the speeds necessary for most of these problems, if the application itself is not highly parallel, the machine's power cannot be utilized. A scheme is presented which will provide the computer systems engineer with a tool for analyzing machines with various configurations of array, symbolic, scaler, and multiprocessors. High speed networks and interconnections make customized, distributed, intelligent systems feasible for the application of AI in space. The method presented can be used to optimize such AI system configurations and to make comparisons between existing computer systems. It is an open question whether or not, for a given mission requirement, a suitable computer system design can be constructed for any amount of money.

  14. Pursuing Innovation: Benchmarking Milwaukee's Transition to a Knowledge-Based Economy. Metro Milwaukee Innovation Index 2010

    ERIC Educational Resources Information Center

    Million, Laura; Dickman, Anneliese; Henken, Rob

    2010-01-01

    While the Milwaukee region's economic base is rooted in its manufacturing history, many believe that the region's future prosperity will be tied to its ability to successfully transition its economy into one that is based on knowledge and innovation. Indeed, fostering innovation has become the call to action for business and political leaders…

  15. A Knowledge-based Learning and Testing System for Medical Education.

    ERIC Educational Resources Information Center

    MacDonald, Siobhan

    The traditional medical curriculum and internships must be supplemented by standardized teaching modalities, such as computer-assisted instruction using patient simulators. A patient simulator is defined as a representation of a clinical situation in which an individual conducts the diagnosis and management of a patient. Advantages include…

  16. Excellence in the Knowledge-Based Economy: From Scientific to Research Excellence

    ERIC Educational Resources Information Center

    Sørensen, Mads P.; Bloch, Carter; Young, Mitchell

    2016-01-01

    In 2013, the European Union (EU) unveiled its new "Composite Indicator for Scientific and Technological Research Excellence." This is not an isolated occurrence; policy-based interest in excellence is growing all over the world. The heightened focus on excellence and, in particular, attempts to define it through quantitative indicators…

  17. Knowledge-Based Logistics Planning and Its Application in Manufacturing and Strategic Planning

    DTIC Science & Technology

    1990-01-01

    But taro Computer Science Department 2Z6 Belt Hall Aiffaloo NY 1426U DL- 13 Dr Sargur N. Srihari SUNY/But tato Computer Science Department 2 6 Bel I...West Balcones Center Drive Austin, Texas 78759 Dr Benjawin Kuipers University of Texas at Austin Department of Computer Sciences T. S. Painter Hall...943U1 Henry Kautz AT&T Belt Labs 6UU wountain Ave, Room SC-402A Murray Hvlt, NJ 0197 4 Amy L. LanSky Al Center SRI Internationat 353 Ravenswood Ave Menlo

  18. Knowledge-Based Intelligent Software Support of Cellular Adaptation to Microgravity Investigations

    NASA Technical Reports Server (NTRS)

    Groleau, Nick; Grymes, Rosalind A.; Alizadeh, Babak; Friedland, Peter (Technical Monitor)

    1994-01-01

    One of the most significant new opportunities that the Space Station affords cell biologists is the ability to do long-term cultivation of cells in the space environment. This facility is essential for investigations that are primarily focused on effects requiring a longer timeline of observation than that provided by the STS (Space Transportation System) platform. Such work requires both very strong laboratory skills to properly and quickly interact with the hardware hosting the culture and deep knowledge of the cell biology domain in order to optimally react to unanticipated scientific developments. Such work can be enabled by advanced automation techniques that have recently been used in the STS-based Spacelab, and that are being readied for the Space Station. In this paper, we describe the adaptation of PI-in-a-Box, the first interactive space science assistant system, to the study of the effects of space flight on cell cycle progression and proliferation.

  19. Exploring Architecture Options for a Federated, Cloud-based System Biology Knowledgebase

    SciTech Connect

    Gorton, Ian; Liu, Yan; Yin, Jian

    2010-12-02

    This paper evaluates various cloud computing technologies and resources for building a system biology knowledge base system. This system will host a huge amount of data and contain a flexible sets of workflows to operate on these data. It will enable system biologist to share their data and algorithms to allow research results to be reproduced, shared, and reused across the system biology community.

  20. AKSED: adaptive knowledge-based system for event detection using collaborative unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Wang, X. Sean; Lee, Byung Suk; Sadjadi, Firooz

    2006-05-01

    Advances in sensor technology and image processing have made it possible to equip unmanned aerial vehicles (UAVs) with economical, high-resolution, energy-efficient sensors. Despite the improvements, current UAVs lack autonomous and collaborative operation capabilities, due to limited bandwidth and limited on-board image processing abilities. The situation, however, is changing. In the next generation of UAVs, much image processing can be carried out onboard and communication bandwidth problem will improve. More importantly, with more processing power, collaborative operations among a team of autonomous UAVs can provide more intelligent event detection capabilities. In this paper, we present ideas for developing a system enabling target recognitions by collaborative operations of autonomous UAVs. UAVs are configured in three stages: manufacturing, mission planning, and deployment. Different sets of information are needed at different stages, and the resulting outcome is an optimized event detection code deployed onto a UAV. The envisioned system architecture and the contemplated methodology, together with problems to be addressed, are presented.

  1. ToxPlorerTM: A Comprehensive Knowledgebase of Toxicity Pathways Using Ontology-driven Information Extraction

    EPA Science Inventory

    Realizing the potential of pathway-based toxicity testing requires a fresh look at how we describe phenomena leading to adverse effects in vivo, how we assess them in vitro and how we extrapolate them in silico across chemicals, doses and species. We developed the ToxPlorer™ fram...

  2. The Contribution of University Business Incubators to New Knowledge-based Ventures: Evidence from Italy.

    ERIC Educational Resources Information Center

    Grimaldi, Rosa; Grandi, Alessandro

    2001-01-01

    University business incubators give businesses access to labs and equipment, scientific-technical knowledge, networks, and reputation. A study of incubators in Italy shows they do not resolve inadequate funding or lack of management and financial skills. However, the networking capacity can offset these problems. (Contains 25 notes/references.)…

  3. Developing a Physiologically-Based Pharmacokinetic Model Knowledgebase in Support of Provisional Model Construction - poster

    EPA Science Inventory

    Building new physiologically based pharmacokinetic (PBPK) models requires a lot data, such as the chemical-specific parameters and in vivo pharmacokinetic data. Previously-developed, well-parameterized, and thoroughly-vetted models can be great resource for supporting the constr...

  4. Knowledge-Based System Analysis and Control Defense Switched Network Task Areas

    DTIC Science & Technology

    1989-09-30

    NM Operation 40 Figure A.3 NMES/LARS Demo Live Data 42 Figure A.4 ACOC NM Operator Training Mode 44 Figure A.5 NMES/LARS Demo (Simulated Data) 47...implementation of a training system to develop operator skills in the use of the current manual Network Management Support Systems, as well as execution of...together with a user’s guide and documentation that would permit it to be used for training purposes. The FY89 SOW called for continuing functional

  5. Knowledge-based identification of sleep stages based on two forehead electroencephalogram channels.

    PubMed

    Huang, Chih-Sheng; Lin, Chun-Ling; Ko, Li-Wei; Liu, Shen-Yi; Su, Tung-Ping; Lin, Chin-Teng

    2014-01-01

    Sleep quality is important, especially given the considerable number of sleep-related pathologies. The distribution of sleep stages is a highly effective and objective way of quantifying sleep quality. As a standard multi-channel recording used in the study of sleep, polysomnography (PSG) is a widely used diagnostic scheme in sleep medicine. However, the standard process of sleep clinical test, including PSG recording and manual scoring, is complex, uncomfortable, and time-consuming. This process is difficult to implement when taking the whole PSG measurements at home for general healthcare purposes. This work presents a novel sleep stage classification system, based on features from the two forehead EEG channels FP1 and FP2. By recording EEG from forehead, where there is no hair, the proposed system can monitor physiological changes during sleep in a more practical way than previous systems. Through a headband or self-adhesive technology, the necessary sensors can be applied easily by users at home. Analysis results demonstrate that classification performance of the proposed system overcomes the individual differences between different participants in terms of automatically classifying sleep stages. Additionally, the proposed sleep stage classification system can identify kernel sleep features extracted from forehead EEG, which are closely related with sleep clinician's expert knowledge. Moreover, forehead EEG features are classified into five sleep stages by using the relevance vector machine. In a leave-one-subject-out cross validation analysis, we found our system to correctly classify five sleep stages at an average accuracy of 76.7 ± 4.0 (SD) % [average kappa 0.68 ± 0.06 (SD)]. Importantly, the proposed sleep stage classification system using forehead EEG features is a viable alternative for measuring EEG signals at home easily and conveniently to evaluate sleep quality reliably, ultimately improving public healthcare.

  6. Knowledge-Based Strategies in Canadian Workplaces: Is There a Role for Continuing Education?

    ERIC Educational Resources Information Center

    Willment, Jo-Anne

    2004-01-01

    A faculty researcher and six graduate students from the Master of Continuing Education program at the University of Calgary completed a small study of knowledge practices within government, postsecondary, and corporate workplaces across Canada. Interview results include an overview of findings and three narrative descriptions. Analysis produced a…

  7. Reducing a Knowledge-Base Search Space When Data Are Missing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    This software addresses the problem of how to efficiently execute a knowledge base in the presence of missing data. Computationally, this is an exponentially expensive operation that without heuristics generates a search space of 1 + 2n possible scenarios, where n is the number of rules in the knowledge base. Even for a knowledge base of the most modest size, say 16 rules, it would produce 65,537 possible scenarios. The purpose of this software is to reduce the complexity of this operation to a more manageable size. The problem that this system solves is to develop an automated approach that can reason in the presence of missing data. This is a meta-reasoning capability that repeatedly calls a diagnostic engine/model to provide prognoses and prognosis tracking. In the big picture, the scenario generator takes as its input the current state of a system, including probabilistic information from Data Forecasting. Using model-based reasoning techniques, it returns an ordered list of fault scenarios that could be generated from the current state, i.e., the plausible future failure modes of the system as it presently stands. The scenario generator models a Potential Fault Scenario (PFS) as a black box, the input of which is a set of states tagged with priorities and the output of which is one or more potential fault scenarios tagged by a confidence factor. The results from the system are used by a model-based diagnostician to predict the future health of the monitored system.

  8. The Adjustment of National Education Systems to a Knowledge-Based Economy: A New Approach

    ERIC Educational Resources Information Center

    Nelson, Moira

    2010-01-01

    In his article "Globalisation, the Learning Society, and Comparative Education", Peter Jarvis recommends lifelong learning in the period of globalisation as a topic ripe for scholarly research. In particular, he argues for the examination of the extent of lifelong learning around the world and its relation to different levels of…

  9. PromAn: an integrated knowledge-based web server dedicated to promoter analysis

    PubMed Central

    Lardenois, Aurélie; Chalmel, Frédéric; Bianchetti, Laurent; Sahel, José-Alain; Léveillard, Thierry; Poch, Olivier

    2006-01-01

    PromAn is a modular web-based tool dedicated to promoter analysis that integrates distinct complementary databases, methods and programs. PromAn provides automatic analysis of a genomic region with minimal prior knowledge of the genomic sequence. Prediction programs and experimental databases are combined to locate the transcription start site (TSS) and the promoter region within a large genomic input sequence. Transcription factor binding sites (TFBSs) can be predicted using several public databases and user-defined motifs. Also, a phylogenetic footprinting strategy, combining multiple alignment of large genomic sequences and assignment of various scores reflecting the evolutionary selection pressure, allows for evaluation and ranking of TFBS predictions. PromAn results can be displayed in an interactive graphical user interface, PromAnGUI. It integrates all of this information to highlight active promoter regions, to identify among the huge number of TFBS predictions those which are the most likely to be potentially functional and to facilitate user refined analysis. Such an integrative approach is essential in the face of a growing number of tools dedicated to promoter analysis in order to propose hypotheses to direct further experimental validations. PromAn is publicly available at . PMID:16845074

  10. A knowledge-based approach to arterial stiffness estimation using the digital volume pulse.

    PubMed

    Jang, Dae-Geun; Farooq, Umar; Park, Seung-Hun; Goh, Choong-Won; Hahn, Minsoo

    2012-08-01

    We have developed a knowledge based approach for arterial stiffness estimation. The proposed new approach reliably estimates arterial stiffness based on the analysis of age and heart rate normalized reflected wave arrival time. The proposed new approach reduces cost, space, technical expertise, specialized equipment, complexity, and increases the usability compared to recently researched noninvasive arterial stiffness estimators. The proposed method consists of two main stages: pulse feature extraction and linear regression analysis. The new approach extracts the pulse features and establishes a linear prediction equation. On evaluating proposed methodology with pulse wave velocity (PWV) based arterial stiffness estimators, the proposed methodology offered the error rate of 8.36% for men and 9.52% for women, respectively. With such low error rates and increased benefits, the proposed approach could be usefully applied as low cost and effective solution for ubiquitous and home healthcare environments.

  11. An Object-Oriented Signal Processing Environment: The Knowledge-Based Signal Processing Package.

    DTIC Science & Technology

    1984-10-01

    languages have shortcomings that make them inadequate for buiding and maintain- ing large software system [Barstow, Chapter 251. John Backuas states [Backus...M.L.T. by John McWArthy in the early 60’s [MCCarthyj is beat described by the term ..AusculerwI or epplnuve lamuage [Bia’i] Progrtamig in a funk- tiomal...underlying implementation mechanimma SEQ-SETQ ("E &REST AROS) The verdon of the Lisp form "setq" that min be used for naming sequences. An even mnber of

  12. An evidence-based knowledgebase of metastasis suppressors to identify key pathways relevant to cancer metastasis.

    PubMed

    Zhao, Min; Li, Zhe; Qu, Hong

    2015-10-21

    Metastasis suppressor genes (MS genes) are genes that play important roles in inhibiting the process of cancer metastasis without preventing growth of the primary tumor. Identification of these genes and understanding their functions are critical for investigation of cancer metastasis. Recent studies on cancer metastasis have identified many new susceptibility MS genes. However, the comprehensive illustration of diverse cellular processes regulated by metastasis suppressors during the metastasis cascade is lacking. Thus, the relationship between MS genes and cancer risk is still unclear. To unveil the cellular complexity of MS genes, we have constructed MSGene (http://MSGene.bioinfo-minzhao.org/), the first literature-based gene resource for exploring human MS genes. In total, we manually curated 194 experimentally verified MS genes and mapped to 1448 homologous genes from 17 model species. Follow-up functional analyses associated 194 human MS genes with epithelium/tissue morphogenesis and epithelia cell proliferation. In addition, pathway analysis highlights the prominent role of MS genes in activation of platelets and coagulation system in tumor metastatic cascade. Moreover, global mutation pattern of MS genes across multiple cancers may reveal common cancer metastasis mechanisms. All these results illustrate the importance of MSGene to our understanding on cell development and cancer metastasis.

  13. Knowledge-based fuzzy system for diagnosis and control of an integrated biological wastewater treatment process.

    PubMed

    Pires, O C; Palma, C; Costa, J C; Moita, I; Alves, M M; Ferreira, E C

    2006-01-01

    A supervisory expert system based on fuzzy logic rules was developed for diagnosis and control of a laboratory- scale plant comprising anaerobic digestion and anoxic/aerobic modules for combined high rate biological N and C removal. The design and implementation of a computational environment in LabVIEW for data acquisition, plant operation and distributed equipment control is described. A step increase in ammonia concentration from 20 to 60 mg N/L was applied during a trial period of 73 h. Recycle flow rate from the aerobic to the anoxic module and bypass flow rate from the influent directly to the anoxic reactor were the output variables of the fuzzy system. They were automatically changed (from 34 to 111 L/day and from 8 to 13 L/day, respectively), when new plant conditions were recognised by the expert system. Denitrification efficiency higher than 85% was achieved 30 h after the disturbance and 15 h after the system response at an HRT as low as 1.5 h. Nitrification efficiency gradually increased from 12 to 50% at an HRT of 3 h. The system proved to react properly in order to set adequate operating conditions that led to timely and efficient recovery of N and C removal rates.

  14. From Cues to Nudge: A Knowledge-Based Framework for Surveillance of Healthcare-Associated Infections.

    PubMed

    Shaban-Nejad, Arash; Mamiya, Hiroshi; Riazanov, Alexandre; Forster, Alan J; Baker, Christopher J O; Tamblyn, Robyn; Buckeridge, David L

    2016-01-01

    We propose an integrated semantic web framework consisting of formal ontologies, web services, a reasoner and a rule engine that together recommend appropriate level of patient-care based on the defined semantic rules and guidelines. The classification of healthcare-associated infections within the HAIKU (Hospital Acquired Infections - Knowledge in Use) framework enables hospitals to consistently follow the standards along with their routine clinical practice and diagnosis coding to improve quality of care and patient safety. The HAI ontology (HAIO) groups over thousands of codes into a consistent hierarchy of concepts, along with relationships and axioms to capture knowledge on hospital-associated infections and complications with focus on the big four types, surgical site infections (SSIs), catheter-associated urinary tract infection (CAUTI); hospital-acquired pneumonia, and blood stream infection. By employing statistical inferencing in our study we use a set of heuristics to define the rule axioms to improve the SSI case detection. We also demonstrate how the occurrence of an SSI is identified using semantic e-triggers. The e-triggers will be used to improve our risk assessment of post-operative surgical site infections (SSIs) for patients undergoing certain type of surgeries (e.g., coronary artery bypass graft surgery (CABG)).

  15. Theory of mind selectively predicts preschoolers’ knowledge-based selective word learning

    PubMed Central

    Brosseau-Liard, Patricia; Penney, Danielle; Poulin-Dubois, Diane

    2015-01-01

    Children can selectively attend to various attributes of a model, such as past accuracy or physical strength, to guide their social learning. There is a debate regarding whether a relation exists between theory-of-mind skills and selective learning. We hypothesized that high performance on theory-of-mind tasks would predict preference for learning new words from accurate informants (an epistemic attribute), but not from physically strong informants (a non-epistemic attribute). Three- and 4-year-olds (N = 65) completed two selective learning tasks, and their theory of mind abilities were assessed. As expected, performance on a theory-of-mind battery predicted children’s preference to learn from more accurate informants but not from physically stronger informants. Results thus suggest that preschoolers with more advanced theory of mind have a better understanding of knowledge and apply that understanding to guide their selection of informants. This work has important implications for research on children’s developing social cognition and early learning. PMID:26211504

  16. Knowledge-based decision support for Space Station assembly sequence planning

    NASA Technical Reports Server (NTRS)

    1991-01-01

    A complete Personal Analysis Assistant (PAA) for Space Station Freedom (SSF) assembly sequence planning consists of three software components: the system infrastructure, intra-flight value added, and inter-flight value added. The system infrastructure is the substrate on which software elements providing inter-flight and intra-flight value-added functionality are built. It provides the capability for building representations of assembly sequence plans and specification of constraints and analysis options. Intra-flight value-added provides functionality that will, given the manifest for each flight, define cargo elements, place them in the National Space Transportation System (NSTS) cargo bay, compute performance measure values, and identify violated constraints. Inter-flight value-added provides functionality that will, given major milestone dates and capability requirements, determine the number and dates of required flights and develop a manifest for each flight. The current project is Phase 1 of a projected two phase program and delivers the system infrastructure. Intra- and inter-flight value-added were to be developed in Phase 2, which has not been funded. Based on experience derived from hundreds of projects conducted over the past seven years, ISX developed an Intelligent Systems Engineering (ISE) methodology that combines the methods of systems engineering and knowledge engineering to meet the special systems development requirements posed by intelligent systems, systems that blend artificial intelligence and other advanced technologies with more conventional computing technologies. The ISE methodology defines a phased program process that begins with an application assessment designed to provide a preliminary determination of the relative technical risks and payoffs associated with a potential application, and then moves through requirements analysis, system design, and development.

  17. SLS-PLAN-IT: A knowledge-based blackboard scheduling system for Spacelab life sciences missions

    NASA Technical Reports Server (NTRS)

    Kao, Cheng-Yan; Lee, Seok-Hua

    1992-01-01

    The primary scheduling tool in use during the Spacelab Life Science (SLS-1) planning phase was the operations research (OR) based, tabular form Experiment Scheduling System (ESS) developed by NASA Marshall. PLAN-IT is an artificial intelligence based interactive graphic timeline editor for ESS developed by JPL. The PLAN-IT software was enhanced for use in the scheduling of Spacelab experiments to support the SLS missions. The enhanced software SLS-PLAN-IT System was used to support the real-time reactive scheduling task during the SLS-1 mission. SLS-PLAN-IT is a frame-based blackboard scheduling shell which, from scheduling input, creates resource-requiring event duration objects and resource-usage duration objects. The blackboard structure is to keep track of the effects of event duration objects on the resource usage objects. Various scheduling heuristics are coded in procedural form and can be invoked any time at the user's request. The system architecture is described along with what has been learned with the SLS-PLAN-IT project.

  18. Towards sustainable infrastructure management: knowledge-based service-oriented computing framework for visual analytics

    NASA Astrophysics Data System (ADS)

    Vatcha, Rashna; Lee, Seok-Won; Murty, Ajeet; Tolone, William; Wang, Xiaoyu; Dou, Wenwen; Chang, Remco; Ribarsky, William; Liu, Wanqiu; Chen, Shen-en; Hauser, Edd

    2009-05-01

    Infrastructure management (and its associated processes) is complex to understand, perform and thus, hard to make efficient and effective informed decisions. The management involves a multi-faceted operation that requires the most robust data fusion, visualization and decision making. In order to protect and build sustainable critical assets, we present our on-going multi-disciplinary large-scale project that establishes the Integrated Remote Sensing and Visualization (IRSV) system with a focus on supporting bridge structure inspection and management. This project involves specific expertise from civil engineers, computer scientists, geographers, and real-world practitioners from industry, local and federal government agencies. IRSV is being designed to accommodate the essential needs from the following aspects: 1) Better understanding and enforcement of complex inspection process that can bridge the gap between evidence gathering and decision making through the implementation of ontological knowledge engineering system; 2) Aggregation, representation and fusion of complex multi-layered heterogeneous data (i.e. infrared imaging, aerial photos and ground-mounted LIDAR etc.) with domain application knowledge to support machine understandable recommendation system; 3) Robust visualization techniques with large-scale analytical and interactive visualizations that support users' decision making; and 4) Integration of these needs through the flexible Service-oriented Architecture (SOA) framework to compose and provide services on-demand. IRSV is expected to serve as a management and data visualization tool for construction deliverable assurance and infrastructure monitoring both periodically (annually, monthly, even daily if needed) as well as after extreme events.

  19. A Very Large Area Network (VLAN) knowledge-base applied to space communication problems

    NASA Technical Reports Server (NTRS)

    Zander, Carol S.

    1988-01-01

    This paper first describes a hierarchical model for very large area networks (VLAN). Space communication problems whose solution could profit by the model are discussed and then an enhanced version of this model incorporating the knowledge needed for the missile detection-destruction problem is presented. A satellite network or VLAN is a network which includes at least one satellite. Due to the complexity, a compromise between fully centralized and fully distributed network management has been adopted. Network nodes are assigned to a physically localized group, called a partition. Partitions consist of groups of cell nodes with one cell node acting as the organizer or master, called the Group Master (GM). Coordinating the group masters is a Partition Master (PM). Knowledge is also distributed hierarchically existing in at least two nodes. Each satellite node has a back-up earth node. Knowledge must be distributed in such a way so as to minimize information loss when a node fails. Thus the model is hierarchical both physically and informationally.

  20. The application of integrated knowledge-based systems for the Biomedical Risk Assessment Intelligent Network (BRAIN)

    NASA Technical Reports Server (NTRS)

    Loftin, Karin C.; Ly, Bebe; Webster, Laurie; Verlander, James; Taylor, Gerald R.; Riley, Gary; Culbert, Chris

    1992-01-01

    One of NASA's goals for long duration space flight is to maintain acceptable levels of crew health, safety, and performance. One way of meeting this goal is through BRAIN, an integrated network of both human and computer elements. BRAIN will function as an advisor to mission managers by assessing the risk of inflight biomedical problems and recommending appropriate countermeasures. Described here is a joint effort among various NASA elements to develop BRAIN and the Infectious Disease Risk Assessment (IDRA) prototype. The implementation of this effort addresses the technological aspects of knowledge acquisition, integration of IDRA components, the use of expert systems to automate the biomedical prediction process, development of a user friendly interface, and integration of IDRA and ExerCISys systems. Because C language, CLIPS and the X-Window System are portable and easily integrated, they were chosen ss the tools for the initial IDRA prototype.

  1. A knowledge-based design framework for airplane conceptual and preliminary design

    NASA Astrophysics Data System (ADS)

    Anemaat, Wilhelmus A. J.

    The goal of work described herein is to develop the second generation of Advanced Aircraft Analysis (AAA) into an object-oriented structure which can be used in different environments. One such environment is the third generation of AAA with its own user interface, the other environment with the same AAA methods (i.e. the knowledge) is the AAA-AML program. AAA-AML automates the initial airplane design process using current AAA methods in combination with AMRaven methodologies for dependency tracking and knowledge management, using the TechnoSoft Adaptive Modeling Language (AML). This will lead to the following benefits: (1) Reduced design time: computer aided design methods can reduce design and development time and replace tedious hand calculations. (2) Better product through improved design: more alternative designs can be evaluated in the same time span, which can lead to improved quality. (3) Reduced design cost: due to less training and less calculation errors substantial savings in design time and related cost can be obtained. (4) Improved Efficiency: the design engineer can avoid technically correct but irrelevant calculations on incomplete or out of sync information, particularly if the process enables robust geometry earlier. Although numerous advancements in knowledge based design have been developed for detailed design, currently no such integrated knowledge based conceptual and preliminary airplane design system exists. The third generation AAA methods are tested over a ten year period on many different airplane designs. Using AAA methods will demonstrate significant time savings. The AAA-AML system will be exercised and tested using 27 existing airplanes ranging from single engine propeller, business jets, airliners, UAV's to fighters. Data for the varied sizing methods will be compared with AAA results, to validate these methods. One new design, a Light Sport Aircraft (LSA), will be developed as an exercise to use the tool for designing a new airplane. Using these tools will show an improvement in efficiency over using separate programs due to the automatic recalculation with any change of input data. The direct visual feedback of 3D geometry in the AAA-AML, will lead to quicker resolving of problems as opposed to conventional methods.

  2. Sensitivity analysis of land unit suitability for conservation using a knowledge-based system.

    PubMed

    Humphries, Hope C; Bourgeron, Patrick S; Reynolds, Keith M

    2010-08-01

    The availability of spatially continuous data layers can have a strong impact on selection of land units for conservation purposes. The suitability of ecological conditions for sustaining the targets of conservation is an important consideration in evaluating candidate conservation sites. We constructed two fuzzy logic-based knowledge bases to determine the conservation suitability of land units in the interior Columbia River basin using NetWeaver software in the Ecosystem Management Decision Support application framework. Our objective was to assess the sensitivity of suitability ratings, derived from evaluating the knowledge bases, to fuzzy logic function parameters and to the removal of data layers (land use condition, road density, disturbance regime change index, vegetation change index, land unit size, cover type size, and cover type change index). The amount and geographic distribution of suitable land polygons was most strongly altered by the removal of land use condition, road density, and land polygon size. Removal of land use condition changed suitability primarily on private or intensively-used public land. Removal of either road density or land polygon size most strongly affected suitability on higher-elevation US Forest Service land containing small-area biophysical environments. Data layers with the greatest influence differed in rank between the two knowledge bases. Our results reinforce the importance of including both biophysical and socio-economic attributes to determine the suitability of land units for conservation. The sensitivity tests provided information about knowledge base structuring and parameterization as well as prioritization for future data needs.

  3. Knowledge-Based Natural Language Understanding: A AAAI-87 Survey Talk

    DTIC Science & Technology

    1987-01-01

    textbooks on Al, but if you’re not familiar with the concept, I’ll run through it very briefly. I Scripts are designed to encode stereotypic event...balloon script. Here is our stereotypic event knowledge about balloons: They start out in an uninflated state. They get inflated in one of two... stereotypic manners, they get tied, and then they die a natural death in one of three ways (see figure 2). 8 I [insert figure 2 about here] This is event

  4. Knowledge-Based Natural Language Understanding: A AAAI-87 Survey Talk

    DTIC Science & Technology

    1991-01-01

    stereotypic event sequences. This is mundane knowledge about some standard scenario for which a common linguistic community shares knowledge. So, for...a balloon script. Here is our stereotypic event knowledge about balloons: They start out in an uninflated state. They get inflated in one of two... stereotypic manners, they get tied, and then they die a natural death in one or three ways (sec figure 2). 8 [insert figure 2 about here] This is event

  5. Technical Opinions Regarding Knowledge-Based Integrated Information Systems Engineering. Volume 8.

    DTIC Science & Technology

    1987-12-01

    794-7696 Michael Stonebraker Department of Computer Science U.C. Berkeley 571 Evans Hall Berkeley, CA 94702 (415) 642-5799 (or 1024) Paul Thompson...on Object-Oriented Systems, IEEE Computer Society, 1985. ,,, [MANO86a] F. Manola and J. Orenstein , "Toward a General Spatial Data Model for an Object

  6. Knowledge-based program to assist in the design of machine vision systems

    NASA Astrophysics Data System (ADS)

    Batchelor, Bruce G.

    1998-10-01

    There exists a serious bottle-neck in the process of designing Machine Vision Systems. This is so severe that the long-claimed flexibility of this technology will never be realized, unless there is a significant increase in the capacity of present-day vision system design teams. One possible way to improve matters is to provide appropriate design tools that will amplify the efforts of engineers who lack the necessary educational back-ground. This article describes a major extension to an existing program, called the Lighting Advisor, which is able to search a pictorial database, looking for key-words chosen by the user. The revised program bases its advice on a description of the object to be inspected and the working environment. The objective of this research is to reduce the skill level needed to operate the program, so that an industrial engineer, with little or no special training in Machine Vision, can receive appropriate and relevant advice, relating to a range of tasks in the design of industrial vision systems.

  7. Knowledge-based load leveling and task allocation in human-machine systems

    NASA Technical Reports Server (NTRS)

    Chignell, M. H.; Hancock, P. A.

    1986-01-01

    Conventional human-machine systems use task allocation policies which are based on the premise of a flexible human operator. This individual is most often required to compensate for and augment the capabilities of the machine. The development of artificial intelligence and improved technologies have allowed for a wider range of task allocation strategies. In response to these issues a Knowledge Based Adaptive Mechanism (KBAM) is proposed for assigning tasks to human and machine in real time, using a load leveling policy. This mechanism employs an online workload assessment and compensation system which is responsive to variations in load through an intelligent interface. This interface consists of a loading strategy reasoner which has access to information about the current status of the human-machine system as well as a database of admissible human/machine loading strategies. Difficulties standing in the way of successful implementation of the load leveling strategy are examined.

  8. A knowledge-based system design/information tool for aircraft flight control systems

    NASA Technical Reports Server (NTRS)

    Mackall, Dale A.; Allen, James G.

    1989-01-01

    Research aircraft have become increasingly dependent on advanced control systems to accomplish program goals. These aircraft are integrating multiple disciplines to improve performance and satisfy research objectives. This integration is being accomplished through electronic control systems. Because of the number of systems involved and the variety of engineering disciplines, systems design methods and information management have become essential to program success. The primary objective of the system design/information tool for aircraft flight control system is to help transfer flight control system design knowledge to the flight test community. By providing all of the design information and covering multiple disciplines in a structured, graphical manner, flight control systems can more easily be understood by the test engineers. This will provide the engineers with the information needed to thoroughly ground test the system and thereby reduce the likelihood of serious design errors surfacing in flight. The secondary objective is to apply structured design techniques to all of the design domains. By using the techniques in the top level system design down through the detailed hardware and software designs, it is hoped that fewer design anomalies will result. The flight test experiences of three highly complex, integrated aircraft programs are reviewed: the X-29 forward-swept wing, the advanced fighter technology integration (AFTI) F-16, and the highly maneuverable aircraft technology (HiMAT) program. Significant operating anomalies and the design errors which cause them, are examined to help identify what functions a system design/information tool should provide to assist designers in avoiding errors.

  9. Knowledge-based machine indexing from natural language text: Knowledge base design, development, and maintenance

    NASA Technical Reports Server (NTRS)

    Genuardi, Michael T.

    1993-01-01

    One strategy for machine-aided indexing (MAI) is to provide a concept-level analysis of the textual elements of documents or document abstracts. In such systems, natural-language phrases are analyzed in order to identify and classify concepts related to a particular subject domain. The overall performance of these MAI systems is largely dependent on the quality and comprehensiveness of their knowledge bases. These knowledge bases function to (1) define the relations between a controlled indexing vocabulary and natural language expressions; (2) provide a simple mechanism for disambiguation and the determination of relevancy; and (3) allow the extension of concept-hierarchical structure to all elements of the knowledge file. After a brief description of the NASA Machine-Aided Indexing system, concerns related to the development and maintenance of MAI knowledge bases are discussed. Particular emphasis is given to statistically-based text analysis tools designed to aid the knowledge base developer. One such tool, the Knowledge Base Building (KBB) program, presents the domain expert with a well-filtered list of synonyms and conceptually-related phrases for each thesaurus concept. Another tool, the Knowledge Base Maintenance (KBM) program, functions to identify areas of the knowledge base affected by changes in the conceptual domain (for example, the addition of a new thesaurus term). An alternate use of the KBM as an aid in thesaurus construction is also discussed.

  10. Adaptation of a Knowledge-Based Decision-Support System in the Tactical Environment.

    DTIC Science & Technology

    1981-12-01

    response. One approach to accomplish this is the incorporation of artificial intelligence into the tactical decision-support system ( TDSS ). Although most...our solution. A very critical, perhaps the most critical, shortcoming of current systems is the difficulty encountered in adapting the TDSS to changing...different. k TDSS , to be truly responsive to the needs of the tactical commander, must be able to adapt to these changes rapidly. Current systems

  11. Chemogenomics knowledgebase and systems pharmacology for hallucinogen target identification-Salvinorin A as a case study.

    PubMed

    Xu, Xiaomeng; Ma, Shifan; Feng, Zhiwei; Hu, Guanxing; Wang, Lirong; Xie, Xiang-Qun

    2016-11-01

    Drug abuse is a serious problem worldwide. Recently, hallucinogens have been reported as a potential preventative and auxiliary therapy for substance abuse. However, the use of hallucinogens as a drug abuse treatment has potential risks, as the fundamental mechanisms of hallucinogens are not clear. So far, no scientific database is available for the mechanism research of hallucinogens. We constructed a hallucinogen-specific chemogenomics database by collecting chemicals, protein targets and pathways closely related to hallucinogens. This information, together with our established computational chemogenomics tools, such as TargetHunter and HTDocking, provided a one-step solution for the mechanism study of hallucinogens. We chose salvinorin A, a potent hallucinogen extracted from the plant Salvia divinorum, as an example to demonstrate the usability of our platform. With the help of HTDocking program, we predicted four novel targets for salvinorin A, including muscarinic acetylcholine receptor 2, cannabinoid receptor 1, cannabinoid receptor 2 and dopamine receptor 2. We looked into the interactions between salvinorin A and the predicted targets. The binding modes, pose and docking scores indicate that salvinorin A may interact with some of these predicted targets. Overall, our database enriched the information of systems pharmacological analysis, target identification and drug discovery for hallucinogens.

  12. The Digital Anatomist Distributed Framework and Its Applications to Knowledge-based Medical Imaging

    PubMed Central

    Brinkley, James F.; Rosse, Cornelius

    1997-01-01

    Abstract The domain of medical imaging is anatomy. Therefore, anatomic knowledge should be a rational basis for organizing and analyzing images. The goals of the Digital Anatomist Program at the University of Washington include the development of an anatomically based software framework for organizing, analyzing, visualizing and utilizing biomedical information. The framework is based on representations for both spatial and symbolic anatomic knowledge, and is being implemented in a distributed architecture in which multiple client programs on the Internet are used to update and access an expanding set of anatomical information resources. The development of this framework is driven by several practical applications, including symbolic anatomic reasoning, knowledge based image segmentation, anatomy information retrieval, and functional brain mapping. Since each of these areas involves many difficult image processing issues, our research strategy is an evolutionary one, in which applications are developed somewhat independently, and partial solutions are integrated in a piecemeal fashion, using the network as the substrate. This approach assumes that networks of interacting components can synergistically work together to solve problems larger than either could solve on its own. Each of the individual projects is described, along with evaluations that show that the individual components are solving the problems they were designed for, and are beginning to interact with each other in a synergistic manner. We argue that this synergy will increase, not only within our own group, but also among groups as the Internet matures, and that an anatomic knowledge base will be a useful means for fostering these interactions. PMID:9147337

  13. A knowledge-based weighting approach to ligand-based virtual screening.

    PubMed

    Stiefl, Nikolaus; Zaliani, Andrea

    2006-01-01

    On the basis of the recently introduced reduced graph concept of ErG (extending reduced graphs), a straightforward weighting approach to include additional (e.g., structural or SAR) knowledge into similarity searching procedures for virtual screening (wErG) is proposed. This simple procedure is exemplified with three data sets, for which interaction patterns available from X-ray structures of native or peptidomimetic ligands with their target protein are used to significantly improve retrieval rates of known actives from the MDL Drug Report database. The results are compared to those of other virtual screening techniques such as Daylight fingerprints, FTrees, UNITY, and various FlexX docking protocols. Here, it is shown that wErG exhibits a very good and stable performance independent of the target structure. On the basis of this (and the fact that ErG retrieves structurally more dissimilar compounds due to its potential to perform scaffold-hopping), the combination of wErG and FlexX is successfully explored. Overall, wErG is not only an easily applicable weighting procedure that efficiently identifies actives in large data sets but it is also straightforward to understand for both medicinal and computational chemists and can, therefore, be driven by several aspects of project-related knowledge (e.g., X-ray, NMR, SAR, and site-directed mutagenesis) in a very early stage of the hit identification process.

  14. Knowledge-based geographic information systems (KBGIS): New analytic and data management tools

    USGS Publications Warehouse

    Albert, T.M.

    1988-01-01

    In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the U.S. Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved. ?? 1988 International Association for Mathematical Geology.

  15. REGene: a literature-based knowledgebase of animal regeneration that bridge tissue regeneration and cancer

    PubMed Central

    Zhao, Min; Rotgans, Bronwyn; Wang, Tianfang; Cummins, S. F.

    2016-01-01

    Regeneration is a common phenomenon across multiple animal phyla. Regeneration-related genes (REGs) are critical for fundamental cellular processes such as proliferation and differentiation. Identification of REGs and elucidating their functions may help to further develop effective treatment strategies in regenerative medicine. So far, REGs have been largely identified by small-scale experimental studies and a comprehensive characterization of the diverse biological processes regulated by REGs is lacking. Therefore, there is an ever-growing need to integrate REGs at the genomics, epigenetics, and transcriptome level to provide a reference list of REGs for regeneration and regenerative medicine research. Towards achieving this, we developed the first literature-based database called REGene (REgeneration Gene database). In the current release, REGene contains 948 human (929 protein-coding and 19 non-coding genes) and 8445 homologous genes curated from gene ontology and extensive literature examination. Additionally, the REGene database provides detailed annotations for each REG, including: gene expression, methylation sites, upstream transcription factors, and protein-protein interactions. An analysis of the collected REGs reveals strong links to a variety of cancers in terms of genetic mutation, protein domains, and cellular pathways. We have prepared a web interface to share these regeneration genes, supported by refined browsing and searching functions at http://REGene.bioinfo-minzhao.org/. PMID:26975833

  16. Application of knowledge-based network processing to automated gas chromatography data interpretation

    SciTech Connect

    Levis, A.P.; Timpany, R.G.; Klotter, D.A.

    1995-10-01

    A method of translating a two-way table of qualified symptom/cause relationships into a four layer Expert Network for diagnosis of machine or sample preparation failure for Gas Chromatography is presented. This method has proven to successfully capture an expert`s ability to predict causes of failure in a Gas Chromatograph based on a small set of symptoms, derived from a chromatogram, in spite of poorly defined category delineations and definitions. In addition, the resulting network possesses the advantages inherent in most neural networks: the ability to function correctly in the presence of missing or uncertain inputs and the ability to improve performance through data-based training procedures. Acquisition of knowledge from the domain experts produced a group of imprecise cause-to-symptom relationships. These are reproduced as parallel pathways composed of Symptom-Filter-Combination-Cause node chains in the network representation. Each symptom signal is passed through a Filter node to determine if the signal should be interpreted as positive or negative evidence and then modified according to the relationship established by the domain experts. The signals from several processed symptoms are then combined in the Combination node(s) for a given cause. The resulting value is passed to the Cause node and the highest valued Cause node is then selected as the most probable cause of failure.

  17. Knowledge-based analysis of microarray gene expression data by using support vector machines

    SciTech Connect

    William Grundy; Manuel Ares, Jr.; David Haussler

    2001-06-18

    The authors introduce a method of functionally classifying genes by using gene expression data from DNA microarray hybridization experiments. The method is based on the theory of support vector machines (SVMs). SVMs are considered a supervised computer learning method because they exploit prior knowledge of gene function to identify unknown genes of similar function from expression data. SVMs avoid several problems associated with unsupervised clustering methods, such as hierarchical clustering and self-organizing maps. SVMs have many mathematical features that make them attractive for gene expression analysis, including their flexibility in choosing a similarity function, sparseness of solution when dealing with large data sets, the ability to handle large feature spaces, and the ability to identify outliers. They test several SVMs that use different similarity metrics, as well as some other supervised learning methods, and find that the SVMs best identify sets of genes with a common function using expression data. Finally, they use SVMs to predict functional roles for uncharacterized yeast ORFs based on their expression data.

  18. A knowledge-based tool for multilevel decomposition of a complex design problem

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    1989-01-01

    Although much work has been done in applying artificial intelligence (AI) tools and techniques to problems in different engineering disciplines, only recently has the application of these tools begun to spread to the decomposition of complex design problems. A new tool based on AI techniques has been developed to implement a decomposition scheme suitable for multilevel optimization and display of data in an N x N matrix format.

  19. NRPS-PKS: a knowledge-based resource for analysis of NRPS/PKS megasynthases.

    PubMed

    Ansari, Mohd Zeeshan; Yadav, Gitanjali; Gokhale, Rajesh S; Mohanty, Debasisa

    2004-07-01

    NRPS-PKS is web-based software for analysing large multi-enzymatic, multi-domain megasynthases that are involved in the biosynthesis of pharmaceutically important natural products such as cyclosporin, rifamycin and erythromycin. NRPS-PKS has been developed based on a comprehensive analysis of the sequence and structural features of several experimentally characterized biosynthetic gene clusters. The results of these analyses have been organized as four integrated searchable databases for elucidating domain organization and substrate specificity of nonribosomal peptide synthetases and three types of polyketide synthases. These databases work as the backend of NRPS-PKS and provide the knowledge base for predicting domain organization and substrate specificity of uncharacterized NRPS/PKS clusters. Benchmarking on a large set of biosynthetic gene clusters has demonstrated that, apart from correct identification of NRPS and PKS domains, NRPS-PKS can also predict specificities of adenylation and acyltransferase domains with reasonably high accuracy. These features of NRPS-PKS make it a valuable resource for identification of natural products biosynthesized by NRPS/PKS gene clusters found in newly sequenced genomes. The training and test sets of gene clusters included in NRPS-PKS correlate information on 307 open reading frames, 2223 functional protein domains, 68 starter/extender precursors and their specific recognition motifs, and also the chemical structure of 101 natural products from four different families. NRPS-PKS is a unique resource which provides a user-friendly interface for correlating chemical structures of natural products with the domains and modules in the corresponding nonribosomal peptide synthetases or polyketide synthases. It also provides guidelines for domain/module swapping as well as site-directed mutagenesis experiments to engineer biosynthesis of novel natural products. NRPS-PKS can be accessed at http://www.nii.res.in/nrps-pks.html.

  20. Knowledge-Based Logistics Planning: Its Application in Manufacturing and Strategic Planning

    DTIC Science & Technology

    1991-09-01

    achieves local consistency between groups of variables via the elimination of incompatible values [Montanari 74, Mackworth 77, Davis 871. The generality...variable vertices are compatible with each other. A satisfiability specification vertex groups constraints into sets of type AND, OR, or XOR. An XOR... group of agents each of which has (a) limited knowledge of the environment, (b) limited knowledge of the constraints and requirements of other agents

  1. Expert knowledge-based assessment of farming practices for different biotic indicators using fuzzy logic.

    PubMed

    Sattler, Claudia; Stachow, Ulrich; Berger, Gert

    2012-03-01

    The study presented here describes a modeling approach for the ex-ante assessment of farming practices with respect to their risk for several single-species biodiversity indicators. The approach is based on fuzzy-logic techniques and, thus, is tolerant to the inclusion of sources of uncertain knowledge, such as expert judgment into the assessment. The result of the assessment is a so-called Index of Suitability (IS) for the five selected biotic indicators calculated per farming practice. Results of IS values are presented for the comparison of crops and for the comparison of several production alternatives per crop (e.g., organic vs. integrated farming, mineral vs. organic fertilization, and reduced vs. plow tillage). Altogether, the modeled results show that the different farming practices can greatly differ in terms of their suitability for the different biotic indicators and that the farmer has a certain scope of flexibility in opting for a farming practice that is more in favor of biodiversity conservation. Thus, the approach is apt to identify farming practices that contribute to biodiversity conservation and, moreover, enables the identification of farming practices that are suitable with respect to more than one biotic indicator.

  2. KUPSnet: Knowledge-based Ubiquitous and Persistent Sensor Network Testbed for Threat Assessment

    DTIC Science & Technology

    2010-09-16

    distribution of interferometer baselines”, Astronomy and Astrophysics Supplement Ser., vol. 15, pp. 417-426, 1974. [12] R. J. -M. Cramer, R. A. Scholtz and M...Grant CNS-0721515, CNS-0831902 and CCF-0956438. The research of X. Cheng was supported in part by the NSF CAREER award CNS-0347674 and NSF under...in 2006 and joined NSF again as a part-time program director in April 2008. She received the NSF CAREER Award in 2004. Sherwood W. Samn was born in Los

  3. Application of knowledge-based vision to closed-loop control of the injection molding process

    NASA Astrophysics Data System (ADS)

    Marsh, Robert; Stamp, R. J.; Hill, T. M.

    1997-10-01

    An investigation is under way to develop a control system for an industrial process which uses a vision systems as a sensor. The research is aimed at the improvement of product quality in commercial injection molding system. A significant enhancement has been achieved in the level of application of visually based inspection techniques to component quality. The aim of the research has been the investigation, and employment, of inspection methods that use knowledge based machine vision. The application of such techniques in this context is comprehensive, extending from object oriented analysis, design and programming of the inspection program, to the application of rule based reasoning, to image interpretation, vision system diagnostics, component diagnostics and molding machine control. In this way, knowledge handling methods are exploited wherever they prove to be beneficial. The vision knowledge base contains information on the procedures required to achieve successful identification of component surface defects. A collection of image processing and pattern recognition algorithms are applied selectively. Once inspection of the component has been performed, defects are related to process variables which affect the quality of the component, and another knowledge base is used to effect a control action at the molding machine. Feedback from other machine sensor is also used to direct the control procedure. Results from the knowledge based vision inspection system are encouraging. They indicate that rapid and effective fault detection and analysis is feasible, as is the verification of system integrity.

  4. Meta-Modeling: A Knowledge-Based Approach to Facilitating Model Construction and Reuse

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Dungan, Jennifer L.

    1997-01-01

    In this paper, we introduce a new modeling approach called meta-modeling and illustrate its practical applicability to the construction of physically-based ecosystem process models. As a critical adjunct to modeling codes meta-modeling requires explicit specification of certain background information related to the construction and conceptual underpinnings of a model. This information formalizes the heretofore tacit relationship between the mathematical modeling code and the underlying real-world phenomena being investigated, and gives insight into the process by which the model was constructed. We show how the explicit availability of such information can make models more understandable and reusable and less subject to misinterpretation. In particular, background information enables potential users to better interpret an implemented ecosystem model without direct assistance from the model author. Additionally, we show how the discipline involved in specifying background information leads to improved management of model complexity and fewer implementation errors. We illustrate the meta-modeling approach in the context of the Scientists' Intelligent Graphical Modeling Assistant (SIGMA) a new model construction environment. As the user constructs a model using SIGMA the system adds appropriate background information that ties the executable model to the underlying physical phenomena under investigation. Not only does this information improve the understandability of the final model it also serves to reduce the overall time and programming expertise necessary to initially build and subsequently modify models. Furthermore, SIGMA's use of background knowledge helps eliminate coding errors resulting from scientific and dimensional inconsistencies that are otherwise difficult to avoid when building complex models. As a. demonstration of SIGMA's utility, the system was used to reimplement and extend a well-known forest ecosystem dynamics model: Forest-BGC.

  5. A Knowledge-Based Approach to Describe and Adapt Learning Objects

    ERIC Educational Resources Information Center

    Bouzeghoub, Amel; Defude, Bruno; Duitama, John Freddy; Lecocq, Claire

    2006-01-01

    Our claim is that semantic metadata are required to allow a real reusing and assembling of learning objects. Our system is based on three models used to describe the domain, learners, and learning objects. The learning object model is inspired from knowledge representation proposals. A learning object can be reused directly or can be combined with…

  6. Membrane transporters in a human genome-scale metabolic knowledgebase and their implications for disease

    PubMed Central

    Sahoo, Swagatika; Aurich, Maike K.; Jonsson, Jon J.; Thiele, Ines

    2014-01-01

    Membrane transporters enable efficient cellular metabolism, aid in nutrient sensing, and have been associated with various diseases, such as obesity and cancer. Genome-scale metabolic network reconstructions capture genomic, physiological, and biochemical knowledge of a target organism, along with a detailed representation of the cellular metabolite transport mechanisms. Since the first reconstruction of human metabolism, Recon 1, published in 2007, progress has been made in the field of metabolite transport. Recently, we published an updated reconstruction, Recon 2, which significantly improved the metabolic coverage and functionality. Human metabolic reconstructions have been used to investigate the role of metabolism in disease and to predict biomarkers and drug targets. Given the importance of cellular transport systems in understanding human metabolism in health and disease, we analyzed the coverage of transport systems for various metabolite classes in Recon 2. We will review the current knowledge on transporters (i.e., their preferred substrates, transport mechanisms, metabolic relevance, and disease association for each metabolite class). We will assess missing coverage and propose modifications and additions through a transport module that is functional when combined with Recon 2. This information will be valuable for further refinements. These data will also provide starting points for further experiments by highlighting areas of incomplete knowledge. This review represents the first comprehensive overview of the transporters involved in central metabolism and their transport mechanisms, thus serving as a compendium of metabolite transporters specific for human metabolic reconstructions. PMID:24653705

  7. Shootout-89, A Comparative Evaluation of Knowledge-based Systems That Forecast Severe Weather.

    NASA Astrophysics Data System (ADS)

    Moninger, W. R.; Lusk, C.; Roberts, W. F.; Bullas, J.; de Lorenzis, B.; McLeod, J. C.; Ellison, E.; Flueck, J.; Lampru, P. D.; Young, K. C.; Weaver, J.; Philips, R. S.; Shaw, R.; Stewart, T. R.; Zubrick, S. M.

    1991-09-01

    During the summer of 1989, the Forecast Systems Laboratory of the National Oceanic and Atmospheric Administration sponsored an evaluation of artificial-intelligence-based systems that forecast severe convective storms. The evaluation experiment, called Shootout-89, took place in Boulder, Colorado, and focused on storms over the northeastern Colorado foothills and plains.Six systems participated in Shootout-89: three traditional expert systems, a hybrid system including a linear model augmented by a small expert system, an analogue-based system, and a system developed using methods from the cognitive science/judgment analysis tradition.Each day of the exercise, the systems generated 2-9-h forecasts of the probabilities of occurrence of nonsignificant weather, significant weather, and severe weather in each of tour regions in northeastern Colorado. A verification coordinator working at the Denver Weather Service Forecast Office gathered ground-truth data from a network of observers.The systems were evaluated on several measures of forecast skill, on timeliness, on ease of learning, and on ease of use. They were generally easy to operate; however, they required substantially different levels of meteorological expertise on the part of their users, reflecting the various operational environments for which they had been designed. The systems varied in their statistical behavior, but on this difficult forecast problem, they generally showed a skill approximately equal to that of persistence forecasts and climatological forecasts.

  8. Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge-Based Image Analysis.

    DTIC Science & Technology

    1988-01-19

    approach for the analysis of aerial images. In this approach image analysis is performed ast three levels of abstraction, namely iconic or low-level... image analysis , symbolic or medium-level image analysis , and semantic or high-level image analysis . Domain dependent knowledge about prototypical urban

  9. Knowledge-based probabilistic representations of branching ratios in chemical networks: The case of dissociative recombinations

    SciTech Connect

    Plessis, Sylvain; Carrasco, Nathalie; Pernot, Pascal

    2010-10-07

    Experimental data about branching ratios for the products of dissociative recombination of polyatomic ions are presently the unique information source available to modelers of natural or laboratory chemical plasmas. Yet, because of limitations in the measurement techniques, data for many ions are incomplete. In particular, the repartition of hydrogen atoms among the fragments of hydrocarbons ions is often not available. A consequence is that proper implementation of dissociative recombination processes in chemical models is difficult, and many models ignore invaluable data. We propose a novel probabilistic approach based on Dirichlet-type distributions, enabling modelers to fully account for the available information. As an application, we consider the production rate of radicals through dissociative recombination in an ionospheric chemistry model of Titan, the largest moon of Saturn. We show how the complete scheme of dissociative recombination products derived with our method dramatically affects these rates in comparison with the simplistic H-loss mechanism implemented by default in all recent models.

  10. Knowledge-based image processing for on-off type DNA microarray

    NASA Astrophysics Data System (ADS)

    Kim, Jong D.; Kim, Seo K.; Cho, Jeong S.; Kim, Jongwon

    2002-06-01

    This paper addresses the image processing technique for discriminating whether the probes are hybrized with target DNA in the Human Papilloma Virus (HPV) DNA Chip designed for genotyping HPV. In addition to the probes, the HPV DNA chip has markers that always react with the sample DNA. The positions of probe-dots in the final scanned image are fixed relative to the marker-dot locations with a small variation according to the accuracy of the dotter and the scanner. The probes are duplicated 4 times for the diagnostic stability. The prior knowledges such as the maker relative distance and the duplication information of probes is integrated into the template matching technique with the normalized correlation measure. Results show that the employment of both of the prior knowledges is to simply average the template matching measures over the positions of the markers and probes. The eventual proposed scheme yields stable marker locating and probe classification.

  11. Preparing Oral Examinations of Mathematical Domains with the Help of a Knowledge-Based Dialogue System.

    ERIC Educational Resources Information Center

    Schmidt, Peter

    A conception of discussing mathematical material in the domain of calculus is outlined. Applications include that university students work at their knowledge and prepare for their oral examinations by utilizing the dialog system. The conception is based upon three pillars. One central pillar is a knowledge base containing the collections of…

  12. A knowledge-based approach to identification and adaptation in dynamical systems control

    NASA Technical Reports Server (NTRS)

    Glass, B. J.; Wong, C. M.

    1988-01-01

    Artificial intelligence techniques are applied to the problems of model form and parameter identification of large-scale dynamic systems. The object-oriented knowledge representation is discussed in the context of causal modeling and qualitative reasoning. Structured sets of rules are used for implementing qualitative component simulations, for catching qualitative discrepancies and quantitative bound violations, and for making reconfiguration and control decisions that affect the physical system. These decisions are executed by backward-chaining through a knowledge base of control action tasks. This approach was implemented for two examples: a triple quadrupole mass spectrometer and a two-phase thermal testbed. Results of tests with both of these systems demonstrate that the software replicates some or most of the functionality of a human operator, thereby reducing the need for a human-in-the-loop in the lower levels of control of these complex systems.

  13. AlexSys: a knowledge-based expert system for multiple sequence alignment construction and analysis.

    PubMed

    Aniba, Mohamed Radhouene; Poch, Olivier; Marchler-Bauer, Aron; Thompson, Julie Dawn

    2010-10-01

    Multiple sequence alignment (MSA) is a cornerstone of modern molecular biology and represents a unique means of investigating the patterns of conservation and diversity in complex biological systems. Many different algorithms have been developed to construct MSAs, but previous studies have shown that no single aligner consistently outperforms the rest. This has led to the development of a number of 'meta-methods' that systematically run several aligners and merge the output into one single solution. Although these methods generally produce more accurate alignments, they are inefficient because all the aligners need to be run first and the choice of the best solution is made a posteriori. Here, we describe the development of a new expert system, AlexSys, for the multiple alignment of protein sequences. AlexSys incorporates an intelligent inference engine to automatically select an appropriate aligner a priori, depending only on the nature of the input sequences. The inference engine was trained on a large set of reference multiple alignments, using a novel machine learning approach. Applying AlexSys to a test set of 178 alignments, we show that the expert system represents a good compromise between alignment quality and running time, making it suitable for high throughput projects. AlexSys is freely available from http://alnitak.u-strasbg.fr/∼aniba/alexsys.

  14. Knowledge-Based Integrated Information Systems Development Methodologies Plan. Volume 2.

    DTIC Science & Technology

    1987-12-01

    would include ENALIM, IDEF1, EER, IDEFIX , etc.). 3. Development Procedure - A collection of decisions and actions organized into steps and phases which...model may, for instance, easily have the visual appearance of an IDEFiX model without being an IDEFiX model. Actually, some tools on the market do...architect of IDEF1 . It is thus to IDEFiX that we will turn our attentions after the completion of our formalization of IDEFI. The process for formalizing

  15. Knowledge-Based Integrated Information Systems Engineering: Highlights and Bibliography. Volume 1.

    DTIC Science & Technology

    1987-12-01

    OFFICE NAME AND ADDRESS 12. REPORT DATE Transportation Systems Center, December 1987 Broadway, MA 02142 13. NUMBER OF PAGES 14 T~ ,.-r~ x -1, 11 A F...intelligently to queries like "How many bombers can be made available to strike Target T within x hours" which require analysis of many factors. • Diverse...Name Model Model Capabilities Capabilities ADDS Relational Relational, Field names Being Hierarchical studied IISS IDEF Relational, ? ? Network IMDAS

  16. KOJAK Group Finder: Scalable Group Detection via Integrated Knowledge-Based and Statistical Reasoning

    DTIC Science & Technology

    2006-09-01

    STELLA and PowerLoomn. These modules comunicate with a knowledge basec using KIF and stan(lardl relational database systelnis using either standard...this approach is only feasible for relatively small and static datasets. For large and dynamically changing datasets sitting on some corporate database

  17. Bionic Modeling of Knowledge-Based Guidance in Automated Underwater Vehicles.

    DTIC Science & Technology

    1987-06-24

    DIFFERENCE TONES While the speed of the owl and other birds has not been adequately documented, Konishi (1973b) reports owl’s flight in captivity ...simulation. There are other bionic forms that are also worth considering in this regard, e.g.,the marine pinnipeds , sharks, etc., who have directional hearing

  18. Towards the knowledge-based design of universal influenza epitope ensemble vaccines

    PubMed Central

    Sheikh, Qamar M.; Gatherer, Derek; Reche, Pedro A; Flower, Darren R.

    2016-01-01

    Motivation: Influenza A viral heterogeneity remains a significant threat due to unpredictable antigenic drift in seasonal influenza and antigenic shifts caused by the emergence of novel subtypes. Annual review of multivalent influenza vaccines targets strains of influenza A and B likely to be predominant in future influenza seasons. This does not induce broad, cross protective immunity against emergent subtypes. Better strategies are needed to prevent future pandemics. Cross-protection can be achieved by activating CD8+ and CD4+ T cells against highly conserved regions of the influenza genome. We combine available experimental data with informatics-based immunological predictions to help design vaccines potentially able to induce cross-protective T-cells against multiple influenza subtypes. Results: To exemplify our approach we designed two epitope ensemble vaccines comprising highly conserved and experimentally verified immunogenic influenza A epitopes as putative non-seasonal influenza vaccines; one specifically targets the US population and the other is a universal vaccine. The USA-specific vaccine comprised 6 CD8+ T cell epitopes (GILGFVFTL, FMYSDFHFI, GMDPRMCSL, SVKEKDMTK, FYIQMCTEL, DTVNRTHQY) and 3 CD4+ epitopes (KGILGFVFTLTVPSE, EYIMKGVYINTALLN, ILGFVFTLTVPSERG). The universal vaccine comprised 8 CD8+ epitopes: (FMYSDFHFI, GILGFVFTL, ILRGSVAHK, FYIQMCTEL, ILKGKFQTA, YYLEKANKI, VSDGGPNLY, YSHGTGTGY) and the same 3 CD4+ epitopes. Our USA-specific vaccine has a population protection coverage (portion of the population potentially responsive to one or more component epitopes of the vaccine, PPC) of over 96 and 95% coverage of observed influenza subtypes. The universal vaccine has a PPC value of over 97 and 88% coverage of observed subtypes. Availability and Implementation: http://imed.med.ucm.es/Tools/episopt.html. Contact: d.r.flower@aston.ac.uk PMID:27402904

  19. ADAPT: A knowledge-based synthesis tool for digital signal processing system design

    SciTech Connect

    Cooley, E.S.

    1988-01-01

    A computer aided synthesis tool for expansion, compression, and filtration of digital images is described. ADAPT, the Autonomous Digital Array Programming Tool, uses an extensive design knowledge base to synthesize a digital signal processing (DSP) system. Input to ADAPT can be either a behavioral description in English, or a block level specification via Petri Nets. The output from ADAPT comprises code to implement the DSP system on an array of processors. ADAPT is constructed using C, Prolog, and X Windows on a SUN 3/280 workstation. ADAPT knowledge encompasses DSP component information and the design algorithms and heuristics of a competent DSP designer. The knowledge is used to form queries for design capture, to generate design constraints from the user's responses, and to examine the design constraints. These constraints direct the search for possible DSP components and target architectures. Constraints are also used for partitioning the target systems into less complex subsystems. The subsystems correspond to architectural building blocks of the DSP design. These subsystems inherit design constraints and DSP characteristics from their parent blocks. Thus, a DSP subsystem or parent block, as designed by ADAPT, must meet the user's design constraints. Design solutions are sought by searching the Components section of the design knowledge base. Component behavior which matches or is similar to that required by the DSP subsystems is sought. Each match, which corresponds to a design alternative, is evaluated in terms of its behavior. When a design is sufficiently close to the behavior required by the user, detailed mathematical simulations may be performed to accurately determine exact behavior.

  20. Use of knowledge-based assistant in marketing of bulk fertilizer storage plants

    NASA Astrophysics Data System (ADS)

    Franklin, Reynold; Burns, Daniel T.; Pyle, Daniel W.

    2001-10-01

    Bulk fertilizer storage companies have undergone dramatic changes including expansions and unification with various companies. This has resulted in the need for large, state- of-the-art fertilizer storage plants. Various factors such as proper layout of the facility, dry fertilizer storage, liquid fertilizer storage, bulk material handling equipment, automation of equipment automation etc. play a vital role in the development of storage plants. Planning for such a facility requires vast amounts of time and collaboration between the planning engineers, building contractor, equipment manufacturer, equipment automates, and other suppliers. Specializing exclusively in development of fertilizer storage plants, Stueve Construction Co. has long felt the need to develop a tool, which will aid in the proposal study and cost analysis of the various entities. This paper describes the feasibility study of developing an expert system incorporation the expertise of the various agents playing a pivotal role in the development of a bulk fertilizer storage plants. Apart from its use in developing a proposal study, its potential use as a marketing tool for the various agencies are also discussed.