Science.gov

Sample records for generation knowledge base

  1. Design of a knowledge-based report generator

    SciTech Connect

    Kukich, K.

    1983-01-01

    Knowledge-based report generation is a technique for automatically generating natural language reports from computer databases. It is so named because it applies knowledge-based expert systems software to the problem of text generation. The first application of the technique, a system for generating natural language stock reports from a daily stock quotes database, is partially implemented. Three fundamental principles of the technique are its use of domain-specific semantic and linguistic knowledge, its use of macro-level semantic and linguistic constructs (such as whole messages, a phrasal lexicon, and a sentence-combining grammar), and its production system approach to knowledge representation. 14 references.

  2. Knowledge-based zonal grid generation for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Andrews, Alison E.

    1988-01-01

    Automation of flow field zoning in two dimensions is an important step towards reducing the difficulty of three-dimensional grid generation in computational fluid dynamics. Using a knowledge-based approach makes sense, but problems arise which are caused by aspects of zoning involving perception, lack of expert consensus, and design processes. These obstacles are overcome by means of a simple shape and configuration language, a tunable zoning archetype, and a method of assembling plans from selected, predefined subplans. A demonstration system for knowledge-based two-dimensional flow field zoning has been successfully implemented and tested on representative aerodynamic configurations. The results show that this approach can produce flow field zonings that are acceptable to experts with differing evaluation criteria.

  3. Automated knowledge generation

    NASA Technical Reports Server (NTRS)

    Myler, Harley R.; Gonzalez, Avelino J.

    1988-01-01

    The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).

  4. Incorporating Feature-Based Annotations into Automatically Generated Knowledge Representations

    NASA Astrophysics Data System (ADS)

    Lumb, L. I.; Lederman, J. I.; Aldridge, K. D.

    2006-12-01

    Earth Science Markup Language (ESML) is efficient and effective in representing scientific data in an XML- based formalism. However, features of the data being represented are not accounted for in ESML. Such features might derive from events (e.g., a gap in data collection due to instrument servicing), identifications (e.g., a scientifically interesting area/volume in an image), or some other source. In order to account for features in an ESML context, we consider them from the perspective of annotation, i.e., the addition of information to existing documents without changing the originals. Although it is possible to extend ESML to incorporate feature-based annotations internally (e.g., by extending the XML schema for ESML), there are a number of complicating factors that we identify. Rather than pursuing the ESML-extension approach, we focus on an external representation for feature-based annotations via XML Pointer Language (XPointer). In previous work (Lumb &Aldridge, HPCS 2006, IEEE, doi:10.1109/HPCS.2006.26), we have shown that it is possible to extract relationships from ESML-based representations, and capture the results in the Resource Description Format (RDF). Thus we explore and report on this same requirement for XPointer-based annotations of ESML representations. As in our past efforts, the Global Geodynamics Project (GGP) allows us to illustrate with a real-world example this approach for introducing annotations into automatically generated knowledge representations.

  5. Automated knowledge acquisition for second generation knowledge base systems: A conceptual analysis and taxonomy

    SciTech Connect

    Williams, K.E.; Kotnour, T.

    1991-12-31

    In this paper, we present a conceptual analysis of knowledge-base development methodologies. The purpose of this research is to help overcome the high cost and lack of efficiency in developing knowledge base representations for artificial intelligence applications. To accomplish this purpose, we analyzed the available methodologies and developed a knowledge-base development methodology taxonomy. We review manual, machine-aided, and machine-learning methodologies. A set of developed characteristics allows description and comparison among the methodologies. We present the results of this conceptual analysis of methodologies and recommendations for development of more efficient and effective tools.

  6. Automated knowledge acquisition for second generation knowledge base systems: A conceptual analysis and taxonomy

    SciTech Connect

    Williams, K.E.; Kotnour, T.

    1991-01-01

    In this paper, we present a conceptual analysis of knowledge-base development methodologies. The purpose of this research is to help overcome the high cost and lack of efficiency in developing knowledge base representations for artificial intelligence applications. To accomplish this purpose, we analyzed the available methodologies and developed a knowledge-base development methodology taxonomy. We review manual, machine-aided, and machine-learning methodologies. A set of developed characteristics allows description and comparison among the methodologies. We present the results of this conceptual analysis of methodologies and recommendations for development of more efficient and effective tools.

  7. Knowledge-based reasoning in the Paladin tactical decision generation system

    NASA Technical Reports Server (NTRS)

    Chappell, Alan R.

    1993-01-01

    A real-time tactical decision generation system for air combat engagements, Paladin, has been developed. A pilot's job in air combat includes tasks that are largely symbolic. These symbolic tasks are generally performed through the application of experience and training (i.e. knowledge) gathered over years of flying a fighter aircraft. Two such tasks, situation assessment and throttle control, are identified and broken out in Paladin to be handled by specialized knowledge based systems. Knowledge pertaining to these tasks is encoded into rule-bases to provide the foundation for decisions. Paladin uses a custom built inference engine and a partitioned rule-base structure to give these symbolic results in real-time. This paper provides an overview of knowledge-based reasoning systems as a subset of rule-based systems. The knowledge used by Paladin in generating results as well as the system design for real-time execution is discussed.

  8. Motion Recognition and Modifying Motion Generation for Imitation Robot Based on Motion Knowledge Formation

    NASA Astrophysics Data System (ADS)

    Okuzawa, Yuki; Kato, Shohei; Kanoh, Masayoshi; Itoh, Hidenori

    A knowledge-based approach to imitation learning of motion generation for humanoid robots and an imitative motion generation system based on motion knowledge learning and modification are described. The system has three parts: recognizing, learning, and modifying parts. The first part recognizes an instructed motion distinguishing it from the motion knowledge database by the continuous hidden markov model. When the motion is recognized as being unfamiliar, the second part learns it using locally weighted regression and acquires a knowledge of the motion. When a robot recognizes the instructed motion as familiar or judges that its acquired knowledge is applicable to the motion generation, the third part imitates the instructed motion by modifying a learned motion. This paper reports some performance results: the motion imitation of several radio gymnastics motions.

  9. A Different Approach to the Generation of Patient Management Problems from a Knowledge-Based System

    PubMed Central

    Barriga, Rosa Maria

    1988-01-01

    Several strategies are proposed to approach the generation of Patient Management Problems from a Knowledge Base and avoid inconsistencies in the results. These strategies are based on a different Knowledge Base structure and in the use of case introductions that describe the patient attributes which are not disease-dependent. This methodology has proven effective in a recent pilot test and it is on its way to implementation as part of an educational program at CWRU, School of Medicine.

  10. Generating MEDLINE search strategies using a librarian knowledge-based system.

    PubMed Central

    Peng, P.; Aguirre, A.; Johnson, S. B.; Cimino, J. J.

    1993-01-01

    We describe a librarian knowledge-based system that generates a search strategy from a query representation based on a user's information need. Together with the natural language parser AQUA, the system functions as a human/computer interface, which translates a user query from free text into a BRS Onsite search formulation, for searching the MEDLINE bibliographic database. In the system, conceptual graphs are used to represent the user's information need. The UMLS Metathesaurus and Semantic Net are used as the key knowledge sources in building the knowledge base. PMID:8130544

  11. Applying Knowledge to Generate Action: A Community-Based Knowledge Translation Framework

    ERIC Educational Resources Information Center

    Campbell, Barbara

    2010-01-01

    Introduction: Practical strategies are needed to translate research knowledge between researchers and users into action. For effective translation to occur, researchers and users should partner during the research process, recognizing the impact that knowledge, when translated into practice, will have on those most affected by that research.…

  12. Detecting knowledge base inconsistencies using automated generation of text and examples

    SciTech Connect

    Mittal, V.O.; Moore, J.D.

    1996-12-31

    Verifying the fidelity of domain representation in large knowledge bases (KBs) is a difficult problem: domain experts are typically not experts in knowledge representation languages, and as knowledge bases grow more complex, visual inspection of the various terms and their abstract definitions, their interrelationships and the limiting, boundary cases becomes much harder. This paper presents an approach to help verify and refine abstract term definitions in knowledge bases. It assumes that it is easier for a domain expert to determine the correctness of individual concrete examples than it is to verify and correct all the ramifications of an abstract, intentional specification. To this end, our approach presents the user with an interface in which abstract terms in the KB are described using examples and natural language generated from the underlying domain representation. Problems in the KB are therefore manifested as problems in the generated description. The user can then highlight specific examples or parts of the explanation that seem problematic. The system reasons about the underlying domain model by using the discourse plan generated for the description. This paper briefly describes the working of the system and illustrates three possible types of problem manifestations using an example of a specification of floating-point numbers in Lisp.

  13. Bedside, classroom and bench: collaborative strategies to generate evidence-based knowledge for nursing practice.

    PubMed

    Weaver, Charlotte A; Warren, Judith J; Delaney, Connie

    2005-12-01

    The rise of evidence-base practice (EBP) as a standard for care delivery is rapidly emerging as a global phenomenon that is transcending political, economic and geographic boundaries. Evidence-based nursing (EBN) addresses the growing body of nursing knowledge supported by different levels of evidence for best practices in nursing care. Across all health care, including nursing, we face the challenge of how to most effectively close the gap between what is known and what is practiced. There is extensive literature on the barriers and difficulties of translating research findings into practical application. While the literature refers to this challenge as the "Bench to Bedside" lag, this paper presents three collaborative strategies that aim to minimize this gap. The Bedside strategy proposes to use the data generated from care delivery and captured in the massive data repositories of electronic health record (EHR) systems as empirical evidence that can be analysed to discover and then inform best practice. In the Classroom strategy, we present a description for how evidence-based nursing knowledge is taught in a baccalaureate nursing program. And finally, the Bench strategy describes applied informatics in converting paper-based EBN protocols into the workflow of clinical information systems. Protocols are translated into reference and executable knowledge with the goal of placing the latest scientific knowledge at the fingertips of front line clinicians. In all three strategies, information technology (IT) is presented as the underlying tool that makes this rapid translation of nursing knowledge into practice and education feasible.

  14. Knowledge-based approach for generating target system specifications from a domain model

    NASA Technical Reports Server (NTRS)

    Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan

    1992-01-01

    Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.

  15. Generating Topic Headings during Reading of Screen-Based Text Facilitates Learning of Structural Knowledge and Impairs Learning of Lower-Level Knowledge

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Marker, Anthony W.

    2007-01-01

    This investigation considers the effects of learner-generated headings on memory. Participants (N = 63) completed a computer-based lesson with or without learner-generated text topic headings. Posttests included a cued recall test of factual knowledge and a sorting task measure of structural knowledge. A significant disordinal interaction was…

  16. Scaling up explanation generation: Large-scale knowledge bases and empirical studies

    SciTech Connect

    Lester, J.C.; Porter, B.W.

    1996-12-31

    To explain complex phenomena, an explanation system must be able to select information from a formal representation of domain knowledge, organize the selected information into multisentential discourse plans, and realize the discourse plans in text. Although recent years have witnessed significant progress in the development of sophisticated computational mechanisms for explanation, empirical results have been limited. This paper reports on a seven year effort to empirically study explanation generation from semantically rich, large-scale knowledge bases. We first describe Knight, a robust explanation system that constructs multi-sentential and multi-paragraph explanations from the Biology Knowledge Base, a large-scale knowledge base in the domain of botanical anatomy, physiology, and development. We then introduce the Two Panel evaluation methodology and describe how Knight`s performance was assessed with this methodology in the most extensive empirical evaluation conducted on an explanation system. In this evaluation, Knight scored within {open_quotes}half a grade{close_quote} of domain experts, and its performance exceeded that of one of the domain experts.

  17. Development of a standardized knowledge base to generate individualized medication plans automatically with drug administration recommendations

    PubMed Central

    Send, Alexander F J; Al-Ayyash, Adel; Schecher, Sabrina; Rudofsky, Gottfried; Klein, Ulrike; Schaier, Matthias; Pruszydlo, Markus G; Witticke, Diana; Lohmann, Kristina; Kaltschmidt, Jens; Haefeli, Walter E; Seidling, Hanna M

    2013-01-01

    Aims We aimed to develop a generic knowledge base with drug administration recommendations which allows the generation of a dynamic and comprehensive medication plan and to evaluate its comprehensibility and potential benefit in a qualitative pilot study with patients and physicians. Methods Based on a literature search and previously published medication plans, a prototype was developed and iteratively refined through qualitative evaluation (interviews with patients and focus group discussions with physicians). To develop the recommendations for safe administration of specific drugs we screened the summary of product characteristics (SmPC) of different exemplary brands, allocated the generated advice to groups with brands potentially requiring the same advice, and reviewed these allocations regarding applicability and appropriateness of the recommendations. Results For the recommendations, 411 SmPCs of 140 different active ingredients including all available galenic formulations, routes of administrations except infusions, and administration devices were screened. Finally, 515 distinct administration recommendations were included in the database. In 926 different generic groups, 29 879 allocations of brands to general advice, food advice, indications, step-by-step instructions, or combinations thereof were made. Thereby, 27 216 of the preselected allocations (91.1%) were confirmed as appropriate. In total, one third of the German drug market was labelled with information. Conclusions Generic grouping of brands according to their active ingredient and other drug characteristics and allocation of standardized administration recommendations is feasible for a large drug market and can be integrated in a medication plan. PMID:24007451

  18. Foundations on Generation of Relationships Between Classes Based on Initial Business Knowledge

    NASA Astrophysics Data System (ADS)

    Nikiforova, Oksana; Pavlova, Natalya

    This chapter focuses on the development of the main component of platform independent model (PIM) of Model Driven Architecture, e.g., class diagram defined in Unified Modeling Language (UML), which has necessary details for transformation into platform specific model (PSM). It is important to formulate core principles of development of well-structured class diagram at a conceptual level, using knowledge of the problem domain, which consists of two interrelated models of system aspects - business processes and concept presentation. Definition of relationships of classes is important for PSM generation; therefore, the research on how it could be defined is performed. The hypothesis that it is possible to derive a class structure from initial business information is adduced. Information about the problem domain is presented in the form of two-hemisphere model that describes two interrelated parts of the most important aspects of a system, namely business process and concept models. These models serve as a source model for class diagram receiving. Capacity for the class diagram generation, based on the two-hemisphere model, is represented by a collection of graph transformations and illustrated with examples, where definition of different kinds of relationships (namely aggregation, dependency, generalization) is displayed.

  19. Knowledge Base for Automatic Generation of Online IMS LD Compliant Course Structures

    ERIC Educational Resources Information Center

    Pacurar, Ecaterina Giacomini; Trigano, Philippe; Alupoaie, Sorin

    2006-01-01

    Our article presents a pedagogical scenarios-based web application that allows the automatic generation and development of pedagogical websites. These pedagogical scenarios are represented in the IMS Learning Design standard. Our application is a web portal helping teachers to dynamically generate web course structures, to edit pedagogical content…

  20. Knowledge-based design of generate-and-patch problem solvers that solve global resource assignment problems

    NASA Technical Reports Server (NTRS)

    Voigt, Kerstin

    1992-01-01

    We present MENDER, a knowledge based system that implements software design techniques that are specialized to automatically compile generate-and-patch problem solvers that satisfy global resource assignments problems. We provide empirical evidence of the superior performance of generate-and-patch over generate-and-test: even with constrained generation, for a global constraint in the domain of '2D-floorplanning'. For a second constraint in '2D-floorplanning' we show that even when it is possible to incorporate the constraint into a constrained generator, a generate-and-patch problem solver may satisfy the constraint more rapidly. We also briefly summarize how an extended version of our system applies to a constraint in the domain of 'multiprocessor scheduling'.

  1. Generating executable knowledge for evidence-based medicine using natural language and semantic processing.

    PubMed

    Borlawsky, Tara; Friedman, Carol; Lussier, Yves A

    2006-01-01

    With an increase in the prevalence of patients having multiple medical conditions, along with the increasing number of medical information sources, an intelligent approach is required to integrate the answers to physicians' patient-related questions into clinical practice in the shortest, most specific way possible. Cochrane Scientific Reviews are currently considered to be the "gold standard" for evidence-based medicine (EBM), because of their well-defined systematic approach to assessing the available medical information. In order to develop semantic approaches for enabling the reuse of these Reviews, a system for producing executable knowledge was designed using a natural language processing (NLP) system we developed (BioMedLEE), and semantic processing techniques. Though BioMedLEE was not designed for or trained over the Cochrane Reviews, this study shows that disease, therapy and drug concepts can be extracted and correlated with an overall recall of 80.3%, coding precision of 94.1%, and concept-concept relationship precision of 87.3%.

  2. A knowledge generation model via the hypernetwork.

    PubMed

    Liu, Jian-Guo; Yang, Guang-Yong; Hu, Zhao-Long

    2014-01-01

    The influence of the statistical properties of the network on the knowledge diffusion has been extensively studied. However, the structure evolution and the knowledge generation processes are always integrated simultaneously. By introducing the Cobb-Douglas production function and treating the knowledge growth as a cooperative production of knowledge, in this paper, we present two knowledge-generation dynamic evolving models based on different evolving mechanisms. The first model, named "HDPH model," adopts the hyperedge growth and the hyperdegree preferential attachment mechanisms. The second model, named "KSPH model," adopts the hyperedge growth and the knowledge stock preferential attachment mechanisms. We investigate the effect of the parameters (α,β) on the total knowledge stock of the two models. The hyperdegree distribution of the HDPH model can be theoretically analyzed by the mean-field theory. The analytic result indicates that the hyperdegree distribution of the HDPH model obeys the power-law distribution and the exponent is γ = 2 + 1/m. Furthermore, we present the distributions of the knowledge stock for different parameters (α,β). The findings indicate that our proposed models could be helpful for deeply understanding the scientific research cooperation.

  3. Computer based interpretation of infrared spectra-structure of the knowledge-base, automatic rule generation and interpretation

    NASA Astrophysics Data System (ADS)

    Ehrentreich, F.; Dietze, U.; Meyer, U.; Abbas, S.; Schulz, H.

    1995-04-01

    It is a main task within the SpecInfo-Project to develop interpretation tools that can handle a great deal more of the complicated, more specific spectrum-structure-correlations. In the first step the empirical knowledge about the assignment of structural groups and their characteristic IR-bands has been collected from literature and represented in a computer readable well-structured form. Vague, verbal rules are managed by introduction of linguistic variables. The next step was the development of automatic rule generating procedures. We had combined and enlarged the IDIOTS algorithm with the algorithm by Blaffert relying on set theory. The procedures were successfully applied to the SpecInfo database. The realization of the preceding items is a prerequisite for the improvement of the computerized structure elucidation procedure.

  4. Knowledge Based Text Generation

    DTIC Science & Technology

    1989-08-01

    passive 70 voice 44 mental models 31 past focus 48 message formalism 31, 40 PAULINE 15 Michelangelo Buonarroti 72 PENMAN 11 Minsky 28, 31 Perlmutter...framework and the local focus constraints to encourage local connectivity. Page 72 Chapter 11 CONCLUSION Ancora imparo. Michelangelo Buonaroti 11.1 Summary

  5. Cooperative Knowledge Bases.

    DTIC Science & Technology

    1988-02-01

    intellegent knowledge bases. The present state of our system for concurrent evaluation of a knowledge base of logic clauses using static allocation...de Kleer, J., An assumption-based TMS, Artificial Intelligence, Vol. 28, No. 2, 1986. [Doyle 79) Doyle, J. A truth maintenance system, Artificial

  6. Knowledge and vision engines: a new generation of image understanding systems combining computational intelligence methods and model-based knowledge representation and reasoning

    NASA Astrophysics Data System (ADS)

    Kuvychko, Igor

    2000-10-01

    Vision is a part of a larger informational system that converts visual information into knowledge structures. These structures drive vision process, resolving ambiguity and uncertainty via feedback, and provide image understanding, that is an interpretation of visual information in terms of such knowledge models. The solution to Image Understanding problems is suggested in form of active multilevel hierarchical networks represented dually as discrete and continuous structures. Computational intelligence methods transform images into model-based knowledge representation. Certainty Dimension converts attractors in neural networks into fuzzy sets, preserving input-output relationships. Symbols naturally emerge in such networks. Symbolic Space is a dual structure that combines closed distributed space split by the set of fuzzy regions, and discrete set of symbols equivalent to the cores of regions represented as points in the Certainty dimension. Model Space carries knowledge in form of links and relations between the symbols, and supports graph, diagrammatic and topological operations. Composition of spaces works similar to M. Minsky frames and agents, Gerard Edelman's maps of maps, etc., combining machine learning, classification and analogy together with induction, deduction and other methods of higher level model-based reasoning. Based on such principles, an Image Understanding system can convert images into knowledge models, effectively resolving uncertainty and ambiguity via feedback projections and does not require supercomputers.

  7. Generational Differences in Knowledge Markets

    DTIC Science & Technology

    2010-03-01

    by retirement—current projections for retiring workers are potentially 3 alarming. According to a report by the United States Government...Morrison, 2006). Yet, despite this dearth of skilled labor, employers are reported to be even more concerned about losing organizational knowledge...concluded, the United States Government Accountability Office (GAO) (2002), published a report identifying several failed missions that simply repeated past

  8. The role of textual semantic constraints in knowledge-based inference generation during reading comprehension: A computational approach.

    PubMed

    Yeari, Menahem; van den Broek, Paul

    2015-01-01

    The present research adopted a computational approach to explore the extent to which the semantic content of texts constrains the activation of knowledge-based inferences. Specifically, we examined whether textual semantic constraints (TSC) can explain (1) the activation of predictive inferences, (2) the activation of bridging inferences and (3) the higher prevalence of the activation of bridging inferences compared to predictive inferences. To examine these hypotheses, we computed the strength of semantic associations between texts and probe items as presented to human readers in previous behavioural studies, using the Latent Semantic Analysis (LSA) algorithm. We tested whether stronger semantic associations are observed for inferred items compared to control items. Our results show that in 15 out of 17 planned comparisons, the computed strength of semantic associations successfully simulated the activation of inferences. These findings suggest that TSC play a central role in the activation of knowledge-based inferences.

  9. Geospatial Standards and the Knowledge Generation Lifescycle

    NASA Technical Reports Server (NTRS)

    Khalsa, Siri Jodha S.; Ramachandran, Rahul

    2014-01-01

    Standards play an essential role at each stage in the sequence of processes by which knowledge is generated from geoscience observations, simulations and analysis. This paper provides an introduction to the field of informatics and the knowledge generation lifecycle in the context of the geosciences. In addition we discuss how the newly formed Earth Science Informatics Technical Committee is helping to advance the application of standards and best practices to make data and data systems more usable and interoperable.

  10. Medical Knowledge Bases.

    ERIC Educational Resources Information Center

    Miller, Randolph A.; Giuse, Nunzia B.

    1991-01-01

    Few commonly available, successful computer-based tools exist in medical informatics. Faculty expertise can be included in computer-based medical information systems. Computers allow dynamic recombination of knowledge to answer questions unanswerable with print textbooks. Such systems can also create stronger ties between academic and clinical…

  11. Health Knowledge Among the Millennial Generation

    PubMed Central

    Lloyd, Tom; Shaffer, Michele L.; Christy, Stetter; Widome, Mark D.; Repke, John; Weitekamp, Michael R.; Eslinger, Paul J.; Bargainnier, Sandra S.; Paul, Ian M.

    2013-01-01

    The Millennial Generation, also known as Generation Y, is the demographic cohort following Generation X, and is generally regarded to be composed of those individuals born between 1980 and 2000. They are the first to grow up in an environment where health-related information is widely available by internet, TV and other electronic media, yet we know very little about the scope of their health knowledge. This study was undertaken to quantify two domains of clinically relevant health knowledge: factual content and ability to solve health related questions (application) in nine clinically related medical areas. Study subjects correctly answered, on average, 75% of health application questions but only 54% of health content questions. Since students were better able to correctly answer questions dealing with applications compared to those on factual content contemporary US high school students may not use traditional hierarchical learning models in acquisition of their health knowledge. PMID:25170479

  12. Generating tsunami risk knowledge at community level as a base for planning and implementation of risk reduction strategies

    NASA Astrophysics Data System (ADS)

    Wegscheider, S.; Post, J.; Zosseder, K.; Mück, M.; Strunz, G.; Riedlinger, T.; Muhari, A.; Anwar, H. Z.

    2011-02-01

    More than 4 million Indonesians live in tsunami-prone areas along the southern and western coasts of Sumatra, Java and Bali. Although a Tsunami Early Warning Center in Jakarta now exists, installed after the devastating 2004 tsunami, it is essential to develop tsunami risk knowledge within the exposed communities as a basis for tsunami disaster management. These communities need to implement risk reduction strategies to mitigate potential consequences. The major aims of this paper are to present a risk assessment methodology which (1) identifies areas of high tsunami risk in terms of potential loss of life, (2) bridges the gaps between research and practical application, and (3) can be implemented at community level. High risk areas have a great need for action to improve people's response capabilities towards a disaster, thus reducing the risk. The methodology developed here is based on a GIS approach and combines hazard probability, hazard intensity, population density and people's response capability to assess the risk. Within the framework of the GITEWS (German-Indonesian Tsunami Early Warning System) project, the methodology was applied to three pilot areas, one of which is southern Bali. Bali's tourism is concentrated for a great part in the communities of Kuta, Legian and Seminyak. Here alone, about 20 000 people live in high and very high tsunami risk areas. The development of risk reduction strategies is therefore of significant interest. A risk map produced for the study area in Bali can be used for local planning activities and the development of risk reduction strategies.

  13. Generating tsunami risk knowledge at community level as a base for planning and implementation of risk reduction strategies

    NASA Astrophysics Data System (ADS)

    Wegscheider, Stephanie; Post, Joachim; Mück, Matthias; Zosseder, Kai; Muhari, Abdul; Anwar, Herryal Z.; Gebert, Niklas; Strunz, Günter; Riedlinger, Torsten

    2010-05-01

    More than 4 million Indonesians live in tsunami-prone areas on the southern and western coasts of Sumatra, Java and Bali. Depending on the location of the tsunamigenic earthquake, in many cases the time to reach a tsunami-safe area is as short as 15 or 20 minutes. To increase the chances of a successful evacuation a comprehensive and thorough planning and preparation is necessary. For this purpose, detailed knowledge on potential hazard impact and safe areas, exposed elements such as people, critical facilities and lifelines, deficiencies in response capabilities and evacuation routes is crucial. The major aims of this paper are (i) to assess and quantify people's response capabilities and (ii) to identify high risk areas which have a high need of action to improve the response capabilities and thus to reduce the risk. The major factor influencing people's ability to evacuate successfully is the factor time. The estimated time of arrival of a tsunami at the coast which determines the overall available time for evacuation after triggering of a tsunami can be derived by analyzing modeled tsunami scenarios for a respective area. But in most cases, this available time frame is diminished by other time components including the time until natural or technical warning signs are received and the time until reaction follows a warning (understanding a warning and decision to take appropriate action). For the time to receive a warning we assume that the early warning centre is able to fulfil the Indonesian presidential decree to issue a warning within 5 minutes. Reaction time is difficult to quantify as here human intrinsic factors as educational level, believe, tsunami knowledge and experience play a role. Although we are aware of the great importance of this factor and the importance to minimize the reaction time, it is not considered in this paper. Quantifying the needed evacuation time is based on a GIS approach. This approach is relatively simple and enables local

  14. Reflexive Professionalism as a Second Generation of Evidence-Based Practice: Some Considerations on the Special Issue "What Works? Modernizing the Knowledge-Base of Social Work"

    ERIC Educational Resources Information Center

    Otto, Hans-Uwe; Polutta, Andreas; Ziegler, Holger

    2009-01-01

    This article refers sympathetically to the thoughtful debates and positions in the "Research on Social Work Practice" ("RSWP"; Special Issue, July, 2008 issue) on "What Works? Modernizing the Knowledge-Base of Social Work." It highlights the need for empirical efficacy and effectiveness research in social work and…

  15. Mobile robot knowledge base

    NASA Astrophysics Data System (ADS)

    Heath Pastore, Tracy; Barnes, Mitchell; Hallman, Rory

    2005-05-01

    Robot technology is developing at a rapid rate for both commercial and Department of Defense (DOD) applications. As a result, the task of managing both technology and experience information is growing. In the not-to-distant past, tracking development efforts of robot platforms, subsystems and components was not too difficult, expensive, or time consuming. To do the same today is a significant undertaking. The Mobile Robot Knowledge Base (MRKB) provides the robotics community with a web-accessible, centralized resource for sharing information, experience, and technology to more efficiently and effectively meet the needs of the robot system user. The resource includes searchable information on robot components, subsystems, mission payloads, platforms, and DOD robotics programs. In addition, the MRKB website provides a forum for technology and information transfer within the DOD robotics community and an interface for the Robotic Systems Pool (RSP). The RSP manages a collection of small teleoperated and semi-autonomous robotic platforms, available for loan to DOD and other qualified entities. The objective is to put robots in the hands of users and use the test data and fielding experience to improve robot systems.

  16. Foundation: Transforming data bases into knowledge bases

    NASA Technical Reports Server (NTRS)

    Purves, R. B.; Carnes, James R.; Cutts, Dannie E.

    1987-01-01

    One approach to transforming information stored in relational data bases into knowledge based representations and back again is described. This system, called Foundation, allows knowledge bases to take advantage of vast amounts of pre-existing data. A benefit of this approach is inspection, and even population, of data bases through an intelligent knowledge-based front-end.

  17. Need to Knowledge (NtK) Model: an evidence-based framework for generating technological innovations with socio-economic impacts

    PubMed Central

    2013-01-01

    Background Traditional government policies suggest that upstream investment in scientific research is necessary and sufficient to generate technological innovations. The expected downstream beneficial socio-economic impacts are presumed to occur through non-government market mechanisms. However, there is little quantitative evidence for such a direct and formulaic relationship between public investment at the input end and marketplace benefits at the impact end. Instead, the literature demonstrates that the technological innovation process involves a complex interaction between multiple sectors, methods, and stakeholders. Discussion The authors theorize that accomplishing the full process of technological innovation in a deliberate and systematic manner requires an operational-level model encompassing three underlying methods, each designed to generate knowledge outputs in different states: scientific research generates conceptual discoveries; engineering development generates prototype inventions; and industrial production generates commercial innovations. Given the critical roles of engineering and business, the entire innovation process should continuously consider the practical requirements and constraints of the commercial marketplace. The Need to Knowledge (NtK) Model encompasses the activities required to successfully generate innovations, along with associated strategies for effectively communicating knowledge outputs in all three states to the various stakeholders involved. It is intentionally grounded in evidence drawn from academic analysis to facilitate objective and quantitative scrutiny, and industry best practices to enable practical application. Summary The Need to Knowledge (NtK) Model offers a practical, market-oriented approach that avoids the gaps, constraints and inefficiencies inherent in undirected activities and disconnected sectors. The NtK Model is a means to realizing increased returns on public investments in those science and technology

  18. Knowledge-Based Image Analysis.

    DTIC Science & Technology

    1981-04-01

    UNCLASSIF1 ED ETL-025s N IIp ETL-0258 AL Ai01319 S"Knowledge-based image analysis u George C. Stockman Barbara A. Lambird I David Lavine Laveen N. Kanal...extraction, verification, region classification, pattern recognition, image analysis . 3 20. A. CT (Continue on rever.. d. It necessary and Identify by...UNCLgSTFTF n In f SECURITY CLASSIFICATION OF THIS PAGE (When Date Entered) .L1 - I Table of Contents Knowledge Based Image Analysis I Preface

  19. Knowledge-based nursing diagnosis

    NASA Astrophysics Data System (ADS)

    Roy, Claudette; Hay, D. Robert

    1991-03-01

    Nursing diagnosis is an integral part of the nursing process and determines the interventions leading to outcomes for which the nurse is accountable. Diagnoses under the time constraints of modern nursing can benefit from a computer assist. A knowledge-based engineering approach was developed to address these problems. A number of problems were addressed during system design to make the system practical extended beyond capture of knowledge. The issues involved in implementing a professional knowledge base in a clinical setting are discussed. System functions, structure, interfaces, health care environment, and terminology and taxonomy are discussed. An integrated system concept from assessment through intervention and evaluation is outlined.

  20. Knowledge based question answering

    SciTech Connect

    Pazzani, M.J.; Engelman, C.

    1983-01-01

    The natural language database query system incorporated in the Knobs Interactive Planning System comprises a dictionary driven parser, APE-II, and script interpreter whch yield a conceptual dependency as a representation of the meaning of user input. A conceptualisation pattern matching production system then determines and executes a procedure for extracting the desired information from the database. In contrast to syntax driven q-a systems, e.g. those based on atn parsers, APE-II is driven bottom-up by expectations associated with word meanings. The goals of this approach include utilising similar representations for questions with similar meanings but widely varying surface structures, developing a powerful mechanism for the disambiguation of words with multiple meanings and the determination of pronoun referents, answering questions which require inferences to be understood, and interpreting ellipses and ungrammatical statements. The Knobs demonstration system is an experimental, expert system for air force mission planning applications. 16 refs.

  1. Augmenting a database knowledge representation for natural language generation

    SciTech Connect

    McCoy, K.F.

    1982-01-01

    The knowledge representation is an important factor in natural language generation since it limits the semantic capabilities of the generation system. This paper identifies several information types in a knowledge representation that can be used to generate meaningful responses to questions about database structure. Creating such a knowledge representation, however, is a long and tedious process. A system is presented which uses the contents of the database to form part of this knowledge representation automatically. It employs three types of world knowledge axioms to ensure that the representation formed is meaningful and contains salient information. 7 references.

  2. Reusing Design Knowledge Based on Design Cases and Knowledge Map

    ERIC Educational Resources Information Center

    Yang, Cheng; Liu, Zheng; Wang, Haobai; Shen, Jiaoqi

    2013-01-01

    Design knowledge was reused for innovative design work to support designers with product design knowledge and help designers who lack rich experiences to improve their design capacity and efficiency. First, based on the ontological model of product design knowledge constructed by taxonomy, implicit and explicit knowledge was extracted from some…

  3. Knowledge-based flow field zoning

    NASA Technical Reports Server (NTRS)

    Andrews, Alison E.

    1988-01-01

    Automation flow field zoning in two dimensions is an important step towards easing the three-dimensional grid generation bottleneck in computational fluid dynamics. A knowledge based approach works well, but certain aspects of flow field zoning make the use of such an approach challenging. A knowledge based flow field zoner, called EZGrid, was implemented and tested on representative two-dimensional aerodynamic configurations. Results are shown which illustrate the way in which EZGrid incorporates the effects of physics, shape description, position, and user bias in a flow field zoning.

  4. Population Education: A Knowledge Base.

    ERIC Educational Resources Information Center

    Jacobson, Willard J.

    To aid junior high and high school educators and curriculum planners as they develop population education programs, the book provides an overview of the population education knowledge base. In addition, it suggests learning activities, discussion questions, and background information which can be integrated into courses dealing with population,…

  5. Epistemology of knowledge based simulation

    SciTech Connect

    Reddy, R.

    1987-04-01

    Combining artificial intelligence concepts, with traditional simulation methodologies yields a powerful design support tool known as knowledge based simulation. This approach turns a descriptive simulation tool into a prescriptive tool, one which recommends specific goals. Much work in the area of general goal processing and explanation of recommendations remains to be done.

  6. Case-Based Tutoring from a Medical Knowledge Base

    PubMed Central

    Chin, Homer L.

    1988-01-01

    The past decade has seen the emergence of programs that make use of large knowledge bases to assist physicians in diagnosis within the general field of internal medicine. One such program, Internist-I, contains knowledge about over 600 diseases, covering a significant proportion of internal medicine. This paper describes the process of converting a subset of this knowledge base--in the area of cardiovascular diseases--into a probabilistic format, and the use of this resulting knowledge base to teach medical diagnostic knowledge. The system (called KBSimulator--for Knowledge-Based patient Simulator) generates simulated patient cases and uses these cases as a focal point from which to teach medical knowledge. It interacts with the student in a mixed-initiative fashion, presenting patients for the student to diagnose, and allowing the student to obtain further information on his/her own initiative in the context of that patient case. The system scores the student, and uses these scores to form a rudimentary model of the student. This resulting model of the student is then used to direct the generation of subsequent patient cases. This project demonstrates the feasibility of building an intelligent, flexible instructional system that uses a knowledge base constructed primarily for medical diagnosis.

  7. Generative and Item-Specific Knowledge of Language

    ERIC Educational Resources Information Center

    Morgan, Emily Ida Popper

    2016-01-01

    The ability to generate novel utterances compositionally using generative knowledge is a hallmark property of human language. At the same time, languages contain non-compositional or idiosyncratic items, such as irregular verbs, idioms, etc. This dissertation asks how and why language achieves a balance between these two systems--generative and…

  8. Automated knowledge-base refinement

    NASA Technical Reports Server (NTRS)

    Mooney, Raymond J.

    1994-01-01

    Over the last several years, we have developed several systems for automatically refining incomplete and incorrect knowledge bases. These systems are given an imperfect rule base and a set of training examples and minimally modify the knowledge base to make it consistent with the examples. One of our most recent systems, FORTE, revises first-order Horn-clause knowledge bases. This system can be viewed as automatically debugging Prolog programs based on examples of correct and incorrect I/O pairs. In fact, we have already used the system to debug simple Prolog programs written by students in a programming language course. FORTE has also been used to automatically induce and revise qualitative models of several continuous dynamic devices from qualitative behavior traces. For example, it has been used to induce and revise a qualitative model of a portion of the Reaction Control System (RCS) of the NASA Space Shuttle. By fitting a correct model of this portion of the RCS to simulated qualitative data from a faulty system, FORTE was also able to correctly diagnose simple faults in this system.

  9. Knowledge-based media adaptation

    NASA Astrophysics Data System (ADS)

    Leopold, Klaus; Jannach, Dietmar; Hellwagner, Hermann

    2004-10-01

    This paper introduces the principal approach and describes the basic architecture and current implementation of the knowledge-based multimedia adaptation framework we are currently developing. The framework can be used in Universal Multimedia Access scenarios, where multimedia content has to be adapted to specific usage environment parameters (network and client device capabilities, user preferences). Using knowledge-based techniques (state-space planning), the framework automatically computes an adaptation plan, i.e., a sequence of media conversion operations, to transform the multimedia resources to meet the client's requirements or constraints. The system takes as input standards-compliant descriptions of the content (using MPEG-7 metadata) and of the target usage environment (using MPEG-21 Digital Item Adaptation metadata) to derive start and goal states for the planning process, respectively. Furthermore, declarative descriptions of the conversion operations (such as available via software library functions) enable existing adaptation algorithms to be invoked without requiring programming effort. A running example in the paper illustrates the descriptors and techniques employed by the knowledge-based media adaptation system.

  10. Drawing on Dynamic Local Knowledge through Student-Generated Photography

    ERIC Educational Resources Information Center

    Coles-Ritchie, Marilee; Monson, Bayley; Moses, Catherine

    2015-01-01

    In this research, the authors explored how teachers using student-generated photography draw on local knowledge. The study draws on the framework of funds of knowledge to highlight the assets marginalized students bring to the classroom and the need for culturally relevant pedagogy to address the needs of a diverse public school population. The…

  11. Knowledge based jet engine diagnostics

    NASA Technical Reports Server (NTRS)

    Jellison, Timothy G.; Dehoff, Ronald L.

    1987-01-01

    A fielded expert system automates equipment fault isolation and recommends corrective maintenance action for Air Force jet engines. The knowledge based diagnostics tool was developed as an expert system interface to the Comprehensive Engine Management System, Increment IV (CEMS IV), the standard Air Force base level maintenance decision support system. XMAM (trademark), the Expert Maintenance Tool, automates procedures for troubleshooting equipment faults, provides a facility for interactive user training, and fits within a diagnostics information feedback loop to improve the troubleshooting and equipment maintenance processes. The application of expert diagnostics to the Air Force A-10A aircraft TF-34 engine equipped with the Turbine Engine Monitoring System (TEMS) is presented.

  12. Cooperating knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Feigenbaum, Edward A.; Buchanan, Bruce G.

    1988-01-01

    This final report covers work performed under Contract NCC2-220 between NASA Ames Research Center and the Knowledge Systems Laboratory, Stanford University. The period of research was from March 1, 1987 to February 29, 1988. Topics covered were as follows: (1) concurrent architectures for knowledge-based systems; (2) methods for the solution of geometric constraint satisfaction problems, and (3) reasoning under uncertainty. The research in concurrent architectures was co-funded by DARPA, as part of that agency's Strategic Computing Program. The research has been in progress since 1985, under DARPA and NASA sponsorship. The research in geometric constraint satisfaction has been done in the context of a particular application, that of determining the 3-D structure of complex protein molecules, using the constraints inferred from NMR measurements.

  13. Sustaining knowledge in the neutron generator community and benchmarking study.

    SciTech Connect

    Barrentine, Tameka C.; Kennedy, Bryan C.; Saba, Anthony W.; Turgeon, Jennifer L.; Schneider, Julia Teresa; Stubblefield, William Anthony; Baldonado, Esther

    2008-03-01

    In 2004, the Responsive Neutron Generator Product Deployment department embarked upon a partnership with the Systems Engineering and Analysis knowledge management (KM) team to develop knowledge management systems for the neutron generator (NG) community. This partnership continues today. The most recent challenge was to improve the current KM system (KMS) development approach by identifying a process that will allow staff members to capture knowledge as they learn it. This 'as-you-go' approach will lead to a sustainable KM process for the NG community. This paper presents a historical overview of NG KMSs, as well as research conducted to move toward sustainable KM.

  14. Advanced software development workstation. Knowledge base design: Design of knowledge base for flight planning application

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.

  15. Acquisition, representation and rule generation for procedural knowledge

    NASA Technical Reports Server (NTRS)

    Ortiz, Chris; Saito, Tim; Mithal, Sachin; Loftin, R. Bowen

    1991-01-01

    Current research into the design and continuing development of a system for the acquisition of procedural knowledge, its representation in useful forms, and proposed methods for automated C Language Integrated Production System (CLIPS) rule generation is discussed. The Task Analysis and Rule Generation Tool (TARGET) is intended to permit experts, individually or collectively, to visually describe and refine procedural tasks. The system is designed to represent the acquired knowledge in the form of graphical objects with the capacity for generating production rules in CLIPS. The generated rules can then be integrated into applications such as NASA's Intelligent Computer Aided Training (ICAT) architecture. Also described are proposed methods for use in translating the graphical and intermediate knowledge representations into CLIPS rules.

  16. Generating pedagogical content knowledge in teacher education students

    NASA Astrophysics Data System (ADS)

    van den Berg, Ed

    2015-09-01

    Some pre-service teaching activities can contribute much to the learning of pedagogical content knowledge (PCK) and subsequent teaching as these activities are generating PCK within the pre-service teacher’s own classroom. Three examples are described: preparing exhibitions of science experiments, assessing preconceptions, and teaching using embedded formative assessment in which assessment leads teaching and almost inevitably results in the development of PCK. Evidence for the effectiveness of the methods is based on the author’s experience in teacher education programmes in different countries, but will need to be confirmed by research. This is a modified version of the author’s keynote lecture on teacher education at the World Conference on Physics Education, 1-6 July 2012, Istanbul, Turkey.

  17. Next Generation Agricultural System Data, Models and Knowledge Products: Introduction

    NASA Technical Reports Server (NTRS)

    Antle, John M.; Jones, James W.; Rosenzweig, Cynthia E.

    2016-01-01

    Agricultural system models have become important tools to provide predictive and assessment capability to a growing array of decision-makers in the private and public sectors. Despite ongoing research and model improvements, many of the agricultural models today are direct descendants of research investments initially made 30-40 years ago, and many of the major advances in data, information and communication technology (ICT) of the past decade have not been fully exploited. The purpose of this Special Issue of Agricultural Systems is to lay the foundation for the next generation of agricultural systems data, models and knowledge products. The Special Issue is based on a 'NextGen' study led by the Agricultural Model Intercomparison and Improvement Project (AgMIP) with support from the Bill and Melinda Gates Foundation.

  18. A knowledge base architecture for distributed knowledge agents

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel; Walls, Bryan

    1990-01-01

    A tuple space based object oriented model for knowledge base representation and interpretation is presented. An architecture for managing distributed knowledge agents is then implemented within the model. The general model is based upon a database implementation of a tuple space. Objects are then defined as an additional layer upon the database. The tuple space may or may not be distributed depending upon the database implementation. A language for representing knowledge and inference strategy is defined whose implementation takes advantage of the tuple space. The general model may then be instantiated in many different forms, each of which may be a distinct knowledge agent. Knowledge agents may communicate using tuple space mechanisms as in the LINDA model as well as using more well known message passing mechanisms. An implementation of the model is presented describing strategies used to keep inference tractable without giving up expressivity. An example applied to a power management and distribution network for Space Station Freedom is given.

  19. AIS Data Base Generation.

    DTIC Science & Technology

    1981-04-01

    the area of natura , ianguage understanding by computer come from the fields of Computational Linguistics and Artificial Intelligence (Al). In the last...blocks of the data base. Another interestino approach to automated database generation is the :m,? taiean ty the Heurist~c P’ogrimming Projec~t at...operating system. 22 OSI’s Orientation O)perling Systems’ approach to automated database generatio,.I i_- irrneo ~~ ciilpport c,’ riformatlon

  20. Automated Fictional Ideation via Knowledge Base Manipulation.

    PubMed

    Llano, Maria Teresa; Colton, Simon; Hepworth, Rose; Gow, Jeremy

    The invention of fictional ideas (ideation) is often a central process in the creative production of artefacts such as poems, music and paintings, but has barely been studied in the computational creativity community. We present here a general approach to automated fictional ideation that works by manipulating facts specified in knowledge bases. More specifically, we specify a number of constructions which, by altering and combining facts from a knowledge base, result in the generation of fictions. Moreover, we present an instantiation of these constructions through the use of ConceptNet, a database of common sense knowledge. In order to evaluate the success of these constructions, we present a curation analysis that calculates the proportion of ideas which pass a typicality judgement. We further evaluate the output of this approach through a crowd-sourcing experiment in which participants were asked to rank ideas. We found a positive correlation between the participant's rankings and a chaining inference technique that automatically assesses the value of the fictions generated through our approach. We believe that these results show that this approach constitutes a firm basis for automated fictional ideation with evaluative capacity.

  1. Knowledge-Based Entrepreneurship in a Boundless Research System

    ERIC Educational Resources Information Center

    Dell'Anno, Davide

    2008-01-01

    International entrepreneurship and knowledge-based entrepreneurship have recently generated considerable academic and non-academic attention. This paper explores the "new" field of knowledge-based entrepreneurship in a boundless research system. Cultural barriers to the development of business opportunities by researchers persist in some academic…

  2. Knowledge Base Editor (SharpKBE)

    NASA Technical Reports Server (NTRS)

    Tikidjian, Raffi; James, Mark; Mackey, Ryan

    2007-01-01

    The SharpKBE software provides a graphical user interface environment for domain experts to build and manage knowledge base systems. Knowledge bases can be exported/translated to various target languages automatically, including customizable target languages.

  3. Critical Analysis of Textbooks: Knowledge-Generating Logics and the Emerging Image of "Global Economic Contexts"

    ERIC Educational Resources Information Center

    Thoma, Michael

    2017-01-01

    This paper presents an approach to the critical analysis of textbook knowledge, which, working from a discourse theory perspective (based on the work of Foucault), refers to the performative nature of language. The critical potential of the approach derives from an analysis of knowledge-generating logics, which produce particular images of reality…

  4. A Discussion of Knowledge Based Design

    NASA Technical Reports Server (NTRS)

    Wood, Richard M.; Bauer, Steven X. S.

    1999-01-01

    A discussion of knowledge and Knowledge- Based design as related to the design of aircraft is presented. The paper discusses the perceived problem with existing design studies and introduces the concepts of design and knowledge for a Knowledge- Based design system. A review of several Knowledge-Based design activities is provided. A Virtual Reality, Knowledge-Based system is proposed and reviewed. The feasibility of Virtual Reality to improve the efficiency and effectiveness of aerodynamic and multidisciplinary design, evaluation, and analysis of aircraft through the coupling of virtual reality technology and a Knowledge-Based design system is also reviewed. The final section of the paper discusses future directions for design and the role of Knowledge-Based design.

  5. Generating Pedagogical Content Knowledge in Teacher Education Students

    ERIC Educational Resources Information Center

    van den Berg, Ed

    2015-01-01

    Some pre-service teaching activities can contribute much to the learning of pedagogical content knowledge (PCK) and subsequent teaching as these activities are "generating" PCK within the pre-service teacher's own classroom. Three examples are described: preparing exhibitions of science experiments, assessing preconceptions, and teaching…

  6. NASDA knowledge-based network planning system

    NASA Technical Reports Server (NTRS)

    Yamaya, K.; Fujiwara, M.; Kosugi, S.; Yambe, M.; Ohmori, M.

    1993-01-01

    One of the SODS (space operation and data system) sub-systems, NP (network planning) was the first expert system used by NASDA (national space development agency of Japan) for tracking and control of satellite. The major responsibilities of the NP system are: first, the allocation of network and satellite control resources and, second, the generation of the network operation plan data (NOP) used in automated control of the stations and control center facilities. Up to now, the first task of network resource scheduling was done by network operators. NP system automatically generates schedules using its knowledge base, which contains information on satellite orbits, station availability, which computer is dedicated to which satellite, and how many stations must be available for a particular satellite pass or a certain time period. The NP system is introduced.

  7. Modeling Guru: Knowledge Base for NASA Modelers

    NASA Astrophysics Data System (ADS)

    Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.

    2009-05-01

    Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the

  8. The experimenters' regress reconsidered: Replication, tacit knowledge, and the dynamics of knowledge generation.

    PubMed

    Feest, Uljana

    2016-08-01

    This paper revisits the debate between Harry Collins and Allan Franklin, concerning the experimenters' regress. Focusing my attention on a case study from recent psychology (regarding experimental evidence for the existence of a Mozart Effect), I argue that Franklin is right to highlight the role of epistemological strategies in scientific practice, but that his account does not sufficiently appreciate Collins's point about the importance of tacit knowledge in experimental practice. In turn, Collins rightly highlights the epistemic uncertainty (and skepticism) surrounding much experimental research. However, I will argue that his analysis of tacit knowledge fails to elucidate the reasons why scientists often are (and should be) skeptical of other researchers' experimental results. I will present an analysis of tacit knowledge in experimental research that not only answers to this desideratum, but also shows how such skepticism can in fact be a vital enabling factor for the dynamic processes of experimental knowledge generation.

  9. Knowledge-based pitch detection

    NASA Astrophysics Data System (ADS)

    Dove, W. P.

    1986-06-01

    Many problems in signal processing involve a mixture of numerical and symbolic knowledge. Examples of problems of this sort include the recognition of speech and the analysis of images. This thesis focuses on the problem of employing a mixture of symbolic and numerical knowledge within a single system, through the development of a system directed at a modified pitch detection problem. For this thesis, the conventional pitch detection problem was modified by providing a phonetic transcript and sex/age information as input to the system, in addition to the acoustic waveform. The Pitch Detector's Assistant (PDA) system that was developed is an interactive facility for evaluating ways of approaching this problem. The PDA system allows the user to interrupt processing at any point, change either input data, derived data, or problem knowledge and continue execution.

  10. Common Sense about Uncommon Knowledge: The Knowledge Bases for Diversity.

    ERIC Educational Resources Information Center

    Smith, G. Pritchy

    This book explains knowledge bases for teaching diverse student populations. An introduction displays one first-year teacher's experiences with diverse students in a high school classroom in San Angelo, Texas in 1961. The 15 chapters are: (1) "Toward Defining Culturally Responsible and Responsive Teacher Education"; (2) "Knowledge…

  11. Knowledge-Based Search Tactics.

    ERIC Educational Resources Information Center

    Shute, Steven J.; Smith, Philip J.

    1993-01-01

    Describes an empirical study that was conducted to examine the performance of expert search intermediaries from Chemical Abstracts Service. Highlights include subject-independent and subject-dependent expertise; a model of the use of subject-specific knowledge; and implications for computerized intermediary systems and for training human…

  12. Automated extraction of knowledge for model-based diagnostics

    NASA Technical Reports Server (NTRS)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  13. Persistent Data/Knowledge Base

    DTIC Science & Technology

    1991-06-01

    Processing in PDKB 47 8 Future Research Topics 51 9 Conclusion 59 A cknowledgnents iii Acknowledgments This research was supported by the Rome Air...queries are processed and rules are inferenced in PDKB. Chapter 8 discusses future research topics and directions for PDKB. They include reasoning...existing knowledge, and forward chaining will be triggered., A detailed query and inferencing processing design is beyond the scope of this research

  14. Introducing T-shaped managers. Knowledge management's next generation.

    PubMed

    Hansen, M T; von Oetinger, B

    2001-03-01

    Most companies do a poor job of capitalizing on the wealth of expertise scattered across their organizations. That's because they tend to rely on centralized knowledge-management systems and technologies. But such systems are really only good at distributing explicit knowledge, the kind that can be captured and codified for general use. They're not very good at transferring implicit knowledge, the kind needed to generate new insights and creative ways of tackling business problems or opportunities. The authors suggest another approach, something they call T-shaped management, which requires executives to share knowledge freely across their organization (the horizontal part of the "T"), while remaining fiercely committed to their individual business unit's performance (the vertical part). A few companies are starting to use this approach, and one--BP Amoco--has been especially successful. From BP's experience, the authors have gleaned five ways that T-shaped managers help companies capitalize on their inherent knowledge. They increase efficiency by transferring best practices. They improve the quality of decision making companywide. They grow revenues through shared expertise. They develop new business opportunities through the cross-pollination of ideas. And they make bold strategic moves possible by delivering well-coordinated implementation. All that takes time, and BP's managers have had to learn how to balance that time against the attention they must pay to their own units. The authors suggest, however, that it's worth the effort to find such a balance to more fully realize the immense value of the knowledge lying idle within so many companies.

  15. Utilizing knowledge-base semantics in graph-based algorithms

    SciTech Connect

    Darwiche, A.

    1996-12-31

    Graph-based algorithms convert a knowledge base with a graph structure into one with a tree structure (a join-tree) and then apply tree-inference on the result. Nodes in the join-tree are cliques of variables and tree-inference is exponential in w*, the size of the maximal clique in the join-tree. A central property of join-trees that validates tree-inference is the running-intersection property: the intersection of any two cliques must belong to every clique on the path between them. We present two key results in connection to graph-based algorithms. First, we show that the running-intersection property, although sufficient, is not necessary for validating tree-inference. We present a weaker property for this purpose, called running-interaction, that depends on non-structural (semantical) properties of a knowledge base. We also present a linear algorithm that may reduce w* of a join-tree, possibly destroying its running-intersection property, while maintaining its running-interaction property and, hence, its validity for tree-inference. Second, we develop a simple algorithm for generating trees satisfying the running-interaction property. The algorithm bypasses triangulation (the standard technique for constructing join-trees) and does not construct a join-tree first. We show that the proposed algorithm may in some cases generate trees that are more efficient than those generated by modifying a join-tree.

  16. Games for Learning: Which Template Generates Social Construction of Knowledge?

    ERIC Educational Resources Information Center

    Garcia, Francisco A.

    2015-01-01

    The purpose of this study was to discover how three person teams use game templates (trivia, role-play, or scavenger hunt) to socially construct knowledge. The researcher designed an experimental Internet-based database to facilitate teams creating each game. Teams consisted of teachers, students, hobbyist, and business owners who shared similar…

  17. [Challenges for knowledge generation in environmental health: an ecosystemic approach].

    PubMed

    Weihs, Marla; Mertens, Frédéric

    2013-05-01

    This article examines opportunities and limitations regarding knowledge generation in the field of environmental health. The contention is that understanding the complexity of factors that determine the health of humans and ecosystems requires a redefinition of the traditional distribution of roles and responsibilities in scientific research. These research practices involve inter and transdisciplinary approaches and the application of an ecosystemic approach (ecohealth). Challenges and opportunities associated to the application of inter and transdisciplinarity in environmental health problems are discussed and illustrated by two case studies that use an ecohealth approach: a project on the contamination and exposure to mercury in the Brazilian Amazon, and another on the urban transmission of echinococcosis in Nepal. In the conclusion, the potential benefits of using an ecohealth approach in overcoming the limitations of unidisciplinary practices and in taking advantage of local knowledge and participation is stressed.

  18. Utilizing Data and Knowledge Mining for Probabilistic Knowledge Bases

    DTIC Science & Technology

    1996-12-01

    everything. My dear wife Tamara deserves an award simply for tolerating me these last 18 months. This is the second time she has been with me during a degree...and the flexibility of its knowledge representation scheme is an inverse one. In order to implement a realistic, real-world application, both of these ...information from some text- based source, such as an on-line encyclopedia or an Internet web page. Most often, these systems are highly focused and specialize

  19. Ontology-Based Multiple Choice Question Generation

    PubMed Central

    Al-Yahya, Maha

    2014-01-01

    With recent advancements in Semantic Web technologies, a new trend in MCQ item generation has emerged through the use of ontologies. Ontologies are knowledge representation structures that formally describe entities in a domain and their relationships, thus enabling automated inference and reasoning. Ontology-based MCQ item generation is still in its infancy, but substantial research efforts are being made in the field. However, the applicability of these models for use in an educational setting has not been thoroughly evaluated. In this paper, we present an experimental evaluation of an ontology-based MCQ item generation system known as OntoQue. The evaluation was conducted using two different domain ontologies. The findings of this study show that ontology-based MCQ generation systems produce satisfactory MCQ items to a certain extent. However, the evaluation also revealed a number of shortcomings with current ontology-based MCQ item generation systems with regard to the educational significance of an automatically constructed MCQ item, the knowledge level it addresses, and its language structure. Furthermore, for the task to be successful in producing high-quality MCQ items for learning assessments, this study suggests a novel, holistic view that incorporates learning content, learning objectives, lexical knowledge, and scenarios into a single cohesive framework. PMID:24982937

  20. The Role of Domain Knowledge in Creative Generation

    ERIC Educational Resources Information Center

    Ward, Thomas B.

    2008-01-01

    Previous studies have shown that a predominant tendency in creative generation tasks is to base new ideas on well-known, specific instances of previous ideas (e.g., basing ideas for imaginary aliens on dogs, cats or bears). However, a substantial minority of individuals has been shown to adopt more abstract approaches to the task and to develop…

  1. Knowledge Based Systems and Metacognition in Radar

    NASA Astrophysics Data System (ADS)

    Capraro, Gerard T.; Wicks, Michael C.

    An airborne ground looking radar sensor's performance may be enhanced by selecting algorithms adaptively as the environment changes. A short description of an airborne intelligent radar system (AIRS) is presented with a description of the knowledge based filter and detection portions. A second level of artificial intelligence (AI) processing is presented that monitors, tests, and learns how to improve and control the first level. This approach is based upon metacognition, a way forward for developing knowledge based systems.

  2. Frame-based knowledge representation for processing planning

    NASA Astrophysics Data System (ADS)

    Lindsay, K. J.

    An Expert System is being developed to perform generative process planning for individual parts fabricated from extruded and sheet metal materials, and for bonded metal assemblies. The system employs a frame-based knowledge representation structure and production rules to generate detailed fabrication and processing instructions. The system is being developed using the InterLISP-D language, commercially available expert system development software and a dedicated LISP machine. The paper describes the knowledge-based representation and reasoning techniques applied within the system and pertinent development issues.

  3. The Knowledge Building Paradigm: A Model of Learning for Net Generation Students

    ERIC Educational Resources Information Center

    Philip, Donald

    2005-01-01

    In this article Donald Philip describes Knowledge Building, a pedagogy based on the way research organizations function. The global economy, Philip argues, is driving a shift from older, industrial models to the model of the business as a learning organization. The cognitive patterns of today's Net Generation students, formed by lifetime exposure…

  4. Socially Relevant Knowledge Based Telemedicine

    DTIC Science & Technology

    2010-10-01

    Advanced Cardiac Life Support training, we have developed a virtual world platform to enable training of geographically disparate teams on ACLS...training. Coupling haptic devices with the virtual world, we have enabled a multi-sensorial platform for team training. The initial experiment shows the...cognit ive task a s we ll as psycho motor task by providing an intera ctive platform to users to 7 perform the tasks. There are many team based

  5. Knowledge-Based Learning: Integration of Deductive and Inductive Learning for Knowledge Base Completion.

    ERIC Educational Resources Information Center

    Whitehall, Bradley Lane

    In constructing a knowledge-based system, the knowledge engineer must convert rules of thumb provided by the domain expert and previously solved examples into a working system. Research in machine learning has produced algorithms that create rules for knowledge-based systems, but these algorithms require either many examples or a complete domain…

  6. Knowledge-Based Software Development Tools

    DTIC Science & Technology

    1993-09-01

    inference, finite differencing, and data structure selection are discussed. A detailed case study is presented that shows how these systems could cooperate...theories that were codified in the language. In particular, encoded in CHI were tireories for: generating data structure implementations 𔄃], which mil...problems, and the finite-differencing program optinmizatica technique Implementation knowledge for data structure generatmin and performance estimatim

  7. The Coming of Knowledge-Based Business.

    ERIC Educational Resources Information Center

    Davis, Stan; Botkin, Jim

    1994-01-01

    Economic growth will come from knowledge-based businesses whose "smart" products filter and interpret information. Businesses will come to think of themselves as educators and their customers as learners. (SK)

  8. How Is Knowledge Generated About Memory Encoding Strategy Effectiveness?

    PubMed Central

    Hertzog, Christopher; Price, Jodi; Dunlosky, John

    2008-01-01

    This study evaluated how people learn about encoding strategy effectiveness in an associative memory task. Individuals studied two lists of paired associates under instructions to use either a normatively effective strategy (interactive imagery) or a normatively ineffective strategy (rote repetition) for each pair. Questionnaire ratings of imagery effectiveness increased and ratings of repetition effectiveness decreased after task experience, demonstrating new knowledge about strategy effectiveness. Cued recall confidence judgments, measuring confidence in recall accuracy, were almost perfectly correlated with actual recall and strongly correlated with postdictions – estimates of recall for each strategy. A structural regression model revealed that postdictions mediated both changes in second-list predictions and changes in strategy effectiveness ratings, implicating accurate performance estimates based on item-level monitoring as the key to updating strategy knowledge. PMID:19043596

  9. Methodology for testing and validating knowledge bases

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, C.; Padalkar, S.; Sztipanovits, J.; Purves, B. R.

    1987-01-01

    A test and validation toolset developed for artificial intelligence programs is described. The basic premises of this method are: (1) knowledge bases have a strongly declarative character and represent mostly structural information about different domains, (2) the conditions for integrity, consistency, and correctness can be transformed into structural properties of knowledge bases, and (3) structural information and structural properties can be uniformly represented by graphs and checked by graph algorithms. The interactive test and validation environment have been implemented on a SUN workstation.

  10. Knowledge based programming environments: A perspective

    NASA Technical Reports Server (NTRS)

    Amin, Ashok T.

    1988-01-01

    Programming environments is an area of recent origin and refers to an integrated set of tools, such as program library, text editor, compiler, and debugger, in support of program development. Understanding of programs and programming has lead to automated techniques for program development. Knowledge based programming system using program transformations offer significant impact on future program development methodologies. A review of recent developments in the area of knowledge based programming environments, from the perspective of software engineering, is presented.

  11. Distributed, cooperating knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt

    1991-01-01

    Some current research in the development and application of distributed, cooperating knowledge-based systems technology is addressed. The focus of the current research is the spacecraft ground operations environment. The underlying hypothesis is that, because of the increasing size, complexity, and cost of planned systems, conventional procedural approaches to the architecture of automated systems will give way to a more comprehensive knowledge-based approach. A hallmark of these future systems will be the integration of multiple knowledge-based agents which understand the operational goals of the system and cooperate with each other and the humans in the loop to attain the goals. The current work includes the development of a reference model for knowledge-base management, the development of a formal model of cooperating knowledge-based agents, the use of testbed for prototyping and evaluating various knowledge-based concepts, and beginning work on the establishment of an object-oriented model of an intelligent end-to-end (spacecraft to user) system. An introductory discussion of these activities is presented, the major concepts and principles being investigated are highlighted, and their potential use in other application domains is indicated.

  12. An Insulating Glass Knowledge Base

    SciTech Connect

    Michael L. Doll; Gerald Hendrickson; Gerard Lagos; Russell Pylkki; Chris Christensen; Charlie Cureija

    2005-08-01

    This report will discuss issues relevant to Insulating Glass (IG) durability performance by presenting the observations and developed conclusions in a logical sequential format. This concluding effort discusses Phase II activities and focuses on beginning to quantifying IG durability issues while continuing the approach presented in the Phase I activities (Appendix 1) which discuss a qualitative assessment of durability issues. Phase II developed a focus around two specific IG design classes previously presented in Phase I of this project. The typical box spacer and thermoplastic spacer design including their Failure Modes and Effect Analysis (FMEA) and Fault Tree diagrams were chosen to address two currently used IG design options with varying components and failure modes. The system failures occur due to failures of components or their interfaces. Efforts to begin quantifying the durability issues focused on the development and delivery of an included computer based IG durability simulation program. The focus/effort to deliver the foundation for a comprehensive IG durability simulation tool is necessary to address advancements needed to meet current and future building envelope energy performance goals. This need is based upon the current lack of IG field failure data and the lengthy field observation time necessary for this data collection. Ultimately, the simulation program is intended to be used by designers throughout the current and future industry supply chain. Its use is intended to advance IG durability as expectations grow around energy conservation and with the growth of embedded technologies as required to meet energy needs. In addition the tool has the immediate benefit of providing insight for research and improvement prioritization. Included in the simulation model presentation are elements and/or methods to address IG materials, design, process, quality, induced stress (environmental and other factors), validation, etc. In addition, acquired data

  13. The importance of knowledge-based technology.

    PubMed

    Cipriano, Pamela F

    2012-01-01

    Nurse executives are responsible for a workforce that can provide safer and more efficient care in a complex sociotechnical environment. National quality priorities rely on technologies to provide data collection, share information, and leverage analytic capabilities to interpret findings and inform approaches to care that will achieve better outcomes. As a key steward for quality, the nurse executive exercises leadership to provide the infrastructure to build and manage nursing knowledge and instill accountability for following evidence-based practices. These actions contribute to a learning health system where new knowledge is captured as a by-product of care delivery enabled by knowledge-based electronic systems. The learning health system also relies on rigorous scientific evidence embedded into practice at the point of care. The nurse executive optimizes use of knowledge-based technologies, integrated throughout the organization, that have the capacity to help transform health care.

  14. Reuse: A knowledge-based approach

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.

  15. Integrating knowledge-based techniques into well-test interpretation

    SciTech Connect

    Harrison, I.W.; Fraser, J.L.

    1995-04-01

    The goal of the Spirit Project was to develop a prototype of next-generation well-test-interpretation (WTI) software that would include knowledge-based decision support for the WTI model selection task. This paper describes how Spirit makes use of several different types of information (pressure, seismic, petrophysical, geological, and engineering) to support the user in identifying the most appropriate WTI model. Spirit`s knowledge-based approach to type-curve matching is to generate several different feasible interpretations by making assumptions about the possible presence of both wellbore storage and late-time boundary effects. Spirit fuses information from type-curve matching and other data sources by use of a knowledge-based decision model developed in collaboration with a WTI expert. The sponsors of the work have judged the resulting prototype system a success.

  16. Web-Based Learning as a Tool of Knowledge Continuity

    ERIC Educational Resources Information Center

    Jaaman, Saiful Hafizah; Ahmad, Rokiah Rozita; Rambely, Azmin Sham

    2013-01-01

    The outbreak of information in a borderless world has prompted lecturers to move forward together with the technological innovation and erudition of knowledge in performing his/her responsibility to educate the young generations to be able to stand above the crowd at the global scene. Teaching and Learning through web-based learning platform is a…

  17. Conversation Intention Perception based on Knowledge Base

    DTIC Science & Technology

    2014-05-01

    2005. 4. Holger Kunz and Thorsten Schaaf. General and specific formalization approach for a balanced scorecard : An expert system with application in...consult in China. Health care system and the modern health in- frastructure play an essential role in recent years [4]. However, self- management for...large question and answer archives. In Proceedings of the 14th ACM international conference on Information and knowledge management , pages 84–90. ACM

  18. "Chromosome": a knowledge-based system for the chromosome classification.

    PubMed

    Ramstein, G; Bernadet, M

    1993-01-01

    Chromosome, a knowledge-based analysis system has been designed for the classification of human chromosomes. Its aim is to perform an optimal classification by driving a tool box containing the procedures of image processing, pattern recognition and classification. This paper presents the general architecture of Chromosome, based on a multiagent system generator. The image processing tool box is described from the met aphasic enhancement to the fine classification. Emphasis is then put on the knowledge base intended for the chromosome recognition. The global classification process is also presented, showing how Chromosome proceeds to classify a given chromosome. Finally, we discuss further extensions of the system for the karyotype building.

  19. A Natural Language Interface Concordant with a Knowledge Base.

    PubMed

    Han, Yong-Jin; Park, Seong-Bae; Park, Se-Young

    2016-01-01

    The discordance between expressions interpretable by a natural language interface (NLI) system and those answerable by a knowledge base is a critical problem in the field of NLIs. In order to solve this discordance problem, this paper proposes a method to translate natural language questions into formal queries that can be generated from a graph-based knowledge base. The proposed method considers a subgraph of a knowledge base as a formal query. Thus, all formal queries corresponding to a concept or a predicate in the knowledge base can be generated prior to query time and all possible natural language expressions corresponding to each formal query can also be collected in advance. A natural language expression has a one-to-one mapping with a formal query. Hence, a natural language question is translated into a formal query by matching the question with the most appropriate natural language expression. If the confidence of this matching is not sufficiently high the proposed method rejects the question and does not answer it. Multipredicate queries are processed by regarding them as a set of collected expressions. The experimental results show that the proposed method thoroughly handles answerable questions from the knowledge base and rejects unanswerable ones effectively.

  20. Bridging the gap: simulations meet knowledge bases

    NASA Astrophysics Data System (ADS)

    King, Gary W.; Morrison, Clayton T.; Westbrook, David L.; Cohen, Paul R.

    2003-09-01

    Tapir and Krill are declarative languages for specifying actions and agents, respectively, that can be executed in simulation. As such, they bridge the gap between strictly declarative knowledge bases and strictly executable code. Tapir and Krill components can be combined to produce models of activity which can answer questions about mechanisms and processes using conventional inference methods and simulation. Tapir was used in DARPA's Rapid Knowledge Formation (RKF) project to construct models of military tactics from the Army Field Manual FM3-90. These were then used to build Courses of Actions (COAs) which could be critiqued by declarative reasoning or via Monte Carlo simulation. Tapir and Krill can be read and written by non-knowledge engineers making it an excellent vehicle for Subject Matter Experts to build and critique knowledge bases.

  1. Knowledge-based Autonomous Test Engineer (KATE)

    NASA Technical Reports Server (NTRS)

    Parrish, Carrie L.; Brown, Barbara L.

    1991-01-01

    Mathematical models of system components have long been used to allow simulators to predict system behavior to various stimuli. Recent efforts to monitor, diagnose, and control real-time systems using component models have experienced similar success. NASA Kennedy is continuing the development of a tool for implementing real-time knowledge-based diagnostic and control systems called KATE (Knowledge based Autonomous Test Engineer). KATE is a model-based reasoning shell designed to provide autonomous control, monitoring, fault detection, and diagnostics for complex engineering systems by applying its reasoning techniques to an exchangeable quantitative model describing the structure and function of the various system components and their systemic behavior.

  2. Building a Knowledge Base for Teacher Education: An Experience in K-8 Mathematics Teacher Preparation

    ERIC Educational Resources Information Center

    Hiebert, James; Morris, Anne K.

    2009-01-01

    Consistent with the theme of this issue, we describe the details of one continuing effort to build knowledge for teacher education. We argue that building a useful knowledge base requires attention to the processes used to generate, record, and vet knowledge. By using 4 features of knowledge-building systems we identified in the introductory…

  3. Prior knowledge-based approach for associating ...

    EPA Pesticide Factsheets

    Evaluating the potential human health and/or ecological risks associated with exposures to complex chemical mixtures in the ambient environment is one of the central challenges of chemical safety assessment and environmental protection. There is a need for approaches that can help to integrate chemical monitoring and bio-effects data to evaluate risks associated with chemicals present in the environment. We used prior knowledge about chemical-gene interactions to develop a knowledge assembly model for detected chemicals at five locations near two wastewater treatment plants. The assembly model was used to generate hypotheses about the biological impacts of the chemicals at each location. The hypotheses were tested using empirical hepatic gene expression data from fathead minnows exposed for 12 d at each location. Empirical gene expression data was also mapped to the assembly models to statistically evaluate the likelihood of a chemical contributing to the observed biological responses. The prior knowledge approach was able reasonably hypothesize the biological impacts at one site but not the other. Chemicals most likely contributing to the observed biological responses were identified at each location. Despite limitations to the approach, knowledge assembly models have strong potential for associating chemical occurrence with potential biological effects and providing a foundation for hypothesis generation to guide research and/or monitoring efforts relat

  4. Knowledge-based diagnosis for aerospace systems

    NASA Technical Reports Server (NTRS)

    Atkinson, David J.

    1988-01-01

    The need for automated diagnosis in aerospace systems and the approach of using knowledge-based systems are examined. Research issues in knowledge-based diagnosis which are important for aerospace applications are treated along with a review of recent relevant research developments in Artificial Intelligence. The design and operation of some existing knowledge-based diagnosis systems are described. The systems described and compared include the LES expert system for liquid oxygen loading at NASA Kennedy Space Center, the FAITH diagnosis system developed at the Jet Propulsion Laboratory, the PES procedural expert system developed at SRI International, the CSRL approach developed at Ohio State University, the StarPlan system developed by Ford Aerospace, the IDM integrated diagnostic model, and the DRAPhys diagnostic system developed at NASA Langley Research Center.

  5. Knowledge-based scheduling of arrival aircraft

    NASA Technical Reports Server (NTRS)

    Krzeczowski, K.; Davis, T.; Erzberger, H.; Lev-Ram, I.; Bergh, C.

    1995-01-01

    A knowledge-based method for scheduling arrival aircraft in the terminal area has been implemented and tested in real-time simulation. The scheduling system automatically sequences, assigns landing times, and assigns runways to arrival aircraft by utilizing continuous updates of aircraft radar data and controller inputs. The scheduling algorithms is driven by a knowledge base which was obtained in over two thousand hours of controller-in-the-loop real-time simulation. The knowledge base contains a series of hierarchical 'rules' and decision logic that examines both performance criteria, such as delay reduction, as well as workload reduction criteria, such as conflict avoidance. The objective of the algorithms is to devise an efficient plan to land the aircraft in a manner acceptable to the air traffic controllers. This paper will describe the scheduling algorithms, give examples of their use, and present data regarding their potential benefits to the air traffic system.

  6. Photography-based image generator

    NASA Astrophysics Data System (ADS)

    Dalton, Nicholas M.; Deering, Charles S.

    1989-09-01

    A two-channel Photography Based Image Generator system was developed to drive the Helmet Mounted Laser Projector at the Naval Training System Center at Orlando, Florida. This projector is a two-channel system that displays a wide field-of-view color image with a high-resolution inset to efficiently match the pilot's visual capability. The image generator is a derivative of the LTV-developed visual system installed in the A-7E Weapon System Trainer at NAS Cecil Field. The Photography Based Image Generator is based on patented LTV technology for high resolution, multi-channel, real world visual simulation. Special provisions were developed for driving the NTSC-developed and patented Helmet Mounted Laser Projector. These include a special 1023-line raster format, an electronic image blending technique, spherical lens mapping for dome projection, a special computer interface for head/eye tracking and flight parameters, special software, and a number of data bases. Good gaze angle tracking is critical to the use of the NTSC projector in a flight simulation environment. The Photography Based Image Generator provides superior dynamic response by performing a relatively simple perspective transformation on stored, high-detail photography instead of generating this detail by "brute force" computer image generation methods. With this approach, high detail can be displayed and updated at the television field rate (60 Hz).

  7. Knowledge-based commodity distribution planning

    NASA Technical Reports Server (NTRS)

    Saks, Victor; Johnson, Ivan

    1994-01-01

    This paper presents an overview of a Decision Support System (DSS) that incorporates Knowledge-Based (KB) and commercial off the shelf (COTS) technology components. The Knowledge-Based Logistics Planning Shell (KBLPS) is a state-of-the-art DSS with an interactive map-oriented graphics user interface and powerful underlying planning algorithms. KBLPS was designed and implemented to support skilled Army logisticians to prepare and evaluate logistics plans rapidly, in order to support corps-level battle scenarios. KBLPS represents a substantial advance in graphical interactive planning tools, with the inclusion of intelligent planning algorithms that provide a powerful adjunct to the planning skills of commodity distribution planners.

  8. Knowledge Base Refinement by Monitoring Abstract Control Knowledge.

    ERIC Educational Resources Information Center

    Wilkins, D. C.; And Others

    Arguing that an explicit representation of the problem-solving method of an expert system shell as abstract control knowledge provides a powerful foundation for learning, this paper describes the abstract control knowledge of the Heracles expert system shell for heuristic classification problems, and describes how the Odysseus apprenticeship…

  9. Knowledge Management and Professional Work: A Communication Perspective on the Knowledge-Based Organization.

    ERIC Educational Resources Information Center

    Heaton, Lorna; Taylor, James R.

    2002-01-01

    Challenges two common assumptions in the literature on Knowledge Management: that knowledge originates in the individual and that once made explicit, subsequent interpretation of the representations of knowledge in symbolic form is unproblematic. Argues that the key to understanding the generation and sharing of knowledge is the role of text as…

  10. Knowledge-based machine indexing from natural language text: Knowledge base design, development, and maintenance

    NASA Technical Reports Server (NTRS)

    Genuardi, Michael T.

    1993-01-01

    One strategy for machine-aided indexing (MAI) is to provide a concept-level analysis of the textual elements of documents or document abstracts. In such systems, natural-language phrases are analyzed in order to identify and classify concepts related to a particular subject domain. The overall performance of these MAI systems is largely dependent on the quality and comprehensiveness of their knowledge bases. These knowledge bases function to (1) define the relations between a controlled indexing vocabulary and natural language expressions; (2) provide a simple mechanism for disambiguation and the determination of relevancy; and (3) allow the extension of concept-hierarchical structure to all elements of the knowledge file. After a brief description of the NASA Machine-Aided Indexing system, concerns related to the development and maintenance of MAI knowledge bases are discussed. Particular emphasis is given to statistically-based text analysis tools designed to aid the knowledge base developer. One such tool, the Knowledge Base Building (KBB) program, presents the domain expert with a well-filtered list of synonyms and conceptually-related phrases for each thesaurus concept. Another tool, the Knowledge Base Maintenance (KBM) program, functions to identify areas of the knowledge base affected by changes in the conceptual domain (for example, the addition of a new thesaurus term). An alternate use of the KBM as an aid in thesaurus construction is also discussed.

  11. Improving structural similarity based virtual screening using background knowledge

    PubMed Central

    2013-01-01

    Background Virtual screening in the form of similarity rankings is often applied in the early drug discovery process to rank and prioritize compounds from a database. This similarity ranking can be achieved with structural similarity measures. However, their general nature can lead to insufficient performance in some application cases. In this paper, we provide a link between ranking-based virtual screening and fragment-based data mining methods. The inclusion of binding-relevant background knowledge into a structural similarity measure improves the quality of the similarity rankings. This background knowledge in the form of binding relevant substructures can either be derived by hand selection or by automated fragment-based data mining methods. Results In virtual screening experiments we show that our approach clearly improves enrichment factors with both applied variants of our approach: the extension of the structural similarity measure with background knowledge in the form of a hand-selected relevant substructure or the extension of the similarity measure with background knowledge derived with data mining methods. Conclusion Our study shows that adding binding relevant background knowledge can lead to significantly improved similarity rankings in virtual screening and that even basic data mining approaches can lead to competitive results making hand-selection of the background knowledge less crucial. This is especially important in drug discovery and development projects where no receptor structure is available or more frequently no verified binding mode is known and mostly ligand based approaches can be applied to generate hit compounds. PMID:24341870

  12. Knowledge-Based Instructional Gaming: GEO.

    ERIC Educational Resources Information Center

    Duchastel, Philip

    1989-01-01

    Describes the design and development of an instructional game, GEO, in which the user learns elements of Canadian geography. The use of knowledge-based artificial intelligence techniques is discussed, the use of HyperCard in the design of GEO is explained, and future directions are suggested. (15 references) (Author/LRW)

  13. The adverse outcome pathway knowledge base

    EPA Science Inventory

    The rapid advancement of the Adverse Outcome Pathway (AOP) framework has been paralleled by the development of tools to store, analyse, and explore AOPs. The AOP Knowledge Base (AOP-KB) project has brought three independently developed platforms (Effectopedia, AOP-Wiki, and AOP-X...

  14. Improving the Knowledge Base in Teacher Education.

    ERIC Educational Resources Information Center

    Rockler, Michael J.

    Education in the United States for most of the last 50 years has built its knowledge base on a single dominating foundation--behavioral psychology. This paper analyzes the history of behaviorism. Syntheses are presented of the theories of Ivan P. Pavlov, J. B. Watson, and B. F. Skinner, all of whom contributed to the body of works on behaviorism.…

  15. Constructing Knowledge Bases: A Promising Instructional Tool.

    ERIC Educational Resources Information Center

    Trollip, Stanley R.; Lippert, Renate C.

    1987-01-01

    Argues that construction of knowledge bases is an instructional tool that encourages students' critical thinking in problem solving situations through metacognitive experiences. A study is described in which college students created expert systems to test the effectiveness of this method of instruction, and benefits for students and teachers are…

  16. An Ontology-Based Conceptual Model For Accumulating And Reusing Knowledge In A DMAIC Process

    NASA Astrophysics Data System (ADS)

    Nguyen, ThanhDat; Kifor, Claudiu Vasile

    2015-09-01

    DMAIC (Define, Measure, Analyze, Improve, and Control) is an important process used to enhance quality of processes basing on knowledge. However, it is difficult to access DMAIC knowledge. Conventional approaches meet a problem arising from structuring and reusing DMAIC knowledge. The main reason is that DMAIC knowledge is not represented and organized systematically. In this article, we overcome the problem basing on a conceptual model that is a combination of DMAIC process, knowledge management, and Ontology engineering. The main idea of our model is to utilizing Ontologies to represent knowledge generated by each of DMAIC phases. We build five different knowledge bases for storing all knowledge of DMAIC phases with the support of necessary tools and appropriate techniques in Information Technology area. Consequently, these knowledge bases provide knowledge available to experts, managers, and web users during or after DMAIC execution in order to share and reuse existing knowledge.

  17. Knowledge-based GIS techniques applied to geological engineering

    USGS Publications Warehouse

    Usery, E. Lynn; Altheide, Phyllis; Deister, Robin R.P.; Barr, David J.

    1988-01-01

    A knowledge-based geographic information system (KBGIS) approach which requires development of a rule base for both GIS processing and for the geological engineering application has been implemented. The rule bases are implemented in the Goldworks expert system development shell interfaced to the Earth Resources Data Analysis System (ERDAS) raster-based GIS for input and output. GIS analysis procedures including recoding, intersection, and union are controlled by the rule base, and the geological engineering map product is generted by the expert system. The KBGIS has been used to generate a geological engineering map of Creve Coeur, Missouri.

  18. An Ebola virus-centered knowledge base.

    PubMed

    Kamdar, Maulik R; Dumontier, Michel

    2015-01-01

    Ebola virus (EBOV), of the family Filoviridae viruses, is a NIAID category A, lethal human pathogen. It is responsible for causing Ebola virus disease (EVD) that is a severe hemorrhagic fever and has a cumulative death rate of 41% in the ongoing epidemic in West Africa. There is an ever-increasing need to consolidate and make available all the knowledge that we possess on EBOV, even if it is conflicting or incomplete. This would enable biomedical researchers to understand the molecular mechanisms underlying this disease and help develop tools for efficient diagnosis and effective treatment. In this article, we present our approach for the development of an Ebola virus-centered Knowledge Base (Ebola-KB) using Linked Data and Semantic Web Technologies. We retrieve and aggregate knowledge from several open data sources, web services and biomedical ontologies. This knowledge is transformed to RDF, linked to the Bio2RDF datasets and made available through a SPARQL 1.1 Endpoint. Ebola-KB can also be explored using an interactive Dashboard visualizing the different perspectives of this integrated knowledge. We showcase how different competency questions, asked by domain users researching the druggability of EBOV, can be formulated as SPARQL Queries or answered using the Ebola-KB Dashboard.

  19. An Ebola virus-centered knowledge base

    PubMed Central

    Kamdar, Maulik R.; Dumontier, Michel

    2015-01-01

    Ebola virus (EBOV), of the family Filoviridae viruses, is a NIAID category A, lethal human pathogen. It is responsible for causing Ebola virus disease (EVD) that is a severe hemorrhagic fever and has a cumulative death rate of 41% in the ongoing epidemic in West Africa. There is an ever-increasing need to consolidate and make available all the knowledge that we possess on EBOV, even if it is conflicting or incomplete. This would enable biomedical researchers to understand the molecular mechanisms underlying this disease and help develop tools for efficient diagnosis and effective treatment. In this article, we present our approach for the development of an Ebola virus-centered Knowledge Base (Ebola-KB) using Linked Data and Semantic Web Technologies. We retrieve and aggregate knowledge from several open data sources, web services and biomedical ontologies. This knowledge is transformed to RDF, linked to the Bio2RDF datasets and made available through a SPARQL 1.1 Endpoint. Ebola-KB can also be explored using an interactive Dashboard visualizing the different perspectives of this integrated knowledge. We showcase how different competency questions, asked by domain users researching the druggability of EBOV, can be formulated as SPARQL Queries or answered using the Ebola-KB Dashboard. Database URL: http://ebola.semanticscience.org. PMID:26055098

  20. Knowledge Generation, Organization Dissemination and Utilization for Rural Development.

    ERIC Educational Resources Information Center

    Beal, George M.

    A "communication system" paradigm for dissemination of appropriate knowledge, information, and technology needed for effective rural development is briefly described. The paradigm describes six categories of interrelated functions, activities, and processes: (1) scientific knowledge production by carrying out basic and applied research; (2)…

  1. Generative Knowledge Interviewing: A Method for Knowledge Transfer and Talent Management at the University of Michigan

    ERIC Educational Resources Information Center

    Peet, Melissa R.; Walsh, Katherine; Sober, Robin; Rawak, Christine S.

    2010-01-01

    Experts and leaders within most fields possess knowledge that is largely tacit and unconscious in nature. The leaders of most organizations do not "know what they know" and cannot share their knowledge with others. The loss of this essential knowledge is of major concern to organizations. This study tested an innovative method of tacit…

  2. Satellite Contamination and Materials Outgassing Knowledge base

    NASA Technical Reports Server (NTRS)

    Minor, Jody L.; Kauffman, William J. (Technical Monitor)

    2001-01-01

    Satellite contamination continues to be a design problem that engineers must take into account when developing new satellites. To help with this issue, NASA's Space Environments and Effects (SEE) Program funded the development of the Satellite Contamination and Materials Outgassing Knowledge base. This engineering tool brings together in one location information about the outgassing properties of aerospace materials based upon ground-testing data, the effects of outgassing that has been observed during flight and measurements of the contamination environment by on-orbit instruments. The knowledge base contains information using the ASTM Standard E- 1559 and also consolidates data from missions using quartz-crystal microbalances (QCM's). The data contained in the knowledge base was shared with NASA by government agencies and industry in the US and international space agencies as well. The term 'knowledgebase' was used because so much information and capability was brought together in one comprehensive engineering design tool. It is the SEE Program's intent to continually add additional material contamination data as it becomes available - creating a dynamic tool whose value to the user is ever increasing. The SEE Program firmly believes that NASA, and ultimately the entire contamination user community, will greatly benefit from this new engineering tool and highly encourages the community to not only use the tool but add data to it as well.

  3. Presentation planning using an integrated knowledge base

    NASA Technical Reports Server (NTRS)

    Arens, Yigal; Miller, Lawrence; Sondheimer, Norman

    1988-01-01

    A description is given of user interface research aimed at bringing together multiple input and output modes in a way that handles mixed mode input (commands, menus, forms, natural language), interacts with a diverse collection of underlying software utilities in a uniform way, and presents the results through a combination of output modes including natural language text, maps, charts and graphs. The system, Integrated Interfaces, derives much of its ability to interact uniformly with the user and the underlying services and to build its presentations, from the information present in a central knowledge base. This knowledge base integrates models of the application domain (Navy ships in the Pacific region, in the current demonstration version); the structure of visual displays and their graphical features; the underlying services (data bases and expert systems); and interface functions. The emphasis is on a presentation planner that uses the knowledge base to produce multi-modal output. There has been a flurry of recent work in user interface management systems. (Several recent examples are listed in the references). Existing work is characterized by an attempt to relieve the software designer of the burden of handcrafting an interface for each application. The work has generally focused on intelligently handling input. This paper deals with the other end of the pipeline - presentations.

  4. Knowledge-Based Query Construction Using the CDSS Knowledge Base for Efficient Evidence Retrieval

    PubMed Central

    Afzal, Muhammad; Hussain, Maqbool; Ali, Taqdir; Hussain, Jamil; Khan, Wajahat Ali; Lee, Sungyoung; Kang, Byeong Ho

    2015-01-01

    Finding appropriate evidence to support clinical practices is always challenging, and the construction of a query to retrieve such evidence is a fundamental step. Typically, evidence is found using manual or semi-automatic methods, which are time-consuming and sometimes make it difficult to construct knowledge-based complex queries. To overcome the difficulty in constructing knowledge-based complex queries, we utilized the knowledge base (KB) of the clinical decision support system (CDSS), which has the potential to provide sufficient contextual information. To automatically construct knowledge-based complex queries, we designed methods to parse rule structure in KB of CDSS in order to determine an executable path and extract the terms by parsing the control structures and logic connectives used in the logic. The automatically constructed knowledge-based complex queries were executed on the PubMed search service to evaluate the results on the reduction of retrieved citations with high relevance. The average number of citations was reduced from 56,249 citations to 330 citations with the knowledge-based query construction approach, and relevance increased from 1 term to 6 terms on average. The ability to automatically retrieve relevant evidence maximizes efficiency for clinicians in terms of time, based on feedback collected from clinicians. This approach is generally useful in evidence-based medicine, especially in ambient assisted living environments where automation is highly important. PMID:26343669

  5. XML-Based SHINE Knowledge Base Interchange Language

    NASA Technical Reports Server (NTRS)

    James, Mark; Mackey, Ryan; Tikidjian, Raffi

    2008-01-01

    The SHINE Knowledge Base Interchange Language software has been designed to more efficiently send new knowledge bases to spacecraft that have been embedded with the Spacecraft Health Inference Engine (SHINE) tool. The intention of the behavioral model is to capture most of the information generally associated with a spacecraft functional model, while specifically addressing the needs of execution within SHINE and Livingstone. As such, it has some constructs that are based on one or the other.

  6. Knowledge Base Management for Model Management Systems.

    DTIC Science & Technology

    1983-06-01

    inter- faces as they relate to aspects of model base management. The focus of this study is to identify some organiza- tions of knowledge about models...Vertical thinking is loosely related to systemic thinking, where one idea establishes a logical foundation upon which to construct the next idea...thinking is somewhat associated with creative thinking, and the idea of pattern matching from one circumstance to another. Mintzberg IRef. 41 has related

  7. Clips as a knowledge based language

    NASA Technical Reports Server (NTRS)

    Harrington, James B.

    1987-01-01

    CLIPS is a language for writing expert systems applications on a personal or small computer. Here, the CLIPS programming language is described and compared to three other artificial intelligence (AI) languages (LISP, Prolog, and OPS5) with regard to the processing they provide for the implementation of a knowledge based system (KBS). A discussion is given on how CLIPS would be used in a control system.

  8. Wavelet-Based Grid Generation

    NASA Technical Reports Server (NTRS)

    Jameson, Leland

    1996-01-01

    Wavelets can provide a basis set in which the basis functions are constructed by dilating and translating a fixed function known as the mother wavelet. The mother wavelet can be seen as a high pass filter in the frequency domain. The process of dilating and expanding this high-pass filter can be seen as altering the frequency range that is 'passed' or detected. The process of translation moves this high-pass filter throughout the domain, thereby providing a mechanism to detect the frequencies or scales of information at every location. This is exactly the type of information that is needed for effective grid generation. This paper provides motivation to use wavelets for grid generation in addition to providing the final product: source code for wavelet-based grid generation.

  9. Knowledge-based systems in Japan

    NASA Technical Reports Server (NTRS)

    Feigenbaum, Edward; Engelmore, Robert S.; Friedland, Peter E.; Johnson, Bruce B.; Nii, H. Penny; Schorr, Herbert; Shrobe, Howard

    1994-01-01

    This report summarizes a study of the state-of-the-art in knowledge-based systems technology in Japan, organized by the Japanese Technology Evaluation Center (JTEC) under the sponsorship of the National Science Foundation and the Advanced Research Projects Agency. The panel visited 19 Japanese sites in March 1992. Based on these site visits plus other interactions with Japanese organizations, both before and after the site visits, the panel prepared a draft final report. JTEC sent the draft to the host organizations for their review. The final report was published in May 1993.

  10. Knowledge Base Refinement by Monitoring Abstract Control Knowledge. Revision 1.

    DTIC Science & Technology

    1987-08-01

    Wilkins, NV. J. Clancey, and B. G. Buchanan 0 S .. Department of Computer Science Stanford University Stanford, CA 94305 e EOT -N...APPROVED FOR PUBLIC RELEASE. 2b DECLASSIFICATIONDOWNGRADING SCHEDULE DISTRIBUTION UNLIMITED 4 PERFORMING ORGANIZATION REPORT NuMBER( S ) S MONITORING...ORGANIZATION REPORT NuMBER( S ) ONR TECHNICAL REPORT # 6a NAME OF PERFORMING ORGANIZATION 6b OFFICE SYMBOL 7a NAME OF MONITORING ORGANIZATION STANFORD KNOWLEDGE

  11. Explanation-based knowledge acquisition of electronics

    NASA Astrophysics Data System (ADS)

    Kieras, David E.

    1992-08-01

    This is the final report in a project that examined how knowledge of practical electronics could be acquired from materials similar to that appearing in electronics training textbooks, from both an artificial intelligence perspective and an experimental psychology perspective. Practical electronics training materials present a series of basic circuits accompanied by an explanation of how the circuit performs the desired function. More complex circuits are then explained in terms of these basic circuits. This material thus presents schema knowledge for individual circuit types in the form of explanations of circuit behavior. Learning from such material would thus consist of first instantiating any applicable schemas, and then constructing a new schema based on the circuit structure and behavior described in the explanation. If the basic structure of the material is an effective approach to learning, learning about a new circuit should be easier if the relevant schemas are available than not. This result was obtained for both an artificial intelligence system that used standard explanation-based learning mechanisms and with human learners in a laboratory setting, but the benefits of already having the relevant schemas were not large in these materials. The close examination of learning in this domain, and the structure of knowledge, should be useful to future cognitive analyses of training in technical domains.

  12. Advances in knowledge-based software engineering

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt

    1991-01-01

    The underlying hypothesis of this work is that a rigorous and comprehensive software reuse methodology can bring about a more effective and efficient utilization of constrained resources in the development of large-scale software systems by both government and industry. It is also believed that correct use of this type of software engineering methodology can significantly contribute to the higher levels of reliability that will be required of future operational systems. An overview and discussion of current research in the development and application of two systems that support a rigorous reuse paradigm are presented: the Knowledge-Based Software Engineering Environment (KBSEE) and the Knowledge Acquisition fo the Preservation of Tradeoffs and Underlying Rationales (KAPTUR) systems. Emphasis is on a presentation of operational scenarios which highlight the major functional capabilities of the two systems.

  13. Bidirectional mereological reasoning in anatomical knowledge bases.

    PubMed Central

    Schulz, S.

    2001-01-01

    Mereological relationships--relationships between parts and wholes--are essential for ontological engineering in the anatomical domain. We propose a knowledge engineering approach that emulates mereological reasoning by taxonomic reasoning based on SEP triplets, a special data structure for the encoding of part-whole relations, which is fully embedded in the formal framework of standard description logics. We extend the SEP formalism in order to account not only for the part-of but also for the has-part relation, both being considered transitive in our domain. Furthermore we analyze the distinction between the ontological primitives singletons, collections and mass concepts in the anatomy domain and sketch how reasoning about these kinds of concepts can be accounted for in a knowledge representation language, using the extended SEP formalism. PMID:11825258

  14. Generative Adolescent Mathematical Learners: The Fabrication of Knowledge

    ERIC Educational Resources Information Center

    Lawler, Brian R.

    2008-01-01

    This dissertation is embedded in a deconstruction of the field of Mathematics Education in order to reconstitute the mathematics student as a generative mathematical learner. The purpose of the dissertation is to understand how generative adolescent mathematical learners (GAMLs) maneuver through their mathematics courses while maintaining such a…

  15. Examining the "Whole Child" to Generate Usable Knowledge

    ERIC Educational Resources Information Center

    Rappolt-Schlichtmann, Gabrielle; Ayoub, Catherine C.; Gravel, Jenna W.

    2009-01-01

    Despite the promise of scientific knowledge contributing to issues facing vulnerable children, families, and communities, typical approaches to research have made applications challenging. While contemporary theories of human development offer appropriate complexity, research has mostly failed to address dynamic developmental processes. Research…

  16. How Causal Knowledge Affects Classification: A Generative Theory of Categorization

    ERIC Educational Resources Information Center

    Rehder, Bob; Kim, ShinWoo

    2006-01-01

    Several theories have been proposed regarding how causal relations among features of objects affect how those objects are classified. The assumptions of these theories were tested in 3 experiments that manipulated the causal knowledge associated with novel categories. There were 3 results. The 1st was a multiple cause effect in which a feature's…

  17. Design of a knowledge-based welding advisor

    SciTech Connect

    Kleban, S.D.

    1996-06-01

    Expert system implementation can take numerous forms ranging form traditional declarative rule-based systems with if-then syntax to imperative programming languages that capture expertise in procedural code. The artificial intelligence community generally thinks of expert systems as rules or rule-bases and an inference engine to process the knowledge. The welding advisor developed at Sandia National Laboratories and described in this paper deviates from this by codifying expertise using object representation and methods. Objects allow computer scientists to model the world as humans perceive it giving us a very natural way to encode expert knowledge. The design of the welding advisor, which generates and evaluates solutions, will be compared and contrasted to a traditional rule- based system.

  18. A Collaborative Environment for Knowledge Base Development

    NASA Astrophysics Data System (ADS)

    Li, W.; Yang, C.; Raskin, R.; Nebert, D. D.; Wu, H.

    2009-12-01

    Knowledge Base (KB) is an essential component for capturing, structuring and defining the meanings of domain knowledge. It’s important in enabling the sharing and interoperability of scientific data and services in a smart manner. It’s also the foundation for most the research in semantic field, such as semantic reasoning and ranking. In collaborating with ESIP, GMU is developing an online interface and supporting infrastructure to allow semantic registration of datasets and other web resources. The semantic description of data, services, and scientific content will be collected and transformed to the KB. As a case study, the harvest of web map services from by Nordic mapping agencies to build a virtual Arctic spatial data infrastructure will be used as the domain example. To automate the process, a controlled vocabulary of certain subjects, such as solid water, is created to filter from existing data and service repositories to obtain a collection of closely related document. Then latent semantic indexing is utilized to analyze semantic relationship among concepts that appears in service document. At last, semantic structure in plain text will be mapped and automatically populated to the specific presentation of knowledge in the KB.

  19. Adaptive Knowledge Management of Project-Based Learning

    ERIC Educational Resources Information Center

    Tilchin, Oleg; Kittany, Mohamed

    2016-01-01

    The goal of an approach to Adaptive Knowledge Management (AKM) of project-based learning (PBL) is to intensify subject study through guiding, inducing, and facilitating development knowledge, accountability skills, and collaborative skills of students. Knowledge development is attained by knowledge acquisition, knowledge sharing, and knowledge…

  20. Knowledge-based public health situation awareness

    NASA Astrophysics Data System (ADS)

    Mirhaji, Parsa; Zhang, Jiajie; Srinivasan, Arunkumar; Richesson, Rachel L.; Smith, Jack W.

    2004-09-01

    There have been numerous efforts to create comprehensive databases from multiple sources to monitor the dynamics of public health and most specifically to detect the potential threats of bioterrorism before widespread dissemination. But there are not many evidences for the assertion that these systems are timely and dependable, or can reliably identify man made from natural incident. One must evaluate the value of so called 'syndromic surveillance systems' along with the costs involved in design, development, implementation and maintenance of such systems and the costs involved in investigation of the inevitable false alarms1. In this article we will introduce a new perspective to the problem domain with a shift in paradigm from 'surveillance' toward 'awareness'. As we conceptualize a rather different approach to tackle the problem, we will introduce a different methodology in application of information science, computer science, cognitive science and human-computer interaction concepts in design and development of so called 'public health situation awareness systems'. We will share some of our design and implementation concepts for the prototype system that is under development in the Center for Biosecurity and Public Health Informatics Research, in the University of Texas Health Science Center at Houston. The system is based on a knowledgebase containing ontologies with different layers of abstraction, from multiple domains, that provide the context for information integration, knowledge discovery, interactive data mining, information visualization, information sharing and communications. The modular design of the knowledgebase and its knowledge representation formalism enables incremental evolution of the system from a partial system to a comprehensive knowledgebase of 'public health situation awareness' as it acquires new knowledge through interactions with domain experts or automatic discovery of new knowledge.

  1. Irrelevance Reasoning in Knowledge Based Systems

    NASA Technical Reports Server (NTRS)

    Levy, A. Y.

    1993-01-01

    This dissertation considers the problem of reasoning about irrelevance of knowledge in a principled and efficient manner. Specifically, it is concerned with two key problems: (1) developing algorithms for automatically deciding what parts of a knowledge base are irrelevant to a query and (2) the utility of relevance reasoning. The dissertation describes a novel tool, the query-tree, for reasoning about irrelevance. Based on the query-tree, we develop several algorithms for deciding what formulas are irrelevant to a query. Our general framework sheds new light on the problem of detecting independence of queries from updates. We present new results that significantly extend previous work in this area. The framework also provides a setting in which to investigate the connection between the notion of irrelevance and the creation of abstractions. We propose a new approach to research on reasoning with abstractions, in which we investigate the properties of an abstraction by considering the irrelevance claims on which it is based. We demonstrate the potential of the approach for the cases of abstraction of predicates and projection of predicate arguments. Finally, we describe an application of relevance reasoning to the domain of modeling physical devices.

  2. Building a knowledge based economy in Russia using guided entrepreneurship

    NASA Astrophysics Data System (ADS)

    Reznik, Boris N.; Daniels, Marc; Ichim, Thomas E.; Reznik, David L.

    2005-06-01

    Despite advanced scientific and technological (S&T) expertise, the Russian economy is presently based upon manufacturing and raw material exports. Currently, governmental incentives are attempting to leverage the existing scientific infrastructure through the concept of building a Knowledge Based Economy. However, socio-economic changes do not occur solely by decree, but by alteration of approach to the market. Here we describe the "Guided Entrepreneurship" plan, a series of steps needed for generation of an army of entrepreneurs, which initiate a chain reaction of S&T-driven growth. The situation in Russia is placed in the framework of other areas where Guided Entrepreneurship has been successful.

  3. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  4. Towards a New Generation of Agricultural System Data, Models and Knowledge Products: Design and Improvement

    NASA Technical Reports Server (NTRS)

    Antle, John M.; Basso, Bruno; Conant, Richard T.; Godfray, H. Charles J.; Jones, James W.; Herrero, Mario; Howitt, Richard E.; Keating, Brian A.; Munoz-Carpena, Rafael; Rosenzweig, Cynthia

    2016-01-01

    This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.

  5. Selection of Construction Methods: A Knowledge-Based Approach

    PubMed Central

    Skibniewski, Miroslaw

    2013-01-01

    The appropriate selection of construction methods to be used during the execution of a construction project is a major determinant of high productivity, but sometimes this selection process is performed without the care and the systematic approach that it deserves, bringing negative consequences. This paper proposes a knowledge management approach that will enable the intelligent use of corporate experience and information and help to improve the selection of construction methods for a project. Then a knowledge-based system to support this decision-making process is proposed and described. To define and design the system, semistructured interviews were conducted within three construction companies with the purpose of studying the way that the method' selection process is carried out in practice and the knowledge associated with it. A prototype of a Construction Methods Knowledge System (CMKS) was developed and then validated with construction industry professionals. As a conclusion, the CMKS was perceived as a valuable tool for construction methods' selection, by helping companies to generate a corporate memory on this issue, reducing the reliance on individual knowledge and also the subjectivity of the decision-making process. The described benefits as provided by the system favor a better performance of construction projects. PMID:24453925

  6. Selection of construction methods: a knowledge-based approach.

    PubMed

    Ferrada, Ximena; Serpell, Alfredo; Skibniewski, Miroslaw

    2013-01-01

    The appropriate selection of construction methods to be used during the execution of a construction project is a major determinant of high productivity, but sometimes this selection process is performed without the care and the systematic approach that it deserves, bringing negative consequences. This paper proposes a knowledge management approach that will enable the intelligent use of corporate experience and information and help to improve the selection of construction methods for a project. Then a knowledge-based system to support this decision-making process is proposed and described. To define and design the system, semistructured interviews were conducted within three construction companies with the purpose of studying the way that the method' selection process is carried out in practice and the knowledge associated with it. A prototype of a Construction Methods Knowledge System (CMKS) was developed and then validated with construction industry professionals. As a conclusion, the CMKS was perceived as a valuable tool for construction methods' selection, by helping companies to generate a corporate memory on this issue, reducing the reliance on individual knowledge and also the subjectivity of the decision-making process. The described benefits as provided by the system favor a better performance of construction projects.

  7. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  8. Sustaining knowledge in the neutron generator community and benchmarking study. Phase II.

    SciTech Connect

    Huff, Tameka B.; Stubblefield, William Anthony; Cole, Benjamin Holland, II; Baldonado, Esther

    2010-08-01

    This report documents the second phase of work under the Sustainable Knowledge Management (SKM) project for the Neutron Generator organization at Sandia National Laboratories. Previous work under this project is documented in SAND2008-1777, Sustaining Knowledge in the Neutron Generator Community and Benchmarking Study. Knowledge management (KM) systems are necessary to preserve critical knowledge within organizations. A successful KM program should focus on people and the process for sharing, capturing, and applying knowledge. The Neutron Generator organization is developing KM systems to ensure knowledge is not lost. A benchmarking study involving site visits to outside industry plus additional resource research was conducted during this phase of the SKM project. The findings presented in this report are recommendations for making an SKM program successful. The recommendations are activities that promote sharing, capturing, and applying knowledge. The benchmarking effort, including the site visits to Toyota and Halliburton, provided valuable information on how the SEA KM team could incorporate a KM solution for not just the neutron generators (NG) community but the entire laboratory. The laboratory needs a KM program that allows members of the workforce to access, share, analyze, manage, and apply knowledge. KM activities, such as communities of practice (COP) and sharing best practices, provide a solution towards creating an enabling environment for KM. As more and more people leave organizations through retirement and job transfer, the need to preserve knowledge is essential. Creating an environment for the effective use of knowledge is vital to achieving the laboratory's mission.

  9. An object-based methodology for knowledge representation in SGML

    SciTech Connect

    Kelsey, R.L.; Hartley, R.T.; Webster, R.B.

    1997-11-01

    An object-based methodology for knowledge representation and its Standard Generalized Markup Language (SGML) implementation is presented. The methodology includes class, perspective domain, and event constructs for representing knowledge within an object paradigm. The perspective construct allows for representation of knowledge from multiple and varying viewpoints. The event construct allows actual use of knowledge to be represented. The SGML implementation of the methodology facilitates usability, structured, yet flexible knowledge design, and sharing and reuse of knowledge class libraries.

  10. Knowledge-based systems and NASA's software support environment

    NASA Technical Reports Server (NTRS)

    Dugan, Tim; Carmody, Cora; Lennington, Kent; Nelson, Bob

    1990-01-01

    A proposed role for knowledge-based systems within NASA's Software Support Environment (SSE) is described. The SSE is chartered to support all software development for the Space Station Freedom Program (SSFP). This includes support for development of knowledge-based systems and the integration of these systems with conventional software systems. In addition to the support of development of knowledge-based systems, various software development functions provided by the SSE will utilize knowledge-based systems technology.

  11. Automated knowledge base development from CAD/CAE databases

    NASA Technical Reports Server (NTRS)

    Wright, R. Glenn; Blanchard, Mary

    1988-01-01

    Knowledge base development requires a substantial investment in time, money, and resources in order to capture the knowledge and information necessary for anything other than trivial applications. This paper addresses a means to integrate the design and knowledge base development process through automated knowledge base development from CAD/CAE databases and files. Benefits of this approach include the development of a more efficient means of knowledge engineering, resulting in the timely creation of large knowledge based systems that are inherently free of error.

  12. Approach for ontological modeling of database schema for the generation of semantic knowledge on the web

    NASA Astrophysics Data System (ADS)

    Rozeva, Anna

    2015-11-01

    Currently there is large quantity of content on web pages that is generated from relational databases. Conceptual domain models provide for the integration of heterogeneous content on semantic level. The use of ontology as conceptual model of a relational data sources makes them available to web agents and services and provides for the employment of ontological techniques for data access, navigation and reasoning. The achievement of interoperability between relational databases and ontologies enriches the web with semantic knowledge. The establishment of semantic database conceptual model based on ontology facilitates the development of data integration systems that use ontology as unified global view. Approach for generation of ontologically based conceptual model is presented. The ontology representing the database schema is obtained by matching schema elements to ontology concepts. Algorithm of the matching process is designed. Infrastructure for the inclusion of mediation between database and ontology for bridging legacy data with formal semantic meaning is presented. Implementation of the knowledge modeling approach on sample database is performed.

  13. Building validation tools for knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Stachowitz, R. A.; Chang, C. L.; Stock, T. S.; Combs, J. B.

    1987-01-01

    The Expert Systems Validation Associate (EVA), a validation system under development at the Lockheed Artificial Intelligence Center for more than a year, provides a wide range of validation tools to check the correctness, consistency and completeness of a knowledge-based system. A declarative meta-language (higher-order language), is used to create a generic version of EVA to validate applications written in arbitrary expert system shells. The architecture and functionality of EVA are presented. The functionality includes Structure Check, Logic Check, Extended Structure Check (using semantic information), Extended Logic Check, Semantic Check, Omission Check, Rule Refinement, Control Check, Test Case Generation, Error Localization, and Behavior Verification.

  14. Tools for Assembling and Managing Scalable Knowledge Bases

    DTIC Science & Technology

    2003-02-01

    1 1.1 Knowledge Translation .......................................................................................................................... 1...areas of the knowledge base and ontology construction process and are outlined in more detail below. 1.1 Knowledge Translation As mentioned above...during KB merging operations. 2.2 The Translation Problem Figure 2: The knowledge translation problem. The general problem we set out to solve is

  15. The Relationship between Agriculture Knowledge Bases for Teaching and Sources of Knowledge

    ERIC Educational Resources Information Center

    Rice, Amber H.; Kitchel, Tracy

    2015-01-01

    The purpose of this study was to describe the agriculture knowledge bases for teaching of agriculture teachers and to see if a relationship existed between years of teaching experience, sources of knowledge, and development of pedagogical content knowledge (PCK), using quantitative methods. A model of PCK from mathematics was utilized as a…

  16. Proposing a Knowledge Base for Teaching Academic Content to English Language Learners: Disciplinary Linguistic Knowledge

    ERIC Educational Resources Information Center

    Turkan, Sultan; De Oliveira, Luciana C.; Lee, Okhee; Phelps, Geoffrey

    2014-01-01

    Background/Context: The current research on teacher knowledge and teacher accountability falls short on information about what teacher knowledge base could guide preparation and accountability of the mainstream teachers for meeting the academic needs of ELLs. Most recently, research on specialized knowledge for teaching has offered ways to…

  17. Linguistic Knowledge and Reasoning for Error Diagnosis and Feedback Generation.

    ERIC Educational Resources Information Center

    Delmonte, Rodolfo

    2003-01-01

    Presents four sets of natural language processing-based exercises for which error correction and feedback are produced by means of a rich database in which linguistic information is encoded either at the lexical or the grammatical level. (Author/VWL)

  18. Processing large sensor data sets for safeguards : the knowledge generation system.

    SciTech Connect

    Thomas, Maikel A.; Smartt, Heidi Anne; Matthews, Robert F.

    2012-04-01

    Modern nuclear facilities, such as reprocessing plants, present inspectors with significant challenges due in part to the sheer amount of equipment that must be safeguarded. The Sandia-developed and patented Knowledge Generation system was designed to automatically analyze large amounts of safeguards data to identify anomalous events of interest by comparing sensor readings with those expected from a process of interest and operator declarations. This paper describes a demonstration of the Knowledge Generation system using simulated accountability tank sensor data to represent part of a reprocessing plant. The demonstration indicated that Knowledge Generation has the potential to address several problems critical to the future of safeguards. It could be extended to facilitate remote inspections and trigger random inspections. Knowledge Generation could analyze data to establish trust hierarchies, to facilitate safeguards use of operator-owned sensors.

  19. The Effects of Domain Knowledge and Instructional Manipulation on Creative Idea Generation

    ERIC Educational Resources Information Center

    Hao, Ning

    2010-01-01

    The experiment was designed to explore the effects of domain knowledge, instructional manipulation, and the interaction between them on creative idea generation. Three groups of participants who respectively possessed the domain knowledge of biology, sports, or neither were asked to finish two tasks: imagining an extraterrestrial animal and…

  20. Route Generation for a Synthetic Character (BOT) Using a Partial or Incomplete Knowledge Route Generation Algorithm in UT2004 Virtual Environment

    NASA Technical Reports Server (NTRS)

    Hanold, Gregg T.; Hanold, David T.

    2010-01-01

    This paper presents a new Route Generation Algorithm that accurately and realistically represents human route planning and navigation for Military Operations in Urban Terrain (MOUT). The accuracy of this algorithm in representing human behavior is measured using the Unreal Tournament(Trademark) 2004 (UT2004) Game Engine to provide the simulation environment in which the differences between the routes taken by the human player and those of a Synthetic Agent (BOT) executing the A-star algorithm and the new Route Generation Algorithm can be compared. The new Route Generation Algorithm computes the BOT route based on partial or incomplete knowledge received from the UT2004 game engine during game play. To allow BOT navigation to occur continuously throughout the game play with incomplete knowledge of the terrain, a spatial network model of the UT2004 MOUT terrain is captured and stored in an Oracle 11 9 Spatial Data Object (SOO). The SOO allows a partial data query to be executed to generate continuous route updates based on the terrain knowledge, and stored dynamic BOT, Player and environmental parameters returned by the query. The partial data query permits the dynamic adjustment of the planned routes by the Route Generation Algorithm based on the current state of the environment during a simulation. The dynamic nature of this algorithm more accurately allows the BOT to mimic the routes taken by the human executing under the same conditions thereby improving the realism of the BOT in a MOUT simulation environment.

  1. [Trends on generation and reproduction of knowledge about economic evaluation and health].

    PubMed

    Arredondo, A; Parada, I

    2001-08-01

    This paper identifies the trends and recent progress in the generation and reproduction of knowledge on health economic evaluation. Analysis is organized along nine public health action fields, namely: health determinants and predictors, economic value of health, healthcare demand, healthcare supply, microeconomic evaluation of healthcare, healthcare market balance, evaluation of policy instruments, general evaluation of the health system, and healthcare planning, regulation and supervision. Each action field is defined to place the reader in the proper setting and level of analysis. In addition, thematic research topics developed in each action field are proposed and discussed. The generation and reproduction of knowledge on the different action fields was based on the review of the bibliographic databases MEDLINE and LILACS for the 1992-2000 period. Results lead to the conclusion that development and application of economic evaluation of healthcare has been uneven across different countries and that there is a growing increase of applications starting in 1994, the year of initiation of healthcare reform in Latin America.

  2. Generating a mortality model from a pediatric ICU (PICU) database utilizing knowledge discovery.

    PubMed Central

    Kennedy, Curtis E.; Aoki, Noriaki

    2002-01-01

    Current models for predicting outcomes are limited by biases inherent in a priori hypothesis generation. Knowledge discovery algorithms generate models directly from databases, minimizing such limitations. Our objective was to generate a mortality model from a PICU database utilizing knowledge discovery techniques. The database contained 5067 records with 192 clinically relevant variables. It was randomly split into training (75%) and validation (25%) groups. We used decision tree induction to generate a mortality model from the training data, and validated its performance on the validation data. The original PRISM algorithm was used for comparison. The decision tree model contained 25 variables and predicted 53/88 deaths; 29 correctly (Sens:33%, Spec:98%, PPV:54%). PRISM predicted 27/88 deaths correctly (Sens:30%, Spec:98%, PPV:51%). Performance difference between models was not significant. We conclude that knowledge discovery algorithms can generate a mortality model from a PICU database, helping establish validity of such tools in the clinical medical domain. PMID:12463850

  3. Knowledge-based reusable software synthesis system

    NASA Technical Reports Server (NTRS)

    Donaldson, Cammie

    1989-01-01

    The Eli system, a knowledge-based reusable software synthesis system, is being developed for NASA Langley under a Phase 2 SBIR contract. Named after Eli Whitney, the inventor of interchangeable parts, Eli assists engineers of large-scale software systems in reusing components while they are composing their software specifications or designs. Eli will identify reuse potential, search for components, select component variants, and synthesize components into the developer's specifications. The Eli project began as a Phase 1 SBIR to define a reusable software synthesis methodology that integrates reusabilityinto the top-down development process and to develop an approach for an expert system to promote and accomplish reuse. The objectives of the Eli Phase 2 work are to integrate advanced technologies to automate the development of reusable components within the context of large system developments, to integrate with user development methodologies without significant changes in method or learning of special languages, and to make reuse the easiest operation to perform. Eli will try to address a number of reuse problems including developing software with reusable components, managing reusable components, identifying reusable components, and transitioning reuse technology. Eli is both a library facility for classifying, storing, and retrieving reusable components and a design environment that emphasizes, encourages, and supports reuse.

  4. IGENPRO knowledge-based operator support system.

    SciTech Connect

    Morman, J. A.

    1998-07-01

    Research and development is being performed on the knowledge-based IGENPRO operator support package for plant transient diagnostics and management to provide operator assistance during off-normal plant transient conditions. A generic thermal-hydraulic (T-H) first-principles approach is being implemented using automated reasoning, artificial neural networks and fuzzy logic to produce a generic T-H system-independent/plant-independent package. The IGENPRO package has a modular structure composed of three modules: the transient trend analysis module PROTREN, the process diagnostics module PRODIAG and the process management module PROMANA. Cooperative research and development work has focused on the PRODIAG diagnostic module of the IGENPRO package and the operator training matrix of transients used at the Braidwood Pressurized Water Reactor station. Promising simulator testing results with PRODIAG have been obtained for the Braidwood Chemical and Volume Control System (CVCS), and the Component Cooling Water System. Initial CVCS test results have also been obtained for the PROTREN module. The PROMANA effort also involves the CVCS. Future work will be focused on the long-term, slow and mild degradation transients where diagnoses of incipient T-H component failure prior to forced outage events is required. This will enhance the capability of the IGENPRO system as a predictive maintenance tool for plant staff and operator support.

  5. DeepDive: Declarative Knowledge Base Construction

    PubMed Central

    De Sa, Christopher; Ratner, Alex; Ré, Christopher; Shin, Jaeho; Wang, Feiran; Wu, Sen; Zhang, Ce

    2016-01-01

    The dark data extraction or knowledge base construction (KBC) problem is to populate a SQL database with information from unstructured data sources including emails, webpages, and pdf reports. KBC is a long-standing problem in industry and research that encompasses problems of data extraction, cleaning, and integration. We describe DeepDive, a system that combines database and machine learning ideas to help develop KBC systems. The key idea in DeepDive is that statistical inference and machine learning are key tools to attack classical data problems in extraction, cleaning, and integration in a unified and more effective manner. DeepDive programs are declarative in that one cannot write probabilistic inference algorithms; instead, one interacts by defining features or rules about the domain. A key reason for this design choice is to enable domain experts to build their own KBC systems. We present the applications, abstractions, and techniques of DeepDive employed to accelerate construction of KBC systems. PMID:28344371

  6. [Precision Nursing: Individual-Based Knowledge Translation].

    PubMed

    Chiang, Li-Chi; Yeh, Mei-Ling; Su, Sui-Lung

    2016-12-01

    U.S. President Obama announced a new era of precision medicine in the Precision Medicine Initiative (PMI). This initiative aims to accelerate the progress of personalized medicine in light of individual requirements for prevention and treatment in order to improve the state of individual and public health. The recent and dramatic development of large-scale biologic databases (such as the human genome sequence), powerful methods for characterizing patients (such as genomics, microbiome, diverse biomarkers, and even pharmacogenomics), and computational tools for analyzing big data are maximizing the potential benefits of precision medicine. Nursing science should follow and keep pace with this trend in order to develop empirical knowledge and expertise in the area of personalized nursing care. Nursing scientists must encourage, examine, and put into practice innovative research on precision nursing in order to provide evidence-based guidance to clinical practice. The applications in personalized precision nursing care include: explanations of personalized information such as the results of genetic testing; patient advocacy and support; anticipation of results and treatment; ongoing chronic monitoring; and support for shared decision-making throughout the disease trajectory. Further, attention must focus on the family and the ethical implications of taking a personalized approach to care. Nurses will need to embrace the paradigm shift to precision nursing and work collaboratively across disciplines to provide the optimal personalized care to patients. If realized, the full potential of precision nursing will provide the best chance for good health for all.

  7. Organizational culture and knowledge management in the electric power generation industry

    NASA Astrophysics Data System (ADS)

    Mayfield, Robert D.

    Scarcity of knowledge and expertise is a challenge in the electric power generation industry. Today's most pervasive knowledge issues result from employee turnover and the constant movement of employees from project to project inside organizations. To address scarcity of knowledge and expertise, organizations must enable employees to capture, transfer, and use mission-critical explicit and tacit knowledge. The purpose of this qualitative grounded theory research was to examine the relationship between and among organizations within the electric power generation industry developing knowledge management processes designed to retain, share, and use the industry, institutional, and technical knowledge upon which the organizations depend. The research findings show that knowledge management is a business problem within the domain of information systems and management. The risks associated with losing mission critical-knowledge can be measured using metrics on employee retention, recruitment, productivity, training and benchmarking. Certain enablers must be in place in order to engage people, encourage cooperation, create a knowledge-sharing culture, and, ultimately change behavior. The research revealed the following change enablers that support knowledge management strategies: (a) training - blended learning, (b) communities of practice, (c) cross-functional teams, (d) rewards and recognition programs, (e) active senior management support, (f) communication and awareness, (g) succession planning, and (h) team organizational culture.

  8. Joint Knowledge Generation Between Climate Science and Infrastructure Engineering

    NASA Astrophysics Data System (ADS)

    Stoner, A. M. K.; Hayhoe, K.; Jacobs, J. M.

    2015-12-01

    Over the past decade the engineering community has become increasingly aware of the need to incorporate climate projections into the planning and design of sensitive infrastructure. However, this is a task that is easier said than done. This presentation will discuss some of the successes and hurdles experiences through the past year, from a climate scientist's perspective, working with engineers in infrastructure research and applied engineering through the Infrastructure & Climate Network (ICNet). Engineers rely on strict building codes and ordinances, and can be the subject of lawsuits if those codes are not followed. Matters are further complicated by the uncertainty inherent to climate projections, which include short-term natural variability, as well as the influence of scientific uncertainty and even human behavior on the rate and magnitude of change. Climate scientists typically address uncertainty by creating projections based on multiple models following different future scenarios. This uncertainty is difficult to incorporate into engineering projects, however, due to the fact that they cannot build two different bridges, one allowing for a lower amount of change, and another for a higher. More often than not there is a considerable difference between the costs of building two such bridges, which means that available funds often are the deciding factor. Discussions of climate science are often well received with engineers who work in the research area of infrastructure; going a step further, however, and implementing it in applied engineering projects can be challenging. This presentation will discuss some of the challenges and opportunities inherent to collaborations between climate scientists and transportation engineers, drawing from a range of studies including truck weight restrictions on roads during the spring thaw, and bridge deck performance due to environmental forcings.

  9. Establishing a national knowledge translation and generation network in kidney disease: the CAnadian KidNey KNowledge TraNslation and GEneration NeTwork.

    PubMed

    Manns, Braden; Barrett, Brendan; Evans, Michael; Garg, Amit; Hemmelgarn, Brenda; Kappel, Joanne; Klarenbach, Scott; Madore, Francois; Parfrey, Patrick; Samuel, Susan; Soroka, Steven; Suri, Rita; Tonelli, Marcello; Wald, Ron; Walsh, Michael; Zappitelli, Michael

    2014-01-01

    Patients with chronic kidney disease (CKD) do not always receive care consistent with guidelines, in part due to complexities in CKD management, lack of randomized trial data to inform care, and a failure to disseminate best practice. At a 2007 conference of key Canadian stakeholders in kidney disease, attendees noted that the impact of Canadian Society of Nephrology (CSN) guidelines was attenuated given limited formal linkages between the CSN Clinical Practice Guidelines Group, kidney researchers, decision makers and knowledge users, and that further knowledge was required to guide care in patients with kidney disease. The idea for the Canadian Kidney Knowledge Translation and Generation Network (CANN-NET) developed from this meeting. CANN-NET is a pan-Canadian network established in partnership with CSN, the Kidney Foundation of Canada and other professional societies to improve the care and outcomes of patients with and at risk for kidney disease. The initial priority areas for knowledge translation include improving optimal timing of dialysis initiation, and increasing the appropriate use of home dialysis. Given the urgent need for new knowledge, CANN-NET has also brought together a national group of experienced Canadian researchers to address knowledge gaps by encouraging and supporting multicentre randomized trials in priority areas, including management of cardiovascular disease in patients with kidney failure.

  10. A framework for knowledge acquisition, representation and problem-solving in knowledge-based planning

    NASA Astrophysics Data System (ADS)

    Martinez-Bermudez, Iliana

    This research addresses the problem of developing planning knowledge-based applications. In particular, it is concerned with the problems of knowledge acquisition and representation---the issues that remain an impediment to the development of large-scale, knowledge-based planning applications. This work aims to develop a model of planning problem solving that facilitates expert knowledge elicitation and also supports effective problem solving. Achieving this goal requires determining the types of knowledge used by planning experts, the structure of this knowledge, and the problem-solving process that results in the plan. While answering these questions it became clear that the knowledge structure, as well as the process of problem solving, largely depends on the knowledge available to the expert. This dissertation proposes classification of planning problems based on their use of expert knowledge. Such classification can help in the selection of the appropriate planning method when dealing with a specific planning problem. The research concentrates on one of the identified classes of planning problems that can be characterized by well-defined and well-structured problem-solving knowledge. To achieve a more complete knowledge representation architecture for such problems, this work employs the task-specific approach to problem solving. The result of this endeavor is a task-specific methodology that allows the representation and use of planning knowledge in a structural, consistent manner specific to the domain of the application. The shell for building a knowledge-based planning application was created as a proof of concept for the methodology described in this dissertation. This shell enabled the development of a system for manufacturing planning---COMPLAN. COMPLAN encompasses knowledge related to four generic techniques used in composite material manufacturing and, given the description of the composite part, creates a family of plans capable of producing it.

  11. Case-based reasoning: The marriage of knowledge base and data base

    NASA Technical Reports Server (NTRS)

    Pulaski, Kirt; Casadaban, Cyprian

    1988-01-01

    The coupling of data and knowledge has a synergistic effect when building an intelligent data base. The goal is to integrate the data and knowledge almost to the point of indistinguishability, permitting them to be used interchangeably. Examples given in this paper suggest that Case-Based Reasoning is a more integrated way to link data and knowledge than pure rule-based reasoning.

  12. A Knowledge-Based System Developer for aerospace applications

    NASA Technical Reports Server (NTRS)

    Shi, George Z.; Wu, Kewei; Fensky, Connie S.; Lo, Ching F.

    1993-01-01

    A prototype Knowledge-Based System Developer (KBSD) has been developed for aerospace applications by utilizing artificial intelligence technology. The KBSD directly acquires knowledge from domain experts through a graphical interface then builds expert systems from that knowledge. This raises the state of the art of knowledge acquisition/expert system technology to a new level by lessening the need for skilled knowledge engineers. The feasibility, applicability , and efficiency of the proposed concept was established, making a continuation which would develop the prototype to a full-scale general-purpose knowledge-based system developer justifiable. The KBSD has great commercial potential. It will provide a marketable software shell which alleviates the need for knowledge engineers and increase productivity in the workplace. The KBSD will therefore make knowledge-based systems available to a large portion of industry.

  13. Creating a knowledge base of biological research papers

    SciTech Connect

    Hafner, C.D.; Baclawski, K.; Futrelle, R.P.; Fridman, N.

    1994-12-31

    Intelligent text-oriented tools for representing and searching the biological research literature are being developed, which combine object-oriented databases with artificial intelligence techniques to create a richly structured knowledge base of Materials and Methods sections of biological research papers. A knowledge model of experimental processes, biological and chemical substances, and analytical techniques is described, based on the representation techniques of taxonomic semantic nets and knowledge frames. Two approaches to populating the knowledge base with the contents of biological research papers are described: natural language processing and an interactive knowledge definition tool.

  14. Knowledge Generation

    SciTech Connect

    BRABSON,JOHN M.; DELAND,SHARON M.

    2000-11-02

    Unattended monitoring systems are being studied as a means of reducing both the cost and intrusiveness of present nuclear safeguards approaches. Such systems present the classic information overload problem to anyone trying to interpret the resulting data not only because of the sheer quantity of data but also because of the problems inherent in trying to correlate information from more than one source. As a consequence, analysis efforts to date have mostly concentrated on checking thresholds or diagnosing failures. Clearly more sophisticated analysis techniques are required to enable automated verification of expected activities level concepts in order to make automated judgments about safety, sensor system integrity, sensor data quality, diversion, and accountancy.

  15. System Engineering for the NNSA Knowledge Base

    NASA Astrophysics Data System (ADS)

    Young, C.; Ballard, S.; Hipp, J.

    2006-05-01

    To improve ground-based nuclear explosion monitoring capability, GNEM R&E (Ground-based Nuclear Explosion Monitoring Research & Engineering) researchers at the national laboratories have collected an extensive set of raw data products. These raw data are used to develop higher level products (e.g. 2D and 3D travel time models) to better characterize the Earth at regional scales. The processed products and selected portions of the raw data are stored in an archiving and access system known as the NNSA (National Nuclear Security Administration) Knowledge Base (KB), which is engineered to meet the requirements of operational monitoring authorities. At its core, the KB is a data archive, and the effectiveness of the KB is ultimately determined by the quality of the data content, but access to that content is completely controlled by the information system in which that content is embedded. Developing this system has been the task of Sandia National Laboratories (SNL), and in this paper we discuss some of the significant challenges we have faced and the solutions we have engineered. One of the biggest system challenges with raw data has been integrating database content from the various sources to yield an overall KB product that is comprehensive, thorough and validated, yet minimizes the amount of disk storage required. Researchers at different facilities often use the same data to develop their products, and this redundancy must be removed in the delivered KB, ideally without requiring any additional effort on the part of the researchers. Further, related data content must be grouped together for KB user convenience. Initially SNL used whatever tools were already available for these tasks, and did the other tasks manually. The ever-growing volume of KB data to be merged, as well as a need for more control of merging utilities, led SNL to develop our own java software package, consisting of a low- level database utility library upon which we have built several

  16. Agent-Based Knowledge Discovery for Modeling and Simulation

    SciTech Connect

    Haack, Jereme N.; Cowell, Andrew J.; Marshall, Eric J.; Fligg, Alan K.; Gregory, Michelle L.; McGrath, Liam R.

    2009-09-15

    This paper describes an approach to using agent technology to extend the automated discovery mechanism of the Knowledge Encapsulation Framework (KEF). KEF is a suite of tools to enable the linking of knowledge inputs (relevant, domain-specific evidence) to modeling and simulation projects, as well as other domains that require an effective collaborative workspace for knowledge-based tasks. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a semantic wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.

  17. An object-based methodology for knowledge representation

    SciTech Connect

    Kelsey, R.L.; Hartley, R.T.; Webster, R.B.

    1997-11-01

    An object based methodology for knowledge representation is presented. The constructs and notation to the methodology are described and illustrated with examples. The ``blocks world,`` a classic artificial intelligence problem, is used to illustrate some of the features of the methodology including perspectives and events. Representing knowledge with perspectives can enrich the detail of the knowledge and facilitate potential lines of reasoning. Events allow example uses of the knowledge to be represented along with the contained knowledge. Other features include the extensibility and maintainability of knowledge represented in the methodology.

  18. Project-Based Learning and the Limits of Corporate Knowledge.

    ERIC Educational Resources Information Center

    Rhodes, Carl; Garrick, John

    2003-01-01

    Analysis of management discourses, especially project-based learning and knowledge management, indicates that such terms as human capital, working knowledge, and knowledge assets construe managerial workers as cogito-economic subjects. Although workplace learning should develop economically related capabilities, such discourses imply that these…

  19. Knowledge Sharing in an American Multinational Company Based in Malaysia

    ERIC Educational Resources Information Center

    Ling, Chen Wai; Sandhu, Manjit S.; Jain, Kamal Kishore

    2009-01-01

    Purpose: This paper seeks to examine the views of executives working in an American based multinational company (MNC) about knowledge sharing, barriers to knowledge sharing, and strategies to promote knowledge sharing. Design/methodology/approach: This study was carried out in phases. In the first phase, a topology of organizational mechanisms for…

  20. Knowledge-Based Aid: A Four Agency Comparative Study

    ERIC Educational Resources Information Center

    McGrath, Simon; King, Kenneth

    2004-01-01

    Part of the response of many development cooperation agencies to the challenges of globalisation, ICTs and the knowledge economy is to emphasise the importance of knowledge for development. This paper looks at the discourses and practices of ''knowledge-based aid'' through an exploration of four agencies: the World Bank, DFID, Sida and JICA. It…

  1. From knowledge generation to knowledge archive. A general strategy using TOPS-MODE with DEREK to formulate new alerts for skin sensitization.

    PubMed

    Estrada, Ernesto; Patlewicz, Grace; Gutierrez, Yaquelin

    2004-01-01

    A general strategy for knowledge flow concerning skin sensitization based on the combined use of TOPS-MODE and DEREK expert system is proposed. TOPS-MODE is used as a knowledge generator, while DEREK represents the knowledge archive. A TOPS-MODE classification model allows the identification of structural fragments and groups responsible for strong/moderate skin sensitization. These structural contributions are sorted, analyzed, and graphically displayed in an appropriate way allowing the identification of several structural alerts for skin sensitization. Nine structural alerts already implemented in DEREK are identified using this strategy. They comprise, among others, alkyl halides, aldehydes, alpha,beta-unsaturated compounds, aromatic amines, phenols, hydroquinone, isothiazolinone, and alkyl sulfonates. Four new hypotheses are generated using TOPS-MODE structural contributions to skin sensitization, which are not recognized as structural alerts by DEREK. They include the reduction of aromatic nitro groups and epoxidation reaction of double bonds as metabolic activation steps that can lead to reactive haptens which can trigger the skin sensitization mechanism. Another new alert is based on 1,2,5-thiadiazole-1,1-dioxide for which we have identified a possible mechanism explaining its strong skin sensitization profile. It is based on the existence of a tautomeric equilibrium and further reaction with nucleophiles, which are both supported by experimental evidence. Finally, we have identified a possible new mechanism for the skin sensitization of nonreactive compounds, which involves the formation of noncovalent complexes with proteins in a processing- and metabolism-independent way.

  2. Bermuda Triangle or three to tango: generation Y, e-health and knowledge management.

    PubMed

    Yee, Kwang Chien

    2007-01-01

    Generation Y workers are slowly gathering critical mass in the healthcare sector. The sustainability of future healthcare is highly dependent on this group of workers. This generation of workers loves technology and thrives in stimulating environments. They have great thirst for life-experience and therefore they move from one working environment to the other. The healthcare system has a hierarchical operational, information and knowledge structure, which unfortunately might not be the ideal ground to integrate with generation Y. The challenges ahead present a fantastic opportunity for electronic health implementation and knowledge management to flourish. Generation Y workers, however, have very different expectation of technology utilisation, technology design and knowledge presentation. This paper will argue that a clear understanding of this group of workers is essential for researchers in health informatics and knowledge management in order to provide socio-technical integrated solution for this group of future workers. The sustainability of a quality healthcare system will depend upon the integration of generation Y, health informatics and knowledge management strategies in a re-invented healthcare system.

  3. Weather, knowledge base and life-style

    NASA Astrophysics Data System (ADS)

    Bohle, Martin

    2015-04-01

    Why to main-stream curiosity for earth-science topics, thus to appraise these topics as of public interest? Namely, to influence practices how humankind's activities intersect the geosphere. How to main-stream that curiosity for earth-science topics? Namely, by weaving diverse concerns into common threads drawing on a wide range of perspectives: be it beauty or particularity of ordinary or special phenomena, evaluating hazards for or from mundane environments, or connecting the scholarly investigation with concerns of citizens at large; applying for threading traditional or modern media, arts or story-telling. Three examples: First "weather"; weather is a topic of primordial interest for most people: weather impacts on humans lives, be it for settlement, for food, for mobility, for hunting, for fishing, or for battle. It is the single earth-science topic that went "prime-time" since in the early 1950-ties the broadcasting of weather forecasts started and meteorologists present their work to the public, daily. Second "knowledge base"; earth-sciences are a relevant for modern societies' economy and value setting: earth-sciences provide insights into the evolution of live-bearing planets, the functioning of Earth's systems and the impact of humankind's activities on biogeochemical systems on Earth. These insights bear on production of goods, living conditions and individual well-being. Third "life-style"; citizen's urban culture prejudice their experiential connections: earth-sciences related phenomena are witnessed rarely, even most weather phenomena. In the past, traditional rural communities mediated their rich experiences through earth-centric story-telling. In course of the global urbanisation process this culture has given place to society-centric story-telling. Only recently anthropogenic global change triggered discussions on geoengineering, hazard mitigation, demographics, which interwoven with arts, linguistics and cultural histories offer a rich narrative

  4. Zero-Knowledge Proof Based Node Authentication

    DTIC Science & Technology

    2009-05-01

    results are inconclusive and require additional experiments. 15. SUBJECT TERMS Airborne Network Protocol, Zero Knowledge Proof, Graph Isomorphism...we developed the basic guidelines for this selection, our results are inconclusive and require additional experiments. 2 2. INTRODUCTION...manufacture, use, or sell any patented invention that may relate to them. This report is the result of contracted fundamental research deemed exempt

  5. Active Data/Knowledge Base Dictionary

    DTIC Science & Technology

    1991-09-01

    maintenance of integrity asser- tions using redundant aggregate. In Proceedings of the 6th Int’l Conf. on Very Large Databases, pages 126-136, Alfonso ...Minker. Logic and Databases. Plenum Press, New York, 1978. [GM83] Hector Garcia- Molina . Using semantic knowledge for transaction processing in a dis

  6. Construction of Expert Knowledge Monitoring and Assessment System Based on Integral Method of Knowledge Evaluation

    ERIC Educational Resources Information Center

    Golovachyova, Viktoriya N.; Menlibekova, Gulbakhyt Zh.; Abayeva, Nella F.; Ten, Tatyana L.; Kogaya, Galina D.

    2016-01-01

    Using computer-based monitoring systems that rely on tests could be the most effective way of knowledge evaluation. The problem of objective knowledge assessment by means of testing takes on a new dimension in the context of new paradigms in education. The analysis of the existing test methods enabled us to conclude that tests with selected…

  7. Students' Refinement of Knowledge during the Development of Knowledge Bases for Expert Systems.

    ERIC Educational Resources Information Center

    Lippert, Renate; Finley, Fred

    The refinement of the cognitive knowledge base was studied through exploration of the transition from novice to expert and the use of an instructional strategy called novice knowledge engineering. Six college freshmen, who were enrolled in an honors physics course, used an expert system to create questions, decisions, rules, and explanations…

  8. A knowledge based system for scientific data visualization

    NASA Technical Reports Server (NTRS)

    Senay, Hikmet; Ignatius, Eve

    1992-01-01

    A knowledge-based system, called visualization tool assistant (VISTA), which was developed to assist scientists in the design of scientific data visualization techniques, is described. The system derives its knowledge from several sources which provide information about data characteristics, visualization primitives, and effective visual perception. The design methodology employed by the system is based on a sequence of transformations which decomposes a data set into a set of data partitions, maps this set of partitions to visualization primitives, and combines these primitives into a composite visualization technique design. Although the primary function of the system is to generate an effective visualization technique design for a given data set by using principles of visual perception the system also allows users to interactively modify the design, and renders the resulting image using a variety of rendering algorithms. The current version of the system primarily supports visualization techniques having applicability in earth and space sciences, although it may easily be extended to include other techniques useful in other disciplines such as computational fluid dynamics, finite-element analysis and medical imaging.

  9. Advancing the hydrogen safety knowledge base

    SciTech Connect

    Weiner, S. C.

    2014-08-29

    The International Energy Agency's Hydrogen Implementing Agreement (IEA HIA) was established in 1977 to pursue collaborative hydrogen research and development and information exchange among its member countries. Information and knowledge dissemination is a key aspect of the work within IEA HIA tasks, and case studies, technical reports and presentations/publications often result from the collaborative efforts. The work conducted in hydrogen safety under Task 31 and its predecessor, Task 19, can positively impact the objectives of national programs even in cases for which a specific task report is not published. As a result, the interactions within Task 31 illustrate how technology information and knowledge exchange among participating hydrogen safety experts serve the objectives intended by the IEA HIA.

  10. Advancing the hydrogen safety knowledge base

    DOE PAGES

    Weiner, S. C.

    2014-08-29

    The International Energy Agency's Hydrogen Implementing Agreement (IEA HIA) was established in 1977 to pursue collaborative hydrogen research and development and information exchange among its member countries. Information and knowledge dissemination is a key aspect of the work within IEA HIA tasks, and case studies, technical reports and presentations/publications often result from the collaborative efforts. The work conducted in hydrogen safety under Task 31 and its predecessor, Task 19, can positively impact the objectives of national programs even in cases for which a specific task report is not published. As a result, the interactions within Task 31 illustrate how technologymore » information and knowledge exchange among participating hydrogen safety experts serve the objectives intended by the IEA HIA.« less

  11. Fundamentals of Knowledge-Based Techniques

    DTIC Science & Technology

    2006-09-01

    Predicate logic knowledge representation models what are known as facts within a domain and represents these facts so that an inference engine or...ANyyy, x]Target Relation [ANsss, T1] [ANyyy, T2] RxPower Relation [T1, x] [T2, x] Figure 3. Relational DBMS Model of Facts and Causes...Semantic Nets Originally semantic nets were developed for the purpose of modeling the English language (7), for a computer to understand. A method

  12. Requirements for an on-line knowledge-based anatomy information system.

    PubMed Central

    Brinkley, J. F.; Rosse, C.

    1998-01-01

    User feedback from the Digital Anatomist Web-based anatomy atlases, together with over 20 years of anatomy teaching experience, were used to formulate the requirements and system design for a next-generation anatomy information system. The main characteristic of this system over current image-based approaches is that it is knowledge-based. A foundational model of anatomy is accessed by an intelligent agent that uses its knowledge about the available anatomy resources and the user types to generate customized interfaces. Current usage statistics suggest that even partial implementation of this design will be of great practical value for both clinical and educational needs. Images Figure 1 PMID:9929347

  13. Improved knowledge diffusion model based on the collaboration hypernetwork

    NASA Astrophysics Data System (ADS)

    Wang, Jiang-Pan; Guo, Qiang; Yang, Guang-Yong; Liu, Jian-Guo

    2015-06-01

    The process for absorbing knowledge becomes an essential element for innovation in firms and in adapting to changes in the competitive environment. In this paper, we present an improved knowledge diffusion hypernetwork (IKDH) model based on the idea that knowledge will spread from the target node to all its neighbors in terms of the hyperedge and knowledge stock. We apply the average knowledge stock V(t) , the variable σ2(t) , and the variance coefficient c(t) to evaluate the performance of knowledge diffusion. By analyzing different knowledge diffusion ways, selection ways of the highly knowledgeable nodes, hypernetwork sizes and hypernetwork structures for the performance of knowledge diffusion, results show that the diffusion speed of IKDH model is 3.64 times faster than that of traditional knowledge diffusion (TKDH) model. Besides, it is three times faster to diffuse knowledge by randomly selecting "expert" nodes than that by selecting large-hyperdegree nodes as "expert" nodes. Furthermore, either the closer network structure or smaller network size results in the faster knowledge diffusion.

  14. How To Manage the Emerging Generational Divide in the Contemporary Knowledge-Rich Workplace.

    ERIC Educational Resources Information Center

    Novicevic, Milorad M.; Buckley, M. Ronald

    2001-01-01

    Addresses the manager's dilemmas and options in resolving emerging latent intergenerational conflict in the contemporary knowledge-rich workplace. Topics include a theoretical framework for generational divide management; the polarization in task requirements; social and environmental factors; differences in employee needs and expectations; and…

  15. Towards a Reconceptualisation of "Word" for High Frequency Word Generation in Word Knowledge Studies

    ERIC Educational Resources Information Center

    Sibanda, Jabulani; Baxen, Jean

    2014-01-01

    The present paper derives from a PhD study investigating the nexus between Grade 4 textbook vocabulary demands and Grade 3 isiXhosa-speaking learners' knowledge of that vocabulary to enable them to read to learn in Grade 4. The paper challenges the efficacy of the four current definitions of "word" for generating high frequency words…

  16. Data Mining in Finance: Using Counterfactuals To Generate Knowledge from Organizational Information Systems.

    ERIC Educational Resources Information Center

    Dhar, Vasant

    1998-01-01

    Shows how counterfactuals and machine learning methods can be used to guide exploration of large databases that addresses some of the fundamental problems that organizations face in learning from data. Discusses data mining, particularly in the financial arena; generating useful knowledge from data; and the evaluation of counterfactuals. (LRW)

  17. "Comments on Greenhow, Robelia, and Hughes": Technologies that Facilitate Generating Knowledge and Possibly Wisdom

    ERIC Educational Resources Information Center

    Dede, Chris

    2009-01-01

    Greenhow, Robelia, and Hughes (2009) argue that Web 2.0 media are well suited to enhancing the education research community's purpose of generating and sharing knowledge. The author of this comment article first articulates how a research infrastructure with capabilities for communal bookmarking, photo and video sharing, social networking, wikis,…

  18. Knowledge sources for evidence-based practice in rheumatology nursing.

    PubMed

    Neher, Margit; Ståhl, Christian; Ellström, Per-Erik; Nilsen, Per

    2015-12-01

    As rheumatology nursing develops and extends, knowledge about current use of knowledge in rheumatology nursing practice may guide discussions about future knowledge needs. To explore what perceptions rheumatology nurses have about their knowledge sources and about what knowledge they use in their practice, 12 nurses working in specialist rheumatology were interviewed using a semi-structured interview guide. The data were analyzed using conventional qualitative content analysis. The analysis yielded four types of knowledge sources in clinical practice: interaction with others in the workplace, contacts outside the workplace, written materials, and previous knowledge and experience. Colleagues, and physicians in particular, were important for informal learning in daily rheumatology practice. Evidence from the medical arena was accessed through medical specialists, while nursing research was used less. Facilitating informal learning and continuing formal education is proposed as a way toward a more evidence-based practice in extended roles.

  19. D and D knowledge management information tool - a web based system developed to share D and D knowledge worldwide

    SciTech Connect

    Lagos, L.; Upadhyay, H.; Shoffner, P.

    2013-07-01

    Deactivation and decommissioning (D and D) work is a high risk and technically challenging enterprise within the U.S. Department of Energy complex. During the past three decades, the DOE's Office of Environmental Management has been in charge of carrying out one of the largest environmental restoration efforts in the world: the cleanup of the Manhattan Project legacy. In today's corporate world, worker experiences and knowledge that have developed over time represent a valuable corporate asset. The ever-dynamic workplace, coupled with an aging workforce, presents corporations with the ongoing challenge of preserving work-related experiences and knowledge for cross-generational knowledge transfer to the future workforce [5]. To prevent the D and D knowledge base and expertise from being lost over time, the DOE and the Applied Research Center at Florida International University (FIU) have developed the web-based Knowledge Management Information Tool (KM-IT) to capture and maintain this valuable information in a universally available and easily accessible and usable system. The D and D KM-IT was developed in collaboration with DOE Headquarters (HQ), the Energy Facility Contractors Group (EFCOG), and the ALARA [as low as reasonably achievable] Centers at Savannah River Sites to preserve the D and D information generated and collected by the D and D community. This is an open secured system that can be accessed from https://www.dndkm.org over the web and through mobile devices at https://m.dndkm.org. This knowledge system serves as a centralized repository and provides a common interface for D and D-related activities. It also improves efficiency by reducing the need to rediscover knowledge and promotes the reuse of existing knowledge. It is a community-driven system that facilitates the gathering, analyzing, storing, and sharing of knowledge and information within the D and D community. It assists the DOE D and D community in identifying potential solutions to their

  20. KAT: A Flexible XML-based Knowledge Authoring Environment

    PubMed Central

    Hulse, Nathan C.; Rocha, Roberto A.; Del Fiol, Guilherme; Bradshaw, Richard L.; Hanna, Timothy P.; Roemer, Lorrie K.

    2005-01-01

    As part of an enterprise effort to develop new clinical information systems at Intermountain Health Care, the authors have built a knowledge authoring tool that facilitates the development and refinement of medical knowledge content. At present, users of the application can compose order sets and an assortment of other structured clinical knowledge documents based on XML schemas. The flexible nature of the application allows the immediate authoring of new types of documents once an appropriate XML schema and accompanying Web form have been developed and stored in a shared repository. The need for a knowledge acquisition tool stems largely from the desire for medical practitioners to be able to write their own content for use within clinical applications. We hypothesize that medical knowledge content for clinical use can be successfully created and maintained through XML-based document frameworks containing structured and coded knowledge. PMID:15802477

  1. AOP Knowledge Base/Wiki Tool Set

    EPA Science Inventory

    Utilizing ToxCast Data and Lifestage Physiologically-Based Pharmacokinetic (PBPK) models to Drive Adverse Outcome Pathways (AOPs)-Based Margin of Exposures (ABME) to Chemicals. Hisham A. El-Masri1, Nicole C. Klienstreur2, Linda Adams1, Tamara Tal1, Stephanie Padilla1, Kristin Is...

  2. Caregiving Antecedents of Secure Base Script Knowledge: A Comparative Analysis of Young Adult Attachment Representations

    ERIC Educational Resources Information Center

    Steele, Ryan D.; Waters, Theodore E. A.; Bost, Kelly K.; Vaughn, Brian E.; Truitt, Warren; Waters, Harriet S.; Booth-LaForce, Cathryn; Roisman, Glenn I.

    2014-01-01

    Based on a subsample (N = 673) of the NICHD Study of Early Child Care and Youth Development (SECCYD) cohort, this article reports data from a follow-up assessment at age 18 years on the antecedents of "secure base script knowledge", as reflected in the ability to generate narratives in which attachment-related difficulties are…

  3. Using Knowledge-Based Systems to Support Learning of Organizational Knowledge: A Case Study

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.; Nash, Rebecca L.; Phan, Tu-Anh T.; Bailey, Teresa R.

    2003-01-01

    This paper describes the deployment of a knowledge system to support learning of organizational knowledge at the Jet Propulsion Laboratory (JPL), a US national research laboratory whose mission is planetary exploration and to 'do what no one has done before.' Data collected over 19 weeks of operation were used to assess system performance with respect to design considerations, participation, effectiveness of communication mechanisms, and individual-based learning. These results are discussed in the context of organizational learning research and implications for practice.

  4. The 2004 knowledge base parametric grid data software suite.

    SciTech Connect

    Wilkening, Lisa K.; Simons, Randall W.; Ballard, Sandy; Jensen, Lee A.; Chang, Marcus C.; Hipp, James Richard

    2004-08-01

    One of the most important types of data in the National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEM R&E) Knowledge Base (KB) is parametric grid (PG) data. PG data can be used to improve signal detection, signal association, and event discrimination, but so far their greatest use has been for improving event location by providing ground-truth-based corrections to travel-time base models. In this presentation we discuss the latest versions of the complete suite of Knowledge Base PG tools developed by NNSA to create, access, manage, and view PG data. The primary PG population tool is the Knowledge Base calibration integration tool (KBCIT). KBCIT is an interactive computer application to produce interpolated calibration-based information that can be used to improve monitoring performance by improving precision of model predictions and by providing proper characterizations of uncertainty. It is used to analyze raw data and produce kriged correction surfaces that can be included in the Knowledge Base. KBCIT not only produces the surfaces but also records all steps in the analysis for later review and possible revision. New features in KBCIT include a new variogram autofit algorithm; the storage of database identifiers with a surface; the ability to merge surfaces; and improved surface-smoothing algorithms. The Parametric Grid Library (PGL) provides the interface to access the data and models stored in a PGL file database. The PGL represents the core software library used by all the GNEM R&E tools that read or write PGL data (e.g., KBCIT and LocOO). The library provides data representations and software models to support accurate and efficient seismic phase association and event location. Recent improvements include conversion of the flat-file database (FDB) to an Oracle database representation; automatic access of station/phase tagged models from the FDB during location; modification of the core

  5. EHR based Genetic Testing Knowledge Base (iGTKB) Development

    PubMed Central

    2015-01-01

    Background The gap between a large growing number of genetic tests and a suboptimal clinical workflow of incorporating these tests into regular clinical practice poses barriers to effective reliance on advanced genetic technologies to improve quality of healthcare. A promising solution to fill this gap is to develop an intelligent genetic test recommendation system that not only can provide a comprehensive view of genetic tests as education resources, but also can recommend the most appropriate genetic tests to patients based on clinical evidence. In this study, we developed an EHR based Genetic Testing Knowledge Base for Individualized Medicine (iGTKB). Methods We extracted genetic testing information and patient medical records from EHR systems at Mayo Clinic. Clinical features have been semi-automatically annotated from the clinical notes by applying a Natural Language Processing (NLP) tool, MedTagger suite. To prioritize clinical features for each genetic test, we compared odds ratio across four population groups. Genetic tests, genetic disorders and clinical features with their odds ratios have been applied to establish iGTKB, which is to be integrated into the Genetic Testing Ontology (GTO). Results Overall, there are five genetic tests operated with sample size greater than 100 in 2013 at Mayo Clinic. A total of 1,450 patients who was tested by one of the five genetic tests have been selected. We assembled 243 clinical features from the Human Phenotype Ontology (HPO) for these five genetic tests. There are 60 clinical features with at least one mention in clinical notes of patients taking the test. Twenty-eight clinical features with high odds ratio (greater than 1) have been selected as dominant features and deposited into iGTKB with their associated information about genetic tests and genetic disorders. Conclusions In this study, we developed an EHR based genetic testing knowledge base, iGTKB. iGTKB will be integrated into the GTO by providing relevant

  6. The process for integrating the NNSA knowledge base.

    SciTech Connect

    Wilkening, Lisa K.; Carr, Dorthe Bame; Young, Christopher John; Hampton, Jeff; Martinez, Elaine

    2009-03-01

    From 2002 through 2006, the Ground Based Nuclear Explosion Monitoring Research & Engineering (GNEMRE) program at Sandia National Laboratories defined and modified a process for merging different types of integrated research products (IRPs) from various researchers into a cohesive, well-organized collection know as the NNSA Knowledge Base, to support operational treaty monitoring. This process includes defining the KB structure, systematically and logically aggregating IRPs into a complete set, and verifying and validating that the integrated Knowledge Base works as expected.

  7. Knowledge-based graphical interfaces for presenting technical information

    NASA Technical Reports Server (NTRS)

    Feiner, Steven

    1988-01-01

    Designing effective presentations of technical information is extremely difficult and time-consuming. Moreover, the combination of increasing task complexity and declining job skills makes the need for high-quality technical presentations especially urgent. We believe that this need can ultimately be met through the development of knowledge-based graphical interfaces that can design and present technical information. Since much material is most naturally communicated through pictures, our work has stressed the importance of well-designed graphics, concentrating on generating pictures and laying out displays containing them. We describe APEX, a testbed picture generation system that creates sequences of pictures that depict the performance of simple actions in a world of 3D objects. Our system supports rules for determining automatically the objects to be shown in a picture, the style and level of detail with which they should be rendered, the method by which the action itself should be indicated, and the picture's camera specification. We then describe work on GRIDS, an experimental display layout system that addresses some of the problems in designing displays containing these pictures, determining the position and size of the material to be presented.

  8. Analyzing Data Generated Through Deliberative Dialogue: Bringing Knowledge Translation Into Qualitative Analysis.

    PubMed

    Plamondon, Katrina M; Bottorff, Joan L; Cole, Donald C

    2015-11-01

    Deliberative dialogue (DD) is a knowledge translation strategy that can serve to generate rich data and bridge health research with action. An intriguing alternative to other modes of generating data, the purposeful and evidence-informed conversations characteristic of DD generate data inclusive of collective interpretations. These data are thus dialogic, presenting complex challenges for qualitative analysis. In this article, we discuss the nature of data generated through DD, orienting ourselves toward a theoretically grounded approach to analysis. We offer an integrated framework for analysis, balancing analytical strategies of categorizing and connecting with the use of empathetic and suspicious interpretive lenses. In this framework, data generation and analysis occur in concert, alongside engaging participants and synthesizing evidence. An example of application is provided, demonstrating nuances of the framework. We conclude with reflections on the strengths and limitations of the framework, suggesting how it may be relevant in other qualitative health approaches.

  9. Apprenticeship Learning Techniques for Knowledge Based Systems

    DTIC Science & Technology

    1988-12-01

    domain, such as medicine. The Odysseus explanation-based learning program constructs explanations of problem-solving actions in the domain of medical...theories and empirical methods so as to allow construction of an explanation. The Odysseus learning program provides the first demonstration of using the... Odysseus explanation-based learning program is presfuted, which constructs explanations of human problem-solving actions in the domain of medical di

  10. Evolution of co-management: role of knowledge generation, bridging organizations and social learning.

    PubMed

    Berkes, Fikret

    2009-04-01

    Over a period of some 20 years, different aspects of co-management (the sharing of power and responsibility between the government and local resource users) have come to the forefront. The paper focuses on a selection of these: knowledge generation, bridging organizations, social learning, and the emergence of adaptive co-management. Co-management can be considered a knowledge partnership. Different levels of organization, from local to international, have comparative advantages in the generation and mobilization of knowledge acquired at different scales. Bridging organizations provide a forum for the interaction of these different kinds of knowledge, and the coordination of other tasks that enable co-operation: accessing resources, bringing together different actors, building trust, resolving conflict, and networking. Social learning is one of these tasks, essential both for the co-operation of partners and an outcome of the co-operation of partners. It occurs most efficiently through joint problem solving and reflection within learning networks. Through successive rounds of learning and problem solving, learning networks can incorporate new knowledge to deal with problems at increasingly larger scales, with the result that maturing co-management arrangements become adaptive co-management in time.

  11. Learning Science-Based Fitness Knowledge in Constructivist Physical Education

    ERIC Educational Resources Information Center

    Sun, Haichun; Chen, Ang; Zhu, Xihe; Ennis, Catherine D.

    2012-01-01

    Teaching fitness-related knowledge has become critical in developing children's healthful living behavior. The purpose of this study was to examine the effects of a science-based, constructivist physical education curriculum on learning fitness knowledge critical to healthful living in elementary school students. The schools (N = 30) were randomly…

  12. Knowledge-Based Hierarchies: Using Organizations to Understand the Economy

    ERIC Educational Resources Information Center

    Garicano, Luis; Rossi-Hansberg, Esteban

    2015-01-01

    Incorporating the decision of how to organize the acquisition, use, and communication of knowledge into economic models is essential to understand a wide variety of economic phenomena. We survey the literature that has used knowledge-based hierarchies to study issues such as the evolution of wage inequality, the growth and productivity of firms,…

  13. Developing Learning Progression-Based Teacher Knowledge Measures

    ERIC Educational Resources Information Center

    Jin, Hui; Shin, HyoJeong; Johnson, Michele E.; Kim, JinHo; Anderson, Charles W.

    2015-01-01

    This study developed learning progression-based measures of science teachers' content knowledge (CK) and pedagogical content knowledge (PCK). The measures focus on an important topic in secondary science curriculum using scientific reasoning (i.e., tracing matter, tracing energy, and connecting scales) to explain plants gaining weight and…

  14. Grey Documentation as a Knowledge Base in Social Work.

    ERIC Educational Resources Information Center

    Berman, Yitzhak

    1994-01-01

    Defines grey documentation as documents issued informally and not available through normal channels and discusses the role that grey documentation can play in the social work knowledge base. Topics addressed include grey documentation and science; social work and the empirical approach in knowledge development; and dissemination of grey…

  15. A knowledge-based decision support system for payload scheduling

    NASA Technical Reports Server (NTRS)

    Tyagi, Rajesh; Tseng, Fan T.

    1988-01-01

    This paper presents the development of a prototype Knowledge-based Decision Support System, currently under development, for scheduling payloads/experiments on space station missions. The DSS is being built on Symbolics, a Lisp machine, using KEE, a commercial knowledge engineering tool.

  16. Conventional and Knowledge-Based Information Retrieval with Prolog.

    ERIC Educational Resources Information Center

    Leigh, William; Paz, Noemi

    1988-01-01

    Describes the use of PROLOG to program knowledge-based information retrieval systems, in which the knowledge contained in a document is translated into machine processable logic. Several examples of the resulting search process, and the program rules supporting the process, are given. (10 references) (CLB)

  17. A relational data-knowledge base system and its potential in developing a distributed data-knowledge system

    NASA Technical Reports Server (NTRS)

    Rahimian, Eric N.; Graves, Sara J.

    1988-01-01

    A new approach used in constructing a rational data knowledge base system is described. The relational database is well suited for distribution due to its property of allowing data fragmentation and fragmentation transparency. An example is formulated of a simple relational data knowledge base which may be generalized for use in developing a relational distributed data knowledge base system. The efficiency and ease of application of such a data knowledge base management system is briefly discussed. Also discussed are the potentials of the developed model for sharing the data knowledge base as well as the possible areas of difficulty in implementing the relational data knowledge base management system.

  18. Knowledge-Based System Analysis and Control

    DTIC Science & Technology

    1989-09-30

    for use as a training tool (given considerable enlargement of its present circuit data base and problem repertoire), because it can provide step-by...from the slow and expensive process of training personnel in complex professional specialties. Tech Control began to emerge as a skill area ripe for...for any purpose but offline training . In late FY87 and early FY88, planning was therefore begun for a new expert system which would have no air gap

  19. Supervised Learning Based Hypothesis Generation from Biomedical Literature

    PubMed Central

    Sang, Shengtian; Yang, Zhihao; Li, Zongyao; Lin, Hongfei

    2015-01-01

    Nowadays, the amount of biomedical literatures is growing at an explosive speed, and there is much useful knowledge undiscovered in this literature. Researchers can form biomedical hypotheses through mining these works. In this paper, we propose a supervised learning based approach to generate hypotheses from biomedical literature. This approach splits the traditional processing of hypothesis generation with classic ABC model into AB model and BC model which are constructed with supervised learning method. Compared with the concept cooccurrence and grammar engineering-based approaches like SemRep, machine learning based models usually can achieve better performance in information extraction (IE) from texts. Then through combining the two models, the approach reconstructs the ABC model and generates biomedical hypotheses from literature. The experimental results on the three classic Swanson hypotheses show that our approach outperforms SemRep system. PMID:26380291

  20. Online Knowledge-Based Model for Big Data Topic Extraction

    PubMed Central

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half. PMID:27195004

  1. Knowledge based systems: From process control to policy analysis

    SciTech Connect

    Marinuzzi, J.G.

    1993-06-01

    Los Alamos has been pursuing the use of Knowledge Based Systems for many years. These systems are currently being used to support projects that range across many production and operations areas. By investing time and money in people and equipment, Los Alamos has developed one of the strongest knowledge based systems capabilities within the DOE. Staff of Los Alamos` Mechanical & Electronic Engineering Division are using these knowledge systems to increase capability, productivity and competitiveness in areas of manufacturing quality control, robotics, process control, plant design and management decision support. This paper describes some of these projects and associated technical program approaches, accomplishments, benefits and future goals.

  2. Knowledge based systems: From process control to policy analysis

    SciTech Connect

    Marinuzzi, J.G.

    1993-01-01

    Los Alamos has been pursuing the use of Knowledge Based Systems for many years. These systems are currently being used to support projects that range across many production and operations areas. By investing time and money in people and equipment, Los Alamos has developed one of the strongest knowledge based systems capabilities within the DOE. Staff of Los Alamos' Mechanical Electronic Engineering Division are using these knowledge systems to increase capability, productivity and competitiveness in areas of manufacturing quality control, robotics, process control, plant design and management decision support. This paper describes some of these projects and associated technical program approaches, accomplishments, benefits and future goals.

  3. Knowledge.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on knowledge includes annotated listings of Web sites, CD-ROMs and computer software, videos, books, and additional resources that deal with knowledge and differences between how animals and humans learn. Sidebars discuss animal intelligence, learning proper behavior, and getting news from the Internet. (LRW)

  4. [Medical practice and clinical research: keys to generate knowledge and improve care].

    PubMed

    Martínez Castuera-Gómez, Carla; Talavera, Juan O

    2013-01-01

    The increased quality in medical care may be immediately accomplished if clinical research is integrated into daily clinical practice. In the generation of medical knowledge are four steps: an unanswered question awakened from clinical practice, the critical analysis of specialized literature, the development of a research protocol, and, finally, the publication of outcomes. Decision making and continuous training are becoming part of an effective strategy of medical attention improvement.

  5. Data mining and intelligent queries in a knowledge-based multimedia medical database system

    NASA Astrophysics Data System (ADS)

    Zhang, Shuhua; Coleman, John D.

    2000-04-01

    Multimedia medical databases have accumulated large quantities of data and information about patients and their medical conditions. Patterns and relationships within this data could provide new knowledge for making better medical decisions. Unfortunately, few technologies have been developed and applied to discover and use this hidden knowledge. We are currently developing a next generation knowledge-based multimedia medical database, named MedBase, with advanced behaviors for data analysis and data fusion. As part of this R&D effort, a knowledge-rich data model is constructed to incorporate data mining techniques/tools to assist the building of medical knowledge bases, and to facilitate intelligent answering of users' investigative and knowledge queries in the database. Techniques such as data generalization, classification, clustering, semantic structures, and concept hierarchies, are used to acquire and represent both symbolic and spatial knowledge implicit in the database. With the availability of semantic structures, concept hierarchies and generalized knowledge, queries may be posed and answered at multiple levels of abstraction. In this article we provide a general description of the approaches and efforts undertaken so far in the MedBase project.

  6. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  7. Supporting the design of translational clinical studies through the generation and verification of conceptual knowledge-anchored hypotheses.

    PubMed

    Payne, Philip R O; Payne, Philip Richard Orrin; Borlawsky, Tara B; Borlawsky, Tara; Kwok, Alan; Greaves, Andrew W; Greaves, Andrew

    2008-11-06

    The ability to generate hypotheses based upon the contents of large-scale, heterogeneous data sets is critical to the design of translational clinical studies. In previous reports, we have described the application of a conceptual knowledge engineering technique, known as constructive induction (CI) in order to satisfy such needs. However, one of the major limitations of this method is the need to engage multiple subject matter experts to verify potential hypotheses generated using CI. In this manuscript, we describe an alternative verification technique that leverages published biomedical literature abstracts. Our report will be framed in the context of an ongoing project to generate hypotheses related to the contents of a translational research data repository maintained by the CLL Research Consortium. Such hypotheses will are intended to inform the design of prospective clinical studies that can elucidate the relationships that may exist between biomarkers and patient phenotypes.

  8. A knowledge-based approach to automated flow-field zoning for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Vogel, Alison Andrews

    1989-01-01

    An automated three-dimensional zonal grid generation capability for computational fluid dynamics is shown through the development of a demonstration computer program capable of automatically zoning the flow field of representative two-dimensional (2-D) aerodynamic configurations. The applicability of a knowledge-based programming approach to the domain of flow-field zoning is examined. Several aspects of flow-field zoning make the application of knowledge-based techniques challenging: the need for perceptual information, the role of individual bias in the design and evaluation of zonings, and the fact that the zoning process is modeled as a constructive, design-type task (for which there are relatively few examples of successful knowledge-based systems in any domain). Engineering solutions to the problems arising from these aspects are developed, and a demonstration system is implemented which can design, generate, and output flow-field zonings for representative 2-D aerodynamic configurations.

  9. Ultrafast quantum random number generation based on quantum phase fluctuations.

    PubMed

    Xu, Feihu; Qi, Bing; Ma, Xiongfeng; Xu, He; Zheng, Haoxuan; Lo, Hoi-Kwong

    2012-05-21

    A quantum random number generator (QRNG) can generate true randomness by exploiting the fundamental indeterminism of quantum mechanics. Most approaches to QRNG employ single-photon detection technologies and are limited in speed. Here, we experimentally demonstrate an ultrafast QRNG at a rate over 6 Gbits/s based on the quantum phase fluctuations of a laser operating near threshold. Moreover, we consider a potential adversary who has partial knowledge on the raw data and discuss how one can rigorously remove such partial knowledge with postprocessing. We quantify the quantum randomness through min-entropy by modeling our system and employ two randomness extractors--Trevisan's extractor and Toeplitz-hashing--to distill the randomness, which is information-theoretically provable. The simplicity and high-speed of our experimental setup show the feasibility of a robust, low-cost, high-speed QRNG.

  10. Sociopathic Knowledge Bases: Correct Knowledge Can Be Harmful Even Given Unlimited Computation

    DTIC Science & Technology

    1989-08-01

    SYMBOL 7a. NAME OF MONITORING ORGANIZATION *University of Illinois f (If applicable) Artificial Intelligence (Code 1133) ______________________ 1...Mathews Ave Dist Urbana, IL 61801 A August 1989 Submitted for Publication: Artificial Intelligence Journal Sociopathic Knowledge Bases: Correct...Introduction Reasoning under uncertainty has been widely investigated in artificial intelligence . Prob- abilistic approaches are of particular relevance

  11. Using affective knowledge to generate and validate a set of emotion-related, action words

    PubMed Central

    Havelka, Jelena; Brown, Charity; Giner-Sorolla, Roger

    2015-01-01

    Emotion concepts are built through situated experience. Abstract word meaning is grounded in this affective knowledge, giving words the potential to evoke emotional feelings and reactions (e.g., Vigliocco et al., 2009). In the present work we explore whether words differ in the extent to which they evoke ‘specific’ emotional knowledge. Using a categorical approach, in which an affective ‘context’ is created, it is possible to assess whether words proportionally activate knowledge relevant to different emotional states (e.g., ‘sadness’, ‘anger’, Stevenson, Mikels & James, 2007a). We argue that this method may be particularly effective when assessing the emotional meaning of action words (e.g., Schacht & Sommer, 2009). In study 1 we use a constrained feature generation task to derive a set of action words that participants associated with six, basic emotional states (see full list in Appendix S1). Generation frequencies were taken to indicate the likelihood that the word would evoke emotional knowledge relevant to the state to which it had been paired. In study 2 a rating task was used to assess the strength of association between the six most frequently generated, or ‘typical’, action words and corresponding emotion labels. Participants were presented with a series of sentences, in which action words (typical and atypical) and labels were paired e.g., “If you are feeling ‘sad’ how likely would you be to act in the following way?” … ‘cry.’ Findings suggest that typical associations were robust. Participants always gave higher ratings to typical vs. atypical action word and label pairings, even when (a) rating direction was manipulated (the label or verb appeared first in the sentence), and (b) the typical behaviours were to be performed by the rater themselves, or others. Our findings suggest that emotion-related action words vary in the extent to which they evoke knowledge relevant for different emotional states. When measuring

  12. The latent structure of secure base script knowledge.

    PubMed

    Waters, Theodore E A; Fraley, R Chris; Groh, Ashley M; Steele, Ryan D; Vaughn, Brian E; Bost, Kelly K; Veríssimo, Manuela; Coppola, Gabrielle; Roisman, Glenn I

    2015-06-01

    There is increasing evidence that attachment representations abstracted from childhood experiences with primary caregivers are organized as a cognitive script describing secure base use and support (i.e., the secure base script). To date, however, the latent structure of secure base script knowledge has gone unexamined-this despite that such basic information about the factor structure and distributional properties of these individual differences has important conceptual implications for our understanding of how representations of early experience are organized and generalized, as well as methodological significance in relation to maximizing statistical power and precision. In this study, we report factor and taxometric analyses that examined the latent structure of secure base script knowledge in 2 large samples. Results suggested that variation in secure base script knowledge-as measured by both the adolescent (N = 674) and adult (N = 714) versions of the Attachment Script Assessment-is generalized across relationships and continuously distributed.

  13. Effects of Delays on 6-Year-Old Children's Self-Generation and Retention of Knowledge through Integration

    ERIC Educational Resources Information Center

    Varga, Nicole L.; Bauer, Patricia J.

    2013-01-01

    The current research was an investigation of the effect of delay on self-generation and retention of knowledge derived through integration by 6-year-old children. Children were presented with novel facts from passages read aloud to them (i.e., "stem" facts) and tested for self-generation of new knowledge through integration of the facts. In…

  14. Extensible knowledge-based architecture for segmenting CT data

    NASA Astrophysics Data System (ADS)

    Brown, Matthew S.; McNitt-Gray, Michael F.; Goldin, Jonathan G.; Aberle, Denise R.

    1998-06-01

    A knowledge-based system has been developed for segmenting computed tomography (CT) images. Its modular architecture includes an anatomical model, image processing engine, inference engine and blackboard. The model contains a priori knowledge of size, shape, X-ray attenuation and relative position of anatomical structures. This knowledge is used to constrain low-level segmentation routines. Model-derived constraints and segmented image objects are both transformed into a common feature space and posted on the blackboard. The inference engine then matches image to model objects, based on the constraints. The transformation to feature space allows the knowledge and image data representations to be independent. Thus a high-level model can be used, with data being stored in a frame-based semantic network. This modularity and explicit representation of knowledge allows for straightforward system extension. We initially demonstrate an application to lung segmentation in thoracic CT, with subsequent extension of the knowledge-base to include tumors within the lung fields. The anatomical model was later augmented to include basic brain anatomy including the skull and blood vessels, to allow automatic segmentation of vascular structures in CT angiograms for 3D rendering and visualization.

  15. Adults' Autonomic and Subjective Emotional Responses to Infant Vocalizations: The Role of Secure Base Script Knowledge

    ERIC Educational Resources Information Center

    Groh, Ashley M.; Roisman, Glenn I.

    2009-01-01

    This article examines the extent to which secure base script knowledge--as reflected in an adult's ability to generate narratives in which attachment-related threats are recognized, competent help is provided, and the problem is resolved--is associated with adults' autonomic and subjective emotional responses to infant distress and nondistress…

  16. Pneumatic tire-based piezoelectric power generation

    NASA Astrophysics Data System (ADS)

    Makki, Noaman; Pop-Iliev, Remon

    2011-03-01

    Plug-in Hybrid Electric Vehicles (PHEVs) and Extended Range Electric Vehicles (EREVs) currently mainly rely on Internal Combustion Engines (ICE) utilizing conventional fuels to recharge batteries in order to extend their range. Even though Piezo-based power generation devices have surfaced in recent years harvesting vibration energy, their output has only been sufficient to power up sensors and other such smaller devices. The permanent need for a cleaner power generation technique still remains. This paper investigates the possibility of using piezoceramics for power generation within the vehicle's wheel assembly by exploiting the rotational motion of the wheel and the continuously variable contact point between the pneumatic tire and the road.

  17. Error Generation in CATS-Based Agents

    NASA Technical Reports Server (NTRS)

    Callantine, Todd

    2003-01-01

    This research presents a methodology for generating errors from a model of nominally preferred correct operator activities, given a particular operational context, and maintaining an explicit link to the erroneous contextual information to support analyses. It uses the Crew Activity Tracking System (CATS) model as the basis for error generation. This report describes how the process works, and how it may be useful for supporting agent-based system safety analyses. The report presents results obtained by applying the error-generation process and discusses implementation issues. The research is supported by the System-Wide Accident Prevention Element of the NASA Aviation Safety Program.

  18. Knowledge-based fault diagnosis system for refuse collection vehicle

    NASA Astrophysics Data System (ADS)

    Tan, CheeFai; Juffrizal, K.; Khalil, S. N.; Nidzamuddin, M. Y.

    2015-05-01

    The refuse collection vehicle is manufactured by local vehicle body manufacturer. Currently; the company supplied six model of the waste compactor truck to the local authority as well as waste management company. The company is facing difficulty to acquire the knowledge from the expert when the expert is absence. To solve the problem, the knowledge from the expert can be stored in the expert system. The expert system is able to provide necessary support to the company when the expert is not available. The implementation of the process and tool is able to be standardize and more accurate. The knowledge that input to the expert system is based on design guidelines and experience from the expert. This project highlighted another application on knowledge-based system (KBS) approached in trouble shooting of the refuse collection vehicle production process. The main aim of the research is to develop a novel expert fault diagnosis system framework for the refuse collection vehicle.

  19. Knowledge-based fault diagnosis system for refuse collection vehicle

    SciTech Connect

    Tan, CheeFai; Juffrizal, K.; Khalil, S. N.; Nidzamuddin, M. Y.

    2015-05-15

    The refuse collection vehicle is manufactured by local vehicle body manufacturer. Currently; the company supplied six model of the waste compactor truck to the local authority as well as waste management company. The company is facing difficulty to acquire the knowledge from the expert when the expert is absence. To solve the problem, the knowledge from the expert can be stored in the expert system. The expert system is able to provide necessary support to the company when the expert is not available. The implementation of the process and tool is able to be standardize and more accurate. The knowledge that input to the expert system is based on design guidelines and experience from the expert. This project highlighted another application on knowledge-based system (KBS) approached in trouble shooting of the refuse collection vehicle production process. The main aim of the research is to develop a novel expert fault diagnosis system framework for the refuse collection vehicle.

  20. Managing Project Landscapes in Knowledge-Based Enterprises

    NASA Astrophysics Data System (ADS)

    Stantchev, Vladimir; Franke, Marc Roman

    Knowledge-based enterprises are typically conducting a large number of research and development projects simultaneously. This is a particularly challenging task in complex and diverse project landscapes. Project Portfolio Management (PPM) can be a viable framework for knowledge and innovation management in such landscapes. A standardized process with defined functions such as project data repository, project assessment, selection, reporting, and portfolio reevaluation can serve as a starting point. In this work we discuss the benefits a multidimensional evaluation framework can provide for knowledge-based enterprises. Furthermore, we describe a knowledge and learning strategy and process in the context of PPM and evaluate their practical applicability at different stages of the PPM process.

  1. Evaluation of database technologies for the CTBT Knowledge Base prototype

    SciTech Connect

    Keyser, R.; Shepard-Dombroski, E.; Baur, D.; Hipp, J.; Moore, S.; Young, C.; Chael, E.

    1996-11-01

    This document examines a number of different software technologies in the rapidly changing field of database management systems, evaluates these systems in light of the expected needs of the Comprehensive Test Ban Treaty (CTBT) Knowledge Base, and makes some recommendations for the initial prototypes of the Knowledge Base. The Knowledge Base requirements are examined and then used as criteria for evaluation of the database management options. A mock-up of the data expected in the Knowledge Base is used as a basis for examining how four different database technologies deal with the problems of storing and retrieving the data. Based on these requirement and the results of the evaluation, the recommendation is that the Illustra database be considered for the initial prototype of the Knowledge Base. Illustra offers a unique blend of performance, flexibility, and features that will aid in the implementation of the prototype. At the same time, Illustra provides a high level of compatibility with the hardware and software environments present at the US NDC (National Data Center) and the PIDC (Prototype International Data Center).

  2. Knowledge-based engineering of a PLC controlled telescope

    NASA Astrophysics Data System (ADS)

    Pessemier, Wim; Raskin, Gert; Saey, Philippe; Van Winckel, Hans; Deconinck, Geert

    2016-08-01

    As the new control system of the Mercator Telescope is being finalized, we can review some technologies and design methodologies that are advantageous, despite their relative uncommonness in astronomical instrumentation. Particular for the Mercator Telescope is that it is controlled by a single high-end soft-PLC (Programmable Logic Controller). Using off-the-shelf components only, our distributed embedded system controls all subsystems of the telescope such as the pneumatic primary mirror support, the hydrostatic bearing, the telescope axes, the dome, the safety system, and so on. We show how real-time application logic can be written conveniently in typical PLC languages (IEC 61131-3) and in C++ (to implement the pointing kernel) using the commercial TwinCAT 3 programming environment. This software processes the inputs and outputs of the distributed system in real-time via an observatory-wide EtherCAT network, which is synchronized with high precision to an IEEE 1588 (PTP, Precision Time Protocol) time reference clock. Taking full advantage of the ability of soft-PLCs to run both real-time and non real-time software, the same device also hosts the most important user interfaces (HMIs or Human Machine Interfaces) and communication servers (OPC UA for process data, FTP for XML configuration data, and VNC for remote control). To manage the complexity of the system and to streamline the development process, we show how most of the software, electronics and systems engineering aspects of the control system have been modeled as a set of scripts written in a Domain Specific Language (DSL). When executed, these scripts populate a Knowledge Base (KB) which can be queried to retrieve specific information. By feeding the results of those queries to a template system, we were able to generate very detailed "browsable" web-based documentation about the system, but also PLC software code, Python client code, model verification reports, etc. The aim of this paper is to

  3. Knowledge discovery based on experiential learning corporate culture management

    NASA Astrophysics Data System (ADS)

    Tu, Kai-Jan

    2014-10-01

    A good corporate culture based on humanistic theory can make the enterprise's management very effective, all enterprise's members have strong cohesion and centripetal force. With experiential learning model, the enterprise can establish an enthusiastic learning spirit corporate culture, have innovation ability to gain the positive knowledge growth effect, and to meet the fierce global marketing competition. A case study on Trend's corporate culture can offer the proof of industry knowledge growth rate equation as the contribution to experiential learning corporate culture management.

  4. Knowledge management: An abstraction of knowledge base and database management systems

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  5. Arranging ISO 13606 archetypes into a knowledge base.

    PubMed

    Kopanitsa, Georgy

    2014-01-01

    To enable the efficient reuse of standard based medical data we propose to develop a higher level information model that will complement the archetype model of ISO 13606. This model will make use of the relationships that are specified in UML to connect medical archetypes into a knowledge base within a repository. UML connectors were analyzed for their ability to be applied in the implementation of a higher level model that will establish relationships between archetypes. An information model was developed using XML Schema notation. The model allows linking different archetypes of one repository into a knowledge base. Presently it supports several relationships and will be advanced in future.

  6. Intelligent technique for knowledge reuse of dental medical records based on case-based reasoning.

    PubMed

    Gu, Dong-Xiao; Liang, Chang-Yong; Li, Xing-Guo; Yang, Shan-Lin; Zhang, Pei

    2010-04-01

    With the rapid development of both information technology and the management of modern medical regulation, the generation of medical records tends to be increasingly intelligent. In this paper, Case-Based Reasoning is applied to the process of generating records of dental cases. Based on the analysis of the features of dental records, a case base is constructed. A mixed case retrieval method (FAIES) is proposed for the knowledge reuse of dental records by adopting Fuzzy Mathematics, which improves similarity algorithm based on Euclidian-Lagrangian Distance, and PULL & PUSH weight adjustment strategy. Finally, an intelligent system of dental cases generation (CBR-DENT) is constructed. The effectiveness of the system, the efficiency of the retrieval method, the extent of adaptation and the adaptation efficiency are tested using the constructed case base. It is demonstrated that FAIES is very effective in terms of reducing the time of writing medical records and improving the efficiency and quality. FAIES is also proven to be an effective aid for diagnoses and provides a new idea for the management of medical records and its applications.

  7. Category vs. Object Knowledge in Category-Based Induction

    ERIC Educational Resources Information Center

    Murphy, Gregory L.; Ross, Brian H.

    2010-01-01

    In one form of category-based induction, people make predictions about unknown properties of objects. There is a tension between predictions made based on the object's specific features (e.g., objects above a certain size tend not to fly) and those made by reference to category-level knowledge (e.g., birds fly). Seven experiments with artificial…

  8. Malaysia Transitions toward a Knowledge-Based Economy

    ERIC Educational Resources Information Center

    Mustapha, Ramlee; Abdullah, Abu

    2004-01-01

    The emergence of a knowledge-based economy (k-economy) has spawned a "new" notion of workplace literacy, changing the relationship between employers and employees. The traditional covenant where employees expect a stable or lifelong employment will no longer apply. The retention of employees will most probably be based on their skills…

  9. Dynamic Strategic Planning in a Professional Knowledge-Based Organization

    ERIC Educational Resources Information Center

    Olivarius, Niels de Fine; Kousgaard, Marius Brostrom; Reventlow, Susanne; Quelle, Dan Grevelund; Tulinius, Charlotte

    2010-01-01

    Professional, knowledge-based institutions have a particular form of organization and culture that makes special demands on the strategic planning supervised by research administrators and managers. A model for dynamic strategic planning based on a pragmatic utilization of the multitude of strategy models was used in a small university-affiliated…

  10. Developing and Assessing Teachers' Knowledge of Game-Based Learning

    ERIC Educational Resources Information Center

    Shah, Mamta; Foster, Aroutis

    2015-01-01

    Research focusing on the development and assessment of teacher knowledge in game-based learning is in its infancy. A mixed-methods study was undertaken to educate pre-service teachers in game-based learning using the Game Network Analysis (GaNA) framework. Fourteen pre-service teachers completed a methods course, which prepared them in game…

  11. Intelligent Tools for Planning Knowledge base Development and Verification

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  12. KBGIS-II: A knowledge-based geographic information system

    NASA Technical Reports Server (NTRS)

    Smith, Terence; Peuquet, Donna; Menon, Sudhakar; Agarwal, Pankaj

    1986-01-01

    The architecture and working of a recently implemented Knowledge-Based Geographic Information System (KBGIS-II), designed to satisfy several general criteria for the GIS, is described. The system has four major functions including query-answering, learning and editing. The main query finds constrained locations for spatial objects that are describable in a predicate-calculus based spatial object language. The main search procedures include a family of constraint-satisfaction procedures that use a spatial object knowledge base to search efficiently for complex spatial objects in large, multilayered spatial data bases. These data bases are represented in quadtree form. The search strategy is designed to reduce the computational cost of search in the average case. The learning capabilities of the system include the addition of new locations of complex spatial objects to the knowledge base as queries are answered, and the ability to learn inductively definitions of new spatial objects from examples. The new definitions are added to the knowledge base by the system. The system is performing all its designated tasks successfully. Future reports will relate performance characteristics of the system.

  13. KBGIS-2: A knowledge-based geographic information system

    NASA Technical Reports Server (NTRS)

    Smith, T.; Peuquet, D.; Menon, S.; Agarwal, P.

    1986-01-01

    The architecture and working of a recently implemented knowledge-based geographic information system (KBGIS-2) that was designed to satisfy several general criteria for the geographic information system are described. The system has four major functions that include query-answering, learning, and editing. The main query finds constrained locations for spatial objects that are describable in a predicate-calculus based spatial objects language. The main search procedures include a family of constraint-satisfaction procedures that use a spatial object knowledge base to search efficiently for complex spatial objects in large, multilayered spatial data bases. These data bases are represented in quadtree form. The search strategy is designed to reduce the computational cost of search in the average case. The learning capabilities of the system include the addition of new locations of complex spatial objects to the knowledge base as queries are answered, and the ability to learn inductively definitions of new spatial objects from examples. The new definitions are added to the knowledge base by the system. The system is currently performing all its designated tasks successfully, although currently implemented on inadequate hardware. Future reports will detail the performance characteristics of the system, and various new extensions are planned in order to enhance the power of KBGIS-2.

  14. Coal based electric generation comparative technologies report

    SciTech Connect

    Not Available

    1989-10-26

    Ohio Clean Fuels, Inc., (OCF) has licensed technology that involves Co-Processing (Co-Pro) poor grade (high sulfur) coal and residual oil feedstocks to produce clean liquid fuels on a commercial scale. Stone Webster is requested to perform a comparative technologies report for grassroot plants utilizing coal as a base fuel. In the case of Co-Processing technology the plant considered is the nth plant in a series of applications. This report presents the results of an economic comparison of this technology with other power generation technologies that use coal. Technologies evaluated were:Co-Processing integrated with simple cycle combustion turbine generators, (CSC); Co-Processing integrated with combined cycle combustion turbine generators, (CCC); pulverized coal-fired boiler with flue gas desulfurization and steam turbine generator, (PC) and Circulating fluidized bed boiler and steam turbine generator, (CFB). Conceptual designs were developed. Designs were based on approximately equivalent net electrical output for each technology. A base case of 310 MWe net for each technology was established. Sensitivity analyses at other net electrical output sizes varying from 220 MWe's to 1770 MWe's were also performed. 4 figs., 9 tabs.

  15. Desperately seeking data: knowledge base-database links.

    PubMed Central

    Hripcsak, G.; Johnson, S. B.; Clayton, P. D.

    1993-01-01

    Linking a knowledge-based system (KBS) to a clinical database is a difficult task, but critical if such systems are to achieve widespread use. The Columbia-Presbyterian Medical Center's clinical event monitor provides alerts, interpretations, research screening, and quality assurance functions for the center. Its knowledge base consists of Arden Syntax Medical Logic Modules (MLMs). The knowledge base was analyzed in order to quantify the use and impact of KBS-database links. The MLM data slot, which contains the definition of these links, had almost as many statements (5.8 vs. 8.8, ns with p = 0.15) and more tokens (122 vs. 76, p = 0.037) than the logic slot, which contains the actual medical knowledge. The data slot underwent about twice as many modifications over time as the logic slot (3.0 vs. 1.6 modifications/version, p = 0.010). Database queries and updates accounted for 97.2% of the MLM's total elapsed execution time. Thus, KBS-database links consume substantial resources in an MLM knowledge base, in terms of coding, maintenance, and performance. PMID:8130552

  16. The browser prototype for the CTBT knowledge base

    SciTech Connect

    Armstrong, H.M.; Keyser, R.G.

    1997-07-02

    As part of the United States Department of Energy`s (DOE) Comprehensive Test Ban Treaty (CTBT) research and development effort, a Knowledge Base is being developed. This Knowledge Base will store the regional geophysical research results as well as geographic contexual information and make this information available to the Automated Data Processing (ADP routines) as well as human analysts involved in CTBT monitoring. This paper focuses on the initial development of a browser prototype to be used to interactively examine the contents of the CTBT Knowledge Base. The browser prototype is intended to be a research tool to experiment with different ways to display and integrate the datasets. An initial prototype version has been developed using Environmental Systems Research Incorporated`s (ESRI) ARC/INFO Geographic Information System (GIS) product. The conceptual requirements, design, initial implementation, current status, and future work plans are discussed. 4 refs., 2 figs.

  17. Assessing an AI knowledge-base for asymptomatic liver diseases.

    PubMed

    Babic, A; Mathiesen, U; Hedin, K; Bodemar, G; Wigertz, O

    1998-01-01

    Discovering not yet seen knowledge from clinical data is of importance in the field of asymptomatic liver diseases. Avoidance of liver biopsy which is used as the ultimate confirmation of diagnosis by making the decision based on relevant laboratory findings only, would be considered an essential support. The system based on Quinlan's ID3 algorithm was simple and efficient in extracting the sought knowledge. Basic principles of applying the AI systems are therefore described and complemented with medical evaluation. Some of the diagnostic rules were found to be useful as decision algorithms i.e. they could be directly applied in clinical work and made a part of the knowledge-base of the Liver Guide, an automated decision support system.

  18. A standard based approach for biomedical knowledge representation.

    PubMed

    Farkash, Ariel; Neuvirth, Hani; Goldschmidt, Yaara; Conti, Costanza; Rizzi, Federica; Bianchi, Stefano; Salvi, Erika; Cusi, Daniele; Shabo, Amnon

    2011-01-01

    The new generation of health information standards, where the syntax and semantics of the content is explicitly formalized, allows for interoperability in healthcare scenarios and analysis in clinical research settings. Studies involving clinical and genomic data include accumulating knowledge as relationships between genotypic and phenotypic information as well as associations within the genomic and clinical worlds. Some involve analysis results targeted at a specific disease; others are of a predictive nature specific to a patient and may be used by decision support applications. Representing knowledge is as important as representing data since data is more useful when coupled with relevant knowledge. Any further analysis and cross-research collaboration would benefit from persisting knowledge and data in a unified way. This paper describes a methodology used in Hypergenes, an EC FP7 project targeting Essential Hypertension, which captures data and knowledge using standards such as HL7 CDA and Clinical Genomics, aligned with the CEN EHR 13606 specification. We demonstrate the benefits of such an approach for clinical research as well as in healthcare oriented scenarios.

  19. Structural topology design of container ship based on knowledge-based engineering and level set method

    NASA Astrophysics Data System (ADS)

    Cui, Jin-ju; Wang, De-yu; Shi, Qi-qi

    2015-06-01

    Knowledge-Based Engineering (KBE) is introduced into the ship structural design in this paper. From the implementation of KBE, the design solutions for both Rules Design Method (RDM) and Interpolation Design Method (IDM) are generated. The corresponding Finite Element (FE) models are generated. Topological design of the longitudinal structures is studied where the Gaussian Process (GP) is employed to build the surrogate model for FE analysis. Multi-objective optimization methods inspired by Pareto Front are used to reduce the design tank weight and outer surface area simultaneously. Additionally, an enhanced Level Set Method (LSM) which employs implicit algorithm is applied to the topological design of typical bracket plate which is used extensively in ship structures. Two different sets of boundary conditions are considered. The proposed methods show satisfactory efficiency and accuracy.

  20. Reducing a Knowledge-Base Search Space When Data Are Missing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    This software addresses the problem of how to efficiently execute a knowledge base in the presence of missing data. Computationally, this is an exponentially expensive operation that without heuristics generates a search space of 1 + 2n possible scenarios, where n is the number of rules in the knowledge base. Even for a knowledge base of the most modest size, say 16 rules, it would produce 65,537 possible scenarios. The purpose of this software is to reduce the complexity of this operation to a more manageable size. The problem that this system solves is to develop an automated approach that can reason in the presence of missing data. This is a meta-reasoning capability that repeatedly calls a diagnostic engine/model to provide prognoses and prognosis tracking. In the big picture, the scenario generator takes as its input the current state of a system, including probabilistic information from Data Forecasting. Using model-based reasoning techniques, it returns an ordered list of fault scenarios that could be generated from the current state, i.e., the plausible future failure modes of the system as it presently stands. The scenario generator models a Potential Fault Scenario (PFS) as a black box, the input of which is a set of states tagged with priorities and the output of which is one or more potential fault scenarios tagged by a confidence factor. The results from the system are used by a model-based diagnostician to predict the future health of the monitored system.

  1. A knowledge-based system for prototypical reasoning

    NASA Astrophysics Data System (ADS)

    Lieto, Antonio; Minieri, Andrea; Piana, Alberto; Radicioni, Daniele P.

    2015-04-01

    In this work we present a knowledge-based system equipped with a hybrid, cognitively inspired architecture for the representation of conceptual information. The proposed system aims at extending the classical representational and reasoning capabilities of the ontology-based frameworks towards the realm of the prototype theory. It is based on a hybrid knowledge base, composed of a classical symbolic component (grounded on a formal ontology) with a typicality based one (grounded on the conceptual spaces framework). The resulting system attempts to reconcile the heterogeneous approach to the concepts in Cognitive Science with the dual process theories of reasoning and rationality. The system has been experimentally assessed in a conceptual categorisation task where common sense linguistic descriptions were given in input, and the corresponding target concepts had to be identified. The results show that the proposed solution substantially extends the representational and reasoning 'conceptual' capabilities of standard ontology-based systems.

  2. Document Retrieval Using A Fuzzy Knowledge-Based System

    NASA Astrophysics Data System (ADS)

    Subramanian, Viswanath; Biswas, Gautam; Bezdek, James C.

    1986-03-01

    This paper presents the design and development of a prototype document retrieval system using a knowledge-based systems approach. Both the domain-specific knowledge base and the inferencing schemes are based on a fuzzy set theoretic framework. A query in natural language represents a request to retrieve a relevant subset of documents from a document base. Such a query, which can include both fuzzy terms and fuzzy relational operators, is converted into an unambiguous intermediate form by a natural language interface. Concepts that describe domain topics and the relationships between concepts, such as the synonym relation and the implication relation between a general concept and more specific concepts, have been captured in a knowledge base. The knowledge base enables the system to emulate the reasoning process followed by an expert, such as a librarian, in understanding and reformulating user queries. The retrieval mechanism processes the query in two steps. First it produces a pruned list of documents pertinent to the query. Second, it uses an evidence combination scheme to compute a degree of support between the query and individual documents produced in step one. The front-end component of the system then presents a set of document citations to the user in ranked order as an answer to the information request.

  3. A knowledge-based information system for monitoring drug levels.

    PubMed

    Wiener, F; Groth, T; Mortimer, O; Hallquist, I; Rane, A

    1989-06-01

    The expert system shell SMR has been enhanced to include information system routines for designing data screens and providing facilities for data entry, storage, retrieval, queries and descriptive statistics. The data for inference making is abstracted from the data base record and inserted into a data array to which the knowledge base is applied to derive the appropriate advice and comments. The enhanced system has been used to develop an intelligent information system for monitoring serum drug levels which includes evaluation of temporal changes and production of specialized printed reports. The module for digoxin has been fully developed and validated. To demonstrate the extension to other drugs a module for phenytoin was constructed with only a rudimentary knowledge base. Data from the request forms together with the S-digoxin results are entered into the data base by the department secretary. The day's results are then reviewed by the clinical pharmacologist. For each case, previous results may be displayed and are taken into account by the system in the decision process. The knowledge base is applied to the data to formulate an evaluative comment on the report returned to the requestor. The report includes a semi-graphic presentation of the current and previous results and either the system's interpretation or one entered by the pharmacologist if he does not agree with it. The pharmacologist's comment is also recorded in the data base for future retrieval, analysis and possible updating of the knowledge base. The system is now undergoing testing and evaluation under routine operations in the clinical pharmacology service. It is a prototype for other applications in both laboratory and clinical medicine currently under development at Uppsala University Hospital. This system may thus provide a vehicle for a more intensive penetration of knowledge-based systems in practical medical applications.

  4. The nature of students' science knowledge base: Using assessment to paint a picture

    NASA Astrophysics Data System (ADS)

    Gotwals, Amelia Wenk

    Goals in inquiry-based science include not only that students understand content knowledge, but also that students be able to utilize this knowledge in complex problem solving situations to work with tasks that involve skills such as interpreting data and formulating scientific explanations. In addition, advancements in the measurement sciences allow for sophisticated and complex ways to score and interpret student responses on assessment tasks. However, while many studies have shown the benefits of scientific inquiry in the classroom and others have described new types of psychometric models available for scoring analysis, few have combined the two to develop a better understanding of how students "know" science. I describe an assessment system used to create items that systematically measure and disentangle three focal aspects of sixth grade students' science knowledge base associated with the BioKIDS: Kids' Inquiry of Diverse Species curriculum. Then, using students' verbal and written responses to the assessment, I analyzed the validity of the assessment tasks and examined the nature of students' science knowledge base when working in science problem solving situations. Overall, the results suggest that the tasks created using the assessment system provided students with opportunities to demonstrate the knowledge and skills about which they were designed, thus indicating that utilizing this assessment system could enable assessment designers to create tasks that allow them to gather information about multiple key aspects of students' science knowledge base. Specifically, tasks can generate information about students' content knowledge, explanation ability and interpreting data ability. In addition, utilizing psychometric models, the results suggest that students have a multidimensional science knowledge base. However, after students have participated in an inquiry-based curricular program, these dimensions are highly related to one another. Despite the

  5. The Latent Structure of Secure Base Script Knowledge

    PubMed Central

    Waters, Theodore E. A.; Fraley, R. Chris; Groh, Ashley M.; Steele, Ryan D.; Vaughn, Brian E.; Bost, Kelly K.; Veríssimo, Manuela; Coppola, Gabrielle; Roisman, Glenn I.

    2015-01-01

    There is increasing evidence that attachment representations abstracted from childhood experiences with primary caregivers are organized as a cognitive script describing secure base use and support (i.e., the secure base script). To date, however, the latent structure of secure base script knowledge has gone unexamined—this despite the fact that such basic information about the factor structure and distributional properties of these individual differences has important conceptual implications for our understanding of how representations of early experience are organized and generalized, as well as methodological significance in relation to maximizing statistical power and precision. In this study, we report factor and taxometric analyses that examined the latent structure of secure base script knowledge in two large samples. Results suggested that variation in secure base script knowledge—as measured by both the adolescent (N = 674) and adult (N = 714) versions of the Attachment Script Assessment—is generalized across relationships and continuously distributed. PMID:25775111

  6. Knowledge of response location alone is not sufficient to generate social inhibition of return.

    PubMed

    Welsh, Timothy N; Manzone, Joseph; McDougall, Laura

    2014-11-01

    Previous research has revealed that the inhibition of return (IOR) effect emerges when individuals respond to a target at the same location as their own previous response or the previous response of a co-actor. The latter social IOR effect is thought to occur because the observation of co-actor's response evokes a representation of that action in the observer and that the observation-evoked response code subsequently activates the inhibitory mechanisms underlying IOR. The present study was conducted to determine if knowledge of the co-actor's response alone is sufficient to evoke social IOR. Pairs of participants completed responses to targets that appeared at different button locations. Button contact generated location-contingent auditory stimuli (high and low tones in Experiment 1 and colour words in Experiment 2). In the Full condition, the observer saw the response and heard the auditory stimuli. In the Auditory Only condition, the observer did not see the co-actor's response, but heard the auditory stimuli generated via button contact to indicate response endpoint. It was found that, although significant individual and social IOR effects emerged in the Full conditions, there were no social IOR effects in the Auditory Only conditions. These findings suggest that knowledge of the co-actor's response alone via auditory information is not sufficient to activate the inhibitory processes leading to IOR. The activation of the mechanisms that lead to social IOR seems to be dependent on processing channels that code the spatial characteristics of action.

  7. Apprenticeship learning techniques for knowledge-based systems

    SciTech Connect

    Wilkins, D.C.

    1987-01-01

    This thesis describes apprenticeship learning techniques for automation of the transfer of expertise. Apprenticeship learning is a form of learning by watching, in which learning occurs as a byproduct of building explanations of human problem-solving actions. As apprenticeship is the most-powerful method that human experts use to refine and debug their expertise in knowledge-intensive domains such as medicine; this motivates giving such capabilities to an expert system. The major accomplishment in this thesis is showing how an explicit representation of the strategy knowledge to solve a general problem class, such as diagnosis, can provide a basis for learning the knowledge that is specific to a particular domain, such as medicine. The Odysseus learning program provides the first demonstration of using the same technique to transfer of expertise to and from an expert system knowledge base. Another major focus of this thesis is limitations of apprenticeship learning. It is shown that extant techniques for reasoning under uncertainty for expert systems lead to a sociopathic knowledge base.

  8. Knowledge-based simulation using object-oriented programming

    NASA Technical Reports Server (NTRS)

    Sidoran, Karen M.

    1993-01-01

    Simulations have become a powerful mechanism for understanding and modeling complex phenomena. Their results have had substantial impact on a broad range of decisions in the military, government, and industry. Because of this, new techniques are continually being explored and developed to make them even more useful, understandable, extendable, and efficient. One such area of research is the application of the knowledge-based methods of artificial intelligence (AI) to the computer simulation field. The goal of knowledge-based simulation is to facilitate building simulations of greatly increased power and comprehensibility by making use of deeper knowledge about the behavior of the simulated world. One technique for representing and manipulating knowledge that has been enhanced by the AI community is object-oriented programming. Using this technique, the entities of a discrete-event simulation can be viewed as objects in an object-oriented formulation. Knowledge can be factual (i.e., attributes of an entity) or behavioral (i.e., how the entity is to behave in certain circumstances). Rome Laboratory's Advanced Simulation Environment (RASE) was developed as a research vehicle to provide an enhanced simulation development environment for building more intelligent, interactive, flexible, and realistic simulations. This capability will support current and future battle management research and provide a test of the object-oriented paradigm for use in large scale military applications.

  9. Ada as an implementation language for knowledge based systems

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel

    1990-01-01

    Debates about the selection of programming languages often produce cultural collisions that are not easily resolved. This is especially true in the case of Ada and knowledge based programming. The construction of programming tools provides a desirable alternative for resolving the conflict.

  10. Planning and Implementing a High Performance Knowledge Base.

    ERIC Educational Resources Information Center

    Cortez, Edwin M.

    1999-01-01

    Discusses the conceptual framework for developing a rapid-prototype high-performance knowledge base for the four mission agencies of the United States Department of Agriculture and their university partners. Describes the background of the project and methods used for establishing the requirements; examines issues and problems surrounding semantic…

  11. PLAN-IT - Knowledge-based mission sequencing

    NASA Technical Reports Server (NTRS)

    Biefeld, Eric W.

    1987-01-01

    PLAN-IT (Plan-Integrated Timelines), a knowledge-based approach to assist in mission sequencing, is discussed. PLAN-IT uses a large set of scheduling techniques known as strategies to develop and maintain a mission sequence. The approach implemented by PLAN-IT and the current applications of PLAN-IT for sequencing at NASA are reported.

  12. The Pedagogical Knowledge Base of Four TESOL Teachers

    ERIC Educational Resources Information Center

    Mullock, Barbara

    2006-01-01

    Many researchers have called for a broadening of the theoretical base of language teacher development programs to include gathering information not only on what teachers do in the classroom, but also on what they know, and how this knowledge is transferred to their teaching behavior, especially as they gain more experience in the classroom.…

  13. Toffler's Powershift: Creating New Knowledge Bases in Higher Education.

    ERIC Educational Resources Information Center

    Powers, Patrick James

    This paper examines the creation of new knowledge bases in higher education in light of the ideas of Alvin Toffler, whose trilogy "Future Shock" (1970), "The Third Wave" (1980), and "Powershift" (1990) focus on the processes, directions, and control of change, respectively. It discusses the increasingly important role…

  14. Value Creation in the Knowledge-Based Economy

    ERIC Educational Resources Information Center

    Liu, Fang-Chun

    2013-01-01

    Effective investment strategies help companies form dynamic core organizational capabilities allowing them to adapt and survive in today's rapidly changing knowledge-based economy. This dissertation investigates three valuation issues that challenge managers with respect to developing business-critical investment strategies that can have…

  15. SCU at TREC 2014 Knowledge Base Acceleration Track

    DTIC Science & Technology

    2014-11-01

    SCU at TREC 2014 Knowledge Base Acceleration Track Hung Nguyen, Yi Fang Department of Computer Engineering Santa Clara University 500 El Camino ...University,Department of Computer Engineering,500 El Camino Real,Santa Clara,CA,95053 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING

  16. Development of a Knowledge Base for Incorporating Technology into Courses

    ERIC Educational Resources Information Center

    Rath, Logan

    2013-01-01

    This article discusses a project resulting from the request of a group of faculty at The College at Brockport to create a website for best practices in teaching and technology. The project evolved into a knowledge base powered by WordPress. Installation and configuration of WordPress resulted in the creation of custom taxonomies and post types,…

  17. CACTUS: Command and Control Training Using Knowledge-Based Simulations

    ERIC Educational Resources Information Center

    Hartley, Roger; Ravenscroft, Andrew; Williams, R. J.

    2008-01-01

    The CACTUS project was concerned with command and control training of large incidents where public order may be at risk, such as large demonstrations and marches. The training requirements and objectives of the project are first summarized justifying the use of knowledge-based computer methods to support and extend conventional training…

  18. Designing a Knowledge Base for Automatic Book Classification.

    ERIC Educational Resources Information Center

    Kim, Jeong-Hyen; Lee, Kyung-Ho

    2002-01-01

    Reports on the design of a knowledge base for an automatic classification in the library science field by using the facet classification principles of colon classification. Discusses inputting titles or key words into the computer to create class numbers through automatic subject recognition and processing title key words. (Author/LRW)

  19. After the Crash: Research-Based Theater for Knowledge Transfer

    ERIC Educational Resources Information Center

    Colantonio, Angela; Kontos, Pia C.; Gilbert, Julie E.; Rossiter, Kate; Gray, Julia; Keightley, Michelle L.

    2008-01-01

    Introduction: The aim of this project was to develop and evaluate a research-based dramatic production for the purpose of transferring knowledge about traumatic brain injury (TBI) to health care professionals, managers, and decision makers. Methods: Using results drawn from six focus group discussions with key stakeholders (consumers, informal…

  20. Language-Based Prior Knowledge and Transition to Mathematics

    ERIC Educational Resources Information Center

    Dogan-Dunlap, Hamide; Torres, Cristina; Chen, Fan

    2005-01-01

    The paper provides a college mathematics student's concept maps, definitions, and essays to support the thesis that language-based prior knowledge can influence students' cognitive processes of mathematical concepts. A group of intermediate algebra students who displayed terms mainly from the spoken language on the first and the second concept…

  1. Towards an Intelligent Planning Knowledge Base Development Environment

    NASA Technical Reports Server (NTRS)

    Chien, S.

    1994-01-01

    ract describes work in developing knowledge base editing and debugging tools for the Multimission VICAR Planner (MVP) system. MVP uses artificial intelligence planning techniques to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing requests made to the JPL Multimission Image Processing Laboratory.

  2. Knowledge Based Engineering for Spatial Database Management and Use

    NASA Technical Reports Server (NTRS)

    Peuquet, D. (Principal Investigator)

    1984-01-01

    The use of artificial intelligence techniques that are applicable to Geographic Information Systems (GIS) are examined. Questions involving the performance and modification to the database structure, the definition of spectra in quadtree structures and their use in search heuristics, extension of the knowledge base, and learning algorithm concepts are investigated.

  3. Design Study: Rocket Based MHD Generator

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This report addresses the technical feasibility and design of a rocket based MHD generator using a sub-scale LOx/RP rocket motor. The design study was constrained by assuming the generator must function within the performance and structural limits of an existing magnet and by assuming realistic limits on (1) the axial electric field, (2) the Hall parameter, (3) current density, and (4) heat flux (given the criteria of heat sink operation). The major results of the work are summarized as follows: (1) A Faraday type of generator with rectangular cross section is designed to operate with a combustor pressure of 300 psi. Based on a magnetic field strength of 1.5 Tesla, the electrical power output from this generator is estimated to be 54.2 KW with potassium seed (weight fraction 3.74%) and 92 KW with cesium seed (weight fraction 9.66%). The former corresponds to a enthalpy extraction ratio of 2.36% while that for the latter is 4.16%; (2) A conceptual design of the Faraday MHD channel is proposed, based on a maximum operating time of 10 to 15 seconds. This concept utilizes a phenolic back wall for inserting the electrodes and inter-electrode insulators. Copper electrode and aluminum oxide insulator are suggested for this channel; and (3) A testing configuration for the sub-scale rocket based MHD system is proposed. An estimate of performance of an ideal rocket based MHD accelerator is performed. With a current density constraint of 5 Amps/cm(exp 2) and a conductivity of 30 Siemens/m, the push power density can be 250, 431, and 750 MW/m(sup 3) when the induced voltage uB have values of 5, 10, and 15 KV/m, respectively.

  4. 'Medical Knowledge' and 'Tradition' of Colonial Korea: Focused on Kudo's "Gynecology"-based Knowledge.

    PubMed

    Hong, Yang Hee

    2013-08-01

    This article attempts to illuminate the ways in which Kudo's medical knowledge based on 'gynecological science' constructed the cultural 'traditions' of colonial Korea. Kudo appears to have been quite an influential figure in colonial Korea in that his writings on the relationship between women's crime, gynecological science and the Chosŏn society granted a significant amount of intellectual authority. Here, I examine Kudo's position within colonial Korea as a producer and propagator of medical knowledge, and then see how women's bodies were understood according to his gynecological knowledge. It also traces the ways in which Kudo's gynecological knowledge represents Chosŏn society and in turn invents the 'traditions' of Chosŏn. Kudo's knowledge of "gynecology" which had been formed while it traveled the states such as Japan, Germany and France served as an important reference for his representation of colonial Korean society. Kudo was a proponent of biological evolution, particularly the rules of 'atavism' put forth by the criminal anthropologist Cesare Lombroso, and argued that an unique social environment caused 'alteration of sexual urges' and primitive cruelty in Chosŏn women. According to Kudo, The social environment was none other than the practice of 'early marriage,' which went against the physiology of women. To Kudo, 'early marriage' was an old 'tradition' of Chosŏn and the cause of heinous crimes, as well as an unmistakable indicator of both the primitiveness and savageness of Chosŏn. While Lombroso considered personal factors such as stress as the cause of women's crimes, Kudo saw Chosŏn women's crimes as a national characteristic. Moreover, he compared the occurrence rate of husband murders by provinces, based on which he categorized the northern population of Chosŏn as barbaric Manchurian and the southern population as the superior Japanese, a combination of racism and scientific knowledge. Kudo's writings provide an insight into the

  5. An Empirical Analysis of Knowledge Based Hypertext Navigation

    PubMed Central

    Snell, J.R.; Boyle, C.

    1990-01-01

    Our purpose is to investigate the effectiveness of knowledge-based navigation in a dermatology hypertext network. The chosen domain is a set of dermatology class notes implemented in Hypercard and SINS. The study measured time, number of moves, and success rates for subjects to find solutions to ten questions. The subjects were required to navigate within a dermatology hypertext network in order to find the solutions to a question. Our results indicate that knowledge-based navigation can assist the user in finding information of interest in a fewer number of node visits (moves) than with traditional button-based browsing or keyword searching. The time necessary to find an item of interest was lower for traditional-based methods. There was no difference in success rates for the two test groups.

  6. Integrating knowledge and control into hypermedia-based training environments: Experiments with HyperCLIPS

    NASA Technical Reports Server (NTRS)

    Hill, Randall W., Jr.

    1990-01-01

    The issues of knowledge representation and control in hypermedia-based training environments are discussed. The main objective is to integrate the flexible presentation capability of hypermedia with a knowledge-based approach to lesson discourse management. The instructional goals and their associated concepts are represented in a knowledge representation structure called a 'concept network'. Its functional usages are many: it is used to control the navigation through a presentation space, generate tests for student evaluation, and model the student. This architecture was implemented in HyperCLIPS, a hybrid system that creates a bridge between HyperCard, a popular hypertext-like system used for building user interfaces to data bases and other applications, and CLIPS, a highly portable government-owned expert system shell.

  7. The Influence of Self-Regulated Learning and Prior Knowledge on Knowledge Acquisition in Computer-Based Learning Environments

    ERIC Educational Resources Information Center

    Bernacki, Matthew

    2010-01-01

    This study examined how learners construct textbase and situation model knowledge in hypertext computer-based learning environments (CBLEs) and documented the influence of specific self-regulated learning (SRL) tactics, prior knowledge, and characteristics of the learner on posttest knowledge scores from exposure to a hypertext. A sample of 160…

  8. Knowledge-Based Production Management: Approaches, Results and Prospects

    DTIC Science & Technology

    1991-12-01

    In this paper we provide an overview of research in the field of knowledge-based production management . We begin by examining the important sources...of decision-making difficulty in practical production management domains, discussing the requirements implied by each with respect to the development...of effective production management tools, and identifying the general opportunities in this regard provided by AI-based technology. We then categorize

  9. Realizing Relevance: The Influence of Domain-Specific Information on Generation of New Knowledge through Integration in 4- to 8-Year-Old Children

    ERIC Educational Resources Information Center

    Bauer, Patricia J.; Larkina, Marina

    2017-01-01

    In accumulating knowledge, direct modes of learning are complemented by productive processes, including self-generation based on integration of separate episodes. Effects of the number of potentially relevant episodes on integration were examined in 4- to 8-year-olds (N = 121; racially/ethnically heterogeneous sample, English speakers, from large…

  10. Predicting links based on knowledge dissemination in complex network

    NASA Astrophysics Data System (ADS)

    Zhou, Wen; Jia, Yifan

    2017-04-01

    Link prediction is the task of mining the missing links in networks or predicting the next vertex pair to be connected by a link. A lot of link prediction methods were inspired by evolutionary processes of networks. In this paper, a new mechanism for the formation of complex networks called knowledge dissemination (KD) is proposed with the assumption of knowledge disseminating through the paths of a network. Accordingly, a new link prediction method-knowledge dissemination based link prediction (KDLP)-is proposed to test KD. KDLP characterizes vertex similarity based on knowledge quantity (KQ) which measures the importance of a vertex through H-index. Extensive numerical simulations on six real-world networks demonstrate that KDLP is a strong link prediction method which performs at a higher prediction accuracy than four well-known similarity measures including common neighbors, local path index, average commute time and matrix forest index. Furthermore, based on the common conclusion that an excellent link prediction method reveals a good evolving mechanism, the experiment results suggest that KD is a considerable network evolving mechanism for the formation of complex networks.

  11. ISPE: A knowledge-based system for fluidization studies

    SciTech Connect

    Reddy, S.

    1991-01-01

    Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all specified goals'' are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that can enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.

  12. Participatory approach to the development of a knowledge base for problem-solving in diabetes self-management

    PubMed Central

    Cole-Lewis, Heather J.; Smaldone, Arlene M.; Davidson, Patricia R.; Kukafka, Rita; Tobin, Jonathan N.; Cassells, Andrea; Mynatt, Elizabeth D.; Hripcsak, George; Mamykina, Lena

    2015-01-01

    Objective To develop an expandable knowledge base of reusable knowledge related to self-management of diabetes that can be used as a foundation for patient-centric decision support tools. Materials and methods The structure and components of the knowledge base were created in participatory design with academic diabetes educators using knowledge acquisition methods. The knowledge base was validated using scenario-based approach with practicing diabetes educators and individuals with diabetes recruited from Community Health Centers (CHCs) serving economically disadvantaged communities and ethnic minorities in New York. Results The knowledge base includes eight glycemic control problems, over 150 behaviors known to contribute to these problems coupled with contextual explanations, and over 200 specific action-oriented self-management goals for correcting problematic behaviors, with corresponding motivational messages. The validation of the knowledge base suggested high level of completeness and accuracy, and identified improvements in cultural appropriateness. These were addressed in new iterations of the knowledge base. Discussion The resulting knowledge base is theoretically grounded, incorporates practical and evidence-based knowledge used by diabetes educators in practice settings, and allows for personally meaningful choices by individuals with diabetes. Participatory design approach helped researchers to capture implicit knowledge of practicing diabetes educators and make it explicit and reusable. Conclusion The knowledge base proposed here is an important step towards development of new generation patient-centric decision support tools for facilitating chronic disease self-management. While this knowledge base specifically targets diabetes, its overall structure and composition can be generalized to other chronic conditions. PMID:26547253

  13. An Analysis of Three Different Approaches to Student Teacher Mentoring and Their Impact on Knowledge Generation in Practicum Settings

    ERIC Educational Resources Information Center

    Mena, Juanjo; García, Marisa; Clarke, Anthony; Barkatsas, Anastasios

    2016-01-01

    Mentoring in Teacher Education is a key component in the professional development of student teachers. However, little research focuses on the knowledge shared and generated in mentoring conversations. In this paper, we explore the knowledge student teachers articulate in mentoring conversations under three different post-lesson approaches to…

  14. NRV web knowledge base on low-energy nuclear physics

    NASA Astrophysics Data System (ADS)

    Karpov, V.; Denikin, A. S.; Alekseev, A. P.; Zagrebaev, V. I.; Rachkov, V. A.; Naumenko, M. A.; Saiko, V. V.

    2016-09-01

    Principles underlying the organization and operation of the NRV web knowledge base on low-energy nuclear physics (http://nrv.jinr.ru) are described. This base includes a vast body of digitized experimental data on the properties of nuclei and on cross sections for nuclear reactions that is combined with a wide set of interconnected computer programs for simulating complex nuclear dynamics, which work directly in the browser of a remote user. Also, the current situation in the realms of application of network information technologies in nuclear physics is surveyed. The potential of the NRV knowledge base is illustrated in detail by applying it to the example of an analysis of the fusion of nuclei that is followed by the decay of the excited compound nucleus formed.

  15. The Knowledge-Based Software Assistant: Beyond CASE

    NASA Technical Reports Server (NTRS)

    Carozzoni, Joseph A.

    1993-01-01

    This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.

  16. SAFOD Brittle Microstructure and Mechanics Knowledge Base (BM2KB)

    NASA Astrophysics Data System (ADS)

    Babaie, Hassan A.; Broda Cindi, M.; Hadizadeh, Jafar; Kumar, Anuj

    2013-07-01

    Scientific drilling near Parkfield, California has established the San Andreas Fault Observatory at Depth (SAFOD), which provides the solid earth community with short range geophysical and fault zone material data. The BM2KB ontology was developed in order to formalize the knowledge about brittle microstructures in the fault rocks sampled from the SAFOD cores. A knowledge base, instantiated from this domain ontology, stores and presents the observed microstructural and analytical data with respect to implications for brittle deformation and mechanics of faulting. These data can be searched on the knowledge base‧s Web interface by selecting a set of terms (classes, properties) from different drop-down lists that are dynamically populated from the ontology. In addition to this general search, a query can also be conducted to view data contributed by a specific investigator. A search by sample is done using the EarthScope SAFOD Core Viewer that allows a user to locate samples on high resolution images of core sections belonging to different runs and holes. The class hierarchy of the BM2KB ontology was initially designed using the Unified Modeling Language (UML), which was used as a visual guide to develop the ontology in OWL applying the Protégé ontology editor. Various Semantic Web technologies such as the RDF, RDFS, and OWL ontology languages, SPARQL query language, and Pellet reasoning engine, were used to develop the ontology. An interactive Web application interface was developed through Jena, a java based framework, with AJAX technology, jsp pages, and java servlets, and deployed via an Apache tomcat server. The interface allows the registered user to submit data related to their research on a sample of the SAFOD core. The submitted data, after initial review by the knowledge base administrator, are added to the extensible knowledge base and become available in subsequent queries to all types of users. The interface facilitates inference capabilities in the

  17. Using the DOE Knowledge Base for Special Event Analysis

    SciTech Connect

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  18. Knowledge-based architecture for airborne mine and minefield detection

    NASA Astrophysics Data System (ADS)

    Agarwal, Sanjeev; Menon, Deepak; Swonger, C. W.

    2004-09-01

    One of the primary lessons learned from airborne mid-wave infrared (MWIR) based mine and minefield detection research and development over the last few years has been the fact that no single algorithm or static detection architecture is able to meet mine and minefield detection performance specifications. This is true not only because of the highly varied environmental and operational conditions under which an airborne sensor is expected to perform but also due to the highly data dependent nature of sensors and algorithms employed for detection. Attempts to make the algorithms themselves more robust to varying operating conditions have only been partially successful. In this paper, we present a knowledge-based architecture to tackle this challenging problem. The detailed algorithm architecture is discussed for such a mine/minefield detection system, with a description of each functional block and data interface. This dynamic and knowledge-driven architecture will provide more robust mine and minefield detection for a highly multi-modal operating environment. The acquisition of the knowledge for this system is predominantly data driven, incorporating not only the analysis of historical airborne mine and minefield imagery data collection, but also other "all source data" that may be available such as terrain information and time of day. This "all source data" is extremely important and embodies causal information that drives the detection performance. This information is not being used by current detection architectures. Data analysis for knowledge acquisition will facilitate better understanding of the factors that affect the detection performance and will provide insight into areas for improvement for both sensors and algorithms. Important aspects of this knowledge-based architecture, its motivations and the potential gains from its implementation are discussed, and some preliminary results are presented.

  19. Virus-based piezoelectric energy generation.

    PubMed

    Lee, Byung Yang; Zhang, Jinxing; Zueger, Chris; Chung, Woo-Jae; Yoo, So Young; Wang, Eddie; Meyer, Joel; Ramesh, Ramamoorthy; Lee, Seung-Wuk

    2012-05-13

    Piezoelectric materials can convert mechanical energy into electrical energy, and piezoelectric devices made of a variety of inorganic materials and organic polymers have been demonstrated. However, synthesizing such materials often requires toxic starting compounds, harsh conditions and/or complex procedures. Previously, it was shown that hierarchically organized natural materials such as bones, collagen fibrils and peptide nanotubes can display piezoelectric properties. Here, we demonstrate that the piezoelectric and liquid-crystalline properties of M13 bacteriophage (phage) can be used to generate electrical energy. Using piezoresponse force microscopy, we characterize the structure-dependent piezoelectric properties of the phage at the molecular level. We then show that self-assembled thin films of phage can exhibit piezoelectric strengths of up to 7.8 pm V(-1). We also demonstrate that it is possible to modulate the dipole strength of the phage, hence tuning the piezoelectric response, by genetically engineering the major coat proteins of the phage. Finally, we develop a phage-based piezoelectric generator that produces up to 6 nA of current and 400 mV of potential and use it to operate a liquid-crystal display. Because biotechnology techniques enable large-scale production of genetically modified phages, phage-based piezoelectric materials potentially offer a simple and environmentally friendly approach to piezoelectric energy generation.

  20. Virus-based piezoelectric energy generation

    NASA Astrophysics Data System (ADS)

    Lee, Byung Yang; Zhang, Jinxing; Zueger, Chris; Chung, Woo-Jae; Yoo, So Young; Wang, Eddie; Meyer, Joel; Ramesh, Ramamoorthy; Lee, Seung-Wuk

    2012-06-01

    Piezoelectric materials can convert mechanical energy into electrical energy, and piezoelectric devices made of a variety of inorganic materials and organic polymers have been demonstrated. However, synthesizing such materials often requires toxic starting compounds, harsh conditions and/or complex procedures. Previously, it was shown that hierarchically organized natural materials such as bones, collagen fibrils and peptide nanotubes can display piezoelectric properties. Here, we demonstrate that the piezoelectric and liquid-crystalline properties of M13 bacteriophage (phage) can be used to generate electrical energy. Using piezoresponse force microscopy, we characterize the structure-dependent piezoelectric properties of the phage at the molecular level. We then show that self-assembled thin films of phage can exhibit piezoelectric strengths of up to 7.8 pm V-1. We also demonstrate that it is possible to modulate the dipole strength of the phage, hence tuning the piezoelectric response, by genetically engineering the major coat proteins of the phage. Finally, we develop a phage-based piezoelectric generator that produces up to 6 nA of current and 400 mV of potential and use it to operate a liquid-crystal display. Because biotechnology techniques enable large-scale production of genetically modified phages, phage-based piezoelectric materials potentially offer a simple and environmentally friendly approach to piezoelectric energy generation.

  1. Measuring Knowledge Elaboration Based on a Computer-Assisted Knowledge Map Analytical Approach to Collaborative Learning

    ERIC Educational Resources Information Center

    Zheng, Lanqin; Huang, Ronghuai; Hwang, Gwo-Jen; Yang, Kaicheng

    2015-01-01

    The purpose of this study is to quantitatively measure the level of knowledge elaboration and explore the relationships between prior knowledge of a group, group performance, and knowledge elaboration in collaborative learning. Two experiments were conducted to investigate the level of knowledge elaboration. The collaborative learning objective in…

  2. Knowledge/geometry-based Mobile Autonomous Robot Simulator (KMARS)

    NASA Technical Reports Server (NTRS)

    Cheng, Linfu; Mckendrick, John D.; Liu, Jeffrey

    1990-01-01

    Ongoing applied research is focused on developing guidance system for robot vehicles. Problems facing the basic research needed to support this development (e.g., scene understanding, real-time vision processing, etc.) are major impediments to progress. Due to the complexity and the unpredictable nature of a vehicle's area of operation, more advanced vehicle control systems must be able to learn about obstacles within the range of its sensor(s). A better understanding of the basic exploration process is needed to provide critical support to developers of both sensor systems and intelligent control systems which can be used in a wide spectrum of autonomous vehicles. Elcee Computek, Inc. has been working under contract to the Flight Dynamics Laboratory, Wright Research and Development Center, Wright-Patterson AFB, Ohio to develop a Knowledge/Geometry-based Mobile Autonomous Robot Simulator (KMARS). KMARS has two parts: a geometry base and a knowledge base. The knowledge base part of the system employs the expert-system shell CLIPS ('C' Language Integrated Production System) and necessary rules that control both the vehicle's use of an obstacle detecting sensor and the overall exploration process. The initial phase project has focused on the simulation of a point robot vehicle operating in a 2D environment.

  3. Developing Practical Knowledge of the Next Generation Science Standards in Elementary Science Teacher Education

    NASA Astrophysics Data System (ADS)

    Hanuscin, Deborah L.; Zangori, Laura

    2016-12-01

    Just as the Next Generation Science Standards (NGSSs) call for change in what students learn and how they are taught, teacher education programs must reconsider courses and curriculum in order to prepare teacher candidates to understand and implement new standards. In this study, we examine the development of prospective elementary teachers' practical knowledge of the NGSS in the context of a science methods course and innovative field experience. We present three themes related to how prospective teachers viewed and utilized the standards: (a) as a useful guide for planning and designing instruction, (b) as a benchmark for student and self-evaluation, and (c) as an achievable vision for teaching and learning. Our findings emphasize the importance of collaborative opportunities for repeated teaching of the same lessons, but question what is achievable in the context of a semester-long experience.

  4. Social studies of volcanology: knowledge generation and expert advice on active volcanoes

    NASA Astrophysics Data System (ADS)

    Donovan, Amy; Oppenheimer, Clive; Bravo, Michael

    2012-04-01

    This paper examines the philosophy and evolution of volcanological science in recent years, particularly in relation to the growth of volcanic hazard and risk science. It uses the lens of Science and Technology Studies to examine the ways in which knowledge generation is controlled and directed by social forces, particularly during eruptions, which constitute landmarks in the development of new technologies and models. It also presents data from a survey of volcanologists carried out during late 2008 and early 2009. These data concern the felt purpose of the science according to the volcanologists who participated and their impressions of the most important eruptions in historical time. It demonstrates that volcanologists are motivated both by the academic science environment and by a social concern for managing the impact of volcanic hazards on populations. Also discussed are the eruptions that have most influenced the discipline and the role of scientists in policymaking on active volcanoes. Expertise in volcanology can become the primary driver of public policy very suddenly when a volcano erupts, placing immense pressure on volcanologists. In response, the epistemological foundations of volcanology are on the move, with an increasing volume of research into risk assessment and management. This requires new, integrated methodologies for knowledge collection that transcend scientific disciplinary boundaries.

  5. Background Knowledge in Learning-Based Relation Extraction

    DTIC Science & Technology

    2012-01-01

    Methods for Natural Language Processing (EMNLP), Edinburgh, Scotland . [Do and Roth , 2010] Do, Q. and Roth , D. (2010). Constraints based taxonomic...University of Illinois at Urbana-Champaign, 2012 Urbana, Illinois Doctoral Committee: Professor Dan Roth , Chair Assistant Professor Julia Hockenmaier...more joyful and impassioned. I am deeply grateful to my advisor, Dan Roth , for his knowledgeable advices and supports from day one of my PhD journey

  6. A Study of Knowledge-Based Systems for Photo Interpretation.

    DTIC Science & Technology

    1980-06-01

    OIL (15] CAI Electronics SOPHIE (10] Medicine GUIDON [14] Learning Chemistry Meta-DENDRAL (i] Agriculture INDUCE [19] Mathematics AM [40] Intelligent...16 6. Computer-Aided Instruction: GUIDON Three types of traditional computer-aided instruction (CAI) are often distinguished: frame-oriented drill-and...systems have an obvious contribution to make to CAI. The GUIDON system developed by Clancey at Stanford exploits the MYCIN knowledge base about

  7. A knowledge based model of electric utility operations. Final report

    SciTech Connect

    1993-08-11

    This report consists of an appendix to provide a documentation and help capability for an analyst using the developed expert system of electric utility operations running in CLIPS. This capability is provided through a separate package running under the WINDOWS Operating System and keyed to provide displays of text, graphics and mixed text and graphics that explain and elaborate on the specific decisions being made within the knowledge based expert system.

  8. Knowledge-Based Decision Support in Department of Defense Acquisitions

    DTIC Science & Technology

    2010-09-01

    2005) reviewed and analyzed the National Aeronautics and Space Administration ( NASA ) project management policies and compared them to the GAO’s best...practices on knowledge-based decision making. The study was primarily focused on the Goddard Space Flight Center, the Jet Propulsion Lab, Johnson ...Space Center, and Marshall Space Flight Center. During its investigation, the GAO found NASA deficient in key criteria and decision reviews to fully

  9. Knowledge-based vision and simple visual machines.

    PubMed Central

    Cliff, D; Noble, J

    1997-01-01

    The vast majority of work in machine vision emphasizes the representation of perceived objects and events: it is these internal representations that incorporate the 'knowledge' in knowledge-based vision or form the 'models' in model-based vision. In this paper, we discuss simple machine vision systems developed by artificial evolution rather than traditional engineering design techniques, and note that the task of identifying internal representations within such systems is made difficult by the lack of an operational definition of representation at the causal mechanistic level. Consequently, we question the nature and indeed the existence of representations posited to be used within natural vision systems (i.e. animals). We conclude that representations argued for on a priori grounds by external observers of a particular vision system may well be illusory, and are at best place-holders for yet-to-be-identified causal mechanistic interactions. That is, applying the knowledge-based vision approach in the understanding of evolved systems (machines or animals) may well lead to theories and models that are internally consistent, computationally plausible, and entirely wrong. PMID:9304684

  10. Knowledge-based processing for aircraft flight control

    NASA Technical Reports Server (NTRS)

    Painter, John H.; Glass, Emily; Economides, Gregory; Russell, Paul

    1994-01-01

    This Contractor Report documents research in Intelligent Control using knowledge-based processing in a manner dual to methods found in the classic stochastic decision, estimation, and control discipline. Such knowledge-based control has also been called Declarative, and Hybid. Software architectures were sought, employing the parallelism inherent in modern object-oriented modeling and programming. The viewpoint adopted was that Intelligent Control employs a class of domain-specific software architectures having features common over a broad variety of implementations, such as management of aircraft flight, power distribution, etc. As much attention was paid to software engineering issues as to artificial intelligence and control issues. This research considered that particular processing methods from the stochastic and knowledge-based worlds are duals, that is, similar in a broad context. They provide architectural design concepts which serve as bridges between the disparate disciplines of decision, estimation, control, and artificial intelligence. This research was applied to the control of a subsonic transport aircraft in the airport terminal area.

  11. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  12. Shapelearner: Towards Shape-Based Visual Knowledge Harvesting

    NASA Astrophysics Data System (ADS)

    Wang, Zheng; Liang, Ti

    2016-06-01

    The explosion of images on the Web has led to a number of efforts to organize images semantically and compile collections of visual knowledge. While there has been enormous progress on categorizing entire images or bounding boxes, only few studies have targeted fine-grained image understanding at the level of specific shape contours. For example, given an image of a cat, we would like a system to not merely recognize the existence of a cat, but also to distinguish between the cat's legs, head, tail, and so on. In this paper, we present ShapeLearner, a system that acquires such visual knowledge about object shapes and their parts. ShapeLearner jointly learns this knowledge from sets of segmented images. The space of label and segmentation hypotheses is pruned and then evaluated using Integer Linear Programming. ShapeLearner places the resulting knowledge in a semantic taxonomy based on WordNet and is able to exploit this hierarchy in order to analyze new kinds of objects that it has not observed before. We conduct experiments using a variety of shape classes from several representative categories and demonstrate the accuracy and robustness of our method.

  13. Knowledge-based system for automatic MBR control.

    PubMed

    Comas, J; Meabe, E; Sancho, L; Ferrero, G; Sipma, J; Monclús, H; Rodriguez-Roda, I

    2010-01-01

    MBR technology is currently challenging traditional wastewater treatment systems and is increasingly selected for WWTP upgrading. MBR systems typically are constructed on a smaller footprint, and provide superior treated water quality. However, the main drawback of MBR technology is that the permeability of membranes declines during filtration due to membrane fouling, which for a large part causes the high aeration requirements of an MBR to counteract this fouling phenomenon. Due to the complex and still unknown mechanisms of membrane fouling it is neither possible to describe clearly its development by means of a deterministic model, nor to control it with a purely mathematical law. Consequently the majority of MBR applications are controlled in an "open-loop" way i.e. with predefined and fixed air scour and filtration/relaxation or backwashing cycles, and scheduled inline or offline chemical cleaning as a preventive measure, without taking into account the real needs of membrane cleaning based on its filtration performance. However, existing theoretical and empirical knowledge about potential cause-effect relations between a number of factors (influent characteristics, biomass characteristics and operational conditions) and MBR operation can be used to build a knowledge-based decision support system (KB-DSS) for the automatic control of MBRs. This KB-DSS contains a knowledge-based control module, which, based on real time comparison of the current permeability trend with "reference trends", aims at optimizing the operation and energy costs and decreasing fouling rates. In practice the automatic control system proposed regulates the set points of the key operational variables controlled in MBR systems (permeate flux, relaxation and backwash times, backwash flows and times, aeration flow rates, chemical cleaning frequency, waste sludge flow rate and recycle flow rates) and identifies its optimal value. This paper describes the concepts and the 3-level architecture

  14. Governing Long-Term Risks in Radioactive Waste Management: Reversibility and Knowledge Transfer Across Generations

    NASA Astrophysics Data System (ADS)

    Lehtonen, M.

    2014-12-01

    Safe management of the long-lived and high-level radioactive waste originating primarily from nuclear power stations requires isolating and confining the waste for periods up to 100 000 years. Disposal in deep geological formations is currently the solution advocated by international organisations (e.g. the IAEA and the OECD-NEA) and governments, but nowhere in the world is such repository for civilian nuclear waste in operation yet. Concerns about the governance of the involved risks and uncertainties for such long periods lie at the heart of the controversies that have slowed down the identification of a solution. In order to draw lessons potentially relevant for the governance of long-term climate risks, this paper examines the ways in which two interrelated aspects have been addressed in nuclear waste management in France, the US, and the Nordic countries. The first issue concerns "reversibility" - i.e. the possibility on one hand to retrieve the waste once it has been disposed of in a repository, and on the other to return at any point in time along the decision-making process to the previous decision-making phase. Reversibility constitutes today a fundamental, legally binding requirement in French radioactive waste policy. A strategy for managing risk and uncertainty as such, reversibility nevertheless also poses significant safety challenges of its own. The second topic goes beyond the timescales (max. 300 years) in which reversibility is usually considered applicable, addressing the question of intergenerational knowledge transfer, comparing the Nordic and the American approaches to the issue. The key challenge here is ensuring the transfer to the future generations - for periods up to 100 000 years - of sufficient knowledge concerning the siting, characteristics and management of the waste deposited in a repository. Even more fundamentally, instead of knowledge transfer, should we rather aim at "active forgetting", in order to prevent the curious in the

  15. Next Generation Climate Change Experiments Needed to Advance Knowledge and for Assessment of CMIP6

    SciTech Connect

    Katzenberger, John; Arnott, James; Wright, Alyson

    2014-10-30

    The Aspen Global Change Institute hosted a technical science workshop entitled, “Next generation climate change experiments needed to advance knowledge and for assessment of CMIP6,” on August 4-9, 2013 in Aspen, CO. Jerry Meehl (NCAR), Richard Moss (PNNL), and Karl Taylor (LLNL) served as co-chairs for the workshop which included the participation of 32 scientists representing most of the major climate modeling centers for a total of 160 participant days. In August 2013, AGCI gathered a high level meeting of representatives from major climate modeling centers around the world to assess achievements and lessons learned from the most recent generation of coordinated modeling experiments known as the Coupled Model Intercomparison Project – 5 (CMIP5) as well as to scope out the science questions and coordination structure desired for the next anticipated phase of modeling experiments called CMIP6. The workshop allowed for reflection on the coordination of the CMIP5 process as well as intercomparison of model results, such as were assessed in the most recent IPCC 5th Assessment Report, Working Group 1. For example, this slide from Masahiro Watanabe examines performance on a range of models capturing Atlantic Meridional Overturning Circulation (AMOC).

  16. Development of an Inquiry-Based Learning Support System Based on an Intelligent Knowledge Exploration Approach

    ERIC Educational Resources Information Center

    Wu, Ji-Wei; Tseng, Judy C. R.; Hwang, Gwo-Jen

    2015-01-01

    Inquiry-Based Learning (IBL) is an effective approach for promoting active learning. When inquiry-based learning is incorporated into instruction, teachers provide guiding questions for students to actively explore the required knowledge in order to solve the problems. Although the World Wide Web (WWW) is a rich knowledge resource for students to…

  17. Integrating Research-Based and Practice-Based Knowledge through Workplace Reflection

    ERIC Educational Resources Information Center

    Nilsen, Per; Nordstrom, Gunilla; Ellstrom, Per-Erik

    2012-01-01

    Purpose: This paper seeks to present a theoretical framework with the aim of contributing to improved understanding of how reflection can provide a mechanism to integrate research-based knowledge with the pre-existing practice-based knowledge. Design/methodology/approach: The paper begins with an explanation of important concepts: research-based…

  18. Temporal and contextual knowledge in model-based expert systems

    NASA Technical Reports Server (NTRS)

    Toth-Fejel, Tihamer; Heher, Dennis

    1987-01-01

    A basic paradigm that allows representation of physical systems with a focus on context and time is presented. Paragon provides the capability to quickly capture an expert's knowledge in a cognitively resonant manner. From that description, Paragon creates a simulation model in LISP, which when executed, verifies that the domain expert did not make any mistakes. The Achille's heel of rule-based systems has been the lack of a systematic methodology for testing, and Paragon's developers are certain that the model-based approach overcomes that problem. The reason this testing is now possible is that software, which is very difficult to test, has in essence been transformed into hardware.

  19. Knowledge-based approach to fault diagnosis and control in distributed process environments

    NASA Astrophysics Data System (ADS)

    Chung, Kwangsue; Tou, Julius T.

    1991-03-01

    This paper presents a new design approach to knowledge-based decision support systems for fault diagnosis and control for quality assurance and productivity improvement in automated manufacturing environments. Based on the observed manifestations, the knowledge-based diagnostic system hypothesizes a set of the most plausible disorders by mimicking the reasoning process of a human diagnostician. The data integration technique is designed to generate error-free hierarchical category files. A novel approach to diagnostic problem solving has been proposed by integrating the PADIKS (Pattern-Directed Knowledge-Based System) concept and the symbolic model of diagnostic reasoning based on the categorical causal model. The combination of symbolic causal reasoning and pattern-directed reasoning produces a highly efficient diagnostic procedure and generates a more realistic expert behavior. In addition, three distinctive constraints are designed to further reduce the computational complexity and to eliminate non-plausible hypotheses involved in the multiple disorders problem. The proposed diagnostic mechanism, which consists of three different levels of reasoning operations, significantly reduces the computational complexity in the diagnostic problem with uncertainty by systematically shrinking the hypotheses space. This approach is applied to the test and inspection data collected from a PCB manufacturing operation.

  20. The COPD Knowledge Base: enabling data analysis and computational simulation in translational COPD research

    PubMed Central

    2014-01-01

    Background Previously we generated a chronic obstructive pulmonary disease (COPD) specific knowledge base (http://www.copdknowledgebase.eu) from clinical and experimental data, text-mining results and public databases. This knowledge base allowed the retrieval of specific molecular networks together with integrated clinical and experimental data. Results The COPDKB has now been extended to integrate over 40 public data sources on functional interaction (e.g. signal transduction, transcriptional regulation, protein-protein interaction, gene-disease association). In addition we integrated COPD-specific expression and co-morbidity networks connecting over 6 000 genes/proteins with physiological parameters and disease states. Three mathematical models describing different aspects of systemic effects of COPD were connected to clinical and experimental data. We have completely redesigned the technical architecture of the user interface and now provide html and web browser-based access and form-based searches. A network search enables the use of interconnecting information and the generation of disease-specific sub-networks from general knowledge. Integration with the Synergy-COPD Simulation Environment enables multi-scale integrated simulation of individual computational models while integration with a Clinical Decision Support System allows delivery into clinical practice. Conclusions The COPD Knowledge Base is the only publicly available knowledge resource dedicated to COPD and combining genetic information with molecular, physiological and clinical data as well as mathematical modelling. Its integrated analysis functions provide overviews about clinical trends and connections while its semantically mapped content enables complex analysis approaches. We plan to further extend the COPDKB by offering it as a repository to publish and semantically integrate data from relevant clinical trials. The COPDKB is freely available after registration at http

  1. NVL - a knowledge representation language based on semantic networks

    SciTech Connect

    Hudli, A.V.

    1989-01-01

    Taxonomic hierarchical networks or semantic networks have been widely used in representing knowledge in AI applications. Semantic networks have been the preferred form of representation in AI, rather than predicate logic because of the need to represent complex structured knowledge. However, the formal semantics of these networks has not been dealt with adequately in the literature. In this thesis, semantic networks are described by means of a formal relational logic called NVL. The characteristic features of NVL are limitor lists and binary predicates. Limitor lists are similar to restricted quantifiers but are more expressive. Several special binary relations are used to express the key ideas of semantic networks. NVL is based on the principles of semantic networks and taxonomic reasoning. The unification and inference mechanisms of NVL have considerable inherent parallelism which makes the language suitable for parallel implementation. The current opinion in AI is that semantic networks represent a subset of first order logic. Rather than modify predicate logic by adding features of semantic networks, the approach has been to devise a new form of logic by considering the basic principles and epistemological primitives of semantic networks such as properties, class concepts, relations, and inheritance. The syntax and semantics of NVL are first presented. Rules in the knowledge based are represented by V relation which also plays an important role in deriving inferences. The (mathematical) correctness of NVL is proved and concepts of unification of lists and inference in NVL are introduced. Parallel algorithms for unification and inference are developed.

  2. Matching sensors to missions using a knowledge-based approach

    NASA Astrophysics Data System (ADS)

    Preece, Alun; Gomez, Mario; de Mel, Geeth; Vasconcelos, Wamberto; Sleeman, Derek; Colley, Stuart; Pearson, Gavin; Pham, Tien; La Porta, Thomas

    2008-04-01

    Making decisions on how best to utilise limited intelligence, surveillance and reconnaisance (ISR) resources is a key issue in mission planning. This requires judgements about which kinds of available sensors are more or less appropriate for specific ISR tasks in a mission. A methodological approach to addressing this kind of decision problem in the military context is the Missions and Means Framework (MMF), which provides a structured way to analyse a mission in terms of tasks, and assess the effectiveness of various means for accomplishing those tasks. Moreover, the problem can be defined as knowledge-based matchmaking: matching the ISR requirements of tasks to the ISR-providing capabilities of available sensors. In this paper we show how the MMF can be represented formally as an ontology (that is, a specification of a conceptualisation); we also represent knowledge about ISR requirements and sensors, and then use automated reasoning to solve the matchmaking problem. We adopt the Semantic Web approach and the Web Ontology Language (OWL), allowing us to import elements of existing sensor knowledge bases. Our core ontologies use the description logic subset of OWL, providing efficient reasoning. We describe a prototype tool as a proof-of-concept for our approach. We discuss the various kinds of possible sensor-mission matches, both exact and inexact, and how the tool helps mission planners consider alternative choices of sensors.

  3. Knowledge-based inference engine for online video dissemination

    NASA Astrophysics Data System (ADS)

    Zhou, Wensheng; Kuo, C.-C. Jay

    2000-10-01

    To facilitate easy access to rich information of multimedia over the Internet, we develop a knowledge-based classification system that supports automatic Indexing and filtering based on semantic concepts for the dissemination of on-line real-time media. Automatic segmentation, annotation and summarization of media for fast information browsing and updating are achieved in the same time. In the proposed system, a real-time scene-change detection proxy performs an initial video structuring process by splitting a video clip into scenes. Motional and visual features are extracted in real time for every detected scene by using online feature extraction proxies. Higher semantics are then derived through a joint use of low-level features along with inference rules in the knowledge base. Inference rules are derived through a supervised learning process based on representative samples. On-line media filtering based on semantic concepts becomes possible by using the proposed video inference engine. Video streams are either blocked or sent to certain channels depending on whether or not the video stream is matched with the user's profile. The proposed system is extensively evaluated by applying the engine to video of basketball games.

  4. From science to action: Principles for undertaking environmental research that enables knowledge exchange and evidence-based decision-making.

    PubMed

    Cvitanovic, C; McDonald, J; Hobday, A J

    2016-12-01

    Effective conservation requires knowledge exchange among scientists and decision-makers to enable learning and support evidence-based decision-making. Efforts to improve knowledge exchange have been hindered by a paucity of empirically-grounded guidance to help scientists and practitioners design and implement research programs that actively facilitate knowledge exchange. To address this, we evaluated the Ningaloo Research Program (NRP), which was designed to generate new scientific knowledge to support evidence-based decisions about the management of the Ningaloo Marine Park in north-western Australia. Specifically, we evaluated (1) outcomes of the NRP, including the extent to which new knowledge informed management decisions; (2) the barriers that prevented knowledge exchange among scientists and managers; (3) the key requirements for improving knowledge exchange processes in the future; and (4) the core capacities that are required to support knowledge exchange processes. While the NRP generated expansive and multidisciplinary science outputs directly relevant to the management of the Ningaloo Marine Park, decision-makers are largely unaware of this knowledge and little has been integrated into decision-making processes. A range of barriers prevented efficient and effective knowledge exchange among scientists and decision-makers including cultural differences among the groups, institutional barriers within decision-making agencies, scientific outputs that were not translated for decision-makers and poor alignment between research design and actual knowledge needs. We identify a set of principles to be implemented routinely as part of any applied research program, including; (i) stakeholder mapping prior to the commencement of research programs to identify all stakeholders, (ii) research questions to be co-developed with stakeholders, (iii) implementation of participatory research approaches, (iv) use of a knowledge broker, and (v) tailored knowledge management

  5. Changing the knowledge base in Western herbal medicine.

    PubMed

    Evans, Sue

    2008-12-01

    The project of modernising Western herbal medicine in order to allow it to be accepted by the public and to contribute to contemporary healthcare is now over two decades old. One aspect of this project involves changes to the ways knowledge about medicinal plants is presented. This paper contrasts the models of Evidence-Based Medicine (EBM) and Traditional Knowledge (TK) to illuminate some of the complexities which have arisen consequent to these changes, particularly with regard to the concept of vitalism, the retention or rejection of which may have broad implications for the clinical practice of herbal medicine. Illustrations from two herbals (central texts on the medicinal use of plants) demonstrate the differences between these frameworks in regard to how herbs are understood. Further, a review of articles on herbal therapeutics published in the Australian Journal of Herbal Medicine indicates that practitioners are moving away from TK and towards the use of EBM in their clinical discussions.

  6. A knowledge-based agent prototype for Chinese address geocoding

    NASA Astrophysics Data System (ADS)

    Wei, Ran; Zhang, Xuehu; Ding, Linfang; Ma, Haoming; Li, Qi

    2009-10-01

    Chinese address geocoding is a difficult problem to deal with due to intrinsic complexities in Chinese address systems and a lack of standards in address assignments and usages. In order to improve existing address geocoding algorithm, a spatial knowledge-based agent prototype aimed at validating address geocoding results is built to determine the spatial accuracies as well as matching confidence. A portion of human's knowledge of judging the spatial closeness of two addresses is represented via first order logic and the corresponding algorithms are implemented with the Prolog language. Preliminary tests conducted using addresses matching result in Beijing area showed that the prototype can successfully assess the spatial closeness between the matching address and the query address with 97% accuracy.

  7. [Artificial intelligence--the knowledge base applied to nephrology].

    PubMed

    Sancipriano, G P

    2005-01-01

    The idea that efficacy efficiency, and quality in medicine could not be reached without sorting the huge knowledge of medical and nursing science is very common. Engineers and computer scientists have developed medical software with great prospects for success, but currently these software applications are not so useful in clinical practice. The medical doctor and the trained nurse live the 'information age' in many daily activities, but the main benefits are not so widespread in working activities. Artificial intelligence and, particularly, export systems charm health staff because of their potential. The first part of this paper summarizes the characteristics of 'weak artificial intelligence' and of expert systems important in clinical practice. The second part discusses medical doctors' requirements and the current nephrologic knowledge bases available for artificial intelligence development.

  8. Knowledge-based system for flight information management. Thesis

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.

    1990-01-01

    The use of knowledge-based system (KBS) architectures to manage information on the primary flight display (PFD) of commercial aircraft is described. The PFD information management strategy used tailored the information on the PFD to the tasks the pilot performed. The KBS design and implementation of the task-tailored PFD information management application is described. The knowledge acquisition and subsequent system design of a flight-phase-detection KBS is also described. The flight-phase output of this KBS was used as input to the task-tailored PFD information management KBS. The implementation and integration of this KBS with existing aircraft systems and the other KBS is described. The flight tests are examined of both KBS's, collectively called the Task-Tailored Flight Information Manager (TTFIM), which verified their implementation and integration, and validated the software engineering advantages of the KBS approach in an operational environment.

  9. Interoperability-oriented Integration of Failure Knowledge into Functional Knowledge and Knowledge Transformation based on Concepts Mapping

    NASA Astrophysics Data System (ADS)

    Koji, Yusuke; Kitamura, Yoshinobu; Kato, Yoshikiyo; Tsutsui, Yoshio; Mizoguchi, Riichiro

    In conceptual design, it is important to develop functional structures which reflect the rich experience in the knowledge from previous design failures. Especially, if a designer learns possible abnormal behaviors from a previous design failure, he or she can add an additional function which prevents such abnormal behaviors and faults. To do this, it is a crucial issue to share such knowledge about possible faulty phenomena and how to cope with them. In fact, a part of such knowledge is described in FMEA (Failure Mode and Effect Analysis) sheets, function structure models for systematic design and fault trees for FTA (Fault Tree Analysis).

  10. Dynamic reasoning in a knowledge-based system

    NASA Technical Reports Server (NTRS)

    Rao, Anand S.; Foo, Norman Y.

    1988-01-01

    Any space based system, whether it is a robot arm assembling parts in space or an onboard system monitoring the space station, has to react to changes which cannot be foreseen. As a result, apart from having domain-specific knowledge as in current expert systems, a space based AI system should also have general principles of change. This paper presents a modal logic which can not only represent change but also reason with it. Three primitive operations, expansion, contraction and revision are introduced and axioms which specify how the knowledge base should change when the external world changes are also specified. Accordingly the notion of dynamic reasoning is introduced, which unlike the existing forms of reasoning, provide general principles of change. Dynamic reasoning is based on two main principles, namely minimize change and maximize coherence. A possible-world semantics which incorporates the above two principles is also discussed. The paper concludes by discussing how the dynamic reasoning system can be used to specify actions and hence form an integral part of an autonomous reasoning and planning system.

  11. Addressing the translational dilemma: dynamic knowledge representation of inflammation using agent-based modeling.

    PubMed

    An, Gary; Christley, Scott

    2012-01-01

    Given the panoply of system-level diseases that result from disordered inflammation, such as sepsis, atherosclerosis, cancer, and autoimmune disorders, understanding and characterizing the inflammatory response is a key target of biomedical research. Untangling the complex behavioral configurations associated with a process as ubiquitous as inflammation represents a prototype of the translational dilemma: the ability to translate mechanistic knowledge into effective therapeutics. A critical failure point in the current research environment is a throughput bottleneck at the level of evaluating hypotheses of mechanistic causality; these hypotheses represent the key step toward the application of knowledge for therapy development and design. Addressing the translational dilemma will require utilizing the ever-increasing power of computers and computational modeling to increase the efficiency of the scientific method in the identification and evaluation of hypotheses of mechanistic causality. More specifically, development needs to focus on facilitating the ability of non-computer trained biomedical researchers to utilize and instantiate their knowledge in dynamic computational models. This is termed "dynamic knowledge representation." Agent-based modeling is an object-oriented, discrete-event, rule-based simulation method that is well suited for biomedical dynamic knowledge representation. Agent-based modeling has been used in the study of inflammation at multiple scales. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggest that this modeling framework is well suited for addressing the translational dilemma. This review describes agent-based modeling, gives examples of its applications in the study of inflammation, and introduces a proposed general expansion of the use of modeling and simulation to augment the generation and evaluation of knowledge

  12. A knowledge-based framework for image enhancement in aviation security.

    PubMed

    Singh, Maneesha; Singh, Sameer; Partridge, Derek

    2004-12-01

    The main aim of this paper is to present a knowledge-based framework for automatically selecting the best image enhancement algorithm from several available on a per image basis in the context of X-ray images of airport luggage. The approach detailed involves a system that learns to map image features that represent its viewability to one or more chosen enhancement algorithms. Viewability measures have been developed to provide an automatic check on the quality of the enhanced image, i.e., is it really enhanced? The choice is based on ground-truth information generated by human X-ray screening experts. Such a system, for a new image, predicts the best-suited enhancement algorithm. Our research details the various characteristics of the knowledge-based system and shows extensive results on real images.

  13. Case-based reasoning for space applications: Utilization of prior experience in knowledge-based systems

    NASA Technical Reports Server (NTRS)

    King, James A.

    1987-01-01

    The goal is to explain Case-Based Reasoning as a vehicle to establish knowledge-based systems based on experimental reasoning for possible space applications. This goal will be accomplished through an examination of reasoning based on prior experience in a sample domain, and also through a presentation of proposed space applications which could utilize Case-Based Reasoning techniques.

  14. Verification of Legal Knowledge-base with Conflictive Concept

    NASA Astrophysics Data System (ADS)

    Hagiwara, Shingo; Tojo, Satoshi

    In this paper, we propose a verification methodology of large-scale legal knowledge. With a revision of legal code, we are forced to revise also other affected code to keep the consistency of law. Thus, our task is to revise the affected area properly and to investigate its adequacy. In this study, we extend the notion of inconsistency besides of the ordinary logical inconsistency, to include the conceptual conflicts. We obtain these conflictions from taxonomy data, and thus, we can avoid tedious manual declarations of opponent words. In the verification process, we adopt extended disjunctive logic programming (EDLP) to tolerate multiple consequences for a given set of antecedents. In addition, we employ abductive logic programming (ALP) regarding the situations to which the rules are applied as premises. Also, we restrict a legal knowledge-base to acyclic program to avoid the circulation of definitions, to justify the relevance of verdicts. Therefore, detecting cyclic parts of legal knowledge would be one of our objectives. The system is composed of two subsystems; we implement the preprocessor in Ruby to facilitate string manipulation, and the verifier in Prolog to exert the logical inference. Also, we employ XML format in the system to retain readability. In this study, we verify actual code of ordinances of Toyama prefecture, and show the experimental results.

  15. Nursing faculties’ knowledge and attitude on evidence-based practice

    PubMed Central

    Mehrdad, Neda; Joolaee, Soodabeh; Joulaee, Azadeh; Bahrani, Naser

    2012-01-01

    Background: Evidence-based practice (EBP) is one of the main professional competencies for health care professionals and a priority for medicine and nursing curriculum as well. EBP leads to improve effective and efficient care and patient outcomes. Nurse educators have responsibility to teach the future nurses, and an opportunity to promote patient outcomes. Therefore, the aim of this study was to describe nurse educators’ knowledge and attitude on EBP. Materials and Methods: This was a descriptive study conducted in nursing faculties of two major universities of medical sciences affiliated to Ministry of Health and Medical Sciences in Tehran, Iran. Data were gathered using a three-section questionnaire. Content and face validity was further enhanced by submitting it to nursing research and education experts. Statistical analysis was carried out using SPSS 11 software. Results: According the results, nursing faculties’ knowledge of EBP was mainly moderate (47.1%). Significant statistical relationship was found between the level of knowledge with education and teaching experience in different nursing programs. Nurses generally held positive attitudes toward EBP (88.6%) and there was no statistical significant relationship with demographic variables. Conclusion: Nursing educators are in a position to influence nursing research in clinical practice in the future. Therefore, it is critical to achieve implementation of EBP and be a change agent for a paradigm shift toward EBP. PMID:23922597

  16. Fiber-based flexible thermoelectric power generator

    NASA Astrophysics Data System (ADS)

    Yadav, A.; Pipe, K. P.; Shtein, M.

    Flexible thermoelectric power generators fabricated by evaporating thin films on flexible fiber substrates are demonstrated to be feasible candidates for waste heat recovery. An open circuit voltage of 19.6 μV K per thermocouple junction is measured for Ni-Ag thin films, and a maximum power of 2 nW for 7 couples at Δ T = 6.6 K is measured. Heat transfer analysis is used to project performance for several other material systems, with a predicted power output of 1 μW per couple for Bi 2Te 3/Sb 2Te 3-based fiber coatings with a hot junction temperature of 100 °C. Considering the performance of woven thermoelectric cloths or fiber composites, relevant properties and dimensions of individual thermoelectric fibers are optimized.

  17. Category vs. Object Knowledge in Category-based Induction

    PubMed Central

    Murphy, Gregory L.; Ross, Brian H.

    2009-01-01

    In one form of category-based induction, people make predictions about unknown properties of objects. There is a tension between predictions made based on the object’s specific features (e.g., objects above a certain size tend not to fly) and those made by reference to category-level knowledge (e.g., birds fly). Seven experiments with artificial categories investigated these two sources of induction by looking at whether people used information about correlated features within categories, suggesting that they focused on feature-feature relations rather than summary categorical information. The results showed that people relied heavily on such correlations, even when there was no reason to think that the correlations exist in the population. The results suggested that people’s use of this strategy is largely unreflective, rather than strategically chosen. These findings have important implications for models of category-based induction, which generally ignore feature-feature relations. PMID:20526447

  18. Detection of infrastructure manipulation with knowledge-based video surveillance

    NASA Astrophysics Data System (ADS)

    Muench, David; Hilsenbeck, Barbara; Kieritz, Hilke; Becker, Stefan; Grosselfinger, Ann-Kristin; Huebner, Wolfgang; Arens, Michael

    2016-10-01

    We are living in a world dependent on sophisticated technical infrastructure. Malicious manipulation of such critical infrastructure poses an enormous threat for all its users. Thus, running a critical infrastructure needs special attention to log the planned maintenance or to detect suspicious events. Towards this end, we present a knowledge-based surveillance approach capable of logging visual observable events in such an environment. The video surveillance modules are based on appearance-based person detection, which further is used to modulate the outcome of generic processing steps such as change detection or skin detection. A relation between the expected scene behavior and the underlying basic video surveillance modules is established. It will be shown that the combination already provides sufficient expressiveness to describe various everyday situations in indoor video surveillance. The whole approach is qualitatively and quantitatively evaluated on a prototypical scenario in a server room.

  19. Sensor-based diagnosis using knowledge of structure and function

    NASA Technical Reports Server (NTRS)

    Scarl, Ethan A.; Jamieson, John R.; Delaune, Carl I.

    1987-01-01

    A system for fault detection and isolation called LES, developed at the Kennedy Space Center for the Space Shuttle's Launch Processing System, is a well-developed diagnostic system that is simultaneously model-based and sensor-based. This experiment has led to a surprising result: the failure of a sensor can not only be handled in precisely the same way as the failure of any other object, but may present an especially easy case. Classical rule-based diagnostic systems need to find out whether or not their sensors are telling them the truth before they can safely draw inferences from them. By contrast, while LES does treat sensors as a special case, it does so only because there may exist a short cut that allows them to be handled more simply than other objects. LES uses both structural and functional knowledge, and has found cases in which the structural knowledge can be economically replaced by the judicious use of functional relationships; LES' functional relationships are stored in exactly one place, so they must be inverted to determine hypothetical values for possibly faulty objects. The inversion process has been extended to include conditional functions not normally considered to have inverses.

  20. Knowledge-based processing for aircraft flight control

    NASA Technical Reports Server (NTRS)

    Painter, John H.

    1991-01-01

    The purpose is to develop algorithms and architectures for embedding artificial intelligence in aircraft guidance and control systems. With the approach adopted, AI-computing is used to create an outer guidance loop for driving the usual aircraft autopilot. That is, a symbolic processor monitors the operation and performance of the aircraft. Then, based on rules and other stored knowledge, commands are automatically formulated for driving the autopilot so as to accomplish desired flight operations. The focus is on developing a software system which can respond to linguistic instructions, input in a standard format, so as to formulate a sequence of simple commands to the autopilot. The instructions might be a fairly complex flight clearance, input either manually or by data-link. Emphasis is on a software system which responds much like a pilot would, employing not only precise computations, but, also, knowledge which is less precise, but more like common-sense. The approach is based on prior work to develop a generic 'shell' architecture for an AI-processor, which may be tailored to many applications by describing the application in appropriate processor data bases (libraries). Such descriptions include numerical models of the aircraft and flight control system, as well as symbolic (linguistic) descriptions of flight operations, rules, and tactics.

  1. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2001-01-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  2. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2000-12-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  3. A Metadata based Knowledge Discovery Methodology for Seeding Translational Research.

    PubMed

    Kothari, Cartik R; Payne, Philip R O

    2015-01-01

    In this paper, we present a semantic, metadata based knowledge discovery methodology for identifying teams of researchers from diverse backgrounds who can collaborate on interdisciplinary research projects: projects in areas that have been identified as high-impact areas at The Ohio State University. This methodology involves the semantic annotation of keywords and the postulation of semantic metrics to improve the efficiency of the path exploration algorithm as well as to rank the results. Results indicate that our methodology can discover groups of experts from diverse areas who can collaborate on translational research projects.

  4. Knowledge-based machine vision systems for space station automation

    NASA Technical Reports Server (NTRS)

    Ranganath, Heggere S.; Chipman, Laure J.

    1989-01-01

    Computer vision techniques which have the potential for use on the space station and related applications are assessed. A knowledge-based vision system (expert vision system) and the development of a demonstration system for it are described. This system implements some of the capabilities that would be necessary in a machine vision system for the robot arm of the laboratory module in the space station. A Perceptics 9200e image processor, on a host VAXstation, was used to develop the demonstration system. In order to use realistic test images, photographs of actual space shuttle simulator panels were used. The system's capabilities of scene identification and scene matching are discussed.

  5. Melody-based knowledge discovery in musical pieces

    NASA Astrophysics Data System (ADS)

    Rybnik, Mariusz; Jastrzebska, Agnieszka

    2016-06-01

    The paper is focused on automated knowledge discovery in musical pieces, based on transformations of digital musical notation. Usually a single musical piece is analyzed, to discover the structure as well as traits of separate voices. Melody and rhythm is processed with the use of three proposed operators, that serve as meta-data. In this work we focus on melody, so the processed data is labeled using fuzzy labels, created for detecting various voice characteristics. A comparative analysis of two musical pieces may be performed as well, that compares them in terms of various rhythmic or melodic traits (as a whole or with voice separation).

  6. Knowledge-Based Reinforcement Learning for Data Mining

    NASA Astrophysics Data System (ADS)

    Kudenko, Daniel; Grzes, Marek

    experts have developed heuristics that help them in planning and scheduling resources in their work place. However, this domain knowledge is often rough and incomplete. When the domain knowledge is used directly by an automated expert system, the solutions are often sub-optimal, due to the incompleteness of the knowledge, the uncertainty of environments, and the possibility to encounter unexpected situations. RL, on the other hand, can overcome the weaknesses of the heuristic domain knowledge and produce optimal solutions. In the talk we propose two techniques, which represent first steps in the area of knowledge-based RL (KBRL). The first technique [1] uses high-level STRIPS operator knowledge in reward shaping to focus the search for the optimal policy. Empirical results show that the plan-based reward shaping approach outperforms other RL techniques, including alternative manual and MDP-based reward shaping when it is used in its basic form. We showed that MDP-based reward shaping may fail and successful experiments with STRIPS-based shaping suggest modifications which can overcome encountered problems. The STRIPSbased method we propose allows expressing the same domain knowledge in a different way and the domain expert can choose whether to define an MDP or STRIPS planning task. We also evaluated the robustness of the proposed STRIPS-based technique to errors in the plan knowledge. In case that STRIPS knowledge is not available, we propose a second technique [2] that shapes the reward with hierarchical tile coding. Where the Q-function is represented with low-level tile coding, a V-function with coarser tile coding can be learned in parallel and used to approximate the potential for ground states. In the context of data mining, our KBRL approaches can also be used for any data collection task where the acquisition of data may incur considerable cost. In addition, observing the data collection agent in specific scenarios may lead to new insights into optimal data

  7. A rainfall simulator based on multifractal generator

    NASA Astrophysics Data System (ADS)

    Akrour, Nawal; mallet, Cecile; barthes, Laurent; chazottes, Aymeric

    2015-04-01

    The Precipitations are due to complex meteorological phenomenon's and unlike other geophysical constituents such as water vapour concentration they present a relaxation behaviour leading to an alternation of dry and wet periods. Thus, precipitations can be described as intermittent process. The spatial and temporal variability of this phenomenon is significant and covers large scales. This high variability can cause extreme events which are difficult to observe properly because of their suddenness and their localized character. For all these reasons, the precipitations are therefore difficult to model. This study aims to adapt a one-dimensional time series model previously developed by the authors [Akrour et al., 2013, 2014] to a two-dimensional rainfall generator. The original time series model can be divided into 3 major steps : rain support generation, intra event rain rates generation using multifractal and finally calibration process. We use the same kind of methodology in the present study. Based on dataset obtained from meteorological radar of Météo France with a spatial resolution of 1 km x 1 km we present the used approach : Firstly, the extraction of rain support (rain/no rain area) allowing the retrieval of the rain support structure function (variogram) and fractal properties. This leads us to use either the rain support modelisation proposed by ScleissXXX [ref] or directly real rain support extracted from radar rain maps. Then, the generation (over rain areas) of rain rates is made thanks to a 2D multifractal Fractionnally Integrated Flux (FIF) model [ref]. This second stage is followed by a calibration/forcing step (forcing average rain rate per events) added in order to provide rain rate coherent with observed rain-rate distribution. The forcing process is based on a relation identified from the average rain rate of observed events and their surfaces. The presentation will first explain the different steps presented above, then some results

  8. Concept maps: A tool for knowledge management and synthesis in web-based conversational learning

    PubMed Central

    Joshi, Ankur; Singh, Satendra; Jaswal, Shivani; Badyal, Dinesh Kumar; Singh, Tejinder

    2016-01-01

    Web-based conversational learning provides an opportunity for shared knowledge base creation through collaboration and collective wisdom extraction. Usually, the amount of generated information in such forums is very huge, multidimensional (in alignment with the desirable preconditions for constructivist knowledge creation), and sometimes, the nature of expected new information may not be anticipated in advance. Thus, concept maps (crafted from constructed data) as “process summary” tools may be a solution to improve critical thinking and learning by making connections between the facts or knowledge shared by the participants during online discussion This exploratory paper begins with the description of this innovation tried on a web-based interacting platform (email list management software), FAIMER-Listserv, and generated qualitative evidence through peer-feedback. This process description is further supported by a theoretical construct which shows how social constructivism (inclusive of autonomy and complexity) affects the conversational learning. The paper rationalizes the use of concept map as mid-summary tool for extracting information and further sense making out of this apparent intricacy. PMID:27563577

  9. Concept maps: A tool for knowledge management and synthesis in web-based conversational learning.

    PubMed

    Joshi, Ankur; Singh, Satendra; Jaswal, Shivani; Badyal, Dinesh Kumar; Singh, Tejinder

    2016-01-01

    Web-based conversational learning provides an opportunity for shared knowledge base creation through collaboration and collective wisdom extraction. Usually, the amount of generated information in such forums is very huge, multidimensional (in alignment with the desirable preconditions for constructivist knowledge creation), and sometimes, the nature of expected new information may not be anticipated in advance. Thus, concept maps (crafted from constructed data) as "process summary" tools may be a solution to improve critical thinking and learning by making connections between the facts or knowledge shared by the participants during online discussion This exploratory paper begins with the description of this innovation tried on a web-based interacting platform (email list management software), FAIMER-Listserv, and generated qualitative evidence through peer-feedback. This process description is further supported by a theoretical construct which shows how social constructivism (inclusive of autonomy and complexity) affects the conversational learning. The paper rationalizes the use of concept map as mid-summary tool for extracting information and further sense making out of this apparent intricacy.

  10. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    ERIC Educational Resources Information Center

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  11. A knowledge base for Vitis vinifera functional analysis

    PubMed Central

    2015-01-01

    Background Vitis vinifera (Grapevine) is the most important fruit species in the modern world. Wine and table grapes sales contribute significantly to the economy of major wine producing countries. The most relevant goals in wine production concern quality and safety. In order to significantly improve the achievement of these objectives and to gain biological knowledge about cultivars, a genomic approach is the most reliable strategy. The recent grapevine genome sequencing offers the opportunity to study the potential roles of genes and microRNAs in fruit maturation and other physiological and pathological processes. Although several systems allowing the analysis of plant genomes have been reported, none of them has been designed specifically for the functional analysis of grapevine genomes of cultivars under environmental stress in connection with microRNA data. Description Here we introduce a novel knowledge base, called BIOWINE, designed for the functional analysis of Vitis vinifera genomes of cultivars present in Sicily. The system allows the analysis of RNA-seq experiments of two different cultivars, namely Nero d'Avola and Nerello Mascalese. Samples were taken under different climatic conditions of phenological phases, diseases, and geographic locations. The BIOWINE web interface is equipped with data analysis modules for grapevine genomes. In particular users may analyze the current genome assembly together with the RNA-seq data through a customized version of GBrowse. The web interface allows users to perform gene set enrichment by exploiting third-party databases. Conclusions BIOWINE is a knowledge base implementing a set of bioinformatics tools for the analysis of grapevine genomes. The system aims to increase our understanding of the grapevine varieties and species of Sicilian products focusing on adaptability to different climatic conditions, phenological phases, diseases, and geographic locations. PMID:26050794

  12. Design of Composite Structures Using Knowledge-Based and Case Based Reasoning

    NASA Technical Reports Server (NTRS)

    Lambright, Jonathan Paul

    1996-01-01

    A method of using knowledge based and case based reasoning to assist designers during conceptual design tasks of composite structures was proposed. The cooperative use of heuristics, procedural knowledge, and previous similar design cases suggests a potential reduction in design cycle time and ultimately product lead time. The hypothesis of this work is that the design process of composite structures can be improved by using Case-Based Reasoning (CBR) and Knowledge-Based (KB) reasoning in the early design stages. The technique of using knowledge-based and case-based reasoning facilitates the gathering of disparate information into one location that is easily and readily available. The method suggests that the inclusion of downstream life-cycle issues into the conceptual design phase reduces potential of defective, and sub-optimal composite structures. Three industry experts were interviewed extensively. The experts provided design rules, previous design cases, and test problems. A Knowledge Based Reasoning system was developed using the CLIPS (C Language Interpretive Procedural System) environment and a Case Based Reasoning System was developed using the Design Memory Utility For Sharing Experiences (MUSE) xviii environment. A Design Characteristic State (DCS) was used to document the design specifications, constraints, and problem areas using attribute-value pair relationships. The DCS provided consistent design information between the knowledge base and case base. Results indicated that the use of knowledge based and case based reasoning provided a robust design environment for composite structures. The knowledge base provided design guidance from well defined rules and procedural knowledge. The case base provided suggestions on design and manufacturing techniques based on previous similar designs and warnings of potential problems and pitfalls. The case base complemented the knowledge base and extended the problem solving capability beyond the existence of

  13. Hospital nurses' use of knowledge-based information resources.

    PubMed

    Tannery, Nancy Hrinya; Wessel, Charles B; Epstein, Barbara A; Gadd, Cynthia S

    2007-01-01

    The purpose of this study was to evaluate the information-seeking practices of nurses before and after access to a library's electronic collection of information resources. This is a pre/post intervention study of nurses at a rural community hospital. The hospital contracted with an academic health sciences library for access to a collection of online knowledge-based resources. Self-report surveys were used to obtain information about nurses' computer use and how they locate and access information to answer questions related to their patient care activities. In 2001, self-report surveys were sent to the hospital's 573 nurses during implementation of access to online resources with a post-implementation survey sent 1 year later. At the initiation of access to the library's electronic resources, nurses turned to colleagues and print textbooks or journals to satisfy their information needs. After 1 year of access, 20% of the nurses had begun to use the library's electronic resources. The study outcome suggests ready access to knowledge-based electronic information resources can lead to changes in behavior among some nurses.

  14. Knowledge-based topographic feature extraction in medical images

    NASA Astrophysics Data System (ADS)

    Qian, JianZhong; Khair, Mohammad M.

    1995-08-01

    Diagnostic medical imaging often contains variations of patient anatomies, camera mispositioning, or other imperfect imaging condiitons. These variations contribute to uncertainty about shapes and boundaries of objects in images. As the results sometimes image features, such as traditional edges, may not be identified reliably and completely. We describe a knowledge based system that is able to reason about such uncertainties and use partial and locally ambiguous information to infer about shapes and lcoation of objects in an image. The system uses directional topographic features (DTFS), such as ridges and valleys, labeled from the underlying intensity surface to correlate to the intrinsic anatomical information. By using domain specific knowledge, the reasoning system can deduce significant anatomical landmarks based upon these DTFS, and can cope with uncertainties and fill in missing information. A succession of levels of representation for visual information and an active process of uncertain reasoning about this visual information are employed to realiably achieve the goal of image analysis. These landmarks can then be used in localization of anatomy of interest, image registration, or other clinical processing. The successful application of this system to a large set of planar cardiac images of nuclear medicine studies has demonstrated its efficiency and accuracy.

  15. The AI Bus architecture for distributed knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain

    1991-01-01

    The AI Bus architecture is layered, distributed object oriented framework developed to support the requirements of advanced technology programs for an order of magnitude improvement in software costs. The consequent need for highly autonomous computer systems, adaptable to new technology advances over a long lifespan, led to the design of an open architecture and toolbox for building large scale, robust, production quality systems. The AI Bus accommodates a mix of knowledge based and conventional components, running on heterogeneous, distributed real world and testbed environment. The concepts and design is described of the AI Bus architecture and its current implementation status as a Unix C++ library or reusable objects. Each high level semiautonomous agent process consists of a number of knowledge sources together with interagent communication mechanisms based on shared blackboards and message passing acquaintances. Standard interfaces and protocols are followed for combining and validating subsystems. Dynamic probes or demons provide an event driven means for providing active objects with shared access to resources, and each other, while not violating their security.

  16. Network fingerprint: a knowledge-based characterization of biomedical networks

    PubMed Central

    Cui, Xiuliang; He, Haochen; He, Fuchu; Wang, Shengqi; Li, Fei; Bo, Xiaochen

    2015-01-01

    It can be difficult for biomedical researchers to understand complex molecular networks due to their unfamiliarity with the mathematical concepts employed. To represent molecular networks with clear meanings and familiar forms for biomedical researchers, we introduce a knowledge-based computational framework to decipher biomedical networks by making systematic comparisons to well-studied “basic networks”. A biomedical network is characterized as a spectrum-like vector called “network fingerprint”, which contains similarities to basic networks. This knowledge-based multidimensional characterization provides a more intuitive way to decipher molecular networks, especially for large-scale network comparisons and clustering analyses. As an example, we extracted network fingerprints of 44 disease networks in the Kyoto Encyclopedia of Genes and Genomes (KEGG) database. The comparisons among the network fingerprints of disease networks revealed informative disease-disease and disease-signaling pathway associations, illustrating that the network fingerprinting framework will lead to new approaches for better understanding of biomedical networks. PMID:26307246

  17. Framework Support For Knowledge-Based Software Development

    NASA Astrophysics Data System (ADS)

    Huseth, Steve

    1988-03-01

    The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.

  18. TMS for Instantiating a Knowledge Base With Incomplete Data

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A computer program that belongs to the class known among software experts as output truth-maintenance-systems (output TMSs) has been devised as one of a number of software tools for reducing the size of the knowledge base that must be searched during execution of artificial- intelligence software of the rule-based inference-engine type in a case in which data are missing. This program determines whether the consequences of activation of two or more rules can be combined without causing a logical inconsistency. For example, in a case involving hypothetical scenarios that could lead to turning a given device on or off, the program determines whether a scenario involving a given combination of rules could lead to turning the device both on and off at the same time, in which case that combination of rules would not be included in the scenario.

  19. Expert operator's associate: A knowledge based system for spacecraft control

    NASA Technical Reports Server (NTRS)

    Nielsen, Mogens; Grue, Klaus; Lecouat, Francois

    1991-01-01

    The Expert Operator's Associate (EOA) project is presented which studies the applicability of expert systems for day-to-day space operations. A prototype expert system is developed, which operates on-line with an existing spacecraft control system at the European Space Operations Centre, and functions as an 'operator's assistant' in controlling satellites. The prototype is demonstrated using an existing real-time simulation model of the MARECS-B2 telecommunication satellite. By developing a prototype system, the extent to which reliability and effectivens of operations can be enhanced by AI based support is examined. In addition the study examines the questions of acquisition and representation of the 'knowledge' for such systems, and the feasibility of 'migration' of some (currently) ground-based functions into future spaceborne autonomous systems.

  20. Systems, methods and apparatus for verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor); Rouff, Christopher A. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  1. Dilemmatic Spaces: High-Stakes Testing and the Possibilities of Collaborative Knowledge Work to Generate Learning Innovations

    ERIC Educational Resources Information Center

    Singh, Parlo; Märtsin, Mariann; Glasswell, Kathryn

    2015-01-01

    This paper examines collaborative researcher-practitioner knowledge work around assessment data in culturally diverse, low socio-economic school communities in Queensland, Australia. Specifically, the paper draws on interview accounts about the work of a cohort of school-based researchers who acted as mediators bridging knowledge flows between a…

  2. A rule-based software test data generator

    NASA Technical Reports Server (NTRS)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  3. Knowledge-based control of an adaptive interface

    NASA Technical Reports Server (NTRS)

    Lachman, Roy

    1989-01-01

    The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.

  4. Using Teacher-Generated Ecological Models to Assess Knowledge Gained During Teacher Training

    NASA Astrophysics Data System (ADS)

    Dresner, M.; Moldenke, A.

    2005-12-01

    Developing a capacity for systems thinking (ways to understand complex systems) requires both immersion in challenging, real-world problem contexts and exposure to systems analysis language, tools and procedures, such as ecosystem modeling. Modeling is useful as a means of conveying complex, dynamic interactions. Models of ecosystems can facilitate an ability to be attentive to whole systems by illustrating multiple factors of interaction, feedback, subsystems and inputs and outputs, which lead to a greater understanding of ecosystem functioning. Concept mapping, which uses models of students' ideas organized hierarchically is used in assessment, but it does not having any outside utility. Ecosystem models, on the other hand, are legitimate end-products in and of themselves. A change made in a learner-generated model that conforms to patterns observed in nature by the learner can be seen as reflections of his or her understanding. Starting with their own reflections on previous ecological knowledge, teachers will model components of the ecosystem they are about to study. 'Teaching models' will be used to familiarize learners with the symbolic language of models and to teach some basic ecology concepts. Teachers then work directly with ecologists in conducting research, using the steps of a straightforward study as a guide, and then observe and discuss patterns in the data they have collected. Higher-order thinking skills are practiced through the reflective use of ecological models. Through a series of questions including analysis, relational reasoning, synthesis, testing, and explaining, pairs of teacher describe the principles and theories about ecology that they think might be operating in their models to one another. They describe the consequences of human-caused impacts and possible causal patterns. They explain any differences in their understanding of ecosystem interactions before and after their research experiences

  5. Autonomous Cryogenic Load Operations: Knowledge-Based Autonomous Test Engineer

    NASA Technical Reports Server (NTRS)

    Schrading, J. Nicolas

    2013-01-01

    The Knowledge-Based Autonomous Test Engineer (KATE) program has a long history at KSC. Now a part of the Autonomous Cryogenic Load Operations (ACLO) mission, this software system has been sporadically developed over the past 20 years. Originally designed to provide health and status monitoring for a simple water-based fluid system, it was proven to be a capable autonomous test engineer for determining sources of failure in the system. As part of a new goal to provide this same anomaly-detection capability for a complicated cryogenic fluid system, software engineers, physicists, interns and KATE experts are working to upgrade the software capabilities and graphical user interface. Much progress was made during this effort to improve KATE. A display of the entire cryogenic system's graph, with nodes for components and edges for their connections, was added to the KATE software. A searching functionality was added to the new graph display, so that users could easily center their screen on specific components. The GUI was also modified so that it displayed information relevant to the new project goals. In addition, work began on adding new pneumatic and electronic subsystems into the KATE knowledge base, so that it could provide health and status monitoring for those systems. Finally, many fixes for bugs, memory leaks, and memory errors were implemented and the system was moved into a state in which it could be presented to stakeholders. Overall, the KATE system was improved and necessary additional features were added so that a presentation of the program and its functionality in the next few months would be a success.

  6. Knowledge-based modelling of historical surfaces using lidar data

    NASA Astrophysics Data System (ADS)

    Höfler, Veit; Wessollek, Christine; Karrasch, Pierre

    2016-10-01

    Currently in archaeological studies digital elevation models are mainly used especially in terms of shaded reliefs for the prospection of archaeological sites. Hesse (2010) provides a supporting software tool for the determination of local relief models during the prospection using LiDAR scans. Furthermore the search for relicts from WW2 is also in the focus of his research. In James et al. (2006) the determined contour lines were used to reconstruct locations of archaeological artefacts such as buildings. This study is much more and presents an innovative workflow of determining historical high resolution terrain surfaces using recent high resolution terrain models and sedimentological expert knowledge. Based on archaeological field studies (Franconian Saale near Bad Neustadt in Germany) the sedimentological analyses shows that archaeological interesting horizon and geomorphological expert knowledge in combination with particle size analyses (Koehn, DIN ISO 11277) are useful components for reconstructing surfaces of the early Middle Ages. Furthermore the paper traces how it is possible to use additional information (extracted from a recent digital terrain model) to support the process of determination historical surfaces. Conceptual this research is based on methodology of geomorphometry and geo-statistics. The basic idea is that the working procedure is based on the different input data. One aims at tracking the quantitative data and the other aims at processing the qualitative data. Thus, the first quantitative data were available for further processing, which were later processed with the qualitative data to convert them to historical heights. In the final stage of the workflow all gathered information are stored in a large data matrix for spatial interpolation using the geostatistical method of Kriging. Besides the historical surface, the algorithm also provides a first estimation of accuracy of the modelling. The presented workflow is characterized by a high

  7. DRILL: a standardized radiology-teaching knowledge base

    NASA Astrophysics Data System (ADS)

    Rundle, Debra A.; Evers, K.; Seshadri, Sridhar B.; Arenson, Ronald L.

    1991-07-01

    Traditionally, radiologists have collected and saved interesting cases in their film formats to teach medical students, residents, and physicians. These cases are classified according to various coding schemes, although current schemes alone are insufficient to meet today's educational needs. Teaching methods and cases also vary among institutions, along with the manner in which instructors present information to their students. In order to address this problem, the authors developed a standardized radiology teaching knowledge database known as the Digital Radiology Image Learning Library (DRILL). DRILL is a relational image knowledge database providing access to standard mammography cases in digital image format along with a pool of clinical and radiological information on a per-case basis. The development platform chosen is a standard Apple Macintosh-II computer and the Oracle database environment. The data entry and query interfaces are implemented in HyperCard. Images are stored on magnetic disk but could be stored on optical media. Since the personal computer platform was chosen, a wide variety of course building tools are available through which a teacher can construct a course, such as authoring and multi-media systems for building computer based courses, or word processors for writing course outlines tests. The interface also provides image conversion tools which convert images into PC-compatible formats.

  8. Compiling knowledge-based systems from KEE to Ada

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  9. Hyperincursion and the Globalization of the Knowledge-Based Economy

    NASA Astrophysics Data System (ADS)

    Leydesdorff, Loet

    2006-06-01

    In biological systems, the capacity of anticipation—that is, entertaining a model of the system within the system—can be considered as naturally given. Human languages enable psychological systems to construct and exchange mental models of themselves and their environments reflexively, that is, provide meaning to the events. At the level of the social system expectations can further be codified. When these codifications are functionally differentiated—like between market mechanisms and scientific research programs—the potential asynchronicity in the update among the subsystems provides room for a second anticipatory mechanism at the level of the transversal information exchange among differently codified meaning-processing subsystems. Interactions between the two different anticipatory mechanisms (the transversal one and the one along the time axis in each subsystem) may lead to co-evolutions and stabilization of expectations along trajectories. The wider horizon of knowledgeable expectations can be expected to meta-stabilize and also globalize a previously stabilized configuration of expectations against the axis of time. While stabilization can be considered as consequences of interaction and aggregation among incursive formulations of the logistic equation, globalization can be modeled using the hyperincursive formulation of this equation. The knowledge-based subdynamic at the global level which thus emerges, enables historical agents to inform the reconstruction of previous states and to co-construct future states of the social system, for example, in a techno-economic co-evolution.

  10. A knowledge-based system design/information tool

    NASA Technical Reports Server (NTRS)

    Allen, James G.; Sikora, Scott E.

    1990-01-01

    The objective of this effort was to develop a Knowledge Capture System (KCS) for the Integrated Test Facility (ITF) at the Dryden Flight Research Facility (DFRF). The DFRF is a NASA Ames Research Center (ARC) facility. This system was used to capture the design and implementation information for NASA's high angle-of-attack research vehicle (HARV), a modified F/A-18A. In particular, the KCS was used to capture specific characteristics of the design of the HARV fly-by-wire (FBW) flight control system (FCS). The KCS utilizes artificial intelligence (AI) knowledge-based system (KBS) technology. The KCS enables the user to capture the following characteristics of automated systems: the system design; the hardware (H/W) design and implementation; the software (S/W) design and implementation; and the utilities (electrical and hydraulic) design and implementation. A generic version of the KCS was developed which can be used to capture the design information for any automated system. The deliverable items for this project consist of the prototype generic KCS and an application, which captures selected design characteristics of the HARV FCS.

  11. A Theory of Information Genetics: How Four Subforces Generate Information and the Implications for Total Quality Knowledge Management.

    ERIC Educational Resources Information Center

    Tsai, Bor-sheng

    2002-01-01

    Proposes a model called information genetics to elaborate on the origin of information generating. Explains conceptual and data models; and describes a software program that was developed for citation data mining, infomapping, and information repackaging for total quality knowledge management in Web representation. (Contains 112 references.)…

  12. Knowledge-based data analysis comes of age.

    PubMed

    Ochs, Michael F

    2010-01-01

    The emergence of high-throughput technologies for measuring biological systems has introduced problems for data interpretation that must be addressed for proper inference. First, analysis techniques need to be matched to the biological system, reflecting in their mathematical structure the underlying behavior being studied. When this is not done, mathematical techniques will generate answers, but the values and reliability estimates may not accurately reflect the biology. Second, analysis approaches must address the vast excess in variables measured (e.g. transcript levels of genes) over the number of samples (e.g. tumors, time points), known as the 'large-p, small-n' problem. In large-p, small-n paradigms, standard statistical techniques generally fail, and computational learning algorithms are prone to overfit the data. Here we review the emergence of techniques that match mathematical structure to the biology, the use of integrated data and prior knowledge to guide statistical analysis, and the recent emergence of analysis approaches utilizing simple biological models. We show that novel biological insights have been gained using these techniques.

  13. Background Knowledge in Learning-Based Relation Extraction

    ERIC Educational Resources Information Center

    Do, Quang Xuan

    2012-01-01

    In this thesis, we study the importance of background knowledge in relation extraction systems. We not only demonstrate the benefits of leveraging background knowledge to improve the systems' performance but also propose a principled framework that allows one to effectively incorporate knowledge into statistical machine learning models for…

  14. Knowledge Base Effects in Children's Number Analogy Solutions.

    ERIC Educational Resources Information Center

    Corsale, Kathleen; Gitomer, Drew

    Developmental and individual differences in mathematical aptitude were investigated as a function of knowledge structure and processing variables. Results indicated the relative importance of knowledge structure and strategy skills in aptitude test performance. Protocol data further elaborated the interrelationships between knowledge and process…

  15. Learning Design Based on Graphical Knowledge-Modelling

    ERIC Educational Resources Information Center

    Paquette, Gilbert; Leonard, Michel; Lundgren-Cayrol, Karin; Mihaila, Stefan; Gareau, Denis

    2006-01-01

    This chapter states and explains that a Learning Design is the result of a knowledge engineering process where knowledge and competencies, learning design and delivery models are constructed in an integrated framework. We present a general graphical language and a knowledge editor that has been adapted to support the construction of learning…

  16. Scientist-Centered Graph-Based Models of Scientific Knowledge

    SciTech Connect

    Chin, George; Stephan, Eric G.; Gracio, Deborah K.; Kuchar, Olga A.; Whitney, Paul D.; Schuchardt, Karen L.

    2005-07-01

    At the Pacific Northwest National Laboratory, we are researching and developing visual models and paradigms that will allow scientists to capture and represent conceptual models in a computational form that may linked to and integrated with scientific data sets and applications. Captured conceptual models may be logical in conveying how individual concepts tie together to form a higher theory, analytical in conveying intermediate or final analysis results, or temporal in describing the experimental process in which concepts are physically and computationally explored. In this paper, we describe and contrast three different research and development systems that allow scientists to capture and interact with computational graph-based models of scientific knowledge. Through these examples, we explore and examine ways in which researchers may graphically encode and apply scientific theory and practice on computer systems.

  17. VIALACTEA knowledge base homogenizing access to Milky Way data

    NASA Astrophysics Data System (ADS)

    Molinaro, Marco; Butora, Robert; Bandieramonte, Marilena; Becciani, Ugo; Brescia, Massimo; Cavuoti, Stefano; Costa, Alessandro; Di Giorgio, Anna M.; Elia, Davide; Hajnal, Akos; Gabor, Hermann; Kacsuk, Peter; Liu, Scige J.; Molinari, Sergio; Riccio, Giuseppe; Schisano, Eugenio; Sciacca, Eva; Smareglia, Riccardo; Vitello, Fabio

    2016-08-01

    The VIALACTEA project has a work package dedicated to "Tools and Infrastructure" and, inside it, a task for the "Database and Virtual Observatory Infrastructure". This task aims at providing an infrastructure to store all the resources needed by the, more purposely, scientific work packages of the project itself. This infrastructure includes a combination of: storage facilities, relational databases and web services on top of them, and has taken, as a whole, the name of VIALACTEA Knowledge Base (VLKB). This contribution illustrates the current status of this VLKB. It details the set of data resources put together; describes the database that allows data discovery through VO inspired metadata maintenance; illustrates the discovery, cutout and access services built on top of the former two for the users to exploit the data content.

  18. A model for a knowledge-based system's life cycle

    NASA Technical Reports Server (NTRS)

    Kiss, Peter A.

    1990-01-01

    The American Institute of Aeronautics and Astronautics has initiated a Committee on Standards for Artificial Intelligence. Presented here are the initial efforts of one of the working groups of that committee. The purpose here is to present a candidate model for the development life cycle of Knowledge Based Systems (KBS). The intent is for the model to be used by the Aerospace Community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are detailed as are and the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.

  19. Knowledge-Based Framework: its specification and new related discussions

    NASA Astrophysics Data System (ADS)

    Rodrigues, Douglas; Zaniolo, Rodrigo R.; Branco, Kalinka R. L. J. C.

    2015-09-01

    Unmanned Aerial Vehicle is a common application of critical embedded systems. The heterogeneity prevalent in these vehicles in terms of services for avionics is particularly relevant to the elaboration of multi-application missions. Besides, this heterogeneity in UAV services is often manifested in the form of characteristics such as reliability, security and performance. Different service implementations typically offer different guarantees in terms of these characteristics and in terms of associated costs. Particularly, we explore the notion of Service-Oriented Architecture (SOA) in the context of UAVs as safety-critical embedded systems for the composition of services to fulfil application-specified performance and dependability guarantees. So, we propose a framework for the deployment of these services and their variants. This framework is called Knowledge-Based Framework for Dynamically Changing Applications (KBF) and we specify its services module, discussing all the related issues.

  20. Sentence Topics Based Knowledge Acquisition for Question Answering

    NASA Astrophysics Data System (ADS)

    Oh, Hyo-Jung; Yun, Bo-Hyun

    This paper presents a knowledge acquisition method using sentence topics for question answering. We define templates for information extraction by the Korean concept network semi-automatically. Moreover, we propose the two-phase information extraction model by the hybrid machine learning such as maximum entropy and conditional random fields. In our experiments, we examined the role of sentence topics in the template-filling task for information extraction. Our experimental result shows the improvement of 18% in F-score and 434% in training speed over the plain CRF-based method for the extraction task. In addition, our result shows the improvement of 8% in F-score for the subsequent QA task.

  1. Generating connections and learning with SemNet, a tool for constructing knowledge networks

    NASA Astrophysics Data System (ADS)

    Gorodetsky, Malka; Fisher, Kathleen M.; Wyman, Barbara

    1994-09-01

    In this paper we examine the impact of using a Macintosh-based knowledge organization toll SemNet, with prospective elementary and middle school teachers enrolled in an upper division biology course. The course models for students the ways in which they will be able to teach hands-on, minds-on science in K-8 classrooms and provides them with an in-depth understanding of a relatively small number of biology topics. This study examines changes in learning habits, metacognitive processes, retention, retrieval, and learring among students enrolled in this course. Students using SemNet tend to exhibit a significant increase in deep processing as measured by self-report. Also on the basis of self-report, SemNet students appear to acquire some cognitive skills that transfer to other courses, such as identifying main ideas and tying ideas together. SemNet students retained and retrieved nearly twice as much information about a topic, the digestive system, as a reference group. Although neither the SemNet nor the reference group exhibited transfer skills as we meansured them, there is evidence that SemNet student changed their thinking strategies.

  2. A national knowledge-based crop recognition in Mediterranean environment

    NASA Astrophysics Data System (ADS)

    Cohen, Yafit; Shoshany, Maxim

    2002-08-01

    Population growth, urban expansion, land degradation, civil strife and war may place plant natural resources for food and agriculture at risk. Crop and yield monitoring is basic information necessary for wise management of these resources. Satellite remote sensing techniques have proven to be cost-effective in widespread agricultural lands in Africa, America, Europe and Australia. However, they have had limited success in Mediterranean regions that are characterized by a high rate of spatio-temporal ecological heterogeneity and high fragmentation of farming lands. An integrative knowledge-based approach is needed for this purpose, which combines imagery and geographical data within the framework of an intelligent recognition system. This paper describes the development of such a crop recognition methodology and its application to an area that comprises approximately 40% of the cropland in Israel. This area contains eight crop types that represent 70% of Israeli agricultural production. Multi-date Landsat TM images representing seasonal vegetation cover variations were converted to normalized difference vegetation index (NDVI) layers. Field boundaries were delineated by merging Landsat data with SPOT-panchromatic images. Crop recognition was then achieved in two-phases, by clustering multi-temporal NDVI layers using unsupervised classification, and then applying 'split-and-merge' rules to these clusters. These rules were formalized through comprehensive learning of relationships between crop types, imagery properties (spectral and NDVI) and auxiliary data including agricultural knowledge, precipitation and soil types. Assessment of the recognition results using ground data from the Israeli Agriculture Ministry indicated an average recognition accuracy exceeding 85% which accounts for both omission and commission errors. The two-phase strategy implemented in this study is apparently successful for heterogeneous regions. This is due to the fact that it allows

  3. Incremental Knowledge Base Construction Using DeepDive

    PubMed Central

    Shin, Jaeho; Wu, Sen; Wang, Feiran; De Sa, Christopher; Zhang, Ce; Ré, Christopher

    2016-01-01

    Populating a database with unstructured information is a long-standing problem in industry and research that encompasses problems of extraction, cleaning, and integration. Recent names used for this problem include dealing with dark data and knowledge base construction (KBC). In this work, we describe DeepDive, a system that combines database and machine learning ideas to help develop KBC systems, and we present techniques to make the KBC process more efficient. We observe that the KBC process is iterative, and we develop techniques to incrementally produce inference results for KBC systems. We propose two methods for incremental inference, based respectively on sampling and variational techniques. We also study the tradeoff space of these methods and develop a simple rule-based optimizer. DeepDive includes all of these contributions, and we evaluate Deep-Dive on five KBC systems, showing that it can speed up KBC inference tasks by up to two orders of magnitude with negligible impact on quality. PMID:27144081

  4. Effects of people knowledge on science learning in a computer-based learning environment

    NASA Astrophysics Data System (ADS)

    Hong, Huang-Yao

    A weakness inherent in science education has been, and continues to be, its emphasis principally on the teaching of scientific knowledge, i.e. knowledge of the object (or the observed). Little attention has been directed to the teaching of people knowledge about scientists, i.e. knowledge of the subject (or the observer), who generates scientific knowledge. This study explored the possible effects of people knowledge on science learning. Participants in the study were 323 tenth graders from nine classes in a public school in Taipei, Taiwan. They were randomly assigned to three groups to self-study science in a computer-based learning environment. The control group was instructed to study various scientific laws discovered by three scientists in three science lessons. The other two groups were instructed to study the same science lessons after studying one of two kinds of people knowledge about the three scientists: achievement-oriented people knowledge (APK) and process-oriented people knowledge (PPK). APK profiles scientists' scientific achievements, and PPK describes scientists' struggles before making the scientific discoveries. The main findings were: Firstly, it was found from problem-solving tests that all three groups performed equally well in applying what they learned from the lessons to solve textbook problems. However, in applying what they learned to interpret the relationships between scientific laws, only the PPK group performed better. Secondly, regarding learning interest, among the students who showed high personal interest in science, the APK group tended to consider the lessons as less interesting than the control group. Among the students who demonstrated low personal interest in science, the PPK group tended to consider the science lessons as more interesting than the control group. Thirdly, in describing their image of the three scientists, the APK group tended to emphasize the abilities and successes of the scientists, whereas the PPK group

  5. Virus-based piezoelectric energy generator

    SciTech Connect

    2012-01-01

    Lawrence Berkeley National Laboratory scientists have developed a way to generate power using harmless viruses that convert mechanical energy into electricity. The milestone could lead to tiny devices that harvest electrical energy from the vibrations of everyday tasks. The first part of the video shows how Berkeley Lab scientists harness the piezoelectric properties of the virus to convert the force of a finger tap into electricity. The second part reveals the "viral-electric" generators in action, first by pressing only one of the generators, then by pressing two at the same time, which produces more current.

  6. Exploring the Impact of Varying Degrees of Cognitive Conflict in the Generation of both Subject and Pedagogical Knowledge as Primary Trainee Teachers Learn about Shadow Formation

    NASA Astrophysics Data System (ADS)

    Parker, Joan

    2006-10-01

    Primary teacher preparation courses to need support students in developing not only science content knowledge, but also pedagogical knowledge appropriate to the effective translation and representation of subject matter for learners in classrooms. In the case of the generalist primary trainee, this constitutes a considerable challenge. This study explored how a group of 13 primary trainees developed subject and pedagogical knowledge during university-based training as they investigated shadow production in a variety of contexts using cognitive conflict as a strategy for promoting conceptual change. By using a metacognitive approach, students analysed their own learning in response to increasing depth of conflict within a series of shadow investigations. The results indicate that the depth of conflict perceived by the learners in this study was instrumental in inducing conceptual change and generating pedagogical insight within the domain of light.

  7. The Knowledge-Based Economy and E-Learning: Critical Considerations for Workplace Democracy

    ERIC Educational Resources Information Center

    Remtulla, Karim A.

    2007-01-01

    The ideological shift by nation-states to "a knowledge-based economy" (also referred to as "knowledge-based society") is causing changes in the workplace. Brought about by the forces of globalisation and technological innovation, the ideologies of the "knowledge-based economy" are not limited to influencing the…

  8. Strategy Regulation: The Role of Intelligence, Metacognitive Attributions, and Knowledge Base.

    ERIC Educational Resources Information Center

    Alexander, Joyce M.; Schwanenflugel, Paula J.

    1994-01-01

    Studied influence of intelligence, metacognitive attributions, and knowledge base coherence in the regulation of the category-sorting strategy in first and second graders. Knowledge base was a powerful predictor of strategic-looking behavior; metacognitive attribution was most influential in low knowledge base conditions; and intelligence had…

  9. Identification of threats using linguistics-based knowledge extraction.

    SciTech Connect

    Chew, Peter A.

    2008-09-01

    One of the challenges increasingly facing intelligence analysts, along with professionals in many other fields, is the vast amount of data which needs to be reviewed and converted into meaningful information, and ultimately into rational, wise decisions by policy makers. The advent of the world wide web (WWW) has magnified this challenge. A key hypothesis which has guided us is that threats come from ideas (or ideology), and ideas are almost always put into writing before the threats materialize. While in the past the 'writing' might have taken the form of pamphlets or books, today's medium of choice is the WWW, precisely because it is a decentralized, flexible, and low-cost method of reaching a wide audience. However, a factor which complicates matters for the analyst is that material published on the WWW may be in any of a large number of languages. In 'Identification of Threats Using Linguistics-Based Knowledge Extraction', we have sought to use Latent Semantic Analysis (LSA) and other similar text analysis techniques to map documents from the WWW, in whatever language they were originally written, to a common language-independent vector-based representation. This then opens up a number of possibilities. First, similar documents can be found across language boundaries. Secondly, a set of documents in multiple languages can be visualized in a graphical representation. These alone offer potentially useful tools and capabilities to the intelligence analyst whose knowledge of foreign languages may be limited. Finally, we can test the over-arching hypothesis--that ideology, and more specifically ideology which represents a threat, can be detected solely from the words which express the ideology--by using the vector-based representation of documents to predict additional features (such as the ideology) within a framework based on supervised learning. In this report, we present the results of a three-year project of the same name. We believe these results clearly

  10. Computer game-based and traditional learning method: a comparison regarding students’ knowledge retention

    PubMed Central

    2013-01-01

    Background Educational computer games are examples of computer-assisted learning objects, representing an educational strategy of growing interest. Given the changes in the digital world over the last decades, students of the current generation expect technology to be used in advancing their learning requiring a need to change traditional passive learning methodologies to an active multisensory experimental learning methodology. The objective of this study was to compare a computer game-based learning method with a traditional learning method, regarding learning gains and knowledge retention, as means of teaching head and neck Anatomy and Physiology to Speech-Language and Hearing pathology undergraduate students. Methods Students were randomized to participate to one of the learning methods and the data analyst was blinded to which method of learning the students had received. Students’ prior knowledge (i.e. before undergoing the learning method), short-term knowledge retention and long-term knowledge retention (i.e. six months after undergoing the learning method) were assessed with a multiple choice questionnaire. Students’ performance was compared considering the three moments of assessment for both for the mean total score and for separated mean scores for Anatomy questions and for Physiology questions. Results Students that received the game-based method performed better in the pos-test assessment only when considering the Anatomy questions section. Students that received the traditional lecture performed better in both post-test and long-term post-test when considering the Anatomy and Physiology questions. Conclusions The game-based learning method is comparable to the traditional learning method in general and in short-term gains, while the traditional lecture still seems to be more effective to improve students’ short and long-term knowledge retention. PMID:23442203

  11. Recognition Of Partially Occluded Workpieces By A Knowledge-Based System

    NASA Astrophysics Data System (ADS)

    Serpico, S. B.; Vernazza, G.; Dellepiane, S.; Angela, P.

    1987-01-01

    A knowledge-based system is presented that is oriented toward partially occluded 2-D workpiece recognition in TV camera images. The generalized Hough transform is employed to extract elementary edge patterns. Intrinsic and relational information regarding elementary patterns is computed and then stored inside a net of frames. A similar net of frames is employed for workpiece model representation, for an easy matching with the previous net. A set of production rules provide the heuristics to find hints for locating focus-of-attention regions, while other production rules specify modalities for applying a hypothesis-generation-and-test process. Experimental results on a set of 20 workpieces are reported.

  12. Advanced Coal-Based Power Generations

    NASA Technical Reports Server (NTRS)

    Robson, F. L.

    1982-01-01

    Advanced power-generation systems using coal-derived fuels are evaluated in two-volume report. Report considers fuel cells, combined gas- and steam-turbine cycles, and magnetohydrodynamic (MHD) energy conversion. Presents technological status of each type of system and analyzes performance of each operating on medium-Btu fuel gas, either delivered via pipeline to powerplant or generated by coal-gasification process at plantsite.

  13. SAFOD Brittle Microstructure and Mechanics Knowledge Base (SAFOD BM2KB)

    NASA Astrophysics Data System (ADS)

    Babaie, H. A.; Hadizadeh, J.; di Toro, G.; Mair, K.; Kumar, A.

    2008-12-01

    We have developed a knowledge base to store and present the data collected by a group of investigators studying the microstructures and mechanics of brittle faulting using core samples from the SAFOD (San Andreas Fault Observatory at Depth) project. The investigations are carried out with a variety of analytical and experimental methods primarily to better understand the physics of strain localization in fault gouge. The knowledge base instantiates an specially-designed brittle rock deformation ontology developed at Georgia State University. The inference rules embedded in the semantic web languages, such as OWL, RDF, and RDFS, which are used in our ontology, allow the Pellet reasoner used in this application to derive additional truths about the ontology and knowledge of this domain. Access to the knowledge base is via a public website, which is designed to provide the knowledge acquired by all the investigators involved in the project. The stored data will be products of studies such as: experiments (e.g., high-velocity friction experiment), analyses (e.g., microstructural, chemical, mass transfer, mineralogical, surface, image, texture), microscopy (optical, HRSEM, FESEM, HRTEM]), tomography, porosity measurement, microprobe, and cathodoluminesence. Data about laboratories, experimental conditions, methods, assumptions, equipments, and mechanical properties and lithology of the studied samples will also be presented on the website per investigation. The ontology was modeled applying the UML (Unified Modeling Language) in Rational Rose, and implemented in OWL-DL (Ontology Web Language) using the Protégé ontology editor. The UML model was converted to OWL-DL by first mapping it to Ecore (.ecore) and Generator model (.genmodel) with the help of the EMF (Eclipse Modeling Framework) plugin in Eclipse. The Ecore model was then mapped to a .uml file, which later was converted into an .owl file and subsequently imported into the Protégé ontology editing environment

  14. Knowledge-based approach to multiple-transaction processing and distributed data-base design

    SciTech Connect

    Park, J.T.

    1987-01-01

    The collective processing of multiple transactions in a data-base system has recently received renewed attention due to its capability of improving the overall performance of a data-base system and its applicability to the design of knowledge-based expert systems and extensible data-base systems. This dissertation consists of two parts. The first part presents a new knowledge-based approach to the problems of processing multiple concurrent queries and distributing replicated data objects for further improvement of the overall system performance. The second part deals with distributed database design, i.e., designing horizontal fragments using a semantic knowledge, and allocating data in a distributed environment. The semantic knowledge on data such as functional dependencies and semantic-data-integrity constraints are newly exploited for the identification of subset relationships between intermediate results of query executions involving joins, such that the (intermediate) results of queries can be utilized for the efficient processing of other queries. The expertise on the collective processing of multiple transactions is embodied into the rules of a rule-based expert system, MTP (Multiple Transaction Processor). In the second part, MTP is applied for the determination of horizontal fragments exploiting the semantic knowledge. Heuristics for allocating data in local area networks are developed.

  15. Development of a Prototype Model-Form Uncertainty Knowledge Base

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  16. Speech-Language Pathologists' Knowledge of Genetics: Perceived Confidence, Attitudes, Knowledge Acquisition and Practice-Based Variables

    ERIC Educational Resources Information Center

    Tramontana, G. Michael; Blood, Ingrid M.; Blood, Gordon W.

    2013-01-01

    The purpose of this study was to determine (a) the general knowledge bases demonstrated by school-based speech-language pathologists (SLPs) in the area of genetics, (b) the confidence levels of SLPs in providing services to children and their families with genetic disorders/syndromes, (c) the attitudes of SLPs regarding genetics and communication…

  17. Data/knowledge Base Processing Using Optical Associative Architectures

    NASA Astrophysics Data System (ADS)

    Akyokus, Selim

    Optical storage, communication, and processing technologies will have a great impact on the future data/knowledge base processing systems. The use of optics in data/knowledge base processing requires new design methods, architectures, and algorithms to apply the optical technology successfully. In this dissertation, three optical associative architectures are proposed. The basic data element in the proposed systems is a 2-D data page. Pages of database relations are stored in a page-oriented optical mass memory, retrieved, and processed in parallel. The first architecture uses a 1-D optical content addressable memory (OCAM) as the main functional unit. A 1-D OCAM is basically an optical vector-matrix multiplier which works as a CAM due to the spatial coding used for bit matching and masking. A 1-D OCAM can compare a search argument with a data page in parallel. The second architecture uses a 2-D OCAM as a main functional unit. A 2-D OCAM is an optical matrix-matrix multiplier which enables the comparison of a page of search arguments with a data page in parallel and in a single step. This architecture allows the execution of multiple selection and join operations very fast. The third architecture uses an optical perfect shuffle network for data routing and a processing array for performing parallel logic operations. A processing array based on symbolic substitution logic is introduced, and the use of a smart SLM as processing array is discussed. The symbolic substitution rules and algorithms for the implementation of search and bitonic sort operations are given for the proposed system. The implementation of relational database operations: selection, projection, update, deletion, sorting, duplication removal, aggregation functions, join, and set operations are described for the proposed systems, timing equations are developed for each operation, and their performances are analyzed. The proposed architectures take advantage of one-to-one mapping among the physical

  18. Prospector II: Towards a knowledge base for mineral deposits

    USGS Publications Warehouse

    McCammon, R.B.

    1994-01-01

    What began in the mid-seventies as a research effort in designing an expert system to aid geologists in exploring for hidden mineral deposits has in the late eighties become a full-sized knowledge-based system to aid geologists in conducting regional mineral resource assessments. Prospector II, the successor to Prospector, is interactive-graphics oriented, flexible in its representation of mineral deposit models, and suited to regional mineral resource assessment. In Prospector II, the geologist enters the findings for an area, selects the deposit models or examples of mineral deposits for consideration, and the program compares the findings with the models or the examples selected, noting the similarities, differences, and missing information. The models or the examples selected are ranked according to scores that are based on the comparisons with the findings. Findings can be reassessed and the process repeated if necessary. The results provide the geologist with a rationale for identifying those mineral deposit types that the geology of an area permits. In future, Prospector II can assist in the creation of new models used in regional mineral resource assessment and in striving toward an ultimate classification of mineral deposits. ?? 1994 International Association for Mathematical Geology.

  19. Effects of delays on 6-year-old children's self-generation and retention of knowledge through integration.

    PubMed

    Varga, Nicole L; Bauer, Patricia J

    2013-06-01

    The current research was an investigation of the effect of delay on self-generation and retention of knowledge derived through integration by 6-year-old children. Children were presented with novel facts from passages read aloud to them (i.e., "stem" facts) and tested for self-generation of new knowledge through integration of the facts. In Experiment 1, children integrated the stem facts at Session 1 and retained the self-generated memory traces over 1 week. In Experiment 2, 1-week delays were imposed either between the to-be-integrated facts (between-stem delay) or after the stem facts but before the test (before-test delay). Integration performance was diminished in both conditions. Moreover, memory for individual stem facts was lower in Experiment 2 than in Experiment 1, suggesting that self-generation through integration promoted memory for explicitly taught information. The results indicate the importance of tests for promoting self-generation through integration as well as for retaining newly self-generated and explicitly taught information.

  20. Cartesian-cell based grid generation and adaptive mesh refinement

    NASA Technical Reports Server (NTRS)

    Coirier, William J.; Powell, Kenneth G.

    1993-01-01

    Viewgraphs on Cartesian-cell based grid generation and adaptive mesh refinement are presented. Topics covered include: grid generation; cell cutting; data structures; flow solver formulation; adaptive mesh refinement; and viscous flow.

  1. Multilayered Knowledge: Understanding the Structure and Enactment of Teacher Educators' Specialized Knowledge Base

    ERIC Educational Resources Information Center

    Selmer, Sarah; Bernstein, Malayna; Bolyard, Johnna

    2016-01-01

    In order to corroborate and grow teacher educator knowledge (TEK) scholarship, this paper describes an in-depth-focused exploration of a group of teacher educators providing professional development. Our grounded data analysis allowed us to define different major elements, sub-elements, and components that comprise TEK, as well as make explicit…

  2. Socioscientific Issues-Based Instruction: An Investigation of Agriscience Students' Content Knowledge Based on Student Variables

    ERIC Educational Resources Information Center

    Shoulders, Catherine W.; Myers, Brian E.

    2013-01-01

    Numerous researchers in science education have reported student improvement in areas of scientific literacy resulting from socioscientific issues (SSI)-based instruction. The purpose of this study was to describe student agriscience content knowledge following a six-week SSI-based instructional unit focusing on the introduction of cultured meat…

  3. Next generation sequencing based approaches to epigenomics

    PubMed Central

    Marra, Marco A.

    2010-01-01

    Next generation sequencing has brought epigenomic studies to the forefront of current research. The power of massively parallel sequencing coupled to innovative molecular and computational techniques has allowed researchers to profile the epigenome at resolutions that were unimaginable only a few years ago. With early proof of concept studies published, the field is now moving into the next phase where the importance of method standardization and rigorous quality control are becoming paramount. In this review we will describe methodologies that have been developed to profile the epigenome using next generation sequencing platforms. We will discuss these in terms of library preparation, sequence platforms and analysis techniques. PMID:21266347

  4. Increasing levels of assistance in refinement of knowledge-based retrieval systems

    NASA Technical Reports Server (NTRS)

    Baudin, Catherine; Kedar, Smadar; Pell, Barney

    1994-01-01

    The task of incrementally acquiring and refining the knowledge and algorithms of a knowledge-based system in order to improve its performance over time is discussed. In particular, the design of DE-KART, a tool whose goal is to provide increasing levels of assistance in acquiring and refining indexing and retrieval knowledge for a knowledge-based retrieval system, is presented. DE-KART starts with knowledge that was entered manually, and increases its level of assistance in acquiring and refining that knowledge, both in terms of the increased level of automation in interacting with users, and in terms of the increased generality of the knowledge. DE-KART is at the intersection of machine learning and knowledge acquisition: it is a first step towards a system which moves along a continuum from interactive knowledge acquisition to increasingly automated machine learning as it acquires more knowledge and experience.

  5. Quantum mechanical energy-based screening of combinatorially generated library of tautomers. TauTGen: a tautomer generator program.

    PubMed

    Harańczyk, Maciej; Gutowski, Maciej

    2007-01-01

    We describe a procedure of finding low-energy tautomers of a molecule. The procedure consists of (i) combinatorial generation of a library of tautomers, (ii) screening based on the results of geometry optimization of initial structures performed at the density functional level of theory, and (iii) final refinement of geometry for the top hits at the second-order Möller-Plesset level of theory followed by single-point energy calculations at the coupled cluster level of theory with single, double, and perturbative triple excitations. The library of initial structures of various tautomers is generated with TauTGen, a tautomer generator program. The procedure proved to be successful for these molecular systems for which common chemical knowledge had not been sufficient to predict the most stable structures.

  6. Voice care knowledge by dysphonic and healthy individuals of different generations.

    PubMed

    Moreti, Felipe; Zambon, Fabiana; Behlau, Mara

    2016-01-01

    The purpose of this study was to identify the opinions of both dysphonic and vocally healthy individuals regarding the factors that affect their voices positively and negatively, analyzing them according to the generation to which the participants belong. Eight hundred sixty-six individuals (304 dysphonic and 562 vocally healthy; 196 men and 670 women) categorized by generation: 22 individuals in Silent Generation (1926/-/1945), 180 in Baby Boomers (1946/-/1964), 285 in Generation X (1965/-/1981), and 379 in Generation Y (1982/-/2003) responded to two open questions: "Cite five things that you believe are good/bad to your voice". Five thousand, two hundred sixty answers were identified (2478 positive and 2782 negative) and organized in 365 factors related to voice care. The three most prevalent positive and negative factors for each generation were as follows: Silent Generation - positive factors: 1 - water, honey and pomegranate, 2 - apple, and 3 - ginger tea, voice exercises and gargling; negative factors: 1 - cold drinks, 2 - excessive speaking, and 3 - alcoholic drinks, smoking and screaming; Baby Boomers - positive factors: 1 - water, 2 - apple, and 3 - sleeping well; negative factors: 1 - cold drinks, 2 - screaming, and 3 - smoking; Generation X - positive factors: 1 - water, 2 - apple, and 3 - vocal warm-up; negative factors: 1 - screaming, 2 - smoking, and 3 - alcoholic drinks; and Generation Y - positive factors: 1 - water, 2 - apple, and 3 - vocal warm-up; negative factors: 1 - screaming, 2 - smoking, and 3 - alcoholic drinks. The impact of generation was greater on the frequency of the responses than on their type. Water and apple were the most frequently cited positive factors for all the generations investigated, whereas screaming and smoking were the most frequently mentioned negative factors. Behavioral aspects related to popular beliefs were reported more frequently by the older generations.

  7. Modeling Rule-Based Item Generation

    ERIC Educational Resources Information Center

    Geerlings, Hanneke; Glas, Cees A. W.; van der Linden, Wim J.

    2011-01-01

    An application of a hierarchical IRT model for items in families generated through the application of different combinations of design rules is discussed. Within the families, the items are assumed to differ only in surface features. The parameters of the model are estimated in a Bayesian framework, using a data-augmented Gibbs sampler. An obvious…

  8. Next Generation Accelerator-Based Light Sources

    SciTech Connect

    Gwyn Williams

    2005-06-26

    We discuss the physics which is driving the evolution of new sources for microscopy and spectroscopy. A new generation of sources, called energy recovery linacs or ERL’s, will be described and reviewed with particular emphasis on the examples of imaging and spectroscopic applications enabled by them.

  9. Computer-Based Arithmetic Test Generation

    ERIC Educational Resources Information Center

    Trocchi, Robert F.

    1973-01-01

    The computer can be a welcome partner in the instructional process, but only if there is man-machine interaction. Man should not compromise system design because of available hardware; the computer must fit the system design for the result to represent an acceptable solution to instructional technology. The Arithmetic Test Generator system fits…

  10. Design and implementation of knowledge-based framework for ground objects recognition in remote sensing images

    NASA Astrophysics Data System (ADS)

    Chen, Shaobin; Ding, Mingyue; Cai, Chao; Fu, Xiaowei; Sun, Yue; Chen, Duo

    2009-10-01

    The advance of image processing makes knowledge-based automatic image interpretation much more realistic than ever. In the domain of remote sensing image processing, the introduction of knowledge enhances the confidence of recognition of typical ground objects. There are mainly two approaches to employ knowledge: the first one is scattering knowledge in concrete program and relevant knowledge of ground objects are fixed by programming; the second is systematically storing knowledge in knowledge base to offer a unified instruction for each object recognition procedure. In this paper, a knowledge-based framework for ground objects recognition in remote sensing image is proposed. This framework takes the second means for using knowledge with a hierarchical architecture. The recognition of typical airport demonstrated the feasibility of the proposed framework.

  11. Widening the Knowledge Acquisition Bottleneck for Constraint-Based Tutors

    ERIC Educational Resources Information Center

    Suraweera, Pramuditha; Mitrovic, Antonija; Martin, Brent

    2010-01-01

    Intelligent Tutoring Systems (ITS) are effective tools for education. However, developing them is a labour-intensive and time-consuming process. A major share of the effort is devoted to acquiring the domain knowledge that underlies the system's intelligence. The goal of this research is to reduce this knowledge acquisition bottleneck and better…

  12. Examining the Mismatch between Pupil and Teacher Knowledge in Acid-Base Chemistry.

    ERIC Educational Resources Information Center

    Erduran, Sibel

    2003-01-01

    Reports a mismatch between teacher and pupil knowledge of acid-base chemistry as a result of controversial episodes from three science lessons. Suggests that the teacher's knowledge is guided by textbook information while the pupil's knowledge is based on direct experimental experience. Proposes that classroom activities should support the…

  13. Users' Attitudes toward Web-Based Collaborative Learning Systems for Knowledge Management

    ERIC Educational Resources Information Center

    Liaw, Shu-Sheng; Chen, Gwo-Dong; Huang, Hsiu-Mei

    2008-01-01

    The Web-based technology is a potential tool for supported collaborative learning that may enrich learning performance, such as individual knowledge construction or group knowledge sharing. Thus, understanding Web-based collaborative learning for knowledge management is a critical issue. The present study is to investigate learners' attitudes…

  14. Estimating evaporative vapor generation from automobiles based on parking activities.

    PubMed

    Dong, Xinyi; Tschantz, Michael; Fu, Joshua S

    2015-07-01

    A new approach is proposed to quantify the evaporative vapor generation based on real parking activity data. As compared to the existing methods, two improvements are applied in this new approach to reduce the uncertainties: First, evaporative vapor generation from diurnal parking events is usually calculated based on estimated average parking duration for the whole fleet, while in this study, vapor generation rate is calculated based on parking activities distribution. Second, rather than using the daily temperature gradient, this study uses hourly temperature observations to derive the hourly incremental vapor generation rates. The parking distribution and hourly incremental vapor generation rates are then adopted with Wade-Reddy's equation to estimate the weighted average evaporative generation. We find that hourly incremental rates can better describe the temporal variations of vapor generation, and the weighted vapor generation rate is 5-8% less than calculation without considering parking activity.

  15. Knowledge-based operation and management of communications systems

    NASA Technical Reports Server (NTRS)

    Heggestad, Harold M.

    1988-01-01

    Expert systems techniques are being applied in operation and control of the Defense Communications System (DCS), which has the mission of providing reliable worldwide voice, data and message services for U.S. forces and commands. Thousands of personnel operate DCS facilities, and many of their functions match the classical expert system scenario: complex, skill-intensive environments with a full spectrum of problems in training and retention, cost containment, modernization, and so on. Two of these functions are: (1) fault isolation and restoral of dedicated circuits at Tech Control Centers, and (2) network management for the Defense Switched Network (the modernized dial-up voice system currently replacing AUTOVON). An expert system for the first of these is deployed for evaluation purposes at Andrews Air Force Base, and plans are being made for procurement of operational systems. In the second area, knowledge obtained with a sophisticated simulator is being embedded in an expert system. The background, design and status of both projects are described.

  16. Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes.

    PubMed

    Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J; Wang, Liliang; Lin, Jianguo

    2016-12-13

    The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions.

  17. Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes

    PubMed Central

    Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J.; Wang, Liliang; Lin, Jianguo

    2016-01-01

    The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions. PMID:28060298

  18. Sensor explication: knowledge-based robotic plan execution through logical objects.

    PubMed

    Budenske, J; Gini, M

    1997-01-01

    Complex robot tasks are usually described as high level goals, with no details on how to achieve them. However, details must be provided to generate primitive commands to control a real robot. A sensor explication concept that makes details explicit from general commands is presented. We show how the transformation from high-level goals to primitive commands can be performed at execution time and we propose an architecture based on reconfigurable objects that contain domain knowledge and knowledge about the sensors and actuators available. Our approach is based on two premises: 1) plan execution is an information gathering process where determining what information is relevant is a great part of the process; and 2) plan execution requires that many details are made explicit. We show how our approach is used in solving the task of moving a robot to and through an unknown, and possibly narrow, doorway; where sonic range data is used to find the doorway, walls, and obstacles. We illustrate the difficulty of such a task using data from a large number of experiments we conducted with a real mobile robot. The laboratory results illustrate how the proper application of knowledge in the integration and utilization of sensors and actuators increases the robustness of plan execution.

  19. Will They Engage? Political Knowledge, Participation and Attitudes of Generations X and Y.

    ERIC Educational Resources Information Center

    Soule, Suzanne

    Most data support the thesis of declining civic engagement among Generations X and Y. If levels of civic engagement remain depressed across the life cycles of Generations X and Y, U.S. democracy may be threatened, for there will be fewer engaged people to fulfill the obligations of democratic citizens. The hope is that youths' indifference to…

  20. Confronting the Technological Pedagogical Knowledge of Finnish Net Generation Student Teachers

    ERIC Educational Resources Information Center

    Valtonen, Teemu; Pontinen, Susanna; Kukkonen, Jari; Dillon, Patrick; Vaisanen, Pertti; Hacklin, Stina

    2011-01-01

    The research reported here is concerned with a critical examination of some of the assumptions concerning the "Net Generation" capabilities of 74 first-year student teachers in a Finnish university. There are assumptions that: (i) Net Generation students are adept at learning through discovery and thinking in a hypertext-like manner…

  1. Next Generation Multimedia Distributed Data Base Systems

    NASA Technical Reports Server (NTRS)

    Pendleton, Stuart E.

    1997-01-01

    The paradigm of client/server computing is changing. The model of a server running a monolithic application and supporting clients at the desktop is giving way to a different model that blurs the line between client and server. We are on the verge of plunging into the next generation of computing technology--distributed object-oriented computing. This is not only a change in requirements but a change in opportunities, and requires a new way of thinking for Information System (IS) developers. The information system demands caused by global competition are requiring even more access to decision making tools. Simply, object-oriented technology has been developed to supersede the current design process of information systems which is not capable of handling next generation multimedia.

  2. Knowledge-based optical coatings design and manufacturing

    NASA Astrophysics Data System (ADS)

    Guenther, Karl H.; Gonzalez, Avelino J.; Yoo, Hoi J.

    1990-12-01

    The theory of thin film optics is well developed for the spectral analysis of a given optical coating. The inverse synthesis - designing an optical coating for a certain spectral performance - is more complicated. Usually a multitude of theoretical designs is feasible because most design problems are over-determined with the number of layers possible with three variables each (n, k, t). The expertise of a good thin film designer comes in at this point with a mostly intuitive selection of certain designs based on previous experience and current manufacturing capabilities. Manufacturing a designed coating poses yet another subset of multiple solutions, as thin if in deposition technology has evolved over the years with a vast variety of different processes. The abundance of published literature may often be more confusing than helpful to the practicing thin film engineer, even if he has time and opportunity to read it. The choice of the right process is also severely limited by the given manufacturing hardware and cost considerations which may not easily allow for the adaption of a new manufacturing approach, even if it promises to be better technically (it ought to be also cheaper). On the user end of the thin film coating business, the typical optical designer or engineer who needs an optical coating may have limited or no knowledge at all about the theoretical and manufacturing criteria for the optimum selection of what he needs. This can be sensed frequently by overly tight tolerances and requirements for optical performance which sometimes stretch the limits of mother nature. We introduce here a know1edge-based system (KBS) intended to assist expert designers and manufacturers in their task of maximizing results and minimizing errors, trial runs, and unproductive time. It will help the experts to manipulate parameters which are largely determined through heuristic reasoning by employing artificial intelligence techniques. In a later state, the KBS will include a

  3. Acts of Discovery: Using Collaborative Research to Mobilize and Generate Knowledge about Visual Arts Teaching Practice

    ERIC Educational Resources Information Center

    Mitchell, Donna Mathewson

    2014-01-01

    Visual arts teachers engage in complex work on a daily basis. This work is informed by practical knowledge that is rarely examined or drawn on in research or in the development of policy. Focusing on the work of secondary visual arts teachers, this article reports on a research program conducted in a regional area of New South Wales, Australia.…

  4. Writing a bachelor thesis generates transferable knowledge and skills useable in nursing practice.

    PubMed

    Lundgren, Solveig M; Robertsson, Barbro

    2013-11-01

    Generic skills or transferable skills have been discussed in terms of whether or not skills learned in one context can be transferred into another context. The current study was aimed to explore nurses' self-perceptions of the knowledge and skills they had obtained while writing a Bachelor's thesis in nursing education, their experience of the extent of transfer and utilization in their current work. Responding nurses (N=42) had all worked from 1 to 1.5 years after their final examination and had completed a questionnaire that was structured with open-ended questions. Only five nurses reported that they were unable to use any of the knowledge and skills they had obtained from writing a thesis. A majority of the nurses (37/42) could give many examples of the practical application of the skills and knowledge they had obtained. Our findings indicate that writing a thesis as part of an undergraduate degree program plays a major role in the acquisition and development of knowledge and skills which can subsequently be transferred into and utilized in nursing practice.

  5. Making Sense of Knowledge Transfer and Social Capital Generation for a Pacific Island Aid Infrastructure Project

    ERIC Educational Resources Information Center

    Manu, Christopher; Walker, Derek H. T.

    2006-01-01

    Purpose: The purpose of this research is to investigate how lessons learned from a case study of a construction project undertaken in the Pacific Islands relates to the interaction between social capital and knowledge transfer. The paper is reflective in nature focusing upon the experiences of one of the authors, being a Pacific Islander and…

  6. New knowledge-based genetic algorithm for excavator boom structural optimization

    NASA Astrophysics Data System (ADS)

    Hua, Haiyan; Lin, Shuwen

    2014-03-01

    Due to the insufficiency of utilizing knowledge to guide the complex optimal searching, existing genetic algorithms fail to effectively solve excavator boom structural optimization problem. To improve the optimization efficiency and quality, a new knowledge-based real-coded genetic algorithm is proposed. A dual evolution mechanism combining knowledge evolution with genetic algorithm is established to extract, handle and utilize the shallow and deep implicit constraint knowledge to guide the optimal searching of genetic algorithm circularly. Based on this dual evolution mechanism, knowledge evolution and population evolution can be connected by knowledge influence operators to improve the configurability of knowledge and genetic operators. Then, the new knowledge-based selection operator, crossover operator and mutation operator are proposed to integrate the optimal process knowledge and domain culture to guide the excavator boom structural optimization. Eight kinds of testing algorithms, which include different genetic operators, are taken as examples to solve the structural optimization of a medium-sized excavator boom. By comparing the results of optimization, it is shown that the algorithm including all the new knowledge-based genetic operators can more remarkably improve the evolutionary rate and searching ability than other testing algorithms, which demonstrates the effectiveness of knowledge for guiding optimal searching. The proposed knowledge-based genetic algorithm by combining multi-level knowledge evolution with numerical optimization provides a new effective method for solving the complex engineering optimization problem.

  7. BJUT at TREC 2015 Microblog Track: Real Time Filtering Using Knowledge Base

    DTIC Science & Technology

    2015-11-20

    conference on Knowledge discovery and data mining, pages 133–142. ACM, 2002. ChengXiang Zhai and John Lafferty. Two-stage language models for information...BJUT at TREC 2015 Microblog Track: Real-Time Filtering Using Knowledge Base Luyang Liu1,2,3, Zhen Yang1,2,3,⇤ 1. College of Computer Science, Beijing...classic retrieval model combined with the external knowledge base, i.e., Wikipedia, for query expansion. Besides, we introduced the knowledge

  8. Optimal Test Design with Rule-Based Item Generation

    ERIC Educational Resources Information Center

    Geerlings, Hanneke; van der Linden, Wim J.; Glas, Cees A. W.

    2013-01-01

    Optimal test-design methods are applied to rule-based item generation. Three different cases of automated test design are presented: (a) test assembly from a pool of pregenerated, calibrated items; (b) test generation on the fly from a pool of calibrated item families; and (c) test generation on the fly directly from calibrated features defining…

  9. Coal and Coal/Biomass-Based Power Generation

    EPA Science Inventory

    For Frank Princiotta's book, Global Climate Change--The Technology Challenge Coal is a key, growing component in power generation globally. It generates 50% of U.S. electricity, and criteria emissions from coal-based power generation are being reduced. However, CO2 emissions m...

  10. Rangeland degradation assessment: a new strategy based on the ecological knowledge of indigenous pastoralists

    NASA Astrophysics Data System (ADS)

    Behmanesh, Bahareh; Barani, Hossein; Abedi Sarvestani, Ahmad; Shahraki, Mohammad Reza; Sharafatmandrad, Mohsen

    2016-04-01

    In a changing world, the prevalence of land degradation is becoming a serious problem, especially in countries with arid and semi-arid rangelands. There are many techniques to assess rangeland degradation that rely on scientific knowledge but ignore indigenous people. Indigenous people have accumulated precious knowledge about land management through generations of experience. Therefore, a study was conducted to find out how indigenous people assess rangeland degradation and how their ecological knowledge can be used for rangeland degradation assessment. Interviews were conducted with the pastoralists of two sites (Dasht and Mirza Baylu), where part of both areas is located in Golestan National Park (north-eastern Iran). A structured questionnaire was designed based on 17 indicators taken from literature and also primary discussions with pastoralists in order to evaluate land degradation. A qualitative Likert five-point scale was used for scoring rangeland degradation indicators. The results revealed that pastoralists pay more attention to edaphic indicators than to vegetative and other indicators. There were significant differences between the inside and outside of the park in terms of rangeland degradation indicators for both sites. The results show that the rangelands outside of the park in both sites were degraded compared to those inside of the park, especially in the areas close to villages. It can be concluded that pastoralists have a wealth of knowledge about the vegetation and grazing animal habits that can be used in rangeland degradation assessment. It is therefore necessary to document their ecological indigenous knowledge and involve them in the process of rangeland-degradation assessment.

  11. Evolving Expert Knowledge Bases: Applications of Crowdsourcing and Serious Gaming to Advance Knowledge Development for Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Floryan, Mark

    2013-01-01

    This dissertation presents a novel effort to develop ITS technologies that adapt by observing student behavior. In particular, we define an evolving expert knowledge base (EEKB) that structures a domain's information as a set of nodes and the relationships that exist between those nodes. The structure of this model is not the particularly novel…

  12. Subject-Matter Didactics as a Central Knowledge Base for Teachers, or Should It Be Called Pedagogical Content Knowledge?

    ERIC Educational Resources Information Center

    Kansanen, Pertti

    2009-01-01

    This paper compares the concepts of "subject-matter didactics" (Fachdidaktik) with "pedagogical content knowledge". The former is based on German didaktik and has a long tradition. The latter was introduced by Lee Shulman in the late 1980s and has no tradition in the same way as its German counterpart. Both of the concepts deal…

  13. Knowledge Based Components of Expertise in Medical Diagnosis.

    DTIC Science & Technology

    1981-09-01

    the psychology of reason- irg have been mirrored in artificial intelligence re- search which has shown an evolution from systems in which knowledge...that is. hypertrophies. An example protocol showing this argument ap- plied to left atrial enlargement and prominent pulmon - ary vasculature (the x...art of artificial intelligence, Themes and case studies of knowledge engineering. Proceedings IJCAI-5, 1977, 1014-1029. Fikes, R. A heuristic pro m

  14. Impact of knowledge-based software engineering on aerospace systems

    NASA Technical Reports Server (NTRS)

    Peyton, Liem; Gersh, Mark A.; Swietek, Gregg

    1991-01-01

    The emergence of knowledge engineering as a software technology will dramatically alter the use of software by expanding application areas across a wide spectrum of industries. The engineering and management of large aerospace software systems could benefit from a knowledge engineering approach. An understanding of this technology can potentially make significant improvements to the current practice of software engineering, and provide new insights into future development and support practices.

  15. Gaining system design knowledge by systematic design space exploration with graph based design languages

    NASA Astrophysics Data System (ADS)

    Schmidt, Jens; Rudolph, Stephan

    2014-10-01

    The conceptual design phase in the design of complex systems such as satellite propulsion systems heavily relies on an exploration of the feasible design space. This exploration requires both: topological changes in the potential system architecture and consistent parametrical changes in the dimensioning of the existing system components. Since advanced engineering design techniques nowadays advocate a model-based systems engineering (MBSE) approach, graph-based design languages which embed a superset of MBSE-features are consequently used in this work to systematically explore the feasible design space. Design languages allow the design knowledge to be represented, modeled and executed using model-based transformations and combine this among other features with constraint processing techniques. The execution of the design language shown for the satellite propulsion systems in this work yields topologically varied designs (i.e. the selection of a monergol, a diergol or a coldgas system) with consistent parameters. Based on an a posteriori performance analysis of the automatically generated system designs, novel system knowledge (most notably in form of so-called "topology change points") can be gained and extracted from the original point cloud of numerical results.

  16. Next Generation Bare Base Waste Processing System (Phase 1)

    DTIC Science & Technology

    1997-08-01

    Municipal Solid Waste Generation in the United States in 1994 ............. 10 Table 5.3.1.2 Estimated Bare Base Solid Waste Generation...with municipal solid waste (MSW) generation rates reported in contemporary literature. Liquid waste stream estimates were made using generally...Therefore, information regarding municipal solid waste (MSW) generation in the United States was also used to derive estimates of the amount and nature ofthe

  17. BioGraph: unsupervised biomedical knowledge discovery via automated hypothesis generation

    PubMed Central

    2011-01-01

    We present BioGraph, a data integration and data mining platform for the exploration and discovery of biomedical information. The platform offers prioritizations of putative disease genes, supported by functional hypotheses. We show that BioGraph can retrospectively confirm recently discovered disease genes and identify potential susceptibility genes, outperforming existing technologies, without requiring prior domain knowledge. Additionally, BioGraph allows for generic biomedical applications beyond gene discovery. BioGraph is accessible at http://www.biograph.be. PMID:21696594

  18. Next generation protein based Streptococcus pneumoniae vaccines.

    PubMed

    Pichichero, Michael E; Khan, M Nadeem; Xu, Qingfu

    2016-01-01

    All currently available Streptococcus pneumoniae (Spn) vaccines have limitations due to their capsular serotype composition. Both the 23-valent Spn polysaccharide vaccine (PPV) and 7, 10, or 13-valent Spn conjugate vaccines (PCV-7, 10, -13) are serotype-based vaccines and therefore they elicit only serotype-specific immunity. Emergence of replacement Spn strains expressing other serotypes has consistently occurred following introduction of capsular serotype based Spn vaccines. Furthermore, capsular polysaccharide vaccines are less effective in protection against non-bacteremic pneumonia and acute otitis media (AOM) than against invasive pneumococcal disease (IPD). These shortcomings of capsular polysaccharide-based Spn vaccines have created high interest in development of non-serotype specific protein-based vaccines that could be effective in preventing both IPD and non-IPD infections. This review discusses the progress to date on development of Spn protein vaccine candidates that are highly conserved by all Spn strains, are highly conserved, exhibit maximal antigenicity and minimal reactogenicity to replace or complement the current capsule-based vaccines. Key to development of a protein based Spn vaccine is an understanding of Spn pathogenesis. Based on pathogenesis, a protein-based Spn vaccine should include one or more ingredients that reduce NP colonization below a pathogenic inoculum. Elimination of all Spn colonization may not be achievable or even advisable. The level of expression of a target protein antigen during pathogenesis is another key to the success of protein based vaccines.. As with virtually all currently licensed vaccines, production of a serum antibody response in response to protein based vaccines is anticipated to provide protection from Spn infections. A significant advantage that protein vaccine formulations can offer over capsule based vaccination is their potential benefits associated with natural priming and boosting to all strains of

  19. Next generation protein based Streptococcus pneumoniae vaccines

    PubMed Central

    Pichichero, Michael E; Khan, M Nadeem; Xu, Qingfu

    2016-01-01

    All currently available Streptococcus pneumoniae (Spn) vaccines have limitations due to their capsular serotype composition. Both the 23-valent Spn polysaccharide vaccine (PPV) and 7, 10, or 13-valent Spn conjugate vaccines (PCV-7, 10, -13) are serotype-based vaccines and therefore they elicit only serotype-specific immunity. Emergence of replacement Spn strains expressing other serotypes has consistently occurred following introduction of capsular serotype based Spn vaccines. Furthermore, capsular polysaccharide vaccines are less effective in protection against non-bacteremic pneumonia and acute otitis media (AOM) than against invasive pneumococcal disease (IPD). These shortcomings of capsular polysaccharide-based Spn vaccines have created high interest in development of non-serotype specific protein-based vaccines that could be effective in preventing both IPD and non-IPD infections. This review discusses the progress to date on development of Spn protein vaccine candidates that are highly conserved by all Spn strains, are highly conserved, exhibit maximal antigenicity and minimal reactogenicity to replace or complement the current capsule-based vaccines. Key to development of a protein based Spn vaccine is an understanding of Spn pathogenesis. Based on pathogenesis, a protein-based Spn vaccine should include one or more ingredients that reduce NP colonization below a pathogenic inoculum. Elimination of all Spn colonization may not be achievable or even advisable. The level of expression of a target protein antigen during pathogenesis is another key to the success of protein based vaccines.. As with virtually all currently licensed vaccines, production of a serum antibody response in response to protein based vaccines is anticipated to provide protection from Spn infections. A significant advantage that protein vaccine formulations can offer over capsule based vaccination is their potential benefits associated with natural priming and boosting to all strains of

  20. Meta-data based mediator generation

    SciTech Connect

    Critchlaw, T

    1998-06-28

    Mediators are a critical component of any data warehouse; they transform data from source formats to the warehouse representation while resolving semantic and syntactic conflicts. The close relationship between mediators and databases requires a mediator to be updated whenever an associated schema is modified. Failure to quickly perform these updates significantly reduces the reliability of the warehouse because queries do not have access to the most current data. This may result in incorrect or misleading responses, and reduce user confidence in the warehouse. Unfortunately, this maintenance may be a significant undertaking if a warehouse integrates several dynamic data sources. This paper describes a meta-data framework, and associated software, designed to automate a significant portion of the mediator generation task and thereby reduce the effort involved in adapting to schema changes. By allowing the DBA to concentrate on identifying the modifications at a high level, instead of reprogramming the mediator, turnaround time is reduced and warehouse reliability is improved.

  1. Optical generation of fuzzy-based rules.

    PubMed

    Gur, Eran; Mendlovic, David; Zalevsky, Zeev

    2002-08-10

    In the last third of the 20th century, fuzzy logic has risen from a mathematical concept to an applicable approach in soft computing. Today, fuzzy logic is used in control systems for various applications, such as washing machines, train-brake systems, automobile automatic gear, and so forth. The approach of optical implementation of fuzzy inferencing was given by the authors in previous papers, giving an extra emphasis to applications with two dominant inputs. In this paper the authors introduce a real-time optical rule generator for the dual-input fuzzy-inference engine. The paper briefly goes over the dual-input optical implementation of fuzzy-logic inferencing. Then, the concept of constructing a set of rules from given data is discussed. Next, the authors show ways to implement this procedure optically. The discussion is accompanied by an example that illustrates the transformation from raw data into fuzzy set rules.

  2. Optical Generation of Fuzzy-Based Rules

    NASA Astrophysics Data System (ADS)

    Gur, Eran; Mendlovic, David; Zalevsky, Zeev

    2002-08-01

    In the last third of the 20th century, fuzzy logic has risen from a mathematical concept to an applicable approach in soft computing. Today, fuzzy logic is used in control systems for various applications, such as washing machines, train-brake systems, automobile automatic gear, and so forth. The approach of optical implementation of fuzzy inferencing was given by the authors in previous papers, giving an extra emphasis to applications with two dominant inputs. In this paper the authors introduce a real-time optical rule generator for the dual-input fuzzy-inference engine. The paper briefly goes over the dual-input optical implementation of fuzzy-logic inferencing. Then, the concept of constructing a set of rules from given data is discussed. Next, the authors show ways to implement this procedure optically. The discussion is accompanied by an example that illustrates the transformation from raw data into fuzzy set rules.

  3. A combined park management framework based on regulatory and behavioral strategies: use of visitors' knowledge to assess effectiveness.

    PubMed

    Papageorgiou, K

    2001-07-01

    In light of the increasing mandate for greater efficiency in conservation of natural reserves such as national parks, the present study suggests educational approaches as a tool to achieve conservation purposes. Currently, the management of human-wildlife interactions is dominated by regulatory strategies, but considerable potential exists for environmental education to enhance knowledge in the short run and to prompt attitude change in the long run. A framework for conservation based on both traditional regulatory- and behavior-oriented strategies was proposed whereby the level of knowledge that park visitors have acquired comprises an obvious outcome and establishes a basis upon which the effectiveness of regulatory- and behavior-based regimes could be assessed. The perceptions regarding park-related issues of two distinct visitor groups (locals and nonlocals) are summarized from a survey undertaken in Vikos-Aoos national park. The findings suggest a superficial knowledge for certain concepts but little profound understanding of the content of such concepts, indicating that knowledge-raising efforts should go a long way towards establishing a positive attitude for the resource. Visitors' poor knowledge of the park's operation regulation contest the efficiency of the presently dominant regulatory management regime. While geographical distances did not appear to significantly differentiate knowledge between the two groups, wilderness experience (as certified by visits to other parks) was proved to be an impetus for generating substantial learner interest in critical park issues among nonlocal visitors. School education and media were found to be significant knowledge providers.

  4. ADAPT: A knowledge-based synthesis tool for digital signal processing system design

    SciTech Connect

    Cooley, E.S.

    1988-01-01

    A computer aided synthesis tool for expansion, compression, and filtration of digital images is described. ADAPT, the Autonomous Digital Array Programming Tool, uses an extensive design knowledge base to synthesize a digital signal processing (DSP) system. Input to ADAPT can be either a behavioral description in English, or a block level specification via Petri Nets. The output from ADAPT comprises code to implement the DSP system on an array of processors. ADAPT is constructed using C, Prolog, and X Windows on a SUN 3/280 workstation. ADAPT knowledge encompasses DSP component information and the design algorithms and heuristics of a competent DSP designer. The knowledge is used to form queries for design capture, to generate design constraints from the user's responses, and to examine the design constraints. These constraints direct the search for possible DSP components and target architectures. Constraints are also used for partitioning the target systems into less complex subsystems. The subsystems correspond to architectural building blocks of the DSP design. These subsystems inherit design constraints and DSP characteristics from their parent blocks. Thus, a DSP subsystem or parent block, as designed by ADAPT, must meet the user's design constraints. Design solutions are sought by searching the Components section of the design knowledge base. Component behavior which matches or is similar to that required by the DSP subsystems is sought. Each match, which corresponds to a design alternative, is evaluated in terms of its behavior. When a design is sufficiently close to the behavior required by the user, detailed mathematical simulations may be performed to accurately determine exact behavior.

  5. RegenBase: a knowledge base of spinal cord injury biology for translational research

    PubMed Central

    Callahan, Alison; Abeyruwan, Saminda W.; Al-Ali, Hassan; Sakurai, Kunie; Ferguson, Adam R.; Popovich, Phillip G.; Shah, Nigam H.; Visser, Ubbo; Bixby, John L.; Lemmon, Vance P.

    2016-01-01

    Spinal cord injury (SCI) research is a data-rich field that aims to identify the biological mechanisms resulting in loss of function and mobility after SCI, as well as develop therapies that promote recovery after injury. SCI experimental methods, data and domain knowledge are locked in the largely unstructured text of scientific publications, making large scale integration with existing bioinformatics resources and subsequent analysis infeasible. The lack of standard reporting for experiment variables and results also makes experiment replicability a significant challenge. To address these challenges, we have developed RegenBase, a knowledge base of SCI biology. RegenBase integrates curated literature-sourced facts and experimental details, raw assay data profiling the effect of compounds on enzyme activity and cell growth, and structured SCI domain knowledge in the form of the first ontology for SCI, using Semantic Web representation languages and frameworks. RegenBase uses consistent identifier schemes and data representations that enable automated linking among RegenBase statements and also to other biological databases and electronic resources. By querying RegenBase, we have identified novel biological hypotheses linking the effects of perturbagens to observed behavioral outcomes after SCI. RegenBase is publicly available for browsing, querying and download. Database URL: http://regenbase.org PMID:27055827

  6. A knowledge-based design framework for airplane conceptual and preliminary design

    NASA Astrophysics Data System (ADS)

    Anemaat, Wilhelmus A. J.

    The goal of work described herein is to develop the second generation of Advanced Aircraft Analysis (AAA) into an object-oriented structure which can be used in different environments. One such environment is the third generation of AAA with its own user interface, the other environment with the same AAA methods (i.e. the knowledge) is the AAA-AML program. AAA-AML automates the initial airplane design process using current AAA methods in combination with AMRaven methodologies for dependency tracking and knowledge management, using the TechnoSoft Adaptive Modeling Language (AML). This will lead to the following benefits: (1) Reduced design time: computer aided design methods can reduce design and development time and replace tedious hand calculations. (2) Better product through improved design: more alternative designs can be evaluated in the same time span, which can lead to improved quality. (3) Reduced design cost: due to less training and less calculation errors substantial savings in design time and related cost can be obtained. (4) Improved Efficiency: the design engineer can avoid technically correct but irrelevant calculations on incomplete or out of sync information, particularly if the process enables robust geometry earlier. Although numerous advancements in knowledge based design have been developed for detailed design, currently no such integrated knowledge based conceptual and preliminary airplane design system exists. The third generation AAA methods are tested over a ten year period on many different airplane designs. Using AAA methods will demonstrate significant time savings. The AAA-AML system will be exercised and tested using 27 existing airplanes ranging from single engine propeller, business jets, airliners, UAV's to fighters. Data for the varied sizing methods will be compared with AAA results, to validate these methods. One new design, a Light Sport Aircraft (LSA), will be developed as an exercise to use the tool for designing a new airplane

  7. The Reliability of Randomly Generated Math Curriculum-Based Measurements

    ERIC Educational Resources Information Center

    Strait, Gerald G.; Smith, Bradley H.; Pender, Carolyn; Malone, Patrick S.; Roberts, Jarod; Hall, John D.

    2015-01-01

    "Curriculum-Based Measurement" (CBM) is a direct method of academic assessment used to screen and evaluate students' skills and monitor their responses to academic instruction and intervention. Interventioncentral.org offers a math worksheet generator at no cost that creates randomly generated "math curriculum-based measures"…

  8. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.

    PubMed

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/.

  9. Knowledge-based approach to de novo design using reaction vectors.

    PubMed

    Patel, Hina; Bodkin, Michael J; Chen, Beining; Gillet, Valerie J

    2009-05-01

    A knowledge-based approach to the de novo design of synthetically feasible molecules is described. The method is based on reaction vectors which represent the structural changes that take place at the reaction center along with the environment in which the reaction occurs. The reaction vectors are derived automatically from a database of reactions which is not restricted by size or reaction complexity. A structure generation algorithm has been developed whereby reaction vectors can be applied to previously unseen starting materials in order to suggest novel syntheses. The approach has been implemented in KNIME and is validated by reproducing known synthetic routes. We then present applications of the method in different drug design scenarios including lead optimization and library enumeration. The method offers great potential for capturing and using the growing body of data on reactions that is becoming available through electronic laboratory notebooks.

  10. Knowledge-based deformable surface model with application to segmentation of brain structures in MRI

    NASA Astrophysics Data System (ADS)

    Ghanei, Amir; Soltanian-Zadeh, Hamid; Elisevich, Kost; Fessler, Jeffrey A.

    2001-07-01

    We have developed a knowledge-based deformable surface for segmentation of medical images. This work has been done in the context of segmentation of hippocampus from brain MRI, due to its challenge and clinical importance. The model has a polyhedral discrete structure and is initialized automatically by analyzing brain MRI sliced by slice, and finding few landmark features at each slice using an expert system. The expert system decides on the presence of the hippocampus and its general location in each slice. The landmarks found are connected together by a triangulation method, to generate a closed initial surface. The surface deforms under defined internal and external force terms thereafter, to generate an accurate and reproducible boundary for the hippocampus. The anterior and posterior (AP) limits of the hippocampus is estimated by automatic analysis of the location of brain stem, and some of the features extracted in the initialization process. These data are combined together with a priori knowledge using Bayes method to estimate a probability density function (pdf) for the length of the structure in sagittal direction. The hippocampus AP limits are found by optimizing this pdf. The model is tested on real clinical data and the results show very good model performance.

  11. Generation Of Manufacturing Routing And Operations Using Structured Knowledge As Basis To Application Of Computer Aided In Process Planning

    NASA Astrophysics Data System (ADS)

    Oswaldo, Luiz Agostinho

    2011-01-01

    The development of computer aided resources in automation of generation of manufacturing routings and operations is being mainly accomplished through the search of similarities between existent ones, resulting standard process routings that are grouped by analysis of similarities between parts or routings. This article proposes the development of manufacturing routings and operations detailment using a methodology which steps will define the initial, intermediate and final operations, starting from the rough piece and going up to the final specifications, that must have binunivocal relationship with the part design specifications. Each step will use the so called rules of precedence to link and chain the routing operations. The rules of precedence order and prioritize the knowledge of various manufacturing processes, taking in account the theories of machining, forging, assembly, and heat treatments; also, utilizes the theories of accumulation of tolerances and process capabilities, between others. It is also reinforced the availability of manufacturing databases related to process tolerances, deviations of machine tool- cutting tool- fixturing devices—workpiece, and process capabilities. The statement and application of rules of precedence, linking and joining manufacturing concepts in a logical and structured way, and their application in the methodology steps will make viable the utilization of structured knowledge instead of tacit one currently available in the manufacturing engineering departments, in the generation of manufacturing routing and operations. Consequently, the development of Computer Aided in Process Planning will be facilitated, due to the structured knowledge applied with this methodology.

  12. Analysis of a Knowledge-Management-Based Process of Transferring Project Management Skills

    ERIC Educational Resources Information Center

    Ioi, Toshihiro; Ono, Masakazu; Ishii, Kota; Kato, Kazuhiko

    2012-01-01

    Purpose: The purpose of this paper is to propose a method for the transfer of knowledge and skills in project management (PM) based on techniques in knowledge management (KM). Design/methodology/approach: The literature contains studies on methods to extract experiential knowledge in PM, but few studies exist that focus on methods to convert…

  13. Portal-based knowledge environment for collaborative science.

    SciTech Connect

    Schuchardt, K.; Pancerella, C.; Rahn, L. A.; Didier, B.; Kodeboyina, D.; Leahy, D.; Myers, J. D.; Oluwole, O. O.; Pitz, W.; Ruscic, B.; Song, J.; von Laszewski, G.; Yang, C.; PNNL; SNL; Stanford Univ.; National Center for Supercomputing Applications; MIT; LLNL

    2007-08-25

    The Knowledge Environment for Collaborative Science (KnECS) is an open-source informatics toolkit designed to enable knowledge Grids that interconnect science communities, unique facilities, data, and tools. KnECS features a Web portal with team and data collaboration tools, lightweight federation of data, provenance tracking, and multi-level support for application integration. We identify the capabilities of KnECS and discuss extensions from the Collaboratory for Multi-Scale Chemical Sciences (CMCS) which enable diverse combustion science communities to create and share verified, documented data sets and reference data, thereby demonstrating new methods of community interaction and data interoperability required by systems science approaches. Finally, we summarize the challenges we encountered and foresee for knowledge environments.

  14. Model-based reasoning for system and software engineering: The Knowledge From Pictures (KFP) environment

    NASA Technical Reports Server (NTRS)

    Bailin, Sydney; Paterra, Frank; Henderson, Scott; Truszkowski, Walt

    1993-01-01

    This paper presents a discussion of current work in the area of graphical modeling and model-based reasoning being undertaken by the Automation Technology Section, Code 522.3, at Goddard. The work was initially motivated by the growing realization that the knowledge acquisition process was a major bottleneck in the generation of fault detection, isolation, and repair (FDIR) systems for application in automated Mission Operations. As with most research activities this work started out with a simple objective: to develop a proof-of-concept system demonstrating that a draft rule-base for a FDIR system could be automatically realized by reasoning from a graphical representation of the system to be monitored. This work was called Knowledge From Pictures (KFP) (Truszkowski et. al. 1992). As the work has successfully progressed the KFP tool has become an environment populated by a set of tools that support a more comprehensive approach to model-based reasoning. This paper continues by giving an overview of the graphical modeling objectives of the work, describing the three tools that now populate the KFP environment, briefly presenting a discussion of related work in the field, and by indicating future directions for the KFP environment.

  15. Generating Vocabulary Knowledge for At-Risk Middle School Readers: Contrasting Program Effects and Growth Trajectories

    ERIC Educational Resources Information Center

    Lawrence, Joshua F.; Rolland, Rebecca Givens; Branum-Martin, Lee; Snow, Catherine E.

    2014-01-01

    We tested whether urban middle-school students from mostly low-income homes had improved academic vocabulary when they participated in a freely available vocabulary program, Word Generation (WG). To understand how this program may support students at risk for long-term reading difficulty, we examined treatment interactions with baseline…

  16. Writing and Reading Knowledge of Spanish/English Second-Generation Bilinguals

    ERIC Educational Resources Information Center

    Ardila, Alfredo; Garcia, Krystal; Garcia, Melissa; Mejia, Joselyn; Vado, Grace

    2017-01-01

    Written bilingualism represents a particular type of bilingualism that is not frequently approached. The aim of this study was to investigate the writing and reading abilities of second-generation immigrants, Spanish-English bilinguals in South Florida. 58 participants (36 females, 22 males; 18-39 years of age) were selected. Both parents were…

  17. Designing a Knowledge Representation Approach for the Generation of Pedagogical Interventions by MTTs

    ERIC Educational Resources Information Center

    Paquette, Luc; Lebeau, Jean-François; Beaulieu, Gabriel; Mayers, André

    2015-01-01

    Model-tracing tutors (MTTs) have proven effective for the tutoring of well-defined tasks, but the pedagogical interventions they produce are limited and usually require the inclusion of pedagogical content, such as text message templates, in the model of the task. The capability to generate pedagogical content would be beneficial to MTT…

  18. Realizing Relevance: The Influence of Domain-Specific Information on Generation of New Knowledge Through Integration in 4- to 8-Year-Old Children.

    PubMed

    Bauer, Patricia J; Larkina, Marina

    2017-01-01

    In accumulating knowledge, direct modes of learning are complemented by productive processes, including self-generation based on integration of separate episodes. Effects of the number of potentially relevant episodes on integration were examined in 4- to 8-year-olds (N = 121; racially/ethnically heterogeneous sample, English speakers, from large metropolitan area). Information was presented along with unrelated or related episodes; the latter challenged children to identify the relevant subset of episodes for integration. In Experiment 1, 4- and 6-year-olds integrated in the unrelated context. Six-year-olds also succeeded in the related context in forced-choice testing. In Experiment 2, 8-year-olds succeeded in open-ended and forced-choice testing. Results illustrate a developmental progression in productive extension of knowledge due in part to age-related increases in identification of relevant information.

  19. Generative Models for Similarity-based Classification

    DTIC Science & Technology

    2007-01-01

    problem of estimating the class-conditional similarity probability models is solved by applying the maximum entropy principle, under the constraint that...model. The SDA class-conditional probability models have exponential form, because they are derived as the maximum entropy distribu- tions subject to...exist because the constraints are based on the data. As prescribed by Jaynes’ principle of maximum entropy [34], a unique class- conditional joint

  20. A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public

    PubMed Central

    Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin

    2016-01-01

    The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge. PMID:27045314

  1. A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public.

    PubMed

    Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin

    2016-01-01

    The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge.

  2. Mechanically based generative laws of morphogenesis

    NASA Astrophysics Data System (ADS)

    Beloussov, Lev V.

    2008-03-01

    A deep (although at the first glance naïve) question which may be addressed to embryonic development is why during this process quite definite and accurately reproduced successions of precise and complicated shapes are taking place, or why, in several cases, the result of development is highly precise in spite of an extensive variability of intermediate stages. This problem can be attacked in two different ways. One of them, up to now just slightly employed, is to formulate robust macroscopic generative laws from which the observed successions of shapes could be derived. Another one, which dominates in modern embryology, regards the development as a succession of highly precise 'micropatterns', each of them arising due to the action of specific factors, having, as a rule, nothing in common with each other. We argue that the latter view contradicts a great bulk of firmly established data and gives no satisfactory answers to the main problems of development. Therefore we intend to follow the first way. By doing this, we regard developing embryos as self-organized systems transpierced by feedbacks among which we pay special attention to those linked with mechanical stresses (MS). We formulate a hypothesis of so-called MS hyper-restoration as a common basis for the developmentally important feedback loops. We present a number of examples confirming this hypothesis and use it for reconstructing prolonged chains of developmental events. Finally, we discuss the application of the same set of assumptions to the first steps of egg development and to the internal differentiation of embryonic cells.

  3. Working Knowledge and Work-Based Learning: Research Implications. Working Paper.

    ERIC Educational Resources Information Center

    McIntyre, John

    The research implications of the concepts of working knowledge and work-based learning were examined. A research agenda for work-based learning arising from the analysis of "working knowledge" was presented. The agenda listed questions pertaining to the following areas: (1) the conditions bringing about work-based learning; (2) the…

  4. Investigating Knowledge Integration in Web-Based Thematic Learning Using Concept Mapping Assessment

    ERIC Educational Resources Information Center

    Liu, Ming-Chou; Wang, Jhen-Yu

    2010-01-01

    Theme-based learning (TBL) refers to learning modes which adopt the following sequence: (a) finding the theme; (b) finding a focus of interest based on the theme; (c) finding materials based on the focus of interest; (d) integrating the materials to establish shared knowledge; (e) publishing and sharing the integrated knowledge. We have created an…

  5. Knowledge and Use of Intervention Practices by Community-Based Early Intervention Service Providers

    ERIC Educational Resources Information Center

    Paynter, Jessica M.; Keen, Deb

    2015-01-01

    This study investigated staff attitudes, knowledge and use of evidence-based practices (EBP) and links to organisational culture in a community-based autism early intervention service. An EBP questionnaire was completed by 99 metropolitan and regionally-based professional and paraprofessional staff. Participants reported greater knowledge and use…

  6. Ternary jitter-based true random number generator

    NASA Astrophysics Data System (ADS)

    Latypov, Rustam; Stolov, Evgeni

    2017-01-01

    In this paper a novel family of generators producing true uniform random numbers in ternary logic is presented. The generator consists of a number of identical ternary logic combinational units connected into a ring. All the units are provided to have a random delay time, and this time is supposed to be distributed in accordance with an exponential distribution. All delays are supposed to be independent events. The theory of the generator is based on Erlang equations. The generator can be used for test production in various systems. Features of multidimensional random vectors, produced by the generator, are discussed.

  7. CKB - the compound knowledge base: a text based chemical search system.

    PubMed

    Walker, Matthew J; Hull, Richard D; Singh, Suresh B

    2002-01-01

    The Compound Knowledge Base (CKB) was developed as a means of locating structures and additional relevant information from a given known structural identifier. Any of Chemical Abstracts Service Registry Number, company code (code number the producing company refers to the chemical entity internally), generic name (trivial or class name), or trade name (name under which the compound is marketed) can be provided as a query. CKB will provide the remaining available information as well as the corresponding structure for any matching compound in the database. The interface to the Compound Knowledge Base is Internet/World Wide Web-based, using Netscape Navigator and the ChemDraw Pro Plugin, which allows Merck scientists quick and easy access to the database from their desktop. The design and implementation of the database and the search interface are herein detailed.

  8. Of Tacit Knowledge, Texts and Thing-based Learning (TBL)

    ERIC Educational Resources Information Center

    Rangachari, P. K.

    2008-01-01

    Practical knowledge has two dimensions--a visible, codified component that resembles the tip of an iceberg. The larger but crucial tacit component which lies submerged consists of values, procedures and tricks of the trade and cannot be easily documented or codified. Undergraduate science students were given an opportunity to explore this…

  9. Towards a knowledge-based correction of iron chlorosis.

    PubMed

    Abadía, Javier; Vázquez, Saúl; Rellán-Álvarez, Rubén; El-Jendoubi, Hamdi; Abadía, Anunciación; Alvarez-Fernández, Ana; López-Millán, Ana Flor

    2011-05-01

    Iron (Fe) deficiency-induced chlorosis is a major nutritional disorder in crops growing in calcareous soils. Iron deficiency in fruit tree crops causes chlorosis, decreases in vegetative growth and marked fruit yield and quality losses. Therefore, Fe fertilizers, either applied to the soil or delivered to the foliage, are used every year to control Fe deficiency in these crops. On the other hand, a substantial body of knowledge is available on the fundamentals of Fe uptake, long and short distance Fe transport and subcellular Fe allocation in plants. Most of this basic knowledge, however, applies only to Fe deficiency, with studies involving Fe fertilization (i.e., with Fe-deficient plants resupplied with Fe) being still scarce. This paper reviews recent developments in Fe-fertilizer research and the state-of-the-art of the knowledge on Fe acquisition, transport and utilization in plants. Also, the effects of Fe-fertilization on the plant responses to Fe deficiency are reviewed. Agronomical Fe-fertilization practices should benefit from the basic knowledge on plant Fe homeostasis already available; this should be considered as a long-term goal that can optimize fertilizer inputs, reduce grower's costs and minimize the environmental impact of fertilization.

  10. Knowledge Base of Pronunciation Teaching: Staking out the Territory

    ERIC Educational Resources Information Center

    Baker, Amanda; Murphy, John

    2011-01-01

    Despite decades of advocacy for greater investigative attention, research into pronunciation instruction in the teaching of English as a second language (ESL) and English as a foreign language (EFL) continues to be limited. This limitation is particularly evident in explorations of teacher cognition (e.g., teachers' knowledge, beliefs, and…

  11. Technological Discourse on CAS-Based Operative Knowledge

    ERIC Educational Resources Information Center

    Mann, Giora; Dana-Picard, Thierry; Zehavi, Nurit

    2007-01-01

    This article begins with a comparison of two groups of teachers, working on the same tasks in Analytic Geometry. One group has only basic experience in CAS-assisted problem solving, and the other group has extensive experience. The comparison is discussed in terms of the interplay between reflection, operative knowledge and execution. The findings…

  12. Portal-based Knowledge Environment for Collaborative Science

    SciTech Connect

    Schuchardt, Karen L.; Pancerella, Carmen M.; Rahn, Larry; Didier, Brett T.; Kodeboyina, Deepti; Leahy, David; Myers, James D.; Oluwole, O.; Pitz, William; Ruscic, Branko; Song, Jing; Laszewski, Gregor V.; Yang, Christine

    2007-08-25

    The Knowledge Environment for Collaborative Science (KnECS) is an open source portal that integrates collaboration tools, data and metadata management, and scientific applications. We describe KnECS features, numerous science applications and their requirements, the benefits derived, and the integration approaches. Finally we discuss isues and challenges for the future.

  13. Requirements-Based Knowledge Discovery for Technology Management

    DTIC Science & Technology

    2002-01-01

    Cunningham, S.W., Carlisle, J., and Nayak, A., “A Process for Mining Science & Technology Documents Databases, illustrated for the case of “Knowledge Discovery and Data Mining,” Ciencia da Informacao 28(1), p. 1-8, 1999. 13

  14. Drug Education Based on a Knowledge, Attitude, and Experience Study

    ERIC Educational Resources Information Center

    Grant, John A.

    1971-01-01

    Results of a questionnaire concerning factual knowledge of attitudes toward, and experience with a variety of drugs are reported. It was concluded that marihuana and other drugs are readily available to secondary school students, and widespread experimentation exists; however, a strict dichotomy exists between marihuana and other drugs. (Author/BY)

  15. Gate-based decomposition of index generation functions

    NASA Astrophysics Data System (ADS)

    Łuba, Tadeusz; Borowik, Grzegorz; Jankowski, Cezary

    2016-09-01

    Index Generation Functions may be useful in distribution of IP addresses, virus scanning, or undesired data detection. Traditional approach leads to universal cells based decomposition. In this paper an original method is proposed. The proposed multilevel logic synthesis method based on functional decomposition uses gates instead of cells. Furthermore, it preserves advantages of functional decomposition and is well suited for ROM-based synthesis of Index Generation Functions.

  16. Diagnosis by integrating model-based reasoning with knowledge-based reasoning

    NASA Technical Reports Server (NTRS)

    Bylander, Tom

    1988-01-01

    Our research investigates how observations can be categorized by integrating a qualitative physical model with experiential knowledge. Our domain is diagnosis of pathologic gait in humans, in which the observations are the gait motions, muscle activity during gait, and physical exam data, and the diagnostic hypotheses are the potential muscle weaknesses, muscle mistimings, and joint restrictions. Patients with underlying neurological disorders typically have several malfunctions. Among the problems that need to be faced are: the ambiguity of the observations, the ambiguity of the qualitative physical model, correspondence of the observations and hypotheses to the qualitative physical model, the inherent uncertainty of experiential knowledge, and the combinatorics involved in forming composite hypotheses. Our system divides the work so that the knowledge-based reasoning suggests which hypotheses appear more likely than others, the qualitative physical model is used to determine which hypotheses explain which observations, and another process combines these functionalities to construct a composite hypothesis based on explanatory power and plausibility. We speculate that the reasoning architecture of our system is generally applicable to complex domains in which a less-than-perfect physical model and less-than-perfect experiential knowledge need to be combined to perform diagnosis.

  17. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  18. An American knowledge base in England - Alternate implementations of an expert system flight status monitor

    NASA Technical Reports Server (NTRS)

    Butler, G. F.; Graves, A. T.; Disbrow, J. D.; Duke, E. L.

    1989-01-01

    A joint activity between the Dryden Flight Research Facility of the NASA Ames Research Center (Ames-Dryden) and the Royal Aerospace Establishment (RAE) on knowledge-based systems has been agreed. Under the agreement, a flight status monitor knowledge base developed at Ames-Dryden has been implemented using the real-time AI (artificial intelligence) toolkit MUSE, which was developed in the UK. Here, the background to the cooperation is described and the details of the flight status monitor and a prototype MUSE implementation are presented. It is noted that the capabilities of the expert-system flight status monitor to monitor data downlinked from the flight test aircraft and to generate information on the state and health of the system for the test engineers provides increased safety during flight testing of new systems. Furthermore, the expert-system flight status monitor provides the systems engineers with ready access to the large amount of information required to describe a complex aircraft system.

  19. An application of knowledge-based systems to satellite control

    NASA Astrophysics Data System (ADS)

    Skiffington, B.; Carrig, J.; Kornell, J.

    This paper describes an expert system prototype which approaches some issues of satellite command and control. The task of the prototype system is to assist a spacecraft controller in maneuvering a geosynchronous satellite for the purpose of maintaining an accurate spacecraft pointing angle, i.e., station keeping. From an expert system's point of view, two features of the system are notable. First, a tool for automated knowledge acquisition was employed. Because the domain experts were in Maryland while the AI experts were in California, a means to automate knowledge acquisition was required. Second, the system involves a blend of simulation and expert systems technology distributed between a DEC VAX computer and a LISP machine (a special purpose AI computer). This kind of distribution is a plausible model for potential real-world installations.

  20. The Aries Project for Rule-Based-Design Knowledge Acquisition,

    DTIC Science & Technology

    1995-06-01

    because of the project’s association with the RAMS. The ram is the symbol of the astrological sign of Aries. 1-2 Design rules—those that every...knowledge acquisition technique for future rule capture projects for other design areas; and devise training material . The Participating Companies...in the process. It was determined that design rules would be categorized by Material , Analog, Digital, Global, and Part Selection, while design

  1. Design and Diagnosis Problem Solving with Multifunctional Technical Knowledge Bases

    DTIC Science & Technology

    1992-09-29

    Causal I ordering theory (Iwasaki & Simon 86) is used to reveal causal dependencies among variables in a set of equations and produce a graph...Knowledge in Device Design .......................................................... 145 QP is More Than SPQR and Dynamical Systems Theory : Response to...SThe concept of abduction has been used to frame the problem of diagnosis, scientific theory formation, natural language understanding, and is a more

  2. Designing a Strategic Plan through an Emerging Knowledge Generation Process: The ATM Experience

    ERIC Educational Resources Information Center

    Zanotti, Francesco

    2012-01-01

    Purpose: The aim of this contribution is to describe a new methodology for designing strategic plans and how it was implemented by ATM, a public transportation agency based in Milan, Italy. Design/methodology/approach: This methodology is founded on a new system theory, called "quantum systemics". It is based on models and metaphors both…

  3. [Design of Electrocardiogram Signal Generator Based on Typical Electrocardiogram Database].

    PubMed

    Wang, Yuting; Wang, Xiaofei; Li, Dongshang; Liu, Guili

    2016-02-01

    Using LabVIEW programming and high-speed multifunction data acquisition card PCI-6251, we designed an electrocardiogram (ECG) signal generator based on Chinese typical ECG database. When the ECG signals are given off by the generator, the generator can also display the ECG information annotations at the same time, including waveform data and diagnostic results. It could be a useful assisting tool of ECG automatic diagnose instruments.

  4. Weather and event generators based on analogues of atmospheric circulation

    NASA Astrophysics Data System (ADS)

    Yiou, Pascal

    2015-04-01

    Analogues of atmospheric circulation have had numerous applications on weather prediction, climate reconstructions and detection/attribution analyses. A stochastic weather generator based on circulation analogues was recently proposed by Yiou (2014) to simulate sequences of European temperatures. One of the features of this weather generator is that it preserves the spatial and temporal structures of the climate variables to be simulated. This method is flexible enough to be combined efficiently with a storm detection algorithm in order to generate large catalogues of high impact extra-tropical storms that hit Europe. I will present the gist of the method of circulation analogues and some performances. Two promising applications for weather generators based on this method (ensemble climate prediction and extra-tropical storms) will be tested. References Yiou, P.: AnaWEGE: a weather generator based on analogues of atmospheric circulation, Geosci. Model Dev., 7, 531-543, doi:10.5194/gmd-7-531-2014, 2014.

  5. A JAVA implementation of a medical knowledge base for decision support.

    PubMed

    Ambrosiadou, V; Goulis, D; Shankararaman, V; Shamtani, G

    1999-01-01

    Distributed decision support is a challenging issue requiring the implementation of advanced computer science techniques together with tools of development which offer ease of communication and efficiency of searching and control performance. This paper presents a JAVA implementation of a knowledge base model called ARISTOTELES which may be used in order to support the development of the medical knowledge base by clinicians in diverse specialised areas of interest. The advantages that are evident by the application of such a cognitive model are ease of knowledge acquisition, modular construction of the knowledge base and greater acceptance from clinicians.

  6. Knowledge-based interpretation of toxoplasmosis serology test results including fuzzy temporal concepts--the ToxoNet system.

    PubMed

    Kopecky, D; Hayde, M; Prusa, A R; Adlassnig, K P

    2001-01-01

    Transplacental transmission of Toxoplasma gondii from an infected, pregnant woman to the unborn that occurs with a probability of about 60 percent [1] results in fetal damage to a degree depending on the gestational age. The computer system ToxoNet processes the results of serological antibody tests having been performed during pregnancy by means of a knowledge base containing medical knowledge on the interpretation of Toxoplasmosis serology tests. By applying this knowledge ToxoNet generates interpretive reports consisting of a diagnostic interpretation and recommendations for therapy and further testing. For that purpose it matches the results of all serological investigations of maternal blood with the content of the knowledge base returning complete textual interpretations for all given findings. The interpretation algorithm derives the stage of maternal infection from these that is used to infer the degree of fetal threat. To consider varying immune responses of particular patients, certain time intervals have to be kept between two subsequent tests in order to guarantee a correct interpretation of the test results. These time intervals are modelled as fuzzy sets, since they allow the formal description of the temporal uncertainties. ToxoNet comprises the knowledge base, an interpretation system, and a program for the creation and modification of the knowledge base. It is available from the World Wide Web by starting a standard browser like the Internet Explorer or the Netscape Navigator. Thus ToxoNet supports the physician in Toxoplasmosis diagnostics and in addition allows to adopt the way of making decisions to the characteristics of the particular laboratory by modifying the underlying knowledge base.

  7. A study of knowledge-based systems for the Space Station

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Swietek, Gregg; Bullock, Bruce

    1989-01-01

    A rapid turnaround study on the potential uses of knowledge-based systems for Space Station Freedom was conducted from October 1987 through January 1988. Participants included both NASA personnel and experienced industrial knowledge engineers. Major results of the study included five recommended systems for the Baseline Configuration of the Space Station, an analysis of sensor hooks and scars, and a proposed plan for evolutionary growth of knowledge-based systems on the Space Station.

  8. BEST: Next-Generation Biomedical Entity Search Tool for Knowledge Discovery from Biomedical Literature

    PubMed Central

    Lee, Kyubum; Choi, Jaehoon; Kim, Seongsoon; Jeon, Minji; Lim, Sangrak; Choi, Donghee; Kim, Sunkyu; Tan, Aik-Choon

    2016-01-01

    As the volume of publications rapidly increases, searching for relevant information from the literature becomes more challenging. To complement standard search engines such as PubMed, it is desirable to have an advanced search tool that directly returns relevant biomedical entities such as targets, drugs, and mutations rather than a long list of articles. Some existing tools submit a query to PubMed and process retrieved abstracts to extract information at query time, resulting in a slow response time and limited coverage of only a fraction of the PubMed corpus. Other tools preprocess the PubMed corpus to speed up the response time; however, they are not constantly updated, and thus produce outdated results. Further, most existing tools cannot process sophisticated queries such as searches for mutations that co-occur with query terms in the literature. To address these problems, we introduce BEST, a biomedical entity search tool. BEST returns, as a result, a list of 10 different types of biomedical entities including genes, diseases, drugs, targets, transcription factors, miRNAs, and mutations that are relevant to a user’s query. To the best of our knowledge, BEST is the only system that processes free text queries and returns up-to-date results in real time including mutation information in the results. BEST is freely accessible at http://best.korea.ac.kr. PMID:27760149

  9. BEST: Next-Generation Biomedical Entity Search Tool for Knowledge Discovery from Biomedical Literature.

    PubMed

    Lee, Sunwon; Kim, Donghyeon; Lee, Kyubum; Choi, Jaehoon; Kim, Seongsoon; Jeon, Minji; Lim, Sangrak; Choi, Donghee; Kim, Sunkyu; Tan, Aik-Choon; Kang, Jaewoo

    2016-01-01

    As the volume of publications rapidly increases, searching for relevant information from the literature becomes more challenging. To complement standard search engines such as PubMed, it is desirable to have an advanced search tool that directly returns relevant biomedical entities such as targets, drugs, and mutations rather than a long list of articles. Some existing tools submit a query to PubMed and process retrieved abstracts to extract information at query time, resulting in a slow response time and limited coverage of only a fraction of the PubMed corpus. Other tools preprocess the PubMed corpus to speed up the response time; however, they are not constantly updated, and thus produce outdated results. Further, most existing tools cannot process sophisticated queries such as searches for mutations that co-occur with query terms in the literature. To address these problems, we introduce BEST, a biomedical entity search tool. BEST returns, as a result, a list of 10 different types of biomedical entities including genes, diseases, drugs, targets, transcription factors, miRNAs, and mutations that are relevant to a user's query. To the best of our knowledge, BEST is the only system that processes free text queries and returns up-to-date results in real time including mutation information in the results. BEST is freely accessible at http://best.korea.ac.kr.

  10. Fuzzy knowledge base construction through belief networks based on Lukasiewicz logic

    NASA Technical Reports Server (NTRS)

    Lara-Rosano, Felipe

    1992-01-01

    In this paper, a procedure is proposed to build a fuzzy knowledge base founded on fuzzy belief networks and Lukasiewicz logic. Fuzzy procedures are developed to do the following: to assess the belief values of a consequent, in terms of the belief values of its logical antecedents and the belief value of the corresponding logical function; and to update belief values when new evidence is available.

  11. Towards knowledge-based systems in clinical practice: development of an integrated clinical information and knowledge management support system.

    PubMed

    Kalogeropoulos, Dimitris A; Carson, Ewart R; Collinson, Paul O

    2003-09-01

    Given that clinicians presented with identical clinical information will act in different ways, there is a need to introduce into routine clinical practice methods and tools to support the scientific homogeneity and accountability of healthcare decisions and actions. The benefits expected from such action include an overall reduction in cost, improved quality of care, patient and public opinion satisfaction. Computer-based medical data processing has yielded methods and tools for managing the task away from the hospital management level and closer to the desired disease and patient management level. To this end, advanced applications of information and disease process modelling technologies have already demonstrated an ability to significantly augment clinical decision making as a by-product. The wide-spread acceptance of evidence-based medicine as the basis of cost-conscious and concurrently quality-wise accountable clinical practice suffices as evidence supporting this claim. Electronic libraries are one-step towards an online status of this key health-care delivery quality control environment. Nonetheless, to date, the underlying information and knowledge management technologies have failed to be integrated into any form of pragmatic or marketable online and real-time clinical decision making tool. One of the main obstacles that needs to be overcome is the development of systems that treat both information and knowledge as clinical objects with same modelling requirements. This paper describes the development of such a system in the form of an intelligent clinical information management system: a system which at the most fundamental level of clinical decision support facilitates both the organised acquisition of clinical information and knowledge and provides a test-bed for the development and evaluation of knowledge-based decision support functions.

  12. The Hebrewer: A Web-Based Inflection Generator

    ERIC Educational Resources Information Center

    Foster, James Q.; Harrell, Lane Foster; Raizen, Esther

    2004-01-01

    This paper reports on the grammatical and programmatical production aspects of the "Hebrewer," a cross-platform web-based reference work in the form of a Hebrew inflection generator. The Hebrewer, a Java applet/servlet combination, is currently capable of generating 2,500 nouns in full declension and 500 verbs in full conjugation,…

  13. Virk: an active learning-based system for bootstrapping knowledge base development in the neurosciences.

    PubMed

    Ambert, Kyle H; Cohen, Aaron M; Burns, Gully A P C; Boudreau, Eilis; Sonmez, Kemal

    2013-01-01

    The frequency and volume of newly-published scientific literature is quickly making manual maintenance of publicly-available databases of primary data unrealistic and costly. Although machine learning (ML) can be useful for developing automated approaches to identifying scientific publications containing relevant information for a database, developing such tools necessitates manually annotating an unrealistic number of documents. One approach to this problem, active learning (AL), builds classification models by iteratively identifying documents that provide the most information to a classifier. Although this approach has been shown to be effective for related problems, in the context of scientific databases curation, it falls short. We present Virk, an AL system that, while being trained, simultaneously learns a classification model and identifies documents having information of interest for a knowledge base. Our approach uses a support vector machine (SVM) classifier with input features derived from neuroscience-related publications from the primary literature. Using our approach, we were able to increase the size of the Neuron Registry, a knowledge base of neuron-related information, by a factor of 90%, a knowledge base of neuron-related information, in 3 months. Using standard biocuration methods, it would have taken between 1 and 2 years to make the same number of contributions to the Neuron Registry. Here, we describe the system pipeline in detail, and evaluate its performance against other approaches to sampling in AL.

  14. Public School Teachers' Knowledge, Perception, and Implementation of Brain-Based Learning Practices

    ERIC Educational Resources Information Center

    Wachob, David A.

    2012-01-01

    The purpose of this study was to determine K-12 teachers' knowledge, beliefs, and practices of brain-based learning strategies in western Pennsylvania schools. The following five research questions were explored: (a) What is the extent of knowledge K-12 public school teachers have about the indicators of brain-based learning and Brain Gym?; (b) To…

  15. A Discourse Based Approach to the Language Documentation of Local Ecological Knowledge

    ERIC Educational Resources Information Center

    Odango, Emerson Lopez

    2016-01-01

    This paper proposes a discourse-based approach to the language documentation of local ecological knowledge (LEK). The knowledge, skills, beliefs, cultural worldviews, and ideologies that shape the way a community interacts with its environment can be examined through the discourse in which LEK emerges. 'Discourse-based' refers to two components:…

  16. End-user oriented language to develop knowledge-based expert systems

    SciTech Connect

    Ueno, H.

    1983-01-01

    A description is given of the COMEX (compact knowledge based expert system) expert system language for application-domain users who want to develop a knowledge-based expert system by themselves. The COMEX system was written in FORTRAN and works on a microcomputer. COMEX is being used in several application domains such as medicine, education, and industry. 7 references.

  17. Comparison of LISP and MUMPS as implementation languages for knowledge-based systems

    SciTech Connect

    Curtis, A.C.

    1984-01-01

    Major components of knowledge-based systems are summarized, along with the programming language features generally useful in their implementation. LISP and MUMPS are briefly described and compared as vehicles for building knowledge-based systems. The paper concludes with suggestions for extensions to MUMPS which might increase its usefulness in artificial intelligence applications without affecting the essential nature of the language. 8 references.

  18. Mapping and Managing Knowledge and Information in Resource-Based Learning

    ERIC Educational Resources Information Center

    Tergan, Sigmar-Olaf; Graber, Wolfgang; Neumann, Anja

    2006-01-01

    In resource-based learning scenarios, students are often overwhelmed by the complexity of task-relevant knowledge and information. Techniques for the external interactive representation of individual knowledge in graphical format may help them to cope with complex problem situations. Advanced computer-based concept-mapping tools have the potential…

  19. Preparing Oral Examinations of Mathematical Domains with the Help of a Knowledge-Based Dialogue System.

    ERIC Educational Resources Information Center

    Schmidt, Peter

    A conception of discussing mathematical material in the domain of calculus is outlined. Applications include that university students work at their knowledge and prepare for their oral examinations by utilizing the dialog system. The conception is based upon three pillars. One central pillar is a knowledge base containing the collections of…

  20. GUIDON-WATCH: A Graphic Interface for Viewing a Knowledge-Based System. Technical Report #14.

    ERIC Educational Resources Information Center

    Richer, Mark H.; Clancey, William J.

    This paper describes GUIDON-WATCH, a graphic interface that uses multiple windows and a mouse to allow a student to browse a knowledge base and view reasoning processes during diagnostic problem solving. The GUIDON project at Stanford University is investigating how knowledge-based systems can provide the basis for teaching programs, and this…