Ontological Problem-Solving Framework for Dynamically Configuring Sensor Systems and Algorithms
Qualls, Joseph; Russomanno, David J.
2011-01-01
The deployment of ubiquitous sensor systems and algorithms has led to many challenges, such as matching sensor systems to compatible algorithms which are capable of satisfying a task. Compounding the challenges is the lack of the requisite knowledge models needed to discover sensors and algorithms and to subsequently integrate their capabilities to satisfy a specific task. A novel ontological problem-solving framework has been designed to match sensors to compatible algorithms to form synthesized systems, which are capable of satisfying a task and then assigning the synthesized systems to high-level missions. The approach designed for the ontological problem-solving framework has been instantiated in the context of a persistence surveillance prototype environment, which includes profiling sensor systems and algorithms to demonstrate proof-of-concept principles. Even though the problem-solving approach was instantiated with profiling sensor systems and algorithms, the ontological framework may be useful with other heterogeneous sensing-system environments. PMID:22163793
Qualls, Joseph; Russomanno, David J.
2011-01-01
The lack of knowledge models to represent sensor systems, algorithms, and missions makes opportunistically discovering a synthesis of systems and algorithms that can satisfy high-level mission specifications impractical. A novel ontological problem-solving framework has been designed that leverages knowledge models describing sensors, algorithms, and high-level missions to facilitate automated inference of assigning systems to subtasks that may satisfy a given mission specification. To demonstrate the efficacy of the ontological problem-solving architecture, a family of persistence surveillance sensor systems and algorithms has been instantiated in a prototype environment to demonstrate the assignment of systems to subtasks of high-level missions. PMID:22164081
A fuzzy-ontology-oriented case-based reasoning framework for semantic diabetes diagnosis.
El-Sappagh, Shaker; Elmogy, Mohammed; Riad, A M
2015-11-01
Case-based reasoning (CBR) is a problem-solving paradigm that uses past knowledge to interpret or solve new problems. It is suitable for experience-based and theory-less problems. Building a semantically intelligent CBR that mimic the expert thinking can solve many problems especially medical ones. Knowledge-intensive CBR using formal ontologies is an evolvement of this paradigm. Ontologies can be used for case representation and storage, and it can be used as a background knowledge. Using standard medical ontologies, such as SNOMED CT, enhances the interoperability and integration with the health care systems. Moreover, utilizing vague or imprecise knowledge further improves the CBR semantic effectiveness. This paper proposes a fuzzy ontology-based CBR framework. It proposes a fuzzy case-base OWL2 ontology, and a fuzzy semantic retrieval algorithm that handles many feature types. This framework is implemented and tested on the diabetes diagnosis problem. The fuzzy ontology is populated with 60 real diabetic cases. The effectiveness of the proposed approach is illustrated with a set of experiments and case studies. The resulting system can answer complex medical queries related to semantic understanding of medical concepts and handling of vague terms. The resulting fuzzy case-base ontology has 63 concepts, 54 (fuzzy) object properties, 138 (fuzzy) datatype properties, 105 fuzzy datatypes, and 2640 instances. The system achieves an accuracy of 97.67%. We compare our framework with existing CBR systems and a set of five machine-learning classifiers; our system outperforms all of these systems. Building an integrated CBR system can improve its performance. Representing CBR knowledge using the fuzzy ontology and building a case retrieval algorithm that treats different features differently improves the accuracy of the resulting systems. Copyright © 2015 Elsevier B.V. All rights reserved.
MONTO: A Machine-Readable Ontology for Teaching Word Problems in Mathematics
ERIC Educational Resources Information Center
Lalingkar, Aparna; Ramnathan, Chandrashekar; Ramani, Srinivasan
2015-01-01
The Indian National Curriculum Framework has as one of its objectives the development of mathematical thinking and problem solving ability. However, recent studies conducted in Indian metros have expressed concern about students' mathematics learning. Except in some private coaching academies, regular classroom teaching does not include problem…
A novel way of integrating rule-based knowledge into a web ontology language framework.
Gamberger, Dragan; Krstaçić, Goran; Jović, Alan
2013-01-01
Web ontology language (OWL), used in combination with the Protégé visual interface, is a modern standard for development and maintenance of ontologies and a powerful tool for knowledge presentation. In this work, we describe a novel possibility to use OWL also for the conceptualization of knowledge presented by a set of rules. In this approach, rules are represented as a hierarchy of actionable classes with necessary and sufficient conditions defined by the description logic formalism. The advantages are that: the set of the rules is not an unordered set anymore, the concepts defined in descriptive ontologies can be used directly in the bodies of rules, and Protégé presents an intuitive tool for editing the set of rules. Standard ontology reasoning processes are not applicable in this framework, but experiments conducted on the rule sets have demonstrated that the reasoning problems can be successfully solved.
A General Architecture for Intelligent Tutoring of Diagnostic Classification Problem Solving
Crowley, Rebecca S.; Medvedeva, Olga
2003-01-01
We report on a general architecture for creating knowledge-based medical training systems to teach diagnostic classification problem solving. The approach is informed by our previous work describing the development of expertise in classification problem solving in Pathology. The architecture envelops the traditional Intelligent Tutoring System design within the Unified Problem-solving Method description Language (UPML) architecture, supporting component modularity and reuse. Based on the domain ontology, domain task ontology and case data, the abstract problem-solving methods of the expert model create a dynamic solution graph. Student interaction with the solution graph is filtered through an instructional layer, which is created by a second set of abstract problem-solving methods and pedagogic ontologies, in response to the current state of the student model. We outline the advantages and limitations of this general approach, and describe it’s implementation in SlideTutor–a developing Intelligent Tutoring System in Dermatopathology. PMID:14728159
Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A
1995-06-01
PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.
Tcheremenskaia, Olga; Benigni, Romualdo; Nikolova, Ivelina; Jeliazkova, Nina; Escher, Sylvia E; Batke, Monika; Baier, Thomas; Poroikov, Vladimir; Lagunin, Alexey; Rautenberg, Micha; Hardy, Barry
2012-04-24
The OpenTox Framework, developed by the partners in the OpenTox project (http://www.opentox.org), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing. The following related ontologies have been developed for OpenTox: a) Toxicological ontology - listing the toxicological endpoints; b) Organs system and Effects ontology - addressing organs, targets/examinations and effects observed in in vivo studies; c) ToxML ontology - representing semi-automatic conversion of the ToxML schema; d) OpenTox ontology- representation of OpenTox framework components: chemical compounds, datasets, types of algorithms, models and validation web services; e) ToxLink-ToxCast assays ontology and f) OpenToxipedia community knowledge resource on toxicology terminology.OpenTox components are made available through standardized REST web services, where every compound, data set, and predictive method has a unique resolvable address (URI), used to retrieve its Resource Description Framework (RDF) representation, or to initiate the associated calculations and generate new RDF-based resources.The services support the integration of toxicity and chemical data from various sources, the generation and validation of computer models for toxic effects, seamless integration of new algorithms and scientifically sound validation routines and provide a flexible framework, which allows building arbitrary number of applications, tailored to solving different problems by end users (e.g. toxicologists). The OpenTox toxicological ontology projects may be accessed via the OpenTox ontology development page http://www.opentox.org/dev/ontology; the OpenTox ontology is available as OWL at http://opentox.org/api/1 1/opentox.owl, the ToxML - OWL conversion utility is an open source resource available at http://ambit.svn.sourceforge.net/viewvc/ambit/branches/toxml-utils/
2012-01-01
Background The OpenTox Framework, developed by the partners in the OpenTox project (http://www.opentox.org), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing. Results The following related ontologies have been developed for OpenTox: a) Toxicological ontology – listing the toxicological endpoints; b) Organs system and Effects ontology – addressing organs, targets/examinations and effects observed in in vivo studies; c) ToxML ontology – representing semi-automatic conversion of the ToxML schema; d) OpenTox ontology– representation of OpenTox framework components: chemical compounds, datasets, types of algorithms, models and validation web services; e) ToxLink–ToxCast assays ontology and f) OpenToxipedia community knowledge resource on toxicology terminology. OpenTox components are made available through standardized REST web services, where every compound, data set, and predictive method has a unique resolvable address (URI), used to retrieve its Resource Description Framework (RDF) representation, or to initiate the associated calculations and generate new RDF-based resources. The services support the integration of toxicity and chemical data from various sources, the generation and validation of computer models for toxic effects, seamless integration of new algorithms and scientifically sound validation routines and provide a flexible framework, which allows building arbitrary number of applications, tailored to solving different problems by end users (e.g. toxicologists). Availability The OpenTox toxicological ontology projects may be accessed via the OpenTox ontology development page http://www.opentox.org/dev/ontology; the OpenTox ontology is available as OWL at http://opentox.org/api/1 1/opentox.owl, the ToxML - OWL conversion utility is an open source resource available at http://ambit.svn.sourceforge.net/viewvc/ambit/branches/toxml-utils/ PMID:22541598
NASA Astrophysics Data System (ADS)
Tuminaro, Jonathan
Many introductory, algebra-based physics students perform poorly on mathematical problem solving tasks in physics. There are at least two possible, distinct reasons for this poor performance: (1) students simply lack the mathematical skills needed to solve problems in physics, or (2) students do not know how to apply the mathematical skills they have to particular problem situations in physics. While many students do lack the requisite mathematical skills, a major finding from this work is that the majority of students possess the requisite mathematical skills, yet fail to use or interpret them in the context of physics. In this thesis I propose a theoretical framework to analyze and describe students' mathematical thinking in physics. In particular, I attempt to answer two questions. What are the cognitive tools involved in formal mathematical thinking in physics? And, why do students make the kinds of mistakes they do when using mathematics in physics? According to the proposed theoretical framework there are three major theoretical constructs: mathematical resources, which are the knowledge elements that are activated in mathematical thinking and problem solving; epistemic games, which are patterns of activities that use particular kinds of knowledge to create new knowledge or solve a problem; and frames, which are structures of expectations that determine how individuals interpret situations or events. The empirical basis for this study comes from videotaped sessions of college students solving homework problems. The students are enrolled in an algebra-based introductory physics course. The videotapes were transcribed and analyzed using the aforementioned theoretical framework. Two important results from this work are: (1) the construction of a theoretical framework that offers researchers a vocabulary (ontological classification of cognitive structures) and grammar (relationship between the cognitive structures) for understanding the nature and origin of mathematical use in the context physics, and (2) a detailed understanding, in terms of the proposed theoretical framework, of the errors that students make when using mathematics in the context of physics.
A four stage approach for ontology-based health information system design.
Kuziemsky, Craig E; Lau, Francis
2010-11-01
To describe and illustrate a four stage methodological approach to capture user knowledge in a biomedical domain area, use that knowledge to design an ontology, and then implement and evaluate the ontology as a health information system (HIS). A hybrid participatory design-grounded theory (GT-PD) method was used to obtain data and code them for ontology development. Prototyping was used to implement the ontology as a computer-based tool. Usability testing evaluated the computer-based tool. An empirically derived domain ontology and set of three problem-solving approaches were developed as a formalized model of the concepts and categories from the GT coding. The ontology and problem-solving approaches were used to design and implement a HIS that tested favorably in usability testing. The four stage approach illustrated in this paper is useful for designing and implementing an ontology as the basis for a HIS. The approach extends existing ontology development methodologies by providing an empirical basis for theory incorporated into ontology design. Copyright © 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Feng, Shou; Fu, Ping; Zheng, Wenbin
2018-03-01
Predicting gene function based on biological instrumental data is a complicated and challenging hierarchical multi-label classification (HMC) problem. When using local approach methods to solve this problem, a preliminary results processing method is usually needed. This paper proposed a novel preliminary results processing method called the nodes interaction method. The nodes interaction method revises the preliminary results and guarantees that the predictions are consistent with the hierarchy constraint. This method exploits the label dependency and considers the hierarchical interaction between nodes when making decisions based on the Bayesian network in its first phase. In the second phase, this method further adjusts the results according to the hierarchy constraint. Implementing the nodes interaction method in the HMC framework also enhances the HMC performance for solving the gene function prediction problem based on the Gene Ontology (GO), the hierarchy of which is a directed acyclic graph that is more difficult to tackle. The experimental results validate the promising performance of the proposed method compared to state-of-the-art methods on eight benchmark yeast data sets annotated by the GO.
An Ontology for Learning Services on the Shop Floor
ERIC Educational Resources Information Center
Ullrich, Carsten
2016-01-01
An ontology expresses a common understanding of a domain that serves as a basis of communication between people or systems, and enables knowledge sharing, reuse of domain knowledge, reasoning and thus problem solving. In Technology-Enhanced Learning, especially in Intelligent Tutoring Systems and Adaptive Learning Environments, ontologies serve as…
Modern architectures for intelligent systems: reusable ontologies and problem-solving methods.
Musen, M. A.
1998-01-01
When interest in intelligent systems for clinical medicine soared in the 1970s, workers in medical informatics became particularly attracted to rule-based systems. Although many successful rule-based applications were constructed, development and maintenance of large rule bases remained quite problematic. In the 1980s, an entire industry dedicated to the marketing of tools for creating rule-based systems rose and fell, as workers in medical informatics began to appreciate deeply why knowledge acquisition and maintenance for such systems are difficult problems. During this time period, investigators began to explore alternative programming abstractions that could be used to develop intelligent systems. The notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) domain-independent problem-solving methods-standard algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper will highlight how intelligent systems for diverse tasks can be efficiently automated using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community. PMID:9929181
Modern architectures for intelligent systems: reusable ontologies and problem-solving methods.
Musen, M A
1998-01-01
When interest in intelligent systems for clinical medicine soared in the 1970s, workers in medical informatics became particularly attracted to rule-based systems. Although many successful rule-based applications were constructed, development and maintenance of large rule bases remained quite problematic. In the 1980s, an entire industry dedicated to the marketing of tools for creating rule-based systems rose and fell, as workers in medical informatics began to appreciate deeply why knowledge acquisition and maintenance for such systems are difficult problems. During this time period, investigators began to explore alternative programming abstractions that could be used to develop intelligent systems. The notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) domain-independent problem-solving methods-standard algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper will highlight how intelligent systems for diverse tasks can be efficiently automated using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community.
Formal ontology for natural language processing and the integration of biomedical databases.
Simon, Jonathan; Dos Santos, Mariana; Fielding, James; Smith, Barry
2006-01-01
The central hypothesis underlying this communication is that the methodology and conceptual rigor of a philosophically inspired formal ontology can bring significant benefits in the development and maintenance of application ontologies [A. Flett, M. Dos Santos, W. Ceusters, Some Ontology Engineering Procedures and their Supporting Technologies, EKAW2002, 2003]. This hypothesis has been tested in the collaboration between Language and Computing (L&C), a company specializing in software for supporting natural language processing especially in the medical field, and the Institute for Formal Ontology and Medical Information Science (IFOMIS), an academic research institution concerned with the theoretical foundations of ontology. In the course of this collaboration L&C's ontology, LinKBase, which is designed to integrate and support reasoning across a plurality of external databases, has been subjected to a thorough auditing on the basis of the principles underlying IFOMIS's Basic Formal Ontology (BFO) [B. Smith, Basic Formal Ontology, 2002. http://ontology.buffalo.edu/bfo]. The goal is to transform a large terminology-based ontology into one with the ability to support reasoning applications. Our general procedure has been the implementation of a meta-ontological definition space in which the definitions of all the concepts and relations in LinKBase are standardized in the framework of first-order logic. In this paper we describe how this principles-based standardization has led to a greater degree of internal coherence of the LinKBase structure, and how it has facilitated the construction of mappings between external databases using LinKBase as translation hub. We argue that the collaboration here described represents a new phase in the quest to solve the so-called "Tower of Babel" problem of ontology integration [F. Montayne, J. Flanagan, Formal Ontology: The Foundation for Natural Language Processing, 2003. http://www.landcglobal.com/].
The Semantic Retrieval of Spatial Data Service Based on Ontology in SIG
NASA Astrophysics Data System (ADS)
Sun, S.; Liu, D.; Li, G.; Yu, W.
2011-08-01
The research of SIG (Spatial Information Grid) mainly solves the problem of how to connect different computing resources, so that users can use all the resources in the Grid transparently and seamlessly. In SIG, spatial data service is described in some kinds of specifications, which use different meta-information of each kind of services. This kind of standardization cannot resolve the problem of semantic heterogeneity, which may limit user to obtain the required resources. This paper tries to solve two kinds of semantic heterogeneities (name heterogeneity and structure heterogeneity) in spatial data service retrieval based on ontology, and also, based on the hierarchical subsumption relationship among concept in ontology, the query words can be extended and more resource can be matched and found for user. These applications of ontology in spatial data resource retrieval can help to improve the capability of keyword matching, and find more related resources.
An Approach to Information Management for AIR7000 with Metadata and Ontologies
2009-10-01
metadata. We then propose an approach based on Semantic Technologies including the Resource Description Framework (RDF) and Upper Ontologies, for the...mandating specific metadata schemas can result in interoperability problems. For example, many standards within the ADO mandate the use of XML for metadata...such problems, we propose an archi- tecture in which different metadata schemes can inter operate. By using RDF (Resource Description Framework ) as a
BiOSS: A system for biomedical ontology selection.
Martínez-Romero, Marcos; Vázquez-Naya, José M; Pereira, Javier; Pazos, Alejandro
2014-04-01
In biomedical informatics, ontologies are considered a key technology for annotating, retrieving and sharing the huge volume of publicly available data. Due to the increasing amount, complexity and variety of existing biomedical ontologies, choosing the ones to be used in a semantic annotation problem or to design a specific application is a difficult task. As a consequence, the design of approaches and tools addressed to facilitate the selection of biomedical ontologies is becoming a priority. In this paper we present BiOSS, a novel system for the selection of biomedical ontologies. BiOSS evaluates the adequacy of an ontology to a given domain according to three different criteria: (1) the extent to which the ontology covers the domain; (2) the semantic richness of the ontology in the domain; (3) the popularity of the ontology in the biomedical community. BiOSS has been applied to 5 representative problems of ontology selection. It also has been compared to existing methods and tools. Results are promising and show the usefulness of BiOSS to solve real-world ontology selection problems. BiOSS is openly available both as a web tool and a web service. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Kernel Methods for Mining Instance Data in Ontologies
NASA Astrophysics Data System (ADS)
Bloehdorn, Stephan; Sure, York
The amount of ontologies and meta data available on the Web is constantly growing. The successful application of machine learning techniques for learning of ontologies from textual data, i.e. mining for the Semantic Web, contributes to this trend. However, no principal approaches exist so far for mining from the Semantic Web. We investigate how machine learning algorithms can be made amenable for directly taking advantage of the rich knowledge expressed in ontologies and associated instance data. Kernel methods have been successfully employed in various learning tasks and provide a clean framework for interfacing between non-vectorial data and machine learning algorithms. In this spirit, we express the problem of mining instances in ontologies as the problem of defining valid corresponding kernels. We present a principled framework for designing such kernels by means of decomposing the kernel computation into specialized kernels for selected characteristics of an ontology which can be flexibly assembled and tuned. Initial experiments on real world Semantic Web data enjoy promising results and show the usefulness of our approach.
Study on the E-commerce platform based on the agent
NASA Astrophysics Data System (ADS)
Fu, Ruixue; Qin, Lishuan; Gao, Yinmin
2011-10-01
To solve problem of dynamic integration in e-commerce, the Multi-Agent architecture of electronic commerce platform system based on Agent and Ontology has been introduced, which includes three major types of agent, Ontology and rule collection. In this architecture, service agent and rule are used to realize the business process reengineering, the reuse of software component, and agility of the electronic commerce platform. To illustrate the architecture, a simulation work has been done and the results imply that the architecture provides a very efficient method to design and implement the flexible, distributed, open and intelligent electronic commerce platform system to solve problem of dynamic integration in ecommerce. The objective of this paper is to illustrate the architecture of electronic commerce platform system, and the approach how Agent and Ontology support the electronic commerce platform system.
MENTOR: an enabler for interoperable intelligent systems
NASA Astrophysics Data System (ADS)
Sarraipa, João; Jardim-Goncalves, Ricardo; Steiger-Garcao, Adolfo
2010-07-01
A community with knowledge organisation based on ontologies will enable an increase in the computational intelligence of its information systems. However, due to the worldwide diversity of communities, a high number of knowledge representation elements, which are not semantically coincident, have appeared representing the same segment of reality, becoming a barrier to business communications. Even if a domain community uses the same kind of technologies in its information systems, such as ontologies, it doesn't solve its semantics differences. In order to solve this interoperability problem, a solution is to use a reference ontology as an intermediary in the communications between the community enterprises and the outside, while allowing the enterprises to keep their own ontology and semantics unchanged internally. This work proposes MENTOR, a methodology to support the development of a common reference ontology for a group of organisations sharing the same business domain. This methodology is based on the mediator ontology (MO) concept, which assists the semantic transformations among each enterprise's ontology and the referential one. The MO enables each organisation to keep its own terminology, glossary and ontological structures, while providing seamless communication and interaction with the others.
NASA Astrophysics Data System (ADS)
Li, Y.; Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; McGibbney, L. J.
2016-12-01
Big oceanographic data have been produced, archived and made available online, but finding the right data for scientific research and application development is still a significant challenge. A long-standing problem in data discovery is how to find the interrelationships between keywords and data, as well as the intrarelationships of the two individually. Most previous research attempted to solve this problem by building domain-specific ontology either manually or through automatic machine learning techniques. The former is costly, labor intensive and hard to keep up-to-date, while the latter is prone to noise and may be difficult for human to understand. Large-scale user behavior data modelling represents a largely untapped, unique, and valuable source for discovering semantic relationships among domain-specific vocabulary. In this article, we propose a search engine framework for mining and utilizing dataset relevancy from oceanographic dataset metadata, user behaviors, and existing ontology. The objective is to improve discovery accuracy of oceanographic data and reduce time for scientist to discover, download and reformat data for their projects. Experiments and a search example show that the proposed search engine helps both scientists and general users search with better ranking results, recommendation, and ontology navigation.
Semantic Integration for Marine Science Interoperability Using Web Technologies
NASA Astrophysics Data System (ADS)
Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.
2008-12-01
The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example exactMatch, narrowerThan, and subClassOf. VINE can compute inferred mappings based on the given associations. Attributes about each mapping, like comments and a confidence level, can also be included. VINE also supports registering and storing resulting mapping files in the Ontology Registry. The presentation will describe the application of semantic technologies in general, and our planned applications in particular, to solve data management problems in the marine and environmental sciences.
NASA Astrophysics Data System (ADS)
Doerr, Martin; Freitas, Fred; Guizzardi, Giancarlo; Han, Hyoil
Ontology is a cross-disciplinary field concerned with the study of concepts and theories that can be used for representing shared conceptualizations of specific domains. Ontological Engineering is a discipline in computer and information science concerned with the development of techniques, methods, languages and tools for the systematic construction of concrete artifacts capturing these representations, i.e., models (e.g., domain ontologies) and metamodels (e.g., upper-level ontologies). In recent years, there has been a growing interest in the application of formal ontology and ontological engineering to solve modeling problems in diverse areas in computer science such as software and data engineering, knowledge representation, natural language processing, information science, among many others.
An Onto-Semiotic Analysis of Combinatorial Problems and the Solving Processes by University Students
ERIC Educational Resources Information Center
Godino, Juan D.; Batanero, Carmen; Roa, Rafael
2005-01-01
In this paper we describe an ontological and semiotic model for mathematical knowledge, using elementary combinatorics as an example. We then apply this model to analyze the solving process of some combinatorial problems by students with high mathematical training, and show its utility in providing a semiotic explanation for the difficulty of…
The Development of Ontology from Multiple Databases
NASA Astrophysics Data System (ADS)
Kasim, Shahreen; Aswa Omar, Nurul; Fudzee, Mohd Farhan Md; Azhar Ramli, Azizul; Aizi Salamat, Mohamad; Mahdin, Hairulnizam
2017-08-01
The area of halal industry is the fastest growing global business across the world. The halal food industry is thus crucial for Muslims all over the world as it serves to ensure them that the food items they consume daily are syariah compliant. Currently, ontology has been widely used in computer sciences area such as web on the heterogeneous information processing, semantic web, and information retrieval. However, ontology has still not been used widely in the halal industry. Today, Muslim community still have problem to verify halal status for products in the market especially foods consisting of E number. This research tried to solve problem in validating the halal status from various halal sources. There are various chemical ontology from multilple databases found to help this ontology development. The E numbers in this chemical ontology are codes for chemicals that can be used as food additives. With this E numbers ontology, Muslim community could identify and verify the halal status effectively for halal products in the market.
Evidence-based ergonomics: a model and conceptual structure proposal.
Silveira, Dierci Marcio
2012-01-01
In Human Factors and Ergonomics Science (HFES), it is difficult to identify what is the best approach to tackle the workplace and systems design problems which needs to be solved, and it has been also advocated as transdisciplinary and multidisciplinary the issue of "How to solve the human factors and ergonomics problems that are identified?". The proposition on this study is to combine the theoretical approach for Sustainability Science, the Taxonomy of the Human Factors and Ergonomics (HFE) discipline and the framework for Evidence-Based Medicine in an attempt to be applied in Human Factors and Ergonomics. Applications of ontologies are known in the field of medical research and computer science. By scrutinizing the key requirements for the HFES structuring of knowledge, it was designed a reference model, First, it was identified the important requirements for HFES Concept structuring, as regarded by Meister. Second, it was developed an evidence-based ergonomics framework as a reference model composed of six levels based on these requirements. Third, it was devised a mapping tool using linguistic resources to translate human work, systems environment and the complexities inherent to their hierarchical relationships to support future development at Level 2 of the reference model and for meeting the two major challenges for HFES, namely, identifying what problems should be addressed in HFE as an Autonomous Science itself and proposing solutions by integrating concepts and methods applied in HFES for those problems.
Bratsas, Charalampos; Koutkias, Vassilis; Kaimakamis, Evangelos; Bamidis, Panagiotis; Maglaveras, Nicos
2007-01-01
Medical Computational Problem (MCP) solving is related to medical problems and their computerized algorithmic solutions. In this paper, an extension of an ontology-based model to fuzzy logic is presented, as a means to enhance the information retrieval (IR) procedure in semantic management of MCPs. We present herein the methodology followed for the fuzzy expansion of the ontology model, the fuzzy query expansion procedure, as well as an appropriate ontology-based Vector Space Model (VSM) that was constructed for efficient mapping of user-defined MCP search criteria and MCP acquired knowledge. The relevant fuzzy thesaurus is constructed by calculating the simultaneous occurrences of terms and the term-to-term similarities derived from the ontology that utilizes UMLS (Unified Medical Language System) concepts by using Concept Unique Identifiers (CUI), synonyms, semantic types, and broader-narrower relationships for fuzzy query expansion. The current approach constitutes a sophisticated advance for effective, semantics-based MCP-related IR.
Design of an ontology for medical image manipulation: an example applied for DICOM extensions
NASA Astrophysics Data System (ADS)
Aubry, Florent; Chameroy, Virginie; Todd-Pokropek, Andrew; Di Paola, Robert
1999-07-01
Currently, various data formats are widely used for medical imags, e.g. DICOM for exchange through network and storage media, and INTERFILE for image exchange in nuclear medicine. These formats are only able partly to solve problems arising in accessing and handling imags. To solve such problems, an ontology dedicated to the description of data and knowledge involved in the handling and the management of medical images has been designed. The ontology offers a semantic frame of reference to which manipulation tools can refer. It considers various point of view on the data, related to the context of production, the content,and the data quality. It supports several levels of abstraction, going from a declarative level related to the examination type to the implementation level. Moreover, the ontology provides mechanisms allowing the creation and the description of new entities. It can, thus, act as an intermediate language ensuring accurate reuse of the entities. This paper, which presents work in progress, is focused on the description of the ontology and points out how to use it for the description of and the access to DICOM or INTERFILE entities, and for the extension of the DICOM or INTERFILE dictionaries, by adding new entities, in order to describe complex relationships between images.
The Cyclic Nature of Problem Solving: An Emergent Multidimensional Problem-Solving Framework
ERIC Educational Resources Information Center
Carlson, Marilyn P.; Bloom, Irene
2005-01-01
This paper describes the problem-solving behaviors of 12 mathematicians as they completed four mathematical tasks. The emergent problem-solving framework draws on the large body of research, as grounded by and modified in response to our close observations of these mathematicians. The resulting "Multidimensional Problem-Solving Framework" has four…
Toward Solving the Problem of Problem Solving: An Analysis Framework
ERIC Educational Resources Information Center
Roesler, Rebecca A.
2016-01-01
Teaching is replete with problem solving. Problem solving as a skill, however, is seldom addressed directly within music teacher education curricula, and research in music education has not examined problem solving systematically. A framework detailing problem-solving component skills would provide a needed foundation. I observed problem solving…
Prediction of protein-protein interaction network using a multi-objective optimization approach.
Chowdhury, Archana; Rakshit, Pratyusha; Konar, Amit
2016-06-01
Protein-Protein Interactions (PPIs) are very important as they coordinate almost all cellular processes. This paper attempts to formulate PPI prediction problem in a multi-objective optimization framework. The scoring functions for the trial solution deal with simultaneous maximization of functional similarity, strength of the domain interaction profiles, and the number of common neighbors of the proteins predicted to be interacting. The above optimization problem is solved using the proposed Firefly Algorithm with Nondominated Sorting. Experiments undertaken reveal that the proposed PPI prediction technique outperforms existing methods, including gene ontology-based Relative Specific Similarity, multi-domain-based Domain Cohesion Coupling method, domain-based Random Decision Forest method, Bagging with REP Tree, and evolutionary/swarm algorithm-based approaches, with respect to sensitivity, specificity, and F1 score.
Best behaviour? Ontologies and the formal description of animal behaviour.
Gkoutos, Georgios V; Hoehndorf, Robert; Tsaprouni, Loukia; Schofield, Paul N
2015-10-01
The development of ontologies for describing animal behaviour has proved to be one of the most difficult of all scientific knowledge domains. Ranging from neurological processes to human emotions, the range and scope needed for such ontologies is highly challenging, but if data integration and computational tools such as automated reasoning are to be fully applied in this important area the underlying principles of these ontologies need to be better established and development needs detailed coordination. Whilst the state of scientific knowledge is always paramount in ontology and formal description framework design, this is a particular problem with neurobehavioural ontologies where our understanding of the relationship between behaviour and its underlying biophysical basis is currently in its infancy. In this commentary, we discuss some of the fundamental problems in designing and using behaviour ontologies, and present some of the best developed tools in this domain.
Workflow Agents vs. Expert Systems: Problem Solving Methods in Work Systems Design
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Seah, Chin
2009-01-01
During the 1980s, a community of artificial intelligence researchers became interested in formalizing problem solving methods as part of an effort called "second generation expert systems" (2nd GES). How do the motivations and results of this research relate to building tools for the workplace today? We provide an historical review of how the theory of expertise has developed, a progress report on a tool for designing and implementing model-based automation (Brahms), and a concrete example how we apply 2nd GES concepts today in an agent-based system for space flight operations (OCAMS). Brahms incorporates an ontology for modeling work practices, what people are doing in the course of a day, characterized as "activities." OCAMS was developed using a simulation-to-implementation methodology, in which a prototype tool was embedded in a simulation of future work practices. OCAMS uses model-based methods to interactively plan its actions and keep track of the work to be done. The problem solving methods of practice are interactive, employing reasoning for and through action in the real world. Analogously, it is as if a medical expert system were charged not just with interpreting culture results, but actually interacting with a patient. Our perspective shifts from building a "problem solving" (expert) system to building an actor in the world. The reusable components in work system designs include entire "problem solvers" (e.g., a planning subsystem), interoperability frameworks, and workflow agents that use and revise models dynamically in a network of people and tools. Consequently, the research focus shifts so "problem solving methods" include ways of knowing that models do not fit the world, and ways of interacting with other agents and people to gain or verify information and (ultimately) adapt rules and procedures to resolve problematic situations.
A unified framework for managing provenance information in translational research
2011-01-01
Background A critical aspect of the NIH Translational Research roadmap, which seeks to accelerate the delivery of "bench-side" discoveries to patient's "bedside," is the management of the provenance metadata that keeps track of the origin and history of data resources as they traverse the path from the bench to the bedside and back. A comprehensive provenance framework is essential for researchers to verify the quality of data, reproduce scientific results published in peer-reviewed literature, validate scientific process, and associate trust value with data and results. Traditional approaches to provenance management have focused on only partial sections of the translational research life cycle and they do not incorporate "domain semantics", which is essential to support domain-specific querying and analysis by scientists. Results We identify a common set of challenges in managing provenance information across the pre-publication and post-publication phases of data in the translational research lifecycle. We define the semantic provenance framework (SPF), underpinned by the Provenir upper-level provenance ontology, to address these challenges in the four stages of provenance metadata: (a) Provenance collection - during data generation (b) Provenance representation - to support interoperability, reasoning, and incorporate domain semantics (c) Provenance storage and propagation - to allow efficient storage and seamless propagation of provenance as the data is transferred across applications (d) Provenance query - to support queries with increasing complexity over large data size and also support knowledge discovery applications We apply the SPF to two exemplar translational research projects, namely the Semantic Problem Solving Environment for Trypanosoma cruzi (T.cruzi SPSE) and the Biomedical Knowledge Repository (BKR) project, to demonstrate its effectiveness. Conclusions The SPF provides a unified framework to effectively manage provenance of translational research data during pre and post-publication phases. This framework is underpinned by an upper-level provenance ontology called Provenir that is extended to create domain-specific provenance ontologies to facilitate provenance interoperability, seamless propagation of provenance, automated querying, and analysis. PMID:22126369
A Consideration of Quality-Attribute-Property for Interoperability of Quality Data
NASA Astrophysics Data System (ADS)
Tarumi, Shinya; Kozaki, Kouji; Kitamura, Yoshinobu; Mizoguchi, Riichiro
Descriptions of attribute and quality are essential elements in ontology developments. Needless to say, science data are description of attributes of target things and it is an important role of ontology to support the validity of and interoperability between the description. Although some upper ontologies such as DOLCE, BFO, etc. are already developed and extensively used, a careful examination reveals some rooms for improvement of them. While each ontology covers quality and quantity, the mutual interchangeability among these ontologies is not considered because each has been designed intended to develop a ``correct'' ontology of quality and quantity. Furthermore, due to variety of ways of data description, no single ontology can cover all the existing scientific data. In this paper, we investigate ``quality'' and ``value'' from an ontological viewpoint and propose a conceptual framework to deal with attribute, property and quality appearing in existing data descriptions in the nanotechnology domain. This framework can be considered as a reference ontology for describing quality with existing upper ontology. Furthermore, on the basis of the results of the consideration, we evaluate and refine a conceptual hierarchy of materials functions which has been built by nanomaterials researchers. Through the evaluation process, we discuss an effect of the definition of a conceptual framework for building/refining ontology. Such conceptual consideration about quality and value is not only the problem in nanomaterials domain but also a first step toward advancement of an intelligent sharing of scientific data in e-Science.
Module Extraction for Efficient Object Queries over Ontologies with Large ABoxes
Xu, Jia; Shironoshita, Patrick; Visser, Ubbo; John, Nigel; Kabuka, Mansur
2015-01-01
The extraction of logically-independent fragments out of an ontology ABox can be useful for solving the tractability problem of querying ontologies with large ABoxes. In this paper, we propose a formal definition of an ABox module, such that it guarantees complete preservation of facts about a given set of individuals, and thus can be reasoned independently w.r.t. the ontology TBox. With ABox modules of this type, isolated or distributed (parallel) ABox reasoning becomes feasible, and more efficient data retrieval from ontology ABoxes can be attained. To compute such an ABox module, we present a theoretical approach and also an approximation for SHIQ ontologies. Evaluation of the module approximation on different types of ontologies shows that, on average, extracted ABox modules are significantly smaller than the entire ABox, and the time for ontology reasoning based on ABox modules can be improved significantly. PMID:26848490
Studies on Experimental Ontology and Knowledge Service Development in Bio-Environmental Engineering
NASA Astrophysics Data System (ADS)
Zhang, Yunliang
2018-01-01
The existing domain-related ontology and information service patterns are analyzed, and the main problems faced by the experimental scheme knowledge service were clarified. The ontology framework model for knowledge service of Bio-environmental Engineering was proposed from the aspects of experimental materials, experimental conditions and experimental instruments, and this ontology will be combined with existing knowledge organization systems to organize scientific and technological literatures, data and experimental schemes. With the similarity and priority calculation, it can improve the related domain research.
Next generation data harmonization
NASA Astrophysics Data System (ADS)
Armstrong, Chandler; Brown, Ryan M.; Chaves, Jillian; Czerniejewski, Adam; Del Vecchio, Justin; Perkins, Timothy K.; Rudnicki, Ron; Tauer, Greg
2015-05-01
Analysts are presented with a never ending stream of data sources. Often, subsets of data sources to solve problems are easily identified but the process to align data sets is time consuming. However, many semantic technologies do allow for fast harmonization of data to overcome these problems. These include ontologies that serve as alignment targets, visual tools and natural language processing that generate semantic graphs in terms of the ontologies, and analytics that leverage these graphs. This research reviews a developed prototype that employs all these approaches to perform analysis across disparate data sources documenting violent, extremist events.
ERIC Educational Resources Information Center
Artzt, Alice F.; Armour-Thomas, Eleanor
The roles of cognition and metacognition were examined in the mathematical problem-solving behaviors of students as they worked in small groups. As an outcome, a framework that links the literature of cognitive science and mathematical problem solving was developed for protocol analysis of mathematical problem solving. Within this framework, each…
Replacing missing values using trustworthy data values from web data sources
NASA Astrophysics Data System (ADS)
Izham Jaya, M.; Sidi, Fatimah; Mat Yusof, Sharmila; Suriani Affendey, Lilly; Ishak, Iskandar; Jabar, Marzanah A.
2017-09-01
In practice, collected data usually are incomplete and contains missing value. Existing approaches in managing missing values overlook the importance of trustworthy data values in replacing missing values. In view that trusted completed data is very important in data analysis, we proposed a framework of missing value replacement using trustworthy data values from web data sources. The proposed framework adopted ontology to map data values from web data sources to the incomplete dataset. As data from web is conflicting with each other, we proposed a trust score measurement based on data accuracy and data reliability. Trust score is then used to select trustworthy data values from web data sources for missing values replacement. We successfully implemented the proposed framework using financial dataset and presented the findings in this paper. From our experiment, we manage to show that replacing missing values with trustworthy data values is important especially in a case of conflicting data to solve missing values problem.
Artificial intelligence and the future.
Clocksin, William F
2003-08-15
We consider some of the ideas influencing current artificial-intelligence research and outline an alternative conceptual framework that gives priority to social relationships as a key component and constructor of intelligent behaviour. The framework starts from Weizenbaum's observation that intelligence manifests itself only relative to specific social and cultural contexts. This is in contrast to a prevailing view, which sees intelligence as an abstract capability of the individual mind based on a mechanism for rational thought. The new approach is not based on the conventional idea that the mind is a rational processor of symbolic information, nor does it require the idea that thought is a kind of abstract problem solving with a semantics that is independent of its embodiment. Instead, priority is given to affective and social responses that serve to engage the whole agent in the life of the communities in which it participates. Intelligence is seen not as the deployment of capabilities for problem solving, but as constructed by the continual, ever-changing and unfinished engagement with the social group within the environment. The construction of the identity of the intelligent agent involves the appropriation or 'taking up' of positions within the conversations and narratives in which it participates. Thus, the new approach argues that the intelligent agent is shaped by the meaning ascribed to experience, by its situation in the social matrix, and by practices of self and of relationship into which intelligent life is recruited. This has implications for the technology of the future, as, for example, classic artificial intelligence models such as goal-directed problem solving are seen as special cases of narrative practices instead of as ontological foundations.
[Analysis of health terminologies for use as ontologies in healthcare information systems].
Romá-Ferri, Maria Teresa; Palomar, Manuel
2008-01-01
Ontologies are a resource that allow the concept of meaning to be represented informatically, thus avoiding the limitations imposed by standardized terms. The objective of this study was to establish the extent to which terminologies could be used for the design of ontologies, which could be serve as an aid to resolve problems such as semantic interoperability and knowledge reusability in healthcare information systems. To determine the extent to which terminologies could be used as ontologies, six of the most important terminologies in clinical, epidemiologic, documentation and administrative-economic contexts were analyzed. The following characteristics were verified: conceptual coverage, hierarchical structure, conceptual granularity of the categories, conceptual relations, and the language used for conceptual representation. MeSH, DeCS and UMLS ontologies were considered lightweight. The main differences among these ontologies concern conceptual specification, the types of relation and the restrictions among the associated concepts. SNOMED and GALEN ontologies have declaratory formalism, based on logical descriptions. These ontologies include explicit qualities and show greater restrictions among associated concepts and rule combinations and were consequently considered as heavyweight. Analysis of the declared representation of the terminologies shows the extent to which they could be reused as ontologies. Their degree of usability depends on whether the aim is for healthcare information systems to solve problems of semantic interoperability (lightweight ontologies) or to reuse the systems' knowledge as an aid to decision making (heavyweight ontologies) and for non-structured information retrieval, extraction, and classification.
Scalable software architectures for decision support.
Musen, M A
1999-12-01
Interest in decision-support programs for clinical medicine soared in the 1970s. Since that time, workers in medical informatics have been particularly attracted to rule-based systems as a means of providing clinical decision support. Although developers have built many successful applications using production rules, they also have discovered that creation and maintenance of large rule bases is quite problematic. In the 1980s, several groups of investigators began to explore alternative programming abstractions that can be used to build decision-support systems. As a result, the notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) problem-solving methods--domain-independent algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper highlights how developers can construct large, maintainable decision-support systems using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community.
Problem Solving Frameworks for Mathematics and Software Development
ERIC Educational Resources Information Center
McMaster, Kirby; Sambasivam, Samuel; Blake, Ashley
2012-01-01
In this research, we examine how problem solving frameworks differ between Mathematics and Software Development. Our methodology is based on the assumption that the words used frequently in a book indicate the mental framework of the author. We compared word frequencies in a sample of 139 books that discuss problem solving. The books were grouped…
OntoPop: An Ontology Population System for the Semantic Web
NASA Astrophysics Data System (ADS)
Thongkrau, Theerayut; Lalitrojwong, Pattarachai
The development of ontology at the instance level requires the extraction of the terms defining the instances from various data sources. These instances then are linked to the concepts of the ontology, and relationships are created between these instances for the next step. However, before establishing links among data, ontology engineers must classify terms or instances from a web document into an ontology concept. The tool for help ontology engineer in this task is called ontology population. The present research is not suitable for ontology development applications, such as long time processing or analyzing large or noisy data sets. OntoPop system introduces a methodology to solve these problems, which comprises two parts. First, we select meaningful features from syntactic relations, which can produce more significant features than any other method. Second, we differentiate feature meaning and reduce noise based on latent semantic analysis. Experimental evaluation demonstrates that the OntoPop works well, significantly out-performing the accuracy of 49.64%, a learning accuracy of 76.93%, and executes time of 5.46 second/instance.
NASA Astrophysics Data System (ADS)
Batzias, Dimitris F.; Ifanti, Konstantina
2012-12-01
Process simulation models are usually empirical, therefore there is an inherent difficulty in serving as carriers for knowledge acquisition and technology transfer, since their parameters have no physical meaning to facilitate verification of the dependence on the production conditions; in such a case, a 'black box' regression model or a neural network might be used to simply connect input-output characteristics. In several cases, scientific/mechanismic models may be proved valid, in which case parameter identification is required to find out the independent/explanatory variables and parameters, which each parameter depends on. This is a difficult task, since the phenomenological level at which each parameter is defined is different. In this paper, we have developed a methodological framework under the form of an algorithmic procedure to solve this problem. The main parts of this procedure are: (i) stratification of relevant knowledge in discrete layers immediately adjacent to the layer that the initial model under investigation belongs to, (ii) design of the ontology corresponding to these layers, (iii) elimination of the less relevant parts of the ontology by thinning, (iv) retrieval of the stronger interrelations between the remaining nodes within the revised ontological network, and (v) parameter identification taking into account the most influential interrelations revealed in (iv). The functionality of this methodology is demonstrated by quoting two representative case examples on wastewater treatment.
NASA Astrophysics Data System (ADS)
Kuo, Eric; Hallinen, Nicole R.; Conlin, Luke D.
2017-05-01
One aim of school science instruction is to help students become adaptive problem solvers. Though successful at structuring novice problem solving, step-by-step problem-solving frameworks may also constrain students' thinking. This study utilises a paradigm established by Heckler [(2010). Some consequences of prompting novice physics students to construct force diagrams. International Journal of Science Education, 32(14), 1829-1851] to test how cuing the first step in a standard framework affects undergraduate students' approaches and evaluation of solutions in physics problem solving. Specifically, prompting the construction of a standard diagram before problem solving increases the use of standard procedures, decreasing the use of a conceptual shortcut. Providing a diagram prompt also lowers students' ratings of informal approaches to similar problems. These results suggest that reminding students to follow typical problem-solving frameworks limits their views of what counts as good problem solving.
A Problem-Solving Conceptual Framework and Its Implications in Designing Problem-Posing Tasks
ERIC Educational Resources Information Center
Singer, Florence Mihaela; Voica, Cristian
2013-01-01
The links between the mathematical and cognitive models that interact during problem solving are explored with the purpose of developing a reference framework for designing problem-posing tasks. When the process of solving is a successful one, a solver successively changes his/her cognitive stances related to the problem via transformations that…
Anatomics: the intersection of anatomy and bioinformatics
Bard, Jonathan BL
2005-01-01
Computational resources are now using the tissue names of the major model organisms so that tissue-associated data can be archived in and retrieved from databases on the basis of developing and adult anatomy. For this to be done, the set of tissues in that organism (its anatome) has to be organized in a way that is computer-comprehensible. Indeed, such formalization is a necessary part of what is becoming known as systems biology, in which explanations of high-level biological phenomena are not only sought in terms of lower-level events, but are articulated within a computational framework. Lists of tissue names alone, however, turn out to be inadequate for this formalization because tissue organization is essentially hierarchical and thus cannot easily be put into tables, the natural format of relational databases. The solution now adopted is to organize the anatomy of each organism as a hierarchy of tissue names and linking relationships (e.g. the tibia is PART OF the leg, the tibia IS-A bone) within what are known as ontologies. In these, a unique ID is assigned to each tissue and this can be used within, for example, gene-expression databases to link data to tissue organization, and also used to query other data sources (interoperability), while inferences about the anatomy can be made within the ontology on the basis of the relationships. There are now about 15 such anatomical ontologies, many of which are linked to organism databases; these ontologies are now publicly available at the Open Biological Ontologies website (http://obo.sourceforge.net) from where they can be freely downloaded and viewed using standard tools. This review considers how anatomy is formalized within ontologies, together with the problems that have had to be solved for this to be done. It is suggested that the appropriate term for the analysis, computer formulation and use of the anatome is anatomics. PMID:15679867
NASA Astrophysics Data System (ADS)
Fasni, N.; Turmudi, T.; Kusnandi, K.
2017-09-01
This research background of this research is the importance of student problem solving abilities. The purpose of this study is to find out whether there are differences in the ability to solve mathematical problems between students who have learned mathematics using Ang’s Framework for Mathematical Modelling Instruction (AFFMMI) and students who have learned using scientific approach (SA). The method used in this research is a quasi-experimental method with pretest-postest control group design. Data analysis of mathematical problem solving ability using Indepent Sample Test. The results showed that there was a difference in the ability to solve mathematical problems between students who received learning with Ang’s Framework for Mathematical Modelling Instruction and students who received learning with a scientific approach. AFFMMI focuses on mathematical modeling. This modeling allows students to solve problems. The use of AFFMMI is able to improve the solving ability.
Development and Evaluation of a Low Fertility Ontology for Analyzing Social Data in Korea.
Lee, Ji-Hyun; Park, Hyeoun-Ae; Song, Tae-Min
2016-01-01
The purpose of this study is to develop a low fertility ontology for collecting and analyzing social data. A low fertility ontology was developed according to Ontology Development 101 and formally represented using Protégé. The content coverage of the ontology was evaluated using 1,387 narratives posted by the public and 63 narratives posted by public servants. Six super-classes of the ontology were developed based on Bronfenbrenner's ecological system theory with an individual in the center and environmental systems impacting their as surroundings. In total, 568 unique concepts were extracted from the narratives. Out of these concepts, 424(74.6%) concepts were lexically or semantically mapped, 67(11.8%) were either broadly or narrowly mapped to the ontology concepts. Remaining 77(13.6%) concepts were not mapped to any of the ontology concepts. This ontology can be used as a framework to understand low fertility problems using social data in Korea.
ERIC Educational Resources Information Center
Espinosa, Allen A.; Nueva España, Rebecca C.; Marasigan, Arlyne C.
2016-01-01
The present study investigated pre-service chemistry teachers' problem solving strategies and alternative conceptions in solving stoichiometric problems and later on formulate a teaching framework based from the result of the study. The pre-service chemistry teachers were given four stoichiometric problems with increasing complexity and they need…
NASA Astrophysics Data System (ADS)
Derakhshani, Maaneli
In this thesis, we consider the implications of solving the quantum measurement problem for the Newtonian description of semiclassical gravity. First we review the formalism of the Newtonian description of semiclassical gravity based on standard quantum mechanics---the Schroedinger-Newton theory---and two well-established predictions that come out of it, namely, gravitational 'cat states' and gravitationally-induced wavepacket collapse. Then we review three quantum theories with 'primitive ontologies' that are well-known known to solve the measurement problem---Schroedinger's many worlds theory, the GRW collapse theory with matter density ontology, and Nelson's stochastic mechanics. We extend the formalisms of these three quantum theories to Newtonian models of semiclassical gravity and evaluate their implications for gravitational cat states and gravitational wavepacket collapse. We find that (1) Newtonian semiclassical gravity based on Schroedinger's many worlds theory is mathematically equivalent to the Schroedinger-Newton theory and makes the same predictions; (2) Newtonian semiclassical gravity based on the GRW theory differs from Schroedinger-Newton only in the use of a stochastic collapse law, but this law allows it to suppress gravitational cat states so as not to be in contradiction with experiment, while allowing for gravitational wavepacket collapse to happen as well; (3) Newtonian semiclassical gravity based on Nelson's stochastic mechanics differs significantly from Schroedinger-Newton, and does not predict gravitational cat states nor gravitational wavepacket collapse. Considering that gravitational cat states are experimentally ruled out, but gravitational wavepacket collapse is testable in the near future, this implies that only the latter two are viable theories of Newtonian semiclassical gravity and that they can be experimentally tested against each other in future molecular interferometry experiments that are anticipated to be capable of testing the gravitational wavepacket collapse prediction.
Student Errors in Dynamic Mathematical Environments
ERIC Educational Resources Information Center
Brown, Molly; Bossé, Michael J.; Chandler, Kayla
2016-01-01
This study investigates the nature of student errors in the context of problem solving and Dynamic Math Environments. This led to the development of the Problem Solving Action Identification Framework; this framework captures and defines all activities and errors associated with problem solving in a dynamic math environment. Found are three…
SNMP-SI: A Network Management Tool Based on Slow Intelligence System Approach
NASA Astrophysics Data System (ADS)
Colace, Francesco; de Santo, Massimo; Ferrandino, Salvatore
The last decade has witnessed an intense spread of computer networks that has been further accelerated with the introduction of wireless networks. Simultaneously with, this growth has increased significantly the problems of network management. Especially in small companies, where there is no provision of personnel assigned to these tasks, the management of such networks is often complex and malfunctions can have significant impacts on their businesses. A possible solution is the adoption of Simple Network Management Protocol. Simple Network Management Protocol (SNMP) is a standard protocol used to exchange network management information. It is part of the Transmission Control Protocol/Internet Protocol (TCP/IP) protocol suite. SNMP provides a tool for network administrators to manage network performance, find and solve network problems, and plan for network growth. SNMP has a big disadvantage: its simple design means that the information it deals with is neither detailed nor well organized enough to deal with the expanding modern networking requirements. Over the past years much efforts has been given to improve the lack of Simple Network Management Protocol and new frameworks has been developed: A promising approach involves the use of Ontology. This is the starting point of this paper where a novel approach to the network management based on the use of the Slow Intelligence System methodologies and Ontology based techniques is proposed. Slow Intelligence Systems is a general-purpose systems characterized by being able to improve performance over time through a process involving enumeration, propagation, adaptation, elimination and concentration. Therefore, the proposed approach aims to develop a system able to acquire, according to an SNMP standard, information from the various hosts that are in the managed networks and apply solutions in order to solve problems. To check the feasibility of this model first experimental results in a real scenario are showed.
ENGAGE: A Game Based Learning and Problem Solving Framework
2012-07-13
Gamification Summit 2012 Mensa Colloquium 2012.2: Social and Video Games Seattle Science Festival TED Salon Vancouver : http...From - To) 6/1/2012 – 6/30/2012 4. TITLE AND SUBTITLE ENGAGE: A Game Based Learning and Problem Solving Framework 5a. CONTRACT NUMBER N/A 5b...Popović ENGAGE: A Game Based Learning and Problem Solving Framework (Task 1 Month 4) Progress, Status and Management Report Monthly Progress
NASA Astrophysics Data System (ADS)
Prather, Edward E.; Wallace, Colin Scott
2018-06-01
We present an instructional framework that allowed a first time physics instructor to improve students quantitative problem solving abilities by more than a letter grade over what was achieved by students in an experienced instructor’s course. This instructional framework uses a Think-Pair-Share approach to foster collaborative quantitative problem solving during the lecture portion of a large enrollment introductory calculus-based mechanics course. Through the development of carefully crafted and sequenced TPS questions, we engage students in rich discussions on key problem solving issues that we typically only hear about when a student comes for help during office hours. Current work in the sophomore E&M course illustrates that this framework is generalizable to classes beyond the introductory level and for topics beyond mechanics.
Ontology Development and Evolution in the Accident Investigation Domain
NASA Technical Reports Server (NTRS)
Carvalho, Robert; Berrios, Dan; Williams, James
2004-01-01
InvestiigationOrganizer (IO) is a collaborative semantic web system designed to support the conduct of mishap investigations. IO provides a common repository for a wide range of mishap related information, allowing investigators to integrate evidence, causal models, and investigation results. IO has been used to support investigations ranging from a small property damage case to the loss of the Space Shuttle Columbia. Through IO'S use in these investigations, we have learned significant lessons? about the application of ontologies and semantic systems to solving real-world problems. This paper will describe the development of the ontology within IO, from the initial development, its growth in response to user requests during use in investigations, and the recent work that was done to control the results of that growth. This paper will also describe the lessons learned from this experience and how they may apply to the implementaton of future ontologies and semantic systems.
Front-Stage Stars and Backstage Producers: The Role of Judges in Problem-Solving Courts1
Portillo, Shannon; Rudes, Danielle; Viglione, Jill; Nelson, Matthew; Taxman, Faye
2012-01-01
In problem-solving courts judges are no longer neutral arbitrators in adversarial justice processes. Instead, judges directly engage with court participants. The movement towards problem-solving court models emerges from a collaborative therapeutic jurisprudence framework. While most scholars argue judges are the central courtroom actors within problem-solving courts, we find judges are the stars front-stage, but play a more supporting role backstage. We use Goffman's front-stage-backstage framework to analyze 350 hours of ethnographic fieldwork within five problem-solving courts. Problem-solving courts are collaborative organizations with shifting leadership, based on forum. Understanding how the roles of courtroom workgroup actors adapt under the new court model is foundational for effective implementation of these justice processes. PMID:23397430
Front-Stage Stars and Backstage Producers: The Role of Judges in Problem-Solving Courts().
Portillo, Shannon; Rudes, Danielle; Viglione, Jill; Nelson, Matthew; Taxman, Faye
2013-01-01
In problem-solving courts judges are no longer neutral arbitrators in adversarial justice processes. Instead, judges directly engage with court participants. The movement towards problem-solving court models emerges from a collaborative therapeutic jurisprudence framework. While most scholars argue judges are the central courtroom actors within problem-solving courts, we find judges are the stars front-stage, but play a more supporting role backstage. We use Goffman's front-stage-backstage framework to analyze 350 hours of ethnographic fieldwork within five problem-solving courts. Problem-solving courts are collaborative organizations with shifting leadership, based on forum. Understanding how the roles of courtroom workgroup actors adapt under the new court model is foundational for effective implementation of these justice processes.
NASA Astrophysics Data System (ADS)
Peckham, S. D.; Kelbert, A.; Rudan, S.; Stoica, M.
2016-12-01
Standardized metadata for models is the key to reliable and greatly simplified coupling in model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System). This model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. While having this kind of standardized metadata for each model in a repository opens up a wide range of exciting possibilities, it is difficult to collect this information and a carefully conceived "data model" or schema is needed to store it. Automated harvesting and scraping methods can provide some useful information, but they often result in metadata that is inaccurate or incomplete, and this is not sufficient to enable the desired capabilities. In order to address this problem, we have developed a browser-based tool called the MCM Tool (Model Component Metadata) which runs on notebooks, tablets and smart phones. This tool was partially inspired by the TurboTax software, which greatly simplifies the necessary task of preparing tax documents. It allows a model developer or advanced user to provide a standardized, deep description of a computational geoscience model, including hydrologic models. Under the hood, the tool uses a new ontology for models built on the CSDMS Standard Names, expressed as a collection of RDF files (Resource Description Framework). This ontology is based on core concepts such as variables, objects, quantities, operations, processes and assumptions. The purpose of this talk is to present details of the new ontology and to then demonstrate the MCM Tool for several hydrologic models.
Liyanage, Harshana; Liaw, Siaw-Teng; Kuziemsky, Craig; de Lusignan, Simon
2013-01-01
There is a growing burden of chronic non-communicable disease (CNCD). Managing CNCDs requires use of multiple sources of health and social care data, and information about coordination and outcomes. Many people with CNCDs have multimorbidity. Problems with data quality exacerbate challenges in measuring quality and health outcomes especially where there is multimorbidity. We have developed an ontological toolkit to support research and quality improvement studies in CNCDs using heterogeneous data, with diabetes mellitus as an exemplar. International experts held a workshop meeting, with follow up discussions and consensus building exercise. We generated conceptual statements about problems with a CNCD that ontologies might support, and a generic reference model. There were varying degrees of consensus. We propose a set of tools, and a four step method: (1) Identification and specification of data sources; (2) Conceptualisation of semantic meaning; (3) How available routine data can be used as a measure of the process or outcome of care; (4) Formalisation and validation of the final ontology.
Multicriteria analysis of ontologically represented information
NASA Astrophysics Data System (ADS)
Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.
2014-11-01
Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.
Music and dance make me feel alive: from Mandela's prison songs and dances to public policy.
Buis, Johann S
2013-01-01
How is it possible for song and dance to exist in political incarceration and manifest itself later as public policy responding to apartheid atrocities? Examining the body of songs, oral history accounts, and eye-witness reports provided by fellow-prisoners of Mandela on Robben Island prison, I uncover a psychological environment mediated through music and dance--within the confines of a political prison. This source of prison music-making by political prisoners in detention, provide us with the artistic expressions of revolutionary songs, parody songs, praise songs, laments, etc. These music genres reflect ontologies embedded in Mandela's juristic imagination. My framework for explaining these ontologies is a theoretical framework I call an aesthetic of function: internal ontologies that speak to the African cultural ground against which external ontologies are expressed in the jurisprudential redress to apartheid atrocities. Examining his external (jurisprudential) ontologies through song and dance, one realizes that the best way for him to have solved the unprecedented public redress of apartheid atrocities is evident in the songs he sang in Robben Island prison. Retribution could have been a logical solution for him. Instead, he turned to truth-telling and reconciliation as public policy. The Truth and Reconciliation Commission's unprecedented breaking of social and jurisprudential boundaries, the claim of agency for both victims and perpetrators, and public policy of South Africa's first democratically elected black president, lie deeply embedded in cultural practices he testified to in his autobiography, "The Long Walk to Freedom". These cultural practices in prison were singing and dancing. This paper complements the music-as-torture trope: here music in detention carries ontological agency. Musical evidence of stylistic features, text, and contextual analyses, and related literary criticism devices, expose Mandela's embedded internal and external ontological cultural practices. Here, song and dance have agency to influence public policy despite the constraints of political detention.
ERIC Educational Resources Information Center
O'Keeffe, Shawn Edward
2013-01-01
The author developed a unified nD framework and process ontology for Building Information Modeling (BIM). The research includes a framework developed for 6D BIM, nD BIM, and nD ontology that defines the domain and sub-domain constructs for future nD BIM dimensions. The nD ontology defines the relationships of kinds within any new proposed…
Toxicology ontology perspectives.
Hardy, Barry; Apic, Gordana; Carthew, Philip; Clark, Dominic; Cook, David; Dix, Ian; Escher, Sylvia; Hastings, Janna; Heard, David J; Jeliazkova, Nina; Judson, Philip; Matis-Mitchell, Sherri; Mitic, Dragana; Myatt, Glenn; Shah, Imran; Spjuth, Ola; Tcheremenskaia, Olga; Toldo, Luca; Watson, David; White, Andrew; Yang, Chihae
2012-01-01
The field of predictive toxicology requires the development of open, public, computable, standardized toxicology vocabularies and ontologies to support the applications required by in silico, in vitro, and in vivo toxicology methods and related analysis and reporting activities. In this article we review ontology developments based on a set of perspectives showing how ontologies are being used in predictive toxicology initiatives and applications. Perspectives on resources and initiatives reviewed include OpenTox, eTOX, Pistoia Alliance, ToxWiz, Virtual Liver, EU-ADR, BEL, ToxML, and Bioclipse. We also review existing ontology developments in neighboring fields that can contribute to establishing an ontological framework for predictive toxicology. A significant set of resources is already available to provide a foundation for an ontological framework for 21st century mechanistic-based toxicology research. Ontologies such as ToxWiz provide a basis for application to toxicology investigations, whereas other ontologies under development in the biological, chemical, and biomedical communities could be incorporated in an extended future framework. OpenTox has provided a semantic web framework for the implementation of such ontologies into software applications and linked data resources. Bioclipse developers have shown the benefit of interoperability obtained through ontology by being able to link their workbench application with remote OpenTox web services. Although these developments are promising, an increased international coordination of efforts is greatly needed to develop a more unified, standardized, and open toxicology ontology framework.
Knowledge acquisition and learning process description in context of e-learning
NASA Astrophysics Data System (ADS)
Kiselev, B. G.; Yakutenko, V. A.; Yuriev, M. A.
2017-01-01
This paper investigates the problem of design of e-learning and MOOC systems. It describes instructional design-based approaches to e-learning systems design: IMS Learning Design, MISA and TELOS. To solve this problem we present Knowledge Field of Educational Environment with Competence boundary conditions - instructional engineering method for self-learning systems design. It is based on the simplified TELOS approach and enables a user to create their individual learning path by choosing prerequisite and target competencies. The paper provides the ontology model for the described instructional engineering method, real life use cases and the classification of the presented model. Ontology model consists of 13 classes and 15 properties. Some of them are inherited from Knowledge Field of Educational Environment and some are new and describe competence boundary conditions and knowledge validation objects. Ontology model uses logical constraints and is described using OWL 2 standard. To give TELOS users better understanding of our approach we list mapping between TELOS and KFEEC.
A Practical Ontology Query Expansion Algorithm for Semantic-Aware Learning Objects Retrieval
ERIC Educational Resources Information Center
Lee, Ming-Che; Tsai, Kun Hua; Wang, Tzone I.
2008-01-01
Following the rapid development of Internet, particularly web page interaction technology, distant e-learning has become increasingly realistic and popular. To solve the problems associated with sharing and reusing teaching materials in different e-learning systems, several standard formats, including SCORM, IMS, LOM, and AICC, etc., recently have…
A framework for solving ill-structured community problems
NASA Astrophysics Data System (ADS)
Keller, William Cotesworth
A multifaceted protocol for solving ill-structured community problems has been developed. It embodies the lessons learned from the past by refining and extending features of previous models from the systems thinkers, and the fields of behavioral decision making and creative problem solving. The protocol also embraces additional features needed to address the unique aspects of community decision situations. The essential elements of the protocol are participants from the community, a problem-solving process, a systems picture, a facilitator, a modified Delphi method of communications, and technical expertise. This interdisciplinary framework has been tested by a quasi experiment with a real world community problem (the high cost of electrical power on Long Island, NY). Results indicate the protocol can enable members of the community to understand a complicated, ill-structured problem and guide them to action to solve the issue. However, the framework takes time (over one year in the test case) and will be inappropriate for crises where quick action is needed.
NASA Astrophysics Data System (ADS)
Roy, Satadru
Traditional approaches to design and optimize a new system, often, use a system-centric objective and do not take into consideration how the operator will use this new system alongside of other existing systems. This "hand-off" between the design of the new system and how the new system operates alongside other systems might lead to a sub-optimal performance with respect to the operator-level objective. In other words, the system that is optimal for its system-level objective might not be best for the system-of-systems level objective of the operator. Among the few available references that describe attempts to address this hand-off, most follow an MDO-motivated subspace decomposition approach of first designing a very good system and then provide this system to the operator who decides the best way to use this new system along with the existing systems. The motivating example in this dissertation presents one such similar problem that includes aircraft design, airline operations and revenue management "subspaces". The research here develops an approach that could simultaneously solve these subspaces posed as a monolithic optimization problem. The monolithic approach makes the problem a Mixed Integer/Discrete Non-Linear Programming (MINLP/MDNLP) problem, which are extremely difficult to solve. The presence of expensive, sophisticated engineering analyses further aggravate the problem. To tackle this challenge problem, the work here presents a new optimization framework that simultaneously solves the subspaces to capture the "synergism" in the problem that the previous decomposition approaches may not have exploited, addresses mixed-integer/discrete type design variables in an efficient manner, and accounts for computationally expensive analysis tools. The framework combines concepts from efficient global optimization, Kriging partial least squares, and gradient-based optimization. This approach then demonstrates its ability to solve an 11 route airline network problem consisting of 94 decision variables including 33 integer and 61 continuous type variables. This application problem is a representation of an interacting group of systems and provides key challenges to the optimization framework to solve the MINLP problem, as reflected by the presence of a moderate number of integer and continuous type design variables and expensive analysis tool. The result indicates simultaneously solving the subspaces could lead to significant improvement in the fleet-level objective of the airline when compared to the previously developed sequential subspace decomposition approach. In developing the approach to solve the MINLP/MDNLP challenge problem, several test problems provided the ability to explore performance of the framework. While solving these test problems, the framework showed that it could solve other MDNLP problems including categorically discrete variables, indicating that the framework could have broader application than the new aircraft design-fleet allocation-revenue management problem.
ERIC Educational Resources Information Center
Scott, Fraser J.
2016-01-01
The "mathematics problem" is a well-known source of difficulty for students attempting numerical problem solving questions in the context of science education. This paper illuminates this problem from a biology education perspective by invoking Hogan's numeracy framework. In doing so, this study has revealed that the contextualisation of…
NASA Astrophysics Data System (ADS)
Vega, Francisco; Pérez, Wilson; Tello, Andrés.; Saquicela, Victor; Espinoza, Mauricio; Solano-Quinde, Lizandro; Vidal, Maria-Esther; La Cruz, Alexandra
2015-12-01
Advances in medical imaging have fostered medical diagnosis based on digital images. Consequently, the number of studies by medical images diagnosis increases, thus, collaborative work and tele-radiology systems are required to effectively scale up to this diagnosis trend. We tackle the problem of the collaborative access of medical images, and present WebMedSA, a framework to manage large datasets of medical images. WebMedSA relies on a PACS and supports the ontological annotation, as well as segmentation and visualization of the images based on their semantic description. Ontological annotations can be performed directly on the volumetric image or at different image planes (e.g., axial, coronal, or sagittal); furthermore, annotations can be complemented after applying a segmentation technique. WebMedSA is based on three main steps: (1) RDF-ization process for extracting, anonymizing, and serializing metadata comprised in DICOM medical images into RDF/XML; (2) Integration of different biomedical ontologies (using L-MOM library), making this approach ontology independent; and (3) segmentation and visualization of annotated data which is further used to generate new annotations according to expert knowledge, and validation. Initial user evaluations suggest that WebMedSA facilitates the exchange of knowledge between radiologists, and provides the basis for collaborative work among them.
Examining problem solving in physics-intensive Ph.D. research
NASA Astrophysics Data System (ADS)
Leak, Anne E.; Rothwell, Susan L.; Olivera, Javier; Zwickl, Benjamin; Vosburg, Jarrett; Martin, Kelly Norris
2017-12-01
Problem-solving strategies learned by physics undergraduates should prepare them for real-world contexts as they transition from students to professionals. Yet, graduate students in physics-intensive research face problems that go beyond problem sets they experienced as undergraduates and are solved by different strategies than are typically learned in undergraduate coursework. This paper expands the notion of problem solving by characterizing the breadth of problems and problem-solving processes carried out by graduate students in physics-intensive research. We conducted semi-structured interviews with ten graduate students to determine the routine, difficult, and important problems they engage in and problem-solving strategies they found useful in their research. A qualitative typological analysis resulted in the creation of a three-dimensional framework: context, activity, and feature (that made the problem challenging). Problem contexts extended beyond theory and mathematics to include interactions with lab equipment, data, software, and people. Important and difficult contexts blended social and technical skills. Routine problem activities were typically well defined (e.g., troubleshooting), while difficult and important ones were more open ended and had multiple solution paths (e.g., evaluating options). In addition to broadening our understanding of problems faced by graduate students, our findings explore problem-solving strategies (e.g., breaking down problems, evaluating options, using test cases or approximations) and characteristics of successful problem solvers (e.g., initiative, persistence, and motivation). Our research provides evidence of the influence that problems students are exposed to have on the strategies they use and learn. Using this evidence, we have developed a preliminary framework for exploring problems from the solver's perspective. This framework will be examined and refined in future work. Understanding problems graduate students face and the strategies they use has implications for improving how we approach problem solving in undergraduate physics and physics education research.
Step by Step: Biology Undergraduates' Problem-Solving Procedures during Multiple-Choice Assessment
ERIC Educational Resources Information Center
Prevost, Luanna B.; Lemons, Paula P.
2016-01-01
This study uses the theoretical framework of domain-specific problem solving to explore the procedures students use to solve multiple-choice problems about biology concepts. We designed several multiple-choice problems and administered them on four exams. We trained students to produce written descriptions of how they solved the problem, and this…
Taking evolution seriously in political science.
Lewis, Orion; Steinmo, Sven
2010-09-01
In this essay, we explore the epistemological and ontological assumptions that have been made to make political science "scientific." We show how political science has generally adopted an ontologically reductionist philosophy of science derived from Newtonian physics and mechanics. This mechanical framework has encountered problems and constraints on its explanatory power, because an emphasis on equilibrium analysis is ill-suited for the study of political change. We outline the primary differences between an evolutionary ontology of social science and the physics-based philosophy commonly employed. Finally, we show how evolutionary thinking adds insight into the study of political phenomena and research questions that are of central importance to the field, such as preference formation.
Step by Step: Biology Undergraduates’ Problem-Solving Procedures during Multiple-Choice Assessment
Prevost, Luanna B.; Lemons, Paula P.
2016-01-01
This study uses the theoretical framework of domain-specific problem solving to explore the procedures students use to solve multiple-choice problems about biology concepts. We designed several multiple-choice problems and administered them on four exams. We trained students to produce written descriptions of how they solved the problem, and this allowed us to systematically investigate their problem-solving procedures. We identified a range of procedures and organized them as domain general, domain specific, or hybrid. We also identified domain-general and domain-specific errors made by students during problem solving. We found that students use domain-general and hybrid procedures more frequently when solving lower-order problems than higher-order problems, while they use domain-specific procedures more frequently when solving higher-order problems. Additionally, the more domain-specific procedures students used, the higher the likelihood that they would answer the problem correctly, up to five procedures. However, if students used just one domain-general procedure, they were as likely to answer the problem correctly as if they had used two to five domain-general procedures. Our findings provide a categorization scheme and framework for additional research on biology problem solving and suggest several important implications for researchers and instructors. PMID:27909021
Shen, Ying; Colloc, Joël; Jacquet-Andrieu, Armelle; Lei, Kai
2015-08-01
This research aims to depict the methodological steps and tools about the combined operation of case-based reasoning (CBR) and multi-agent system (MAS) to expose the ontological application in the field of clinical decision support. The multi-agent architecture works for the consideration of the whole cycle of clinical decision-making adaptable to many medical aspects such as the diagnosis, prognosis, treatment, therapeutic monitoring of gastric cancer. In the multi-agent architecture, the ontological agent type employs the domain knowledge to ease the extraction of similar clinical cases and provide treatment suggestions to patients and physicians. Ontological agent is used for the extension of domain hierarchy and the interpretation of input requests. Case-based reasoning memorizes and restores experience data for solving similar problems, with the help of matching approach and defined interfaces of ontologies. A typical case is developed to illustrate the implementation of the knowledge acquisition and restitution of medical experts. Copyright © 2015 Elsevier Inc. All rights reserved.
A Uniform Ontology for Software Interfaces
NASA Technical Reports Server (NTRS)
Feyock, Stefan
2002-01-01
It is universally the case that computer users who are not also computer specialists prefer to deal with computers' in terms of a familiar ontology, namely that of their application domains. For example, the well-known Windows ontology assumes that the user is an office worker, and therefore should be presented with a "desktop environment" featuring entities such as (virtual) file folders, documents, appointment calendars, and the like, rather than a world of machine registers and machine language instructions, or even the DOS command level. The central theme of this research has been the proposition that the user interacting with a software system should have at his disposal both the ontology underlying the system, as well as a model of the system. This information is necessary for the understanding of the system in use, as well as for the automatic generation of assistance for the user, both in solving the problem for which the application is designed, and for providing guidance in the capabilities and use of the system.
NASA Astrophysics Data System (ADS)
Yenaeng, Sasikanchana; Saelee, Somkid; Samai, Wirachai
2018-01-01
The system evaluation for report writing skills of summary by Hybrid Genetic Algorithm-Support Vector Machines (HGA-SVM) with Ontology of Medical Case Study in Problem Based Learning (PBL) is a system was developed as a guideline of scoring for the facilitators or medical teacher. The essay answers come from medical student of medical education courses in the nervous system motion and Behavior I and II subject, a third year medical student 20 groups of 9-10 people, the Faculty of Medicine in Prince of Songkla University (PSU). The audit committee have the opinion that the ratings of individual facilitators are inadequate, this system to solve such problems. In this paper proposes a development of the system evaluation for report writing skills of summary by HGA-SVM with Ontology of medical case study in PBL which the mean scores of machine learning score and humans (facilitators) score were not different at the significantly level .05 all 3 essay parts contain problem essay part, hypothesis essay part and learning objective essay part. The result show that, the average score all 3 essay parts that were not significantly different from the rate at the level of significance .05.
Generalizing Backtrack-Free Search: A Framework for Search-Free Constraint Satisfaction
NASA Technical Reports Server (NTRS)
Jonsson, Ari K.; Frank, Jeremy
2000-01-01
Tractable classes of constraint satisfaction problems are of great importance in artificial intelligence. Identifying and taking advantage of such classes can significantly speed up constraint problem solving. In addition, tractable classes are utilized in applications where strict worst-case performance guarantees are required, such as constraint-based plan execution. In this work, we present a formal framework for search-free (backtrack-free) constraint satisfaction. The framework is based on general procedures, rather than specific propagation techniques, and thus generalizes existing techniques in this area. We also relate search-free problem solving to the notion of decision sets and use the result to provide a constructive criterion that is sufficient to guarantee search-free problem solving.
Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; He, Yongqun
2015-01-01
It is time-consuming to build an ontology with many terms and axioms. Thus it is desired to automate the process of ontology development. Ontology Design Patterns (ODPs) provide a reusable solution to solve a recurrent modeling problem in the context of ontology engineering. Because ontology terms often follow specific ODPs, the Ontology for Biomedical Investigations (OBI) developers proposed a Quick Term Templates (QTTs) process targeted at generating new ontology classes following the same pattern, using term templates in a spreadsheet format. Inspired by the ODPs and QTTs, the Ontorat web application is developed to automatically generate new ontology terms, annotations of terms, and logical axioms based on a specific ODP(s). The inputs of an Ontorat execution include axiom expression settings, an input data file, ID generation settings, and a target ontology (optional). The axiom expression settings can be saved as a predesigned Ontorat setting format text file for reuse. The input data file is generated based on a template file created by a specific ODP (text or Excel format). Ontorat is an efficient tool for ontology expansion. Different use cases are described. For example, Ontorat was applied to automatically generate over 1,000 Japan RIKEN cell line cell terms with both logical axioms and rich annotation axioms in the Cell Line Ontology (CLO). Approximately 800 licensed animal vaccines were represented and annotated in the Vaccine Ontology (VO) by Ontorat. The OBI team used Ontorat to add assay and device terms required by ENCODE project. Ontorat was also used to add missing annotations to all existing Biobank specific terms in the Biobank Ontology. A collection of ODPs and templates with examples are provided on the Ontorat website and can be reused to facilitate ontology development. With ever increasing ontology development and applications, Ontorat provides a timely platform for generating and annotating a large number of ontology terms by following design patterns. http://ontorat.hegroup.org/.
Ontologies for Effective Use of Context in E-Learning Settings
ERIC Educational Resources Information Center
Jovanovic, Jelena; Gasevic, Dragan; Knight, Colin; Richards, Griff
2007-01-01
This paper presents an ontology-based framework aimed at explicit representation of context-specific metadata derived from the actual usage of learning objects and learning designs. The core part of the proposed framework is a learning object context ontology, that leverages a range of other kinds of learning ontologies (e.g., user modeling…
ERIC Educational Resources Information Center
Lai, Su-Huei
A conceptual framework of the modes of problem-solving action has been developed on the basis of a simple relationship cone to assist individuals in diversified professions in inquiry and implementation of theory and practice in their professional development. The conceptual framework is referred to as the Cone-Deciphered Modes of Problem Solving…
Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A.
2016-01-01
Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving “live partial-area taxonomies” is demonstrated. PMID:27345947
Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A
2016-08-01
Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving "live partial-area taxonomies" is demonstrated. Copyright © 2016 Elsevier Inc. All rights reserved.
Method of transition from 3D model to its ontological representation in aircraft design process
NASA Astrophysics Data System (ADS)
Govorkov, A. S.; Zhilyaev, A. S.; Fokin, I. V.
2018-05-01
This paper proposes the method of transition from a 3D model to its ontological representation and describes its usage in the aircraft design process. The problems of design for manufacturability and design automation are also discussed. The introduced method is to aim to ease the process of data exchange between important aircraft design phases, namely engineering and design control. The method is also intended to increase design speed and 3D model customizability. This requires careful selection of the complex systems (CAD / CAM / CAE / PDM), providing the basis for the integration of design and technological preparation of production and more fully take into account the characteristics of products and processes for their manufacture. It is important to solve this problem, as investment in the automation define the company's competitiveness in the years ahead.
NASA Astrophysics Data System (ADS)
Macioł, Piotr; Regulski, Krzysztof
2016-08-01
We present a process of semantic meta-model development for data management in an adaptable multiscale modeling framework. The main problems in ontology design are discussed, and a solution achieved as a result of the research is presented. The main concepts concerning the application and data management background for multiscale modeling were derived from the AM3 approach—object-oriented Agile multiscale modeling methodology. The ontological description of multiscale models enables validation of semantic correctness of data interchange between submodels. We also present a possibility of using the ontological model as a supervisor in conjunction with a multiscale model controller and a knowledge base system. Multiscale modeling formal ontology (MMFO), designed for describing multiscale models' data and structures, is presented. A need for applying meta-ontology in the MMFO development process is discussed. Examples of MMFO application in describing thermo-mechanical treatment of metal alloys are discussed. Present and future applications of MMFO are described.
An Approach to Folksonomy-Based Ontology Maintenance for Learning Environments
ERIC Educational Resources Information Center
Gasevic, D.; Zouaq, Amal; Torniai, Carlo; Jovanovic, J.; Hatala, Marek
2011-01-01
Recent research in learning technologies has demonstrated many promising contributions from the use of ontologies and semantic web technologies for the development of advanced learning environments. In spite of those benefits, ontology development and maintenance remain the key research challenges to be solved before ontology-enhanced learning…
Gene function prediction based on the Gene Ontology hierarchical structure.
Cheng, Liangxi; Lin, Hongfei; Hu, Yuncui; Wang, Jian; Yang, Zhihao
2014-01-01
The information of the Gene Ontology annotation is helpful in the explanation of life science phenomena, and can provide great support for the research of the biomedical field. The use of the Gene Ontology is gradually affecting the way people store and understand bioinformatic data. To facilitate the prediction of gene functions with the aid of text mining methods and existing resources, we transform it into a multi-label top-down classification problem and develop a method that uses the hierarchical relationships in the Gene Ontology structure to relieve the quantitative imbalance of positive and negative training samples. Meanwhile the method enhances the discriminating ability of classifiers by retaining and highlighting the key training samples. Additionally, the top-down classifier based on a tree structure takes the relationship of target classes into consideration and thus solves the incompatibility between the classification results and the Gene Ontology structure. Our experiment on the Gene Ontology annotation corpus achieves an F-value performance of 50.7% (precision: 52.7% recall: 48.9%). The experimental results demonstrate that when the size of training set is small, it can be expanded via topological propagation of associated documents between the parent and child nodes in the tree structure. The top-down classification model applies to the set of texts in an ontology structure or with a hierarchical relationship.
Ontological approach for safe and effective polypharmacy prescription
Grando, Adela; Farrish, Susan; Boyd, Cynthia; Boxwala, Aziz
2012-01-01
The intake of multiple medications in patients with various medical conditions challenges the delivery of medical care. Initial empirical studies and pilot implementations seem to indicate that generic safe and effective multi-drug prescription principles could be defined and reused to reduce adverse drug events and to support compliance with medical guidelines and drug formularies. Given that ontologies are known to provide well-principled, sharable, setting-independent and machine-interpretable declarative specification frameworks for modeling and reasoning on biomedical problems, we explore here their use in the context of multi-drug prescription. We propose an ontology for modeling drug-related knowledge and a repository of safe and effective generic prescription principles. To test the usability and the level of granularity of the developed ontology-based specification models and heuristic we implemented a tool that computes the complexity of multi-drug treatments, and a decision aid to check the safeness and effectiveness of prescribed multi-drug treatments. PMID:23304299
ERIC Educational Resources Information Center
Kostousov, Sergei; Kudryavtsev, Dmitry
2017-01-01
Problem solving is a critical competency for modern world and also an effective way of learning. Education should not only transfer domain-specific knowledge to students, but also prepare them to solve real-life problems--to apply knowledge from one or several domains within specific situation. Problem solving as teaching tool is known for a long…
Network planning under uncertainties
NASA Astrophysics Data System (ADS)
Ho, Kwok Shing; Cheung, Kwok Wai
2008-11-01
One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a generic framework for solving the network planning problem under uncertainties. In addition to reviewing the various network planning problems involving uncertainties, we also propose that a unified framework based on robust optimization can be used to solve a rather large segment of network planning problem under uncertainties. Robust optimization is first introduced in the operations research literature and is a framework that incorporates information about the uncertainty sets for the parameters in the optimization model. Even though robust optimization is originated from tackling the uncertainty in the optimization process, it can serve as a comprehensive and suitable framework for tackling generic network planning problems under uncertainties. In this paper, we begin by explaining the main ideas behind the robust optimization approach. Then we demonstrate the capabilities of the proposed framework by giving out some examples of how the robust optimization framework can be applied to the current common network planning problems under uncertain environments. Next, we list some practical considerations for solving the network planning problem under uncertainties with the proposed framework. Finally, we conclude this article with some thoughts on the future directions for applying this framework to solve other network planning problems.
Mathematical Problem Solving through Sequential Process Analysis
ERIC Educational Resources Information Center
Codina, A.; Cañadas, M. C.; Castro, E.
2015-01-01
Introduction: The macroscopic perspective is one of the frameworks for research on problem solving in mathematics education. Coming from this perspective, our study addresses the stages of thought in mathematical problem solving, offering an innovative approach because we apply sequential relations and global interrelations between the different…
Improving data quality in the linked open data: a survey
NASA Astrophysics Data System (ADS)
Hadhiatma, A.
2018-03-01
The Linked Open Data (LOD) is “web of data”, a different paradigm from “web of document” commonly used today. However, the huge LOD still suffers from data quality problems such as completeness, consistency, and accuracy. Data quality problems relate to designing effective methods both to manage and to retrieve information at various data quality levels. Based on review from papers and journals, addressing data quality requires some standards functioning to (1) identification of data quality problems, (2) assessment of data quality for a given context, and (3) correction of data quality problems. However, mostly the methods and strategies dealing with the LOD data quality were not as an integrative approach. Hence, based on those standards and an integrative approach, there are opportunities to improve the LOD data quality in the term of incompleteness, inaccuracy and inconsistency, considering to its schema and ontology, namely ontology refinement. Moreover, the term of the ontology refinement means that it copes not only to improve data quality but also to enrich the LOD. Therefore, it needs (1) a standard for data quality assessment and evaluation which is more appropriate to the LOD; (2) a framework of methods based on statistical relational learning that can improve the correction of data quality problems as well as enrich the LOD.
ERIC Educational Resources Information Center
Jalan, Sukoriyanto; Nusantara, Toto; Subanji, Subanji; Chandra, Tjang Daniel
2016-01-01
This study aims to explain the thinking process of students in solving combination problems considered from assimilation and accommodation frameworks. This research used a case study approach by classifying students into three categories of capabilities namely high, medium and low capabilities. From each of the ability categories, one student was…
Measurement contextuality is implied by macroscopic realism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen Zeqian; Montina, A.; Perimeter Institute for Theoretical Physics, 31 Caroline Street North, Waterloo, Ontario, N2L 2Y5
2011-04-15
Ontological theories of quantum mechanics provide a realistic description of single systems by means of well-defined quantities conditioning the measurement outcomes. In order to be complete, they should also fulfill the minimal condition of macroscopic realism. Under the assumption of outcome determinism and for Hilbert space dimension greater than 2, they were all proved to be contextual for projective measurements. In recent years a generalized concept of noncontextuality was introduced that applies also to the case of outcome indeterminism and unsharp measurements. It was pointed out that the Beltrametti-Bugajski model is an example of measurement noncontextual indeterminist theory. Here wemore » provide a simple proof that this model is the only one with such a feature for projective measurements and Hilbert space dimension greater than 2. In other words, there is no extension of quantum theory providing more accurate predictions of outcomes and simultaneously preserving the minimal labeling of events through projective operators. As a corollary, noncontextuality for projective measurements implies noncontextuality for unsharp measurements. By noting that the condition of macroscopic realism requires an extension of quantum theory, unless a breaking of unitarity is invoked, we arrive at the conclusion that the only way to solve the measurement problem in the framework of an ontological theory is by relaxing the hypothesis of measurement noncontextuality in its generalized sense.« less
Incubation, Insight, and Creative Problem Solving: A Unified Theory and a Connectionist Model
ERIC Educational Resources Information Center
Helie, Sebastien; Sun, Ron
2010-01-01
This article proposes a unified framework for understanding creative problem solving, namely, the explicit-implicit interaction theory. This new theory of creative problem solving constitutes an attempt at providing a more unified explanation of relevant phenomena (in part by reinterpreting/integrating various fragmentary existing theories of…
Bialas, Andrzej
2011-01-01
Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains. PMID:22164064
Bialas, Andrzej
2011-01-01
Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains.
An ontology for component-based models of water resource systems
NASA Astrophysics Data System (ADS)
Elag, Mostafa; Goodall, Jonathan L.
2013-08-01
Component-based modeling is an approach for simulating water resource systems where a model is composed of a set of components, each with a defined modeling objective, interlinked through data exchanges. Component-based modeling frameworks are used within the hydrologic, atmospheric, and earth surface dynamics modeling communities. While these efforts have been advancing, it has become clear that the water resources modeling community in particular, and arguably the larger earth science modeling community as well, faces a challenge of fully and precisely defining the metadata for model components. The lack of a unified framework for model component metadata limits interoperability between modeling communities and the reuse of models across modeling frameworks due to ambiguity about the model and its capabilities. To address this need, we propose an ontology for water resources model components that describes core concepts and relationships using the Web Ontology Language (OWL). The ontology that we present, which is termed the Water Resources Component (WRC) ontology, is meant to serve as a starting point that can be refined over time through engagement by the larger community until a robust knowledge framework for water resource model components is achieved. This paper presents the methodology used to arrive at the WRC ontology, the WRC ontology itself, and examples of how the ontology can aid in component-based water resources modeling by (i) assisting in identifying relevant models, (ii) encouraging proper model coupling, and (iii) facilitating interoperability across earth science modeling frameworks.
Extending XNAT Platform with an Incremental Semantic Framework
Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael
2017-01-01
Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases. PMID:28912709
Extending XNAT Platform with an Incremental Semantic Framework.
Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael
2017-01-01
Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases.
NASA Astrophysics Data System (ADS)
Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka
2015-05-01
Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been found successful to analyse both the test items as well as students' responses in a systematic way. The framework can therefore be applied in the design of new tasks, the analysis and assessment of students' responses, and as a tool for teachers to scaffold students in their problem-solving process. Conclusions:This paper gives implications for practice and for future research to both develop new context-based problems in a structured way, as well as providing analytical tools for investigating students' higher order thinking in their responses to these tasks.
ERIC Educational Resources Information Center
Sevian, H.; Bernholt, S.; Szteinberg, G. A.; Auguste, S.; Pérez, L. C.
2015-01-01
A perspective is presented on how the representation mapping framework by Hahn and Chater (1998) may be used to characterize reasoning during problem solving in chemistry. To provide examples for testing the framework, an exploratory study was conducted with students and professors from three different courses in the middle of the undergraduate…
Group Problem Solving as a Different Participatory Approach to Citizenship Education
ERIC Educational Resources Information Center
Guérin, Laurence
2017-01-01
Purpose: The main goal of this article is to learning define and justify group problem solving as an approach to citizenship education. It is demonstrated that the choice of theoretical framework of democracy has consequences for the chosen learning goals, educational approach and learning activities. The framework used here is an epistemic theory…
ERIC Educational Resources Information Center
Lai, Su-Huei
The conceptual framework of the Modes of Problem Solving Action (MPSA) model integrates Dewey's pragmatism, critical science theory, and theory regarding the three modes of inquiry. The MPSA model is formulated in the shape of a matrix. Horizontally, there are the following modes: technical, interpretive, and emancipating. Vertically, there are…
Prospective Teachers' Problem Solving Skills and Self-Confidence Levels
ERIC Educational Resources Information Center
Gursen Otacioglu, Sena
2008-01-01
The basic objective of the research is to determine whether the education that prospective teachers in different fields receive is related to their levels of problem solving skills and self-confidence. Within the mentioned framework, the prospective teachers' problem solving and self-confidence levels have been examined under several variables.…
Backtrack Programming: A Computer-Based Approach to Group Problem Solving.
ERIC Educational Resources Information Center
Scott, Michael D.; Bodaken, Edward M.
Backtrack problem-solving appears to be a viable alternative to current problem-solving methodologies. It appears to have considerable heuristic potential as a conceptual and operational framework for small group communication research, as well as functional utility for the student group in the small group class or the management team in the…
ERIC Educational Resources Information Center
Gu, Xiaoqing; Chen, Shan; Zhu, Wenbo; Lin, Lin
2015-01-01
Considerable effort has been invested in innovative learning practices such as collaborative inquiry. Collaborative problem solving is becoming popular in school settings, but there is limited knowledge on how to develop skills crucial in collaborative problem solving in students. Based on the intervention design in social interaction of…
A Contingency View of Problem Solving in Schools: A Case Analysis.
ERIC Educational Resources Information Center
Hanson, E. Mark; Brown, Michael E.
Patterns of problem-solving activity in one middle-class urban high school are examined and a problem solving model rooted in a conceptual framework of contingency theory is presented. Contingency theory stresses that as political, economic, and social conditions in an organization's environment become problematic, the internal structures of the…
The SWAN biomedical discourse ontology.
Ciccarese, Paolo; Wu, Elizabeth; Wong, Gwen; Ocana, Marco; Kinoshita, June; Ruttenberg, Alan; Clark, Tim
2008-10-01
Developing cures for highly complex diseases, such as neurodegenerative disorders, requires extensive interdisciplinary collaboration and exchange of biomedical information in context. Our ability to exchange such information across sub-specialties today is limited by the current scientific knowledge ecosystem's inability to properly contextualize and integrate data and discourse in machine-interpretable form. This inherently limits the productivity of research and the progress toward cures for devastating diseases such as Alzheimer's and Parkinson's. SWAN (Semantic Web Applications in Neuromedicine) is an interdisciplinary project to develop a practical, common, semantically structured, framework for biomedical discourse initially applied, but not limited, to significant problems in Alzheimer Disease (AD) research. The SWAN ontology has been developed in the context of building a series of applications for biomedical researchers, as well as in extensive discussions and collaborations with the larger bio-ontologies community. In this paper, we present and discuss the SWAN ontology of biomedical discourse. We ground its development theoretically, present its design approach, explain its main classes and their application, and show its relationship to other ongoing activities in biomedicine and bio-ontologies.
Semantics-enabled service discovery framework in the SIMDAT pharma grid.
Qu, Cangtao; Zimmermann, Falk; Kumpf, Kai; Kamuzinzi, Richard; Ledent, Valérie; Herzog, Robert
2008-03-01
We present the design and implementation of a semantics-enabled service discovery framework in the data Grids for process and product development using numerical simulation and knowledge discovery (SIMDAT) Pharma Grid, an industry-oriented Grid environment for integrating thousands of Grid-enabled biological data services and analysis services. The framework consists of three major components: the Web ontology language (OWL)-description logic (DL)-based biological domain ontology, OWL Web service ontology (OWL-S)-based service annotation, and semantic matchmaker based on the ontology reasoning. Built upon the framework, workflow technologies are extensively exploited in the SIMDAT to assist biologists in (semi)automatically performing in silico experiments. We present a typical usage scenario through the case study of a biological workflow: IXodus.
Step by Step: Biology Undergraduates' Problem-Solving Procedures during Multiple-Choice Assessment.
Prevost, Luanna B; Lemons, Paula P
2016-01-01
This study uses the theoretical framework of domain-specific problem solving to explore the procedures students use to solve multiple-choice problems about biology concepts. We designed several multiple-choice problems and administered them on four exams. We trained students to produce written descriptions of how they solved the problem, and this allowed us to systematically investigate their problem-solving procedures. We identified a range of procedures and organized them as domain general, domain specific, or hybrid. We also identified domain-general and domain-specific errors made by students during problem solving. We found that students use domain-general and hybrid procedures more frequently when solving lower-order problems than higher-order problems, while they use domain-specific procedures more frequently when solving higher-order problems. Additionally, the more domain-specific procedures students used, the higher the likelihood that they would answer the problem correctly, up to five procedures. However, if students used just one domain-general procedure, they were as likely to answer the problem correctly as if they had used two to five domain-general procedures. Our findings provide a categorization scheme and framework for additional research on biology problem solving and suggest several important implications for researchers and instructors. © 2016 L. B. Prevost and P. P. Lemons. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Ontology Mappings to Improve Learning Resource Search
ERIC Educational Resources Information Center
Gasevic, Dragan; Hatala, Marek
2006-01-01
This paper proposes an ontology mapping-based framework that allows searching for learning resources using multiple ontologies. The present applications of ontologies in e-learning use various ontologies (eg, domain, curriculum, context), but they do not give a solution on how to interoperate e-learning systems based on different ontologies. The…
Optimality conditions for the numerical solution of optimization problems with PDE constraints :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilo Valentin, Miguel Alejandro; Ridzal, Denis
2014-03-01
A theoretical framework for the numerical solution of partial di erential equation (PDE) constrained optimization problems is presented in this report. This theoretical framework embodies the fundamental infrastructure required to e ciently implement and solve this class of problems. Detail derivations of the optimality conditions required to accurately solve several parameter identi cation and optimal control problems are also provided in this report. This will allow the reader to further understand how the theoretical abstraction presented in this report translates to the application.
A Customizable Language Learning Support System Using Ontology-Driven Engine
ERIC Educational Resources Information Center
Wang, Jingyun; Mendori, Takahiko; Xiong, Juan
2013-01-01
This paper proposes a framework for web-based language learning support systems designed to provide customizable pedagogical procedures based on the analysis of characteristics of both learner and course. This framework employs a course-centered ontology and a teaching method ontology as the foundation for the student model, which includes learner…
Hastings, Janna; Chepelev, Leonid; Willighagen, Egon; Adams, Nico; Steinbeck, Christoph; Dumontier, Michel
2011-01-01
Cheminformatics is the application of informatics techniques to solve chemical problems in silico. There are many areas in biology where cheminformatics plays an important role in computational research, including metabolism, proteomics, and systems biology. One critical aspect in the application of cheminformatics in these fields is the accurate exchange of data, which is increasingly accomplished through the use of ontologies. Ontologies are formal representations of objects and their properties using a logic-based ontology language. Many such ontologies are currently being developed to represent objects across all the domains of science. Ontologies enable the definition, classification, and support for querying objects in a particular domain, enabling intelligent computer applications to be built which support the work of scientists both within the domain of interest and across interrelated neighbouring domains. Modern chemical research relies on computational techniques to filter and organise data to maximise research productivity. The objects which are manipulated in these algorithms and procedures, as well as the algorithms and procedures themselves, enjoy a kind of virtual life within computers. We will call these information entities. Here, we describe our work in developing an ontology of chemical information entities, with a primary focus on data-driven research and the integration of calculated properties (descriptors) of chemical entities within a semantic web context. Our ontology distinguishes algorithmic, or procedural information from declarative, or factual information, and renders of particular importance the annotation of provenance to calculated data. The Chemical Information Ontology is being developed as an open collaborative project. More details, together with a downloadable OWL file, are available at http://code.google.com/p/semanticchemistry/ (license: CC-BY-SA).
Hastings, Janna; Chepelev, Leonid; Willighagen, Egon; Adams, Nico; Steinbeck, Christoph; Dumontier, Michel
2011-01-01
Cheminformatics is the application of informatics techniques to solve chemical problems in silico. There are many areas in biology where cheminformatics plays an important role in computational research, including metabolism, proteomics, and systems biology. One critical aspect in the application of cheminformatics in these fields is the accurate exchange of data, which is increasingly accomplished through the use of ontologies. Ontologies are formal representations of objects and their properties using a logic-based ontology language. Many such ontologies are currently being developed to represent objects across all the domains of science. Ontologies enable the definition, classification, and support for querying objects in a particular domain, enabling intelligent computer applications to be built which support the work of scientists both within the domain of interest and across interrelated neighbouring domains. Modern chemical research relies on computational techniques to filter and organise data to maximise research productivity. The objects which are manipulated in these algorithms and procedures, as well as the algorithms and procedures themselves, enjoy a kind of virtual life within computers. We will call these information entities. Here, we describe our work in developing an ontology of chemical information entities, with a primary focus on data-driven research and the integration of calculated properties (descriptors) of chemical entities within a semantic web context. Our ontology distinguishes algorithmic, or procedural information from declarative, or factual information, and renders of particular importance the annotation of provenance to calculated data. The Chemical Information Ontology is being developed as an open collaborative project. More details, together with a downloadable OWL file, are available at http://code.google.com/p/semanticchemistry/ (license: CC-BY-SA). PMID:21991315
Typed Linear Chain Conditional Random Fields and Their Application to Intrusion Detection
NASA Astrophysics Data System (ADS)
Elfers, Carsten; Horstmann, Mirko; Sohr, Karsten; Herzog, Otthein
Intrusion detection in computer networks faces the problem of a large number of both false alarms and unrecognized attacks. To improve the precision of detection, various machine learning techniques have been proposed. However, one critical issue is that the amount of reference data that contains serious intrusions is very sparse. In this paper we present an inference process with linear chain conditional random fields that aims to solve this problem by using domain knowledge about the alerts of different intrusion sensors represented in an ontology.
ERIC Educational Resources Information Center
Rouet, Jean-Francois; Betrancourt, Mirelle; Britt, M. Anne; Bromme, Rainer; Graesser, Arthur C.; Kulikowich, Jonna M.; Leu, Donald J.; Ueno, Naoki; van Oostendorp, Herre
2009-01-01
Governments and other stakeholders have become increasingly interested in assessing the skills of their adult populations for the purposes of monitoring how well prepared they are for the challenges of the new information world. The current paper provides an overview of the conceptual framework developed for the assessment of problem solving in…
ERIC Educational Resources Information Center
Antonenko, Pavlo D.; Jahanzad, Farzaneh; Greenwood, Carmen
2014-01-01
Collaborative problem solving is an essential component of any 21st century science career. Scientists are hired, retained, and promoted for solving problems in dynamic and interdisciplinary teams. They discuss issues, explain and justify their opinions, debate, elaborate, and reflect on their collective knowledge. At the same time, both…
A Problem-Solving Framework to Assist Students and Teachers in STEM Courses
ERIC Educational Resources Information Center
Phillips, Jeffrey A.; Clemmer, Katharine W.; McCallum, Jeremy E. B.; Zachariah, Thomas M.
2017-01-01
Well-developed, problem-solving skills are essential for any student enrolled in a science, technology, engineering, and mathematics (STEM) course as well as for graduates in the workforce. One of the most essential skills is the ability to monitor one's own progress and understanding while solving a problem. Successful monitoring during the…
ERIC Educational Resources Information Center
Currie-Rubin, Rachel
2012-01-01
This dissertation examines the problem-solving processes of seven graduate student novices enrolled in a course in educational assessment and ten educational assessment experts. Using Jonassen's (1997) ill- and well-structured problem-solving frameworks, I analyze think-aloud protocols of experts and novices as they examine ill-structured…
Scientific Digital Libraries, Interoperability, and Ontologies
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.
2009-01-01
Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.
The neuron classification problem
Bota, Mihail; Swanson, Larry W.
2007-01-01
A systematic account of neuron cell types is a basic prerequisite for determining the vertebrate nervous system global wiring diagram. With comprehensive lineage and phylogenetic information unavailable, a general ontology based on structure-function taxonomy is proposed and implemented in a knowledge management system, and a prototype analysis of select regions (including retina, cerebellum, and hypothalamus) presented. The supporting Brain Architecture Knowledge Management System (BAMS) Neuron ontology is online and its user interface allows queries about terms and their definitions, classification criteria based on the original literature and “Petilla Convention” guidelines, hierarchies, and relations—with annotations documenting each ontology entry. Combined with three BAMS modules for neural regions, connections between regions and neuron types, and molecules, the Neuron ontology provides a general framework for physical descriptions and computational modeling of neural systems. The knowledge management system interacts with other web resources, is accessible in both XML and RDF/OWL, is extendible to the whole body, and awaits large-scale data population requiring community participation for timely implementation. PMID:17582506
An Ontology-Based Framework for Bridging Learning Design and Learning Content
ERIC Educational Resources Information Center
Knight, Colin, Gasevic, Dragan; Richards, Griff
2006-01-01
The paper describes an ontology-based framework for bridging learning design and learning object content. In present solutions, researchers have proposed conceptual models and developed tools for both of those subjects, but without detailed discussions of how they can be used together. In this paper we advocate the use of ontologies to explicitly…
A Formal Theory for Modular ERDF Ontologies
NASA Astrophysics Data System (ADS)
Analyti, Anastasia; Antoniou, Grigoris; Damásio, Carlos Viegas
The success of the Semantic Web is impossible without any form of modularity, encapsulation, and access control. In an earlier paper, we extended RDF graphs with weak and strong negation, as well as derivation rules. The ERDF #n-stable model semantics of the extended RDF framework (ERDF) is defined, extending RDF(S) semantics. In this paper, we propose a framework for modular ERDF ontologies, called modular ERDF framework, which enables collaborative reasoning over a set of ERDF ontologies, while support for hidden knowledge is also provided. In particular, the modular ERDF stable model semantics of modular ERDF ontologies is defined, extending the ERDF #n-stable model semantics. Our proposed framework supports local semantics and different points of view, local closed-world and open-world assumptions, and scoped negation-as-failure. Several complexity results are provided.
Paradigms and Problem-Solving: A Literature Review.
ERIC Educational Resources Information Center
Berner, Eta S.
1984-01-01
Thomas Kuhn's conceptions of the influence of paradigms on the progress of science form the framework for analyzing how medical educators have approached research on medical problem solving. A new paradigm emphasizing multiple types of problems with varied solution strategies is proposed. (Author/MLW)
ERIC Educational Resources Information Center
Winschel, Grace A.; Everett, Renata K.; Coppola, Brian P.; Shultz, Ginger V.
2015-01-01
Cooperative learning was employed as an instructional approach to facilitate student development of spectroscopy problem solving skills. An interactive online environment was used as a framework to structure weekly discussions around spectroscopy problems outside of class. Weekly discussions consisted of modified jigsaw-style problem solving…
A top-level ontology of functions and its application in the Open Biomedical Ontologies.
Burek, Patryk; Hoehndorf, Robert; Loebe, Frank; Visagie, Johann; Herre, Heinrich; Kelso, Janet
2006-07-15
A clear understanding of functions in biology is a key component in accurate modelling of molecular, cellular and organismal biology. Using the existing biomedical ontologies it has been impossible to capture the complexity of the community's knowledge about biological functions. We present here a top-level ontological framework for representing knowledge about biological functions. This framework lends greater accuracy, power and expressiveness to biomedical ontologies by providing a means to capture existing functional knowledge in a more formal manner. An initial major application of the ontology of functions is the provision of a principled way in which to curate functional knowledge and annotations in biomedical ontologies. Further potential applications include the facilitation of ontology interoperability and automated reasoning. A major advantage of the proposed implementation is that it is an extension to existing biomedical ontologies, and can be applied without substantial changes to these domain ontologies. The Ontology of Functions (OF) can be downloaded in OWL format from http://onto.eva.mpg.de/. Additionally, a UML profile and supplementary information and guides for using the OF can be accessed from the same website.
Lenguaje y Proceso de Pensamiento (Language and Thought Processes)
ERIC Educational Resources Information Center
Rimoldi, H. J. A.
1978-01-01
Thought processes as they are observed during problem solving are discussed. A theoretical framework is designed to establish the correspondence between the tactics the subjects use to solve problems and thinking processes. (NCR)
A Cognitive Analysis of Students’ Mathematical Problem Solving Ability on Geometry
NASA Astrophysics Data System (ADS)
Rusyda, N. A.; Kusnandi, K.; Suhendra, S.
2017-09-01
The purpose of this research is to analyze of mathematical problem solving ability of students in one of secondary school on geometry. This research was conducted by using quantitative approach with descriptive method. Population in this research was all students of that school and the sample was twenty five students that was chosen by purposive sampling technique. Data of mathematical problem solving were collected through essay test. The results showed the percentage of achievement of mathematical problem solving indicators of students were: 1) solve closed mathematical problems with context in math was 50%; 2) solve the closed mathematical problems with the context beyond mathematics was 24%; 3) solving open mathematical problems with contexts in mathematics was 35%; And 4) solving open mathematical problems with contexts outside mathematics was 44%. Based on the percentage, it can be concluded that the level of achievement of mathematical problem solving ability in geometry still low. This is because students are not used to solving problems that measure mathematical problem solving ability, weaknesses remember previous knowledge, and lack of problem solving framework. So the students’ ability of mathematical problems solving need to be improved with implement appropriate learning strategy.
Constructing a Geology Ontology Using a Relational Database
NASA Astrophysics Data System (ADS)
Hou, W.; Yang, L.; Yin, S.; Ye, J.; Clarke, K.
2013-12-01
In geology community, the creation of a common geology ontology has become a useful means to solve problems of data integration, knowledge transformation and the interoperation of multi-source, heterogeneous and multiple scale geological data. Currently, human-computer interaction methods and relational database-based methods are the primary ontology construction methods. Some human-computer interaction methods such as the Geo-rule based method, the ontology life cycle method and the module design method have been proposed for applied geological ontologies. Essentially, the relational database-based method is a reverse engineering of abstracted semantic information from an existing database. The key is to construct rules for the transformation of database entities into the ontology. Relative to the human-computer interaction method, relational database-based methods can use existing resources and the stated semantic relationships among geological entities. However, two problems challenge the development and application. One is the transformation of multiple inheritances and nested relationships and their representation in an ontology. The other is that most of these methods do not measure the semantic retention of the transformation process. In this study, we focused on constructing a rule set to convert the semantics in a geological database into a geological ontology. According to the relational schema of a geological database, a conversion approach is presented to convert a geological spatial database to an OWL-based geological ontology, which is based on identifying semantics such as entities, relationships, inheritance relationships, nested relationships and cluster relationships. The semantic integrity of the transformation was verified using an inverse mapping process. In a geological ontology, an inheritance and union operations between superclass and subclass were used to present the nested relationship in a geochronology and the multiple inheritances relationship. Based on a Quaternary database of downtown of Foshan city, Guangdong Province, in Southern China, a geological ontology was constructed using the proposed method. To measure the maintenance of semantics in the conversation process and the results, an inverse mapping from the ontology to a relational database was tested based on a proposed conversation rule. The comparison of schema and entities and the reduction of tables between the inverse database and the original database illustrated that the proposed method retains the semantic information well during the conversation process. An application for abstracting sandstone information showed that semantic relationships among concepts in the geological database were successfully reorganized in the constructed ontology. Key words: geological ontology; geological spatial database; multiple inheritance; OWL Acknowledgement: This research is jointly funded by the Specialized Research Fund for the Doctoral Program of Higher Education of China (RFDP) (20100171120001), NSFC (41102207) and the Fundamental Research Funds for the Central Universities (12lgpy19).
Ontologies as integrative tools for plant science
Walls, Ramona L.; Athreya, Balaji; Cooper, Laurel; Elser, Justin; Gandolfo, Maria A.; Jaiswal, Pankaj; Mungall, Christopher J.; Preece, Justin; Rensing, Stefan; Smith, Barry; Stevenson, Dennis W.
2012-01-01
Premise of the study Bio-ontologies are essential tools for accessing and analyzing the rapidly growing pool of plant genomic and phenomic data. Ontologies provide structured vocabularies to support consistent aggregation of data and a semantic framework for automated analyses and reasoning. They are a key component of the semantic web. Methods This paper provides background on what bio-ontologies are, why they are relevant to botany, and the principles of ontology development. It includes an overview of ontologies and related resources that are relevant to plant science, with a detailed description of the Plant Ontology (PO). We discuss the challenges of building an ontology that covers all green plants (Viridiplantae). Key results Ontologies can advance plant science in four keys areas: (1) comparative genetics, genomics, phenomics, and development; (2) taxonomy and systematics; (3) semantic applications; and (4) education. Conclusions Bio-ontologies offer a flexible framework for comparative plant biology, based on common botanical understanding. As genomic and phenomic data become available for more species, we anticipate that the annotation of data with ontology terms will become less centralized, while at the same time, the need for cross-species queries will become more common, causing more researchers in plant science to turn to ontologies. PMID:22847540
A Framework for Concept-Based Digital Course Libraries
ERIC Educational Resources Information Center
Dicheva, Darina; Dichev, Christo
2004-01-01
This article presents a general framework for building conceptbased digital course libraries. The framework is based on the idea of using a conceptual structure that represents a subject domain ontology for classification of the course library content. Two aspects, domain conceptualization, which supports findability and ontologies, which support…
Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto; Marcos, Mar; Legaz-García, María del Carmen; Moner, David; Torres-Sospedra, Joaquín; Esteban-Gil, Angel; Martínez-Salvador, Begoña; Robles, Montserrat
2013-01-01
Background The secondary use of electronic healthcare records (EHRs) often requires the identification of patient cohorts. In this context, an important problem is the heterogeneity of clinical data sources, which can be overcome with the combined use of standardized information models, virtual health records, and semantic technologies, since each of them contributes to solving aspects related to the semantic interoperability of EHR data. Objective To develop methods allowing for a direct use of EHR data for the identification of patient cohorts leveraging current EHR standards and semantic web technologies. Materials and methods We propose to take advantage of the best features of working with EHR standards and ontologies. Our proposal is based on our previous results and experience working with both technological infrastructures. Our main principle is to perform each activity at the abstraction level with the most appropriate technology available. This means that part of the processing will be performed using archetypes (ie, data level) and the rest using ontologies (ie, knowledge level). Our approach will start working with EHR data in proprietary format, which will be first normalized and elaborated using EHR standards and then transformed into a semantic representation, which will be exploited by automated reasoning. Results We have applied our approach to protocols for colorectal cancer screening. The results comprise the archetypes, ontologies, and datasets developed for the standardization and semantic analysis of EHR data. Anonymized real data have been used and the patients have been successfully classified by the risk of developing colorectal cancer. Conclusions This work provides new insights in how archetypes and ontologies can be effectively combined for EHR-driven phenotyping. The methodological approach can be applied to other problems provided that suitable archetypes, ontologies, and classification rules can be designed. PMID:23934950
Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto; Marcos, Mar; Legaz-García, María del Carmen; Moner, David; Torres-Sospedra, Joaquín; Esteban-Gil, Angel; Martínez-Salvador, Begoña; Robles, Montserrat
2013-12-01
The secondary use of electronic healthcare records (EHRs) often requires the identification of patient cohorts. In this context, an important problem is the heterogeneity of clinical data sources, which can be overcome with the combined use of standardized information models, virtual health records, and semantic technologies, since each of them contributes to solving aspects related to the semantic interoperability of EHR data. To develop methods allowing for a direct use of EHR data for the identification of patient cohorts leveraging current EHR standards and semantic web technologies. We propose to take advantage of the best features of working with EHR standards and ontologies. Our proposal is based on our previous results and experience working with both technological infrastructures. Our main principle is to perform each activity at the abstraction level with the most appropriate technology available. This means that part of the processing will be performed using archetypes (ie, data level) and the rest using ontologies (ie, knowledge level). Our approach will start working with EHR data in proprietary format, which will be first normalized and elaborated using EHR standards and then transformed into a semantic representation, which will be exploited by automated reasoning. We have applied our approach to protocols for colorectal cancer screening. The results comprise the archetypes, ontologies, and datasets developed for the standardization and semantic analysis of EHR data. Anonymized real data have been used and the patients have been successfully classified by the risk of developing colorectal cancer. This work provides new insights in how archetypes and ontologies can be effectively combined for EHR-driven phenotyping. The methodological approach can be applied to other problems provided that suitable archetypes, ontologies, and classification rules can be designed.
Souba, Wiley W
2011-02-24
The ethical foundation of the medical profession, which values service above reward and holds the doctor-patient relationship as inviolable, continues to be challenged by the commercialization of health care. This article contends that a realigned leadership framework - one that distinguishes being a leader as the ontological basis for what leaders know, have, and do - is central to safeguarding medicine's ethical foundation. Four ontological pillars of leadership - awareness, commitment, integrity, and authenticity - are proposed as fundamental elements that anchor this foundation and the basic tenets of professionalism. Ontological leadership is shaped by and accessible through language; what health care leaders create in language "uses" them by providing a point of view (a context) within and from which they orient their conversations, decisions, and conduct such that they are ethically aligned and grounded. This contextual leadership framework exposes for us the limitations imposed by our mental maps, creating new opportunity sets for being and action (previously unavailable) that embody medicine's charter on professionalism. While this leadership methodology contrasts with the conventional results-oriented model where leading is generally equated with a successful clinical practice, a distinguished research program, or a promotion, it is not a replacement for it; indeed, results are essential for performance. Rather, being and action are interrelated and their correlated nature equips leaders with a framework for tackling health care's most complex problems in a manner that preserves medicine's venerable ethical heritage.
2011-01-01
The ethical foundation of the medical profession, which values service above reward and holds the doctor-patient relationship as inviolable, continues to be challenged by the commercialization of health care. This article contends that a realigned leadership framework - one that distinguishes being a leader as the ontological basis for what leaders know, have, and do - is central to safeguarding medicine's ethical foundation. Four ontological pillars of leadership - awareness, commitment, integrity, and authenticity - are proposed as fundamental elements that anchor this foundation and the basic tenets of professionalism. Ontological leadership is shaped by and accessible through language; what health care leaders create in language "uses" them by providing a point of view (a context) within and from which they orient their conversations, decisions, and conduct such that they are ethically aligned and grounded. This contextual leadership framework exposes for us the limitations imposed by our mental maps, creating new opportunity sets for being and action (previously unavailable) that embody medicine's charter on professionalism. While this leadership methodology contrasts with the conventional results-oriented model where leading is generally equated with a successful clinical practice, a distinguished research program, or a promotion, it is not a replacement for it; indeed, results are essential for performance. Rather, being and action are interrelated and their correlated nature equips leaders with a framework for tackling health care's most complex problems in a manner that preserves medicine's venerable ethical heritage. PMID:21349187
DMTO: a realistic ontology for standard diabetes mellitus treatment.
El-Sappagh, Shaker; Kwak, Daehan; Ali, Farman; Kwak, Kyung-Sup
2018-02-06
Treatment of type 2 diabetes mellitus (T2DM) is a complex problem. A clinical decision support system (CDSS) based on massive and distributed electronic health record data can facilitate the automation of this process and enhance its accuracy. The most important component of any CDSS is its knowledge base. This knowledge base can be formulated using ontologies. The formal description logic of ontology supports the inference of hidden knowledge. Building a complete, coherent, consistent, interoperable, and sharable ontology is a challenge. This paper introduces the first version of the newly constructed Diabetes Mellitus Treatment Ontology (DMTO) as a basis for shared-semantics, domain-specific, standard, machine-readable, and interoperable knowledge relevant to T2DM treatment. It is a comprehensive ontology and provides the highest coverage and the most complete picture of coded knowledge about T2DM patients' current conditions, previous profiles, and T2DM-related aspects, including complications, symptoms, lab tests, interactions, treatment plan (TP) frameworks, and glucose-related diseases and medications. It adheres to the design principles recommended by the Open Biomedical Ontologies Foundry and is based on ontological realism that follows the principles of the Basic Formal Ontology and the Ontology for General Medical Science. DMTO is implemented under Protégé 5.0 in Web Ontology Language (OWL) 2 format and is publicly available through the National Center for Biomedical Ontology's BioPortal at http://bioportal.bioontology.org/ontologies/DMTO . The current version of DMTO includes more than 10,700 classes, 277 relations, 39,425 annotations, 214 semantic rules, and 62,974 axioms. We provide proof of concept for this approach to modeling TPs. The ontology is able to collect and analyze most features of T2DM as well as customize chronic TPs with the most appropriate drugs, foods, and physical exercises. DMTO is ready to be used as a knowledge base for semantically intelligent and distributed CDSS systems.
Critical Realism and School Effectiveness Research in Colombia: The Difference It Should Make
ERIC Educational Resources Information Center
Parra, Juan David
2018-01-01
This article draws on the case of academic work produced by Colombian scholars, to address the debate on the persistent failure of policy efforts to improve school effectiveness. Realist meta-theory plays a significant role in this research, because it provides a general framework to identify ontological problems and inconsistencies in empirical…
Design and performance frameworks for constructing problem-solving simulations.
Stevens, Ron; Palacio-Cayetano, Joycelin
2003-01-01
Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks ideally would be guided less by the strengths/limitations of the presentation media and more by cognitive analyses detailing the goals of the tasks, the needs and abilities of students, and the resulting decision outcomes needed by different audiences. This article describes a problem-solving environment and associated theoretical framework for investigating how students select and use strategies as they solve complex science problems. A framework is first described for designing on-line problem spaces that highlights issues of content, scale, cognitive complexity, and constraints. While this framework was originally designed for medical education, it has proven robust and has been successfully applied to learning environments from elementary school through medical school. Next, a similar framework is detailed for collecting student performance and progress data that can provide evidence of students' strategic thinking and that could potentially be used to accelerate student progress. Finally, experimental validation data are presented that link strategy selection and use with other metrics of scientific reasoning and student achievement.
Design and Performance Frameworks for Constructing Problem-Solving Simulations
Stevens, Ron; Palacio-Cayetano, Joycelin
2003-01-01
Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks ideally would be guided less by the strengths/limitations of the presentation media and more by cognitive analyses detailing the goals of the tasks, the needs and abilities of students, and the resulting decision outcomes needed by different audiences. This article describes a problem-solving environment and associated theoretical framework for investigating how students select and use strategies as they solve complex science problems. A framework is first described for designing on-line problem spaces that highlights issues of content, scale, cognitive complexity, and constraints. While this framework was originally designed for medical education, it has proven robust and has been successfully applied to learning environments from elementary school through medical school. Next, a similar framework is detailed for collecting student performance and progress data that can provide evidence of students' strategic thinking and that could potentially be used to accelerate student progress. Finally, experimental validation data are presented that link strategy selection and use with other metrics of scientific reasoning and student achievement. PMID:14506505
Zhang, Zhizun; Gonzalez, Mila C; Morse, Stephen S
2017-01-01
Background There are increasing concerns about our preparedness and timely coordinated response across the globe to cope with emerging infectious diseases (EIDs). This poses practical challenges that require exploiting novel knowledge management approaches effectively. Objective This work aims to develop an ontology-driven knowledge management framework that addresses the existing challenges in sharing and reusing public health knowledge. Methods We propose a systems engineering-inspired ontology-driven knowledge management approach. It decomposes public health knowledge into concepts and relations and organizes the elements of knowledge based on the teleological functions. Both knowledge and semantic rules are stored in an ontology and retrieved to answer queries regarding EID preparedness and response. Results A hybrid concept extraction was implemented in this work. The quality of the ontology was evaluated using the formal evaluation method Ontology Quality Evaluation Framework. Conclusions Our approach is a potentially effective methodology for managing public health knowledge. Accuracy and comprehensiveness of the ontology can be improved as more knowledge is stored. In the future, a survey will be conducted to collect queries from public health practitioners. The reasoning capacity of the ontology will be evaluated using the queries and hypothetical outbreaks. We suggest the importance of developing a knowledge sharing standard like the Gene Ontology for the public health domain. PMID:29021130
ERIC Educational Resources Information Center
Ge, Xun; Law, Victor; Huang, Kun
2016-01-01
One of the goals for problem-based learning (PBL) is to promote self-regulation. Although self-regulation has been studied extensively, its interrelationships with ill-structured problem solving have been unclear. In order to clarify the interrelationships, this article proposes a conceptual framework illustrating the iterative processes among…
The Structure of Ill-Structured (and Well-Structured) Problems Revisited
ERIC Educational Resources Information Center
Reed, Stephen K.
2016-01-01
In his 1973 article "The Structure of ill structured problems", Herbert Simon proposed that solving ill-structured problems could be modeled within the same information-processing framework developed for solving well-structured problems. This claim is reexamined within the context of over 40 years of subsequent research and theoretical…
CHAMPION: Intelligent Hierarchical Reasoning Agents for Enhanced Decision Support
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hohimer, Ryan E.; Greitzer, Frank L.; Noonan, Christine F.
2011-11-15
We describe the design and development of an advanced reasoning framework employing semantic technologies, organized within a hierarchy of computational reasoning agents that interpret domain specific information. Designed based on an inspirational metaphor of the pattern recognition functions performed by the human neocortex, the CHAMPION reasoning framework represents a new computational modeling approach that derives invariant knowledge representations through memory-prediction belief propagation processes that are driven by formal ontological language specification and semantic technologies. The CHAMPION framework shows promise for enhancing complex decision making in diverse problem domains including cyber security, nonproliferation and energy consumption analysis.
ERIC Educational Resources Information Center
Lin, Wei-Lun; Lien, Yunn-Wen
2013-01-01
This study examined how working memory plays different roles in open-ended versus closed-ended creative problem-solving processes, as represented by divergent thinking tests and insight problem-solving tasks. With respect to the analysis of different task demands and the framework of dual-process theories, the hypothesis was that the idea…
NanoParticle Ontology for Cancer Nanotechnology Research
Thomas, Dennis G.; Pappu, Rohit V.; Baker, Nathan A.
2010-01-01
Data generated from cancer nanotechnology research are so diverse and large in volume that it is difficult to share and efficiently use them without informatics tools. In particular, ontologies that provide a unifying knowledge framework for annotating the data are required to facilitate the semantic integration, knowledge-based searching, unambiguous interpretation, mining and inferencing of the data using informatics methods. In this paper, we discuss the design and development of NanoParticle Ontology (NPO), which is developed within the framework of the Basic Formal Ontology (BFO), and implemented in the Ontology Web Language (OWL) using well-defined ontology design principles. The NPO was developed to represent knowledge underlying the preparation, chemical composition, and characterization of nanomaterials involved in cancer research. Public releases of the NPO are available through BioPortal website, maintained by the National Center for Biomedical Ontology. Mechanisms for editorial and governance processes are being developed for the maintenance, review, and growth of the NPO. PMID:20211274
Meeting report: advancing practical applications of biodiversity ontologies
2014-01-01
We describe the outcomes of three recent workshops aimed at advancing development of the Biological Collections Ontology (BCO), the Population and Community Ontology (PCO), and tools to annotate data using those and other ontologies. The first workshop gathered use cases to help grow the PCO, agreed upon a format for modeling challenging concepts such as ecological niche, and developed ontology design patterns for defining collections of organisms and population-level phenotypes. The second focused on mapping datasets to ontology terms and converting them to Resource Description Framework (RDF), using the BCO. To follow-up, a BCO hackathon was held concurrently with the 16th Genomics Standards Consortium Meeting, during which we converted additional datasets to RDF, developed a Material Sample Core for the Global Biodiversity Information Framework, created a Web Ontology Language (OWL) file for importing Darwin Core classes and properties into BCO, and developed a workflow for converting biodiversity data among formats.
Standard model of knowledge representation
NASA Astrophysics Data System (ADS)
Yin, Wensheng
2016-09-01
Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.
Multiobjective optimization of temporal processes.
Song, Zhe; Kusiak, Andrew
2010-06-01
This paper presents a dynamic predictive-optimization framework of a nonlinear temporal process. Data-mining (DM) and evolutionary strategy algorithms are integrated in the framework for solving the optimization model. DM algorithms learn dynamic equations from the process data. An evolutionary strategy algorithm is then applied to solve the optimization problem guided by the knowledge extracted by the DM algorithm. The concept presented in this paper is illustrated with the data from a power plant, where the goal is to maximize the boiler efficiency and minimize the limestone consumption. This multiobjective optimization problem can be either transformed into a single-objective optimization problem through preference aggregation approaches or into a Pareto-optimal optimization problem. The computational results have shown the effectiveness of the proposed optimization framework.
Ontology-Driven Provenance Management in eScience: An Application in Parasite Research
NASA Astrophysics Data System (ADS)
Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.
Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.
Ontology to relational database transformation for web application development and maintenance
NASA Astrophysics Data System (ADS)
Mahmudi, Kamal; Inggriani Liem, M. M.; Akbar, Saiful
2018-03-01
Ontology is used as knowledge representation while database is used as facts recorder in a KMS (Knowledge Management System). In most applications, data are managed in a database system and updated through the application and then they are transformed to knowledge as needed. Once a domain conceptor defines the knowledge in the ontology, application and database can be generated from the ontology. Most existing frameworks generate application from its database. In this research, ontology is used for generating the application. As the data are updated through the application, a mechanism is designed to trigger an update to the ontology so that the application can be rebuilt based on the newest ontology. By this approach, a knowledge engineer has a full flexibility to renew the application based on the latest ontology without dependency to a software developer. In many cases, the concept needs to be updated when the data changed. The framework is built and tested in a spring java environment. A case study was conducted to proof the concepts.
Modular Knowledge Representation and Reasoning in the Semantic Web
NASA Astrophysics Data System (ADS)
Serafini, Luciano; Homola, Martin
Construction of modular ontologies by combining different modules is becoming a necessity in ontology engineering in order to cope with the increasing complexity of the ontologies and the domains they represent. The modular ontology approach takes inspiration from software engineering, where modularization is a widely acknowledged feature. Distributed reasoning is the other side of the coin of modular ontologies: given an ontology comprising of a set of modules, it is desired to perform reasoning by combination of multiple reasoning processes performed locally on each of the modules. In the last ten years, a number of approaches for combining logics has been developed in order to formalize modular ontologies. In this chapter, we survey and compare the main formalisms for modular ontologies and distributed reasoning in the Semantic Web. We select four formalisms build on formal logical grounds of Description Logics: Distributed Description Logics, ℰ-connections, Package-based Description Logics and Integrated Distributed Description Logics. We concentrate on expressivity and distinctive modeling features of each framework. We also discuss reasoning capabilities of each framework.
Physical properties of biological entities: an introduction to the ontology of physics for biology.
Cook, Daniel L; Bookstein, Fred L; Gennari, John H
2011-01-01
As biomedical investigators strive to integrate data and analyses across spatiotemporal scales and biomedical domains, they have recognized the benefits of formalizing languages and terminologies via computational ontologies. Although ontologies for biological entities-molecules, cells, organs-are well-established, there are no principled ontologies of physical properties-energies, volumes, flow rates-of those entities. In this paper, we introduce the Ontology of Physics for Biology (OPB), a reference ontology of classical physics designed for annotating biophysical content of growing repositories of biomedical datasets and analytical models. The OPB's semantic framework, traceable to James Clerk Maxwell, encompasses modern theories of system dynamics and thermodynamics, and is implemented as a computational ontology that references available upper ontologies. In this paper we focus on the OPB classes that are designed for annotating physical properties encoded in biomedical datasets and computational models, and we discuss how the OPB framework will facilitate biomedical knowledge integration. © 2011 Cook et al.
The Open-Ended Approach Framework
ERIC Educational Resources Information Center
Munroe, Lloyd
2015-01-01
This paper describes a pedagogical framework that teachers can use to support students who are engaged in solving open-ended problems, by explaining how two Japanese expert teachers successfully apply open-ended problems in their mathematics class. The Open-Ended Approach (OPA) framework consists of two main sections: Understanding Mathematical…
A Solution Framework for Environmental Characterization Problems
This paper describes experiences developing a grid-enabled framework for solving environmental inverse problems. The solution approach taken here couples environmental simulation models with global search methods and requires readily available computational resources of the grid ...
Towards the Construction of a Framework to Deal with Routine Problems to Foster Mathematical Inquiry
ERIC Educational Resources Information Center
Santos-Trigo, Manuel; Camacho-Machin, Matias
2009-01-01
To what extent does the process of solving textbook problems help students develop a way of thinking that is consistent with mathematical practice? Can routine problems be transformed into problem solving activities that promote students' mathematical reflection? These questions are used to outline and discuss features of an inquiry framework…
Problem Solving in Physics: Undergraduates' Framing, Procedures, and Decision Making
NASA Astrophysics Data System (ADS)
Modir, Bahar
In this dissertation I will start with the broad research question of what does problem solving in upper division physics look like? My focus in this study is on students' problem solving in physics theory courses. Some mathematical formalisms are common across all physics core courses such as using the process of separation of variables, doing Taylor series, or using the orthogonality properties of mathematical functions to set terms equal to zero. However, there are slight differences in their use of these mathematical formalisms across different courses, possibly because of how students map different physical systems to these processes. Thus, my first main research question aims to answer how students perform these recurring processes across upper division physics courses. I break this broad question into three particular research questions: What knowledge pieces do students use to make connections between physics and procedural math? How do students use their knowledge pieces coherently to provide reasoning strategies in estimation problems? How do students look ahead into the problem to read the information out of the physical scenario to align their use of math in physics? Building on the previous body of the literature, I will use the theory family of Knowledge in Pieces and provide evidence to expand this theoretical foundation. I will compare my study with previous studies and provide suggestions on how to generalize these theory expansions for future use. My experimental data mostly come from video-based classroom data. Students in groups of 2-4 students solve in-class problems in quantum mechanics and electromagnetic fields 1 courses collaboratively. In addition, I will analyze clinical interviews to demonstrate how a single case study student plays an epistemic game to estimate the total energy in a hurricane. My second research question is more focused on a particular instructional context. How do students frame problem solving in quantum mechanics? I will lay out a new theoretical framework based in epistemic framing that separates the problem solving space into four frames divided along two axes. The first axis models students' framing in math and physics, expanded through the second axis of conceptual problem solving and algorithmic problem solving. I use this framework to show how students navigate problem solving. Lastly, I will use this developed framework to interpret existing difficulties in quantum mechanics.
Subjectivity and schizophrenia: another look at incomprehensibility and treatment nonadherence.
Parnas, Josef; Henriksen, Mads Gram
2013-01-01
Psychiatry is in a time of crisis. The absence of significant breakthroughs to actionable etiological knowledge has left the discipline in a state of uncertainty and worries are being voiced about its status and future. In our view, the stagnation can be, at least in part, ascribed to an excessive, behaviorist-oriented, epistemological, and ontological simplification of psychopathology. The aim of this phenomenological study is to articulate the notion of the 'disordered self' in schizophrenia, a notion that we believe constitutes an important step forward in grasping its essential pathogenetic structures. Through the framework of self-disorders, we analyze two domains of the psychopathology of schizophrenia, seeking to recast their puzzling nature into more useful clinical and scientific terms. First, we examine the so-called schizophrenic incomprehensibility (bizarre gestalt, bizarre delusions, and 'crazy actions') and argue that grasping the altered framework for experiencing, associated with the disordered self, makes these phenomena appear comprehensible to a considerable extent. Second, we explore the issue of treatment noncompliance and provide a novel account of 'poor insight' into illness. We propose that poor insight into schizophrenia is not simply a problem of insufficient self- reflection due to psychological defenses or impaired metacognition, but rather that it is intrinsically expressive of the severity and nature of self-disorders. The instabilities of the first-person perspective throw the patient into a different, often quasisolipsistic, ontological-existential framework. We argue that interventions seeking to optimize the patients' compliance might prove more efficient if they take the alterations of the patients' ontological-existential framework into account. © 2013 S. Karger AG, Basel.
Utilizing a structural meta-ontology for family-based quality assurance of the BioPortal ontologies.
Ochs, Christopher; He, Zhe; Zheng, Ling; Geller, James; Perl, Yehoshua; Hripcsak, George; Musen, Mark A
2016-06-01
An Abstraction Network is a compact summary of an ontology's structure and content. In previous research, we showed that Abstraction Networks support quality assurance (QA) of biomedical ontologies. The development of an Abstraction Network and its associated QA methodologies, however, is a labor-intensive process that previously was applicable only to one ontology at a time. To improve the efficiency of the Abstraction-Network-based QA methodology, we introduced a QA framework that uses uniform Abstraction Network derivation techniques and QA methodologies that are applicable to whole families of structurally similar ontologies. For the family-based framework to be successful, it is necessary to develop a method for classifying ontologies into structurally similar families. We now describe a structural meta-ontology that classifies ontologies according to certain structural features that are commonly used in the modeling of ontologies (e.g., object properties) and that are important for Abstraction Network derivation. Each class of the structural meta-ontology represents a family of ontologies with identical structural features, indicating which types of Abstraction Networks and QA methodologies are potentially applicable to all of the ontologies in the family. We derive a collection of 81 families, corresponding to classes of the structural meta-ontology, that enable a flexible, streamlined family-based QA methodology, offering multiple choices for classifying an ontology. The structure of 373 ontologies from the NCBO BioPortal is analyzed and each ontology is classified into multiple families modeled by the structural meta-ontology. Copyright © 2016 Elsevier Inc. All rights reserved.
A concept ideation framework for medical device design.
Hagedorn, Thomas J; Grosse, Ian R; Krishnamurty, Sundar
2015-06-01
Medical device design is a challenging process, often requiring collaboration between medical and engineering domain experts. This collaboration can be best institutionalized through systematic knowledge transfer between the two domains coupled with effective knowledge management throughout the design innovation process. Toward this goal, we present the development of a semantic framework for medical device design that unifies a large medical ontology with detailed engineering functional models along with the repository of design innovation information contained in the US Patent Database. As part of our development, existing medical, engineering, and patent document ontologies were modified and interlinked to create a comprehensive medical device innovation and design tool with appropriate properties and semantic relations to facilitate knowledge capture, enrich existing knowledge, and enable effective knowledge reuse for different scenarios. The result is a Concept Ideation Framework for Medical Device Design (CIFMeDD). Key features of the resulting framework include function-based searching and automated inter-domain reasoning to uniquely enable identification of functionally similar procedures, tools, and inventions from multiple domains based on simple semantic searches. The significance and usefulness of the resulting framework for aiding in conceptual design and innovation in the medical realm are explored via two case studies examining medical device design problems. Copyright © 2015 Elsevier Inc. All rights reserved.
Zhang, Zhizun; Gonzalez, Mila C; Morse, Stephen S; Venkatasubramanian, Venkat
2017-10-11
There are increasing concerns about our preparedness and timely coordinated response across the globe to cope with emerging infectious diseases (EIDs). This poses practical challenges that require exploiting novel knowledge management approaches effectively. This work aims to develop an ontology-driven knowledge management framework that addresses the existing challenges in sharing and reusing public health knowledge. We propose a systems engineering-inspired ontology-driven knowledge management approach. It decomposes public health knowledge into concepts and relations and organizes the elements of knowledge based on the teleological functions. Both knowledge and semantic rules are stored in an ontology and retrieved to answer queries regarding EID preparedness and response. A hybrid concept extraction was implemented in this work. The quality of the ontology was evaluated using the formal evaluation method Ontology Quality Evaluation Framework. Our approach is a potentially effective methodology for managing public health knowledge. Accuracy and comprehensiveness of the ontology can be improved as more knowledge is stored. In the future, a survey will be conducted to collect queries from public health practitioners. The reasoning capacity of the ontology will be evaluated using the queries and hypothetical outbreaks. We suggest the importance of developing a knowledge sharing standard like the Gene Ontology for the public health domain. ©Zhizun Zhang, Mila C Gonzalez, Stephen S Morse, Venkat Venkatasubramanian. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 11.10.2017.
Design and Application of an Ontology for Component-Based Modeling of Water Systems
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2012-12-01
Many Earth system modeling frameworks have adopted an approach of componentizing models so that a large model can be assembled by linking a set of smaller model components. These model components can then be more easily reused, extended, and maintained by a large group of model developers and end users. While there has been a notable increase in component-based model frameworks in the Earth sciences in recent years, there has been less work on creating framework-agnostic metadata and ontologies for model components. Well defined model component metadata is needed, however, to facilitate sharing, reuse, and interoperability both within and across Earth system modeling frameworks. To address this need, we have designed an ontology for the water resources community named the Water Resources Component (WRC) ontology in order to advance the application of component-based modeling frameworks across water related disciplines. Here we present the design of the WRC ontology and demonstrate its application for integration of model components used in watershed management. First we show how the watershed modeling system Soil and Water Assessment Tool (SWAT) can be decomposed into a set of hydrological and ecological components that adopt the Open Modeling Interface (OpenMI) standard. Then we show how the components can be used to estimate nitrogen losses from land to surface water for the Baltimore Ecosystem study area. Results of this work are (i) a demonstration of how the WRC ontology advances the conceptual integration between components of water related disciplines by handling the semantic and syntactic heterogeneity present when describing components from different disciplines and (ii) an investigation of a methodology by which large models can be decomposed into a set of model components that can be well described by populating metadata according to the WRC ontology.
Interference thinking in constructing students’ knowledge to solve mathematical problems
NASA Astrophysics Data System (ADS)
Jayanti, W. E.; Usodo, B.; Subanti, S.
2018-04-01
This research aims to describe interference thinking in constructing students’ knowledge to solve mathematical problems. Interference thinking in solving problems occurs when students have two concepts that interfere with each other’s concept. Construction of problem-solving can be traced using Piaget’s assimilation and accommodation framework, helping to know the students’ thinking structures in solving the problems. The method of this research was a qualitative method with case research strategy. The data in this research involving problem-solving result and transcripts of interviews about students’ errors in solving the problem. The results of this research focus only on the student who experience proactive interference, where student in solving a problem using old information to interfere with the ability to recall new information. The student who experience interference thinking in constructing their knowledge occurs when the students’ thinking structures in the assimilation and accommodation process are incomplete. However, after being given reflection to the student, then the students’ thinking process has reached equilibrium condition even though the result obtained remains wrong.
Initial implementation of a comparative data analysis ontology.
Prosdocimi, Francisco; Chisham, Brandon; Pontelli, Enrico; Thompson, Julie D; Stoltzfus, Arlin
2009-07-03
Comparative analysis is used throughout biology. When entities under comparison (e.g. proteins, genomes, species) are related by descent, evolutionary theory provides a framework that, in principle, allows N-ary comparisons of entities, while controlling for non-independence due to relatedness. Powerful software tools exist for specialized applications of this approach, yet it remains under-utilized in the absence of a unifying informatics infrastructure. A key step in developing such an infrastructure is the definition of a formal ontology. The analysis of use cases and existing formalisms suggests that a significant component of evolutionary analysis involves a core problem of inferring a character history, relying on key concepts: "Operational Taxonomic Units" (OTUs), representing the entities to be compared; "character-state data" representing the observations compared among OTUs; "phylogenetic tree", representing the historical path of evolution among the entities; and "transitions", the inferred evolutionary changes in states of characters that account for observations. Using the Web Ontology Language (OWL), we have defined these and other fundamental concepts in a Comparative Data Analysis Ontology (CDAO). CDAO has been evaluated for its ability to represent token data sets and to support simple forms of reasoning. With further development, CDAO will provide a basis for tools (for semantic transformation, data retrieval, validation, integration, etc.) that make it easier for software developers and biomedical researchers to apply evolutionary methods of inference to diverse types of data, so as to integrate this powerful framework for reasoning into their research.
Texas two-step: a framework for optimal multi-input single-output deconvolution.
Neelamani, Ramesh; Deffenbaugh, Max; Baraniuk, Richard G
2007-11-01
Multi-input single-output deconvolution (MISO-D) aims to extract a deblurred estimate of a target signal from several blurred and noisy observations. This paper develops a new two step framework--Texas Two-Step--to solve MISO-D problems with known blurs. Texas Two-Step first reduces the MISO-D problem to a related single-input single-output deconvolution (SISO-D) problem by invoking the concept of sufficient statistics (SSs) and then solves the simpler SISO-D problem using an appropriate technique. The two-step framework enables new MISO-D techniques (both optimal and suboptimal) based on the rich suite of existing SISO-D techniques. In fact, the properties of SSs imply that a MISO-D algorithm is mean-squared-error optimal if and only if it can be rearranged to conform to the Texas Two-Step framework. Using this insight, we construct new wavelet- and curvelet-based MISO-D algorithms with asymptotically optimal performance. Simulated and real data experiments verify that the framework is indeed effective.
Assessing Algebraic Solving Ability: A Theoretical Framework
ERIC Educational Resources Information Center
Lian, Lim Hooi; Yew, Wun Thiam
2012-01-01
Algebraic solving ability had been discussed by many educators and researchers. There exists no definite definition for algebraic solving ability as it can be viewed from different perspectives. In this paper, the nature of algebraic solving ability in terms of algebraic processes that demonstrate the ability in solving algebraic problem is…
Computable visually observed phenotype ontological framework for plants
2011-01-01
Background The ability to search for and precisely compare similar phenotypic appearances within and across species has vast potential in plant science and genetic research. The difficulty in doing so lies in the fact that many visual phenotypic data, especially visually observed phenotypes that often times cannot be directly measured quantitatively, are in the form of text annotations, and these descriptions are plagued by semantic ambiguity, heterogeneity, and low granularity. Though several bio-ontologies have been developed to standardize phenotypic (and genotypic) information and permit comparisons across species, these semantic issues persist and prevent precise analysis and retrieval of information. A framework suitable for the modeling and analysis of precise computable representations of such phenotypic appearances is needed. Results We have developed a new framework called the Computable Visually Observed Phenotype Ontological Framework for plants. This work provides a novel quantitative view of descriptions of plant phenotypes that leverages existing bio-ontologies and utilizes a computational approach to capture and represent domain knowledge in a machine-interpretable form. This is accomplished by means of a robust and accurate semantic mapping module that automatically maps high-level semantics to low-level measurements computed from phenotype imagery. The framework was applied to two different plant species with semantic rules mined and an ontology constructed. Rule quality was evaluated and showed high quality rules for most semantics. This framework also facilitates automatic annotation of phenotype images and can be adopted by different plant communities to aid in their research. Conclusions The Computable Visually Observed Phenotype Ontological Framework for plants has been developed for more efficient and accurate management of visually observed phenotypes, which play a significant role in plant genomics research. The uniqueness of this framework is its ability to bridge the knowledge of informaticians and plant science researchers by translating descriptions of visually observed phenotypes into standardized, machine-understandable representations, thus enabling the development of advanced information retrieval and phenotype annotation analysis tools for the plant science community. PMID:21702966
Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason
2016-01-01
With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713
Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason
2016-12-15
With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.
Software-engineering challenges of building and deploying reusable problem solvers.
O'Connor, Martin J; Nyulas, Csongor; Tu, Samson; Buckeridge, David L; Okhmatovskaia, Anna; Musen, Mark A
2009-11-01
Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task-method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach.
Software-engineering challenges of building and deploying reusable problem solvers
O’CONNOR, MARTIN J.; NYULAS, CSONGOR; TU, SAMSON; BUCKERIDGE, DAVID L.; OKHMATOVSKAIA, ANNA; MUSEN, MARK A.
2012-01-01
Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task–method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach. PMID:23565031
Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun
2016-09-14
Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.
Motivation and Organizational Principles for Anatomical Knowledge Representation
Rosse, Cornelius; Mejino, José L.; Modayur, Bharath R.; Jakobovits, Rex; Hinshaw, Kevin P.; Brinkley, James F.
1998-01-01
Abstract Objective: Conceptualization of the physical objects and spaces that constitute the human body at the macroscopic level of organization, specified as a machine-parseable ontology that, in its human-readable form, is comprehensible to both expert and novice users of anatomical information. Design: Conceived as an anatomical enhancement of the UMLS Semantic Network and Metathesaurus, the anatomical ontology was formulated by specifying defining attributes and differentia for classes and subclasses of physical anatomical entities based on their partitive and spatial relationships. The validity of the classification was assessed by instantiating the ontology for the thorax. Several transitive relationships were used for symbolically modeling aspects of the physical organization of the thorax. Results: By declaring Organ as the macroscopic organizational unit of the body, and defining the entities that constitute organs and higher level entities constituted by organs, all anatomical entities could be assigned to one of three top level classes (Anatomical structure, Anatomical spatial entity and Body substance). The ontology accommodates both the systemic and regional (topographical) views of anatomy, as well as diverse clinical naming conventions of anatomical entities. Conclusions: The ontology formulated for the thorax is extendible to microscopic and cellular levels, as well as to other body parts, in that its classes subsume essentially all anatomical entities that constitute the body. Explicit definitions of these entities and their relationships provide the first requirement for standards in anatomical concept representation. Conceived from an anatomical viewpoint, the ontology can be generalized and mapped to other biomedical domains and problem solving tasks that require anatomical knowledge. PMID:9452983
LONI visualization environment.
Dinov, Ivo D; Valentino, Daniel; Shin, Bae Cheol; Konstantinidis, Fotios; Hu, Guogang; MacKenzie-Graham, Allan; Lee, Erh-Fang; Shattuck, David; Ma, Jeff; Schwartz, Craig; Toga, Arthur W
2006-06-01
Over the past decade, the use of informatics to solve complex neuroscientific problems has increased dramatically. Many of these research endeavors involve examining large amounts of imaging, behavioral, genetic, neurobiological, and neuropsychiatric data. Superimposing, processing, visualizing, or interpreting such a complex cohort of datasets frequently becomes a challenge. We developed a new software environment that allows investigators to integrate multimodal imaging data, hierarchical brain ontology systems, on-line genetic and phylogenic databases, and 3D virtual data reconstruction models. The Laboratory of Neuro Imaging visualization environment (LONI Viz) consists of the following components: a sectional viewer for imaging data, an interactive 3D display for surface and volume rendering of imaging data, a brain ontology viewer, and an external database query system. The synchronization of all components according to stereotaxic coordinates, region name, hierarchical ontology, and genetic labels is achieved via a comprehensive BrainMapper functionality, which directly maps between position, structure name, database, and functional connectivity information. This environment is freely available, portable, and extensible, and may prove very useful for neurobiologists, neurogenetisists, brain mappers, and for other clinical, pedagogical, and research endeavors.
Cognitive constraints on high school students' representations of real environmental problems
NASA Astrophysics Data System (ADS)
Barnes, Ervin Kenneth
One class of juniors and seniors was studied through one semester in the investigation of how students think about, learn from, and solve real environmental problems. The intention was to listen to student voices while researching the features of their representations of these problems, the beliefs they held (tenets), the cognitive processes they employed, and the principles of science, ecology, problem solving, and ethics they held as tenets. The focus was upon two self-selected groups as they perceived, engaged, analyzed, and proposed solutions for problems. Analysis of the student representations involved interpretation of the features to include both the perspective tenets and the envisioning processes. These processes included the intentive and attentive constraints as tenet acquisition and volitive and agential constraints as tenet affirmation. The perspective tenets included a variety of conceptual (basic science, ecological, ethical, and problem-solving) constraints as well as ontological, epistemological, and other cultural (role, status, power, and community) constraints. The perspective tenets were interpreted thematically including the ways populations of people cause and care about environmental problems, the magnitude of environmental problems and the science involved, the expectations and limitations students perceive for themselves, and the importance of community awareness and cooperation to addressing these problems. Some of these tenets were interpreted to be principles in that they were rules that were accepted by some people as true. The perspective tenets, along with the envisioning processes, were perceived to be the constraints that determined the environmental problems and limited the solution possibilities. The students thought about environmental problems in mature and principled ways using a repertoire of cognitive processes. They learned from them as they acquired and affirmed tenets. They solved them through personal choices and efforts to increase community awareness. The ways students think about, learn from, and solve real environmental problems were all constrained by the perspective tenets (including cultural tenets of role, status, and power) and envisioning processes. It was concluded that students need help from the community to go further in solving these real environmental problems.
Research and simulation of the decoupling transformation in AC motor vector control
NASA Astrophysics Data System (ADS)
He, Jiaojiao; Zhao, Zhongjie; Liu, Ken; Zhang, Yongping; Yao, Tuozhong
2018-04-01
Permanent magnet synchronous motor (PMSM) is a nonlinear, strong coupling, multivariable complex object, and transformation decoupling can solve the coupling problem of permanent magnet synchronous motor. This paper gives a permanent magnet synchronous motor (PMSM) mathematical model, introduces the permanent magnet synchronous motor vector control coordinate transformation in the process of modal matrix inductance matrix transform through the matrix related knowledge of different coordinates of diagonalization, which makes the coupling between the independent, realize the control of motor current and excitation the torque current coupling separation, and derived the coordinate transformation matrix, the thought to solve the coupling problem of AC motor. Finally, in the Matlab/Simulink environment, through the establishment and combination between the PMSM ontology, coordinate conversion module, built the simulation model of permanent magnet synchronous motor vector control, introduces the model of each part, and analyzed the simulation results.
OPPL-Galaxy, a Galaxy tool for enhancing ontology exploitation as part of bioinformatics workflows
2013-01-01
Background Biomedical ontologies are key elements for building up the Life Sciences Semantic Web. Reusing and building biomedical ontologies requires flexible and versatile tools to manipulate them efficiently, in particular for enriching their axiomatic content. The Ontology Pre Processor Language (OPPL) is an OWL-based language for automating the changes to be performed in an ontology. OPPL augments the ontologists’ toolbox by providing a more efficient, and less error-prone, mechanism for enriching a biomedical ontology than that obtained by a manual treatment. Results We present OPPL-Galaxy, a wrapper for using OPPL within Galaxy. The functionality delivered by OPPL (i.e. automated ontology manipulation) can be combined with the tools and workflows devised within the Galaxy framework, resulting in an enhancement of OPPL. Use cases are provided in order to demonstrate OPPL-Galaxy’s capability for enriching, modifying and querying biomedical ontologies. Conclusions Coupling OPPL-Galaxy with other bioinformatics tools of the Galaxy framework results in a system that is more than the sum of its parts. OPPL-Galaxy opens a new dimension of analyses and exploitation of biomedical ontologies, including automated reasoning, paving the way towards advanced biological data analyses. PMID:23286517
The Model Method: Singapore Children's Tool for Representing and Solving Algebraic Word Problems
ERIC Educational Resources Information Center
Ng, Swee Fong; Lee, Kerry
2009-01-01
Solving arithmetic and algebraic word problems is a key component of the Singapore elementary mathematics curriculum. One heuristic taught, the model method, involves drawing a diagram to represent key information in the problem. We describe the model method and a three-phase theoretical framework supporting its use. We conducted 2 studies to…
ERIC Educational Resources Information Center
Kuo, Eric; Hallinen, Nicole R.; Conlin, Luke D.
2017-01-01
One aim of school science instruction is to help students become adaptive problem solvers. Though successful at structuring novice problem solving, step-by-step problem-solving frameworks may also constrain students' thinking. This study utilises a paradigm established by Heckler [(2010). Some consequences of prompting novice physics students to…
Students' Mathematics Word Problem-Solving Achievement in a Computer-Based Story
ERIC Educational Resources Information Center
Gunbas, N.
2015-01-01
The purpose of this study was to investigate the effect of a computer-based story, which was designed in anchored instruction framework, on sixth-grade students' mathematics word problem-solving achievement. Problems were embedded in a story presented on a computer as computer story, and then compared with the paper-based version of the same story…
ENGAGE: A Game Based Learning and Problem Solving Framework
2012-08-15
multiplayer card game Creature Capture now supports an offline multiplayer mode (sharing a single computer), in response to feedback from teachers that a...Planetopia overworld will be ready for use by a number of physical schools as well as integrated into multiple online teaching resources. The games will be...From - To) 7/1/2012 – 7/31/2012 4. TITLE AND SUBTITLE ENGAGE: A Game Based Learning and Problem Solving Framework 5a. CONTRACT NUMBER N/A 5b
Supervision in School Psychology: The Developmental/Ecological/Problem-Solving Model
ERIC Educational Resources Information Center
Simon, Dennis J.; Cruise, Tracy K.; Huber, Brenda J.; Swerdlik, Mark E.; Newman, Daniel S.
2014-01-01
Effective supervision models guide the supervisory relationship and supervisory tasks leading to reflective and purposeful practice. The Developmental/Ecological/Problem-Solving (DEP) Model provides a contemporary framework for supervision specific to school psychology. Designed for the school psychology internship, the DEP Model is also…
Teaming to Teach the Information Problem-Solving Process.
ERIC Educational Resources Information Center
Sine, Lynn; Murphy, Becky
1992-01-01
Explains a problem-solving format developed by a school media specialist and first grade teacher that used the framework of Eisenberg and Berkowitz's "Big Six Skills" for library media programs. The application of the format to a science unit on the senses is described. (two references) (MES)
An investigative framework to facilitate epidemiological thinking during herd problem-solving.
More, Simon J; Doherty, Michael L; O'Grady, Luke
2017-01-01
Veterinary clinicians and students commonly use diagnostic approaches appropriate for individual cases when conducting herd problem-solving. However, these approaches can be problematic, in part because they make limited use of epidemiological principles and methods, which has clear application during the investigation of herd problems. In this paper, we provide an overview of diagnostic approaches that are used when investigating individual animal cases, and the challenges faced when these approaches are directly translated from the individual to the herd. Further, we propose an investigative framework to facilitate epidemiological thinking during herd problem-solving. A number of different approaches are used when making a diagnosis on an individual animal, including pattern recognition, hypothetico-deductive reasoning, and the key abnormality method. Methods commonly applied to individuals are often adapted for herd problem-solving: 'comparison with best practice' being a herd-level adaptation of pattern recognition, and 'differential diagnoses' a herd-level adaptation of hypothetico-deductive reasoning. These approaches can be effective, however, challenges can arise. Herds are complex; a collection of individual cows, but also additional layers relating to environment, management, feeding etc. It is unrealistic to expect seamless translation of diagnostic approaches from the individual to the herd. Comparison with best practice is time-consuming and prioritisation of actions can be problematic, whereas differential diagnoses can lead to 'pathogen hunting', particularly in complex cases. Epidemiology is the science of understanding disease in populations. The focus is on the population, underpinned by principles and utilising methods that seek to allow us to generate solid conclusions from apparently uncontrolled situations. In this paper, we argue for the inclusion of epidemiological principles and methods as an additional tool for herd problem-solving, and outline an investigative framework, with examples, to effectively incorporate these principles and methods with other diagnostic approaches during herd problem-solving. Relevant measures of performance are identified, and measures of case frequencies are calculated and compared across time, in space and among animal groupings, to identify patterns, clues and plausible hypotheses, consistent with potential biological processes. With this knowledge, the subsequent investigation (relevant on-farm activities, diagnostic testing and other examinations) can be focused, and actions prioritised (specifically, those actions that are likely to make the greatest difference in addressing the problem if enacted). In our experience, this investigative framework is an effective teaching tool, facilitating epidemiological thinking among students during herd problem-solving. It is a generic and robust process, suited to many herd-based problems.
Insight with hands and things.
Vallée-Tourangeau, Frédéric; Steffensen, Sune Vork; Vallée-Tourangeau, Gaëlle; Sirota, Miroslav
2016-10-01
Two experiments examined whether different task ecologies influenced insight problem solving. The 17 animals problem was employed, a pure insight problem. Its initial formulation encourages the application of a direct arithmetic solution, but its solution requires the spatial arrangement of sets involving some degree of overlap. Participants were randomly allocated to either a tablet condition where they could use a stylus and an electronic tablet to sketch a solution or a model building condition where participants were given material with which to build enclosures and figurines. In both experiments, participants were much more likely to develop a working solution in the model building condition. The difference in performance elicited by different task ecologies was unrelated to individual differences in working memory, actively open-minded thinking, or need for cognition (Experiment 1), although individual differences in creativity were correlated with problem solving success in Experiment 2. The discussion focuses on the implications of these findings for the prevailing metatheoretical commitment to methodological individualism that places the individual as the ontological locus of cognition. Copyright © 2016 Elsevier B.V. All rights reserved.
A Theoretical Framework for Studying Adolescent Contraceptive Use.
ERIC Educational Resources Information Center
Urberg, Kathryn A.
1982-01-01
Presents a theoretical framework for viewing adolescent contraceptive usage. The problem-solving process is used for developmentally examining the competencies that must be present for effective contraceptive use, including: problem recognition, motivation, generation of alternatives, decision making and implementation. Each aspect is discussed…
Neural Meta-Memes Framework for Combinatorial Optimization
NASA Astrophysics Data System (ADS)
Song, Li Qin; Lim, Meng Hiot; Ong, Yew Soon
In this paper, we present a Neural Meta-Memes Framework (NMMF) for combinatorial optimization. NMMF is a framework which models basic optimization algorithms as memes and manages them dynamically when solving combinatorial problems. NMMF encompasses neural networks which serve as the overall planner/coordinator to balance the workload between memes. We show the efficacy of the proposed NMMF through empirical study on a class of combinatorial problem, the quadratic assignment problem (QAP).
Reasoning across Ontologically Distinct Levels: Students' Understandings of Molecular Genetics
ERIC Educational Resources Information Center
Duncan, Ravit Golan; Reiser, Brian J.
2007-01-01
In this article we apply a novel analytical framework to explore students' difficulties in understanding molecular genetics--a domain that is particularly challenging to learn. Our analytical framework posits that reasoning in molecular genetics entails mapping across ontologically distinct levels--an information level containing the genetic…
Modeling patient safety incidents knowledge with the Categorial Structure method.
Souvignet, Julien; Bousquet, Cédric; Lewalle, Pierre; Trombert-Paviot, Béatrice; Rodrigues, Jean Marie
2011-01-01
Following the WHO initiative named World Alliance for Patient Safety (PS) launched in 2004 a conceptual framework developed by PS national reporting experts has summarized the knowledge available. As a second step, the Department of Public Health of the University of Saint Etienne team elaborated a Categorial Structure (a semi formal structure not related to an upper level ontology) identifying the elements of the semantic structure underpinning the broad concepts contained in the framework for patient safety. This knowledge engineering method has been developed to enable modeling patient safety information as a prerequisite for subsequent full ontology development. The present article describes the semantic dissection of the concepts, the elicitation of the ontology requirements and the domain constraints of the conceptual framework. This ontology includes 134 concepts and 25 distinct relations and will serve as basis for an Information Model for Patient Safety.
Cultivating Peace through Design Thinking: Problem Solving with PAST Foundation
ERIC Educational Resources Information Center
Deaner, Kat; McCreery-Kellert, Heather
2018-01-01
Design thinking is a methodology that emphasizes reasoning and decision-making as part of the problem-solving process. It is a structured framework for identifying challenges, gathering information, generating potential solutions, refining ideas, and testing solutions. Design thinking offers valuable skills that will serve students well as they…
Assessment for Intervention: A Problem-Solving Approach
ERIC Educational Resources Information Center
Brown-Chidsey, Rachel, Ed.
2005-01-01
This cutting-edge volume offers a complete primer on conducting problem-solving based assessments in school or clinical settings. Presented are an effective framework and up-to-date tools for identifying and remediating the many environmental factors that may contribute to a student's academic, emotional, or behavioral difficulties, and for…
Including Critical Thinking and Problem Solving in Physical Education
ERIC Educational Resources Information Center
Pill, Shane; SueSee, Brendan
2017-01-01
Many physical education curriculum frameworks include statements about the inclusion of critical inquiry processes and the development of creativity and problem-solving skills. The learning environment created by physical education can encourage or limit the application and development of the learners' cognitive resources for critical and creative…
The Relationship of Drawing and Mathematical Problem Solving: "Draw for Math" Tasks
ERIC Educational Resources Information Center
Edens, Kellah; Potter, Ellen
2007-01-01
This study examines a series of children's drawings ("Draw for Math" tasks) to determine the relationship of students' spatial understanding and mathematical problem solving. Level of spatial understanding was assessed by applying the framework of central conceptual structures suggested by Case (1996), a cognitive developmental researcher.…
ERIC Educational Resources Information Center
Eseryel, Deniz; Ge, Xun; Ifenthaler, Dirk; Law, Victor
2011-01-01
Following a design-based research framework, this article reports two empirical studies with an educational MMOG, called "McLarin's Adventures," on facilitating 9th-grade students' complex problem-solving skill acquisition in interdisciplinary STEM education. The article discusses the nature of complex and ill-structured problem solving…
ERIC Educational Resources Information Center
Belland, Brian R.
2011-01-01
Problem solving is an important skill in the knowledge economy. Research indicates that the development of problem solving skills works better in the context of instructional approaches centered on real-world problems. But students need scaffolding to be successful in such instruction. In this paper I present a conceptual framework for…
Instance-Based Ontology Matching for Open and Distance Learning Materials
ERIC Educational Resources Information Center
Cerón-Figueroa, Sergio; López-Yáñez, Itzamá; Villuendas-Rey, Yenny; Camacho-Nieto, Oscar; Aldape-Pérez, Mario; Yáñez-Márquez, Cornelio
2017-01-01
The present work describes an original associative model of pattern classification and its application to align different ontologies containing Learning Objects (LOs), which are in turn related to Open and Distance Learning (ODL) educative content. The problem of aligning ontologies is known as Ontology Matching Problem (OMP), whose solution is…
Structural design using equilibrium programming formulations
NASA Technical Reports Server (NTRS)
Scotti, Stephen J.
1995-01-01
Solutions to increasingly larger structural optimization problems are desired. However, computational resources are strained to meet this need. New methods will be required to solve increasingly larger problems. The present approaches to solving large-scale problems involve approximations for the constraints of structural optimization problems and/or decomposition of the problem into multiple subproblems that can be solved in parallel. An area of game theory, equilibrium programming (also known as noncooperative game theory), can be used to unify these existing approaches from a theoretical point of view (considering the existence and optimality of solutions), and be used as a framework for the development of new methods for solving large-scale optimization problems. Equilibrium programming theory is described, and existing design techniques such as fully stressed design and constraint approximations are shown to fit within its framework. Two new structural design formulations are also derived. The first new formulation is another approximation technique which is a general updating scheme for the sensitivity derivatives of design constraints. The second new formulation uses a substructure-based decomposition of the structure for analysis and sensitivity calculations. Significant computational benefits of the new formulations compared with a conventional method are demonstrated.
Parikh, Priti P; Minning, Todd A; Nguyen, Vinh; Lalithsena, Sarasi; Asiaee, Amir H; Sahoo, Satya S; Doshi, Prashant; Tarleton, Rick; Sheth, Amit P
2012-01-01
Research on the biology of parasites requires a sophisticated and integrated computational platform to query and analyze large volumes of data, representing both unpublished (internal) and public (external) data sources. Effective analysis of an integrated data resource using knowledge discovery tools would significantly aid biologists in conducting their research, for example, through identifying various intervention targets in parasites and in deciding the future direction of ongoing as well as planned projects. A key challenge in achieving this objective is the heterogeneity between the internal lab data, usually stored as flat files, Excel spreadsheets or custom-built databases, and the external databases. Reconciling the different forms of heterogeneity and effectively integrating data from disparate sources is a nontrivial task for biologists and requires a dedicated informatics infrastructure. Thus, we developed an integrated environment using Semantic Web technologies that may provide biologists the tools for managing and analyzing their data, without the need for acquiring in-depth computer science knowledge. We developed a semantic problem-solving environment (SPSE) that uses ontologies to integrate internal lab data with external resources in a Parasite Knowledge Base (PKB), which has the ability to query across these resources in a unified manner. The SPSE includes Web Ontology Language (OWL)-based ontologies, experimental data with its provenance information represented using the Resource Description Format (RDF), and a visual querying tool, Cuebee, that features integrated use of Web services. We demonstrate the use and benefit of SPSE using example queries for identifying gene knockout targets of Trypanosoma cruzi for vaccine development. Answers to these queries involve looking up multiple sources of data, linking them together and presenting the results. The SPSE facilitates parasitologists in leveraging the growing, but disparate, parasite data resources by offering an integrative platform that utilizes Semantic Web techniques, while keeping their workload increase minimal.
NASA Astrophysics Data System (ADS)
Gómez A, Héctor F.; Martínez-Tomás, Rafael; Arias Tapia, Susana A.; Rincón Zamorano, Mariano
2014-04-01
Automatic systems that monitor human behaviour for detecting security problems are a challenge today. Previously, our group defined the Horus framework, which is a modular architecture for the integration of multi-sensor monitoring stages. In this work, structure and technologies required for high-level semantic stages of Horus are proposed, and the associated methodological principles established with the aim of recognising specific behaviours and situations. Our methodology distinguishes three semantic levels of events: low level (compromised with sensors), medium level (compromised with context), and high level (target behaviours). The ontology for surveillance and ubiquitous computing has been used to integrate ontologies from specific domains and together with semantic technologies have facilitated the modelling and implementation of scenes and situations by reusing components. A home context and a supermarket context were modelled following this approach, where three suspicious activities were monitored via different virtual sensors. The experiments demonstrate that our proposals facilitate the rapid prototyping of this kind of systems.
Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A
2015-01-01
This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.
NASA Astrophysics Data System (ADS)
Hoehn, Jessica R.; Finkelstein, Noah D.
2018-06-01
As part of a research study on student reasoning in quantum mechanics, we examine students' use of ontologies, or the way students' categorically organize entities they are reasoning about. In analyzing three episodes of focus group discussions with modern physics students, we present evidence of the dynamic nature of ontologies, and refine prior theoretical frameworks for thinking about dynamic ontologies. We find that in a given reasoning episode ontologies can be dynamic in construction (referring to when the reasoner constructs the ontologies) or application (referring to which ontologies are applied in a given reasoning episode). In our data, we see instances of students flexibly switching back and forth between parallel stable structures as well as constructing and negotiating new ontologies in the moment. Methodologically, we use a collective conceptual blending framework as an analytic tool for capturing student reasoning in groups. In this research, we value the messiness of student reasoning and argue that reasoning in a tentative manner can be productive for students learning quantum mechanics. As such, we shift away from a binary view of student learning which sees students as either having the correct answer or not.
Automated Database Mediation Using Ontological Metadata Mappings
Marenco, Luis; Wang, Rixin; Nadkarni, Prakash
2009-01-01
Objective To devise an automated approach for integrating federated database information using database ontologies constructed from their extended metadata. Background One challenge of database federation is that the granularity of representation of equivalent data varies across systems. Dealing effectively with this problem is analogous to dealing with precoordinated vs. postcoordinated concepts in biomedical ontologies. Model Description The authors describe an approach based on ontological metadata mapping rules defined with elements of a global vocabulary, which allows a query specified at one granularity level to fetch data, where possible, from databases within the federation that use different granularities. This is implemented in OntoMediator, a newly developed production component of our previously described Query Integrator System. OntoMediator's operation is illustrated with a query that accesses three geographically separate, interoperating databases. An example based on SNOMED also illustrates the applicability of high-level rules to support the enforcement of constraints that can prevent inappropriate curator or power-user actions. Summary A rule-based framework simplifies the design and maintenance of systems where categories of data must be mapped to each other, for the purpose of either cross-database query or for curation of the contents of compositional controlled vocabularies. PMID:19567801
Alignment of the UMLS semantic network with BioTop: methodology and assessment.
Schulz, Stefan; Beisswanger, Elena; van den Hoek, László; Bodenreider, Olivier; van Mulligen, Erik M
2009-06-15
For many years, the Unified Medical Language System (UMLS) semantic network (SN) has been used as an upper-level semantic framework for the categorization of terms from terminological resources in biomedicine. BioTop has recently been developed as an upper-level ontology for the biomedical domain. In contrast to the SN, it is founded upon strict ontological principles, using OWL DL as a formal representation language, which has become standard in the semantic Web. In order to make logic-based reasoning available for the resources annotated or categorized with the SN, a mapping ontology was developed aligning the SN with BioTop. The theoretical foundations and the practical realization of the alignment are being described, with a focus on the design decisions taken, the problems encountered and the adaptations of BioTop that became necessary. For evaluation purposes, UMLS concept pairs obtained from MEDLINE abstracts by a named entity recognition system were tested for possible semantic relationships. Furthermore, all semantic-type combinations that occur in the UMLS Metathesaurus were checked for satisfiability. The effort-intensive alignment process required major design changes and enhancements of BioTop and brought up several design errors that could be fixed. A comparison between a human curator and the ontology yielded only a low agreement. Ontology reasoning was also used to successfully identify 133 inconsistent semantic-type combinations. BioTop, the OWL DL representation of the UMLS SN, and the mapping ontology are available at http://www.purl.org/biotop/.
Quality assurance of the gene ontology using abstraction networks.
Ochs, Christopher; Perl, Yehoshua; Halper, Michael; Geller, James; Lomax, Jane
2016-06-01
The gene ontology (GO) is used extensively in the field of genomics. Like other large and complex ontologies, quality assurance (QA) efforts for GO's content can be laborious and time consuming. Abstraction networks (AbNs) are summarization networks that reveal and highlight high-level structural and hierarchical aggregation patterns in an ontology. They have been shown to successfully support QA work in the context of various ontologies. Two kinds of AbNs, called the area taxonomy and the partial-area taxonomy, are developed for GO hierarchies and derived specifically for the biological process (BP) hierarchy. Within this framework, several QA heuristics, based on the identification of groups of anomalous terms which exhibit certain taxonomy-defined characteristics, are introduced. Such groups are expected to have higher error rates when compared to other terms. Thus, by focusing QA efforts on anomalous terms one would expect to find relatively more erroneous content. By automatically identifying these potential problem areas within an ontology, time and effort will be saved during manual reviews of GO's content. BP is used as a testbed, with samples of three kinds of anomalous BP terms chosen for a taxonomy-based QA review. Additional heuristics for QA are demonstrated. From the results of this QA effort, it is observed that different kinds of inconsistencies in the modeling of GO can be exposed with the use of the proposed heuristics. For comparison, the results of QA work on a sample of terms chosen from GO's general population are presented.
Imam, Fahim T.; Larson, Stephen D.; Bandrowski, Anita; Grethe, Jeffery S.; Gupta, Amarnath; Martone, Maryann E.
2012-01-01
An initiative of the NIH Blueprint for neuroscience research, the Neuroscience Information Framework (NIF) project advances neuroscience by enabling discovery and access to public research data and tools worldwide through an open source, semantically enhanced search portal. One of the critical components for the overall NIF system, the NIF Standardized Ontologies (NIFSTD), provides an extensive collection of standard neuroscience concepts along with their synonyms and relationships. The knowledge models defined in the NIFSTD ontologies enable an effective concept-based search over heterogeneous types of web-accessible information entities in NIF’s production system. NIFSTD covers major domains in neuroscience, including diseases, brain anatomy, cell types, sub-cellular anatomy, small molecules, techniques, and resource descriptors. Since the first production release in 2008, NIF has grown significantly in content and functionality, particularly with respect to the ontologies and ontology-based services that drive the NIF system. We present here on the structure, design principles, community engagement, and the current state of NIFSTD ontologies. PMID:22737162
Tutorial on Protein Ontology Resources
Arighi, Cecilia; Drabkin, Harold; Christie, Karen R.; Ross, Karen; Natale, Darren
2017-01-01
The Protein Ontology (PRO) is the reference ontology for proteins in the Open Biomedical Ontologies (OBO) foundry and consists of three sub-ontologies representing protein classes of homologous genes, proteoforms (e.g., splice isoforms, sequence variants, and post-translationally modified forms), and protein complexes. PRO defines classes of proteins and protein complexes, both species-specific and species non-specific, and indicates their relationships in a hierarchical framework, supporting accurate protein annotation at the appropriate level of granularity, analyses of protein conservation across species, and semantic reasoning. In this first section of this chapter, we describe the PRO framework including categories of PRO terms and the relationship of PRO to other ontologies and protein resources. Next, we provide a tutorial about the PRO website (proconsortium.org) where users can browse and search the PRO hierarchy, view reports on individual PRO terms, and visualize relationships among PRO terms in a hierarchical table view, a multiple sequence alignment view, and a Cytoscape network view. Finally, we describe several examples illustrating the unique and rich information available in PRO. PMID:28150233
KaBOB: ontology-based semantic integration of biomedical databases.
Livingston, Kevin M; Bada, Michael; Baumgartner, William A; Hunter, Lawrence E
2015-04-23
The ability to query many independent biological databases using a common ontology-based semantic model would facilitate deeper integration and more effective utilization of these diverse and rapidly growing resources. Despite ongoing work moving toward shared data formats and linked identifiers, significant problems persist in semantic data integration in order to establish shared identity and shared meaning across heterogeneous biomedical data sources. We present five processes for semantic data integration that, when applied collectively, solve seven key problems. These processes include making explicit the differences between biomedical concepts and database records, aggregating sets of identifiers denoting the same biomedical concepts across data sources, and using declaratively represented forward-chaining rules to take information that is variably represented in source databases and integrating it into a consistent biomedical representation. We demonstrate these processes and solutions by presenting KaBOB (the Knowledge Base Of Biomedicine), a knowledge base of semantically integrated data from 18 prominent biomedical databases using common representations grounded in Open Biomedical Ontologies. An instance of KaBOB with data about humans and seven major model organisms can be built using on the order of 500 million RDF triples. All source code for building KaBOB is available under an open-source license. KaBOB is an integrated knowledge base of biomedical data representationally based in prominent, actively maintained Open Biomedical Ontologies, thus enabling queries of the underlying data in terms of biomedical concepts (e.g., genes and gene products, interactions and processes) rather than features of source-specific data schemas or file formats. KaBOB resolves many of the issues that routinely plague biomedical researchers intending to work with data from multiple data sources and provides a platform for ongoing data integration and development and for formal reasoning over a wealth of integrated biomedical data.
Using Ontological Engineering to Overcome AI-ED Problems: Contribution, Impact and Perspectives
ERIC Educational Resources Information Center
Mizoguchi, Riichiro; Bourdeau, Jacqueline
2016-01-01
This article reflects on the ontology engineering methodology discussed by the paper entitled "Using Ontological Engineering to Overcome AI-ED Problems" published in this journal in 2000. We discuss the achievements obtained in the last 10 years, the impact of our work as well as recent trends and perspectives in ontology engineering for…
Reasoning about Resources and Hierarchical Tasks Using OWL and SWRL
NASA Astrophysics Data System (ADS)
Elenius, Daniel; Martin, David; Ford, Reginald; Denker, Grit
Military training and testing events are highly complex affairs, potentially involving dozens of legacy systems that need to interoperate in a meaningful way. There are superficial interoperability concerns (such as two systems not sharing the same messaging formats), but also substantive problems such as different systems not sharing the same understanding of the terrain, positions of entities, and so forth. We describe our approach to facilitating such events: describe the systems and requirements in great detail using ontologies, and use automated reasoning to automatically find and help resolve problems. The complexity of our problem took us to the limits of what one can do with OWL, and we needed to introduce some innovative techniques of using and extending it. We describe our novel ways of using SWRL and discuss its limitations as well as extensions to it that we found necessary or desirable. Another innovation is our representation of hierarchical tasks in OWL, and an engine that reasons about them. Our task ontology has proved to be a very flexible and expressive framework to describe requirements on resources and their capabilities in order to achieve some purpose.
A Cognitive Simulator for Learning the Nature of Human Problem Solving
NASA Astrophysics Data System (ADS)
Miwa, Kazuhisa
Problem solving is understood as a process through which states of problem solving are transferred from the initial state to the goal state by applying adequate operators. Within this framework, knowledge and strategies are given as operators for the search. One of the most important points of researchers' interest in the domain of problem solving is to explain the performance of problem solving behavior based on the knowledge and strategies that the problem solver has. We call the interplay between problem solvers' knowledge/strategies and their behavior the causal relation between mental operations and behavior. It is crucially important, we believe, for novice learners in this domain to understand the causal relation between mental operations and behavior. Based on this insight, we have constructed a learning system in which learners can control mental operations of a computational agent that solves a task, such as knowledge, heuristics, and cognitive capacity, and can observe its behavior. We also introduce this system to a university class, and discuss which findings were discovered by the participants.
ERIC Educational Resources Information Center
Swanson, H. Lee
1982-01-01
An information processing approach to the assessment of learning disabled students' intellectual performance is presented. The model is based on the assumption that intelligent behavior is comprised of a variety of problem- solving strategies. An account of child problem solving is explained and illustrated with a "thinking aloud" protocol.…
Structuring Video Cases to Support Future Teachers' Problem Solving
ERIC Educational Resources Information Center
Kale, Ugur; Whitehouse, Pamela
2012-01-01
This study examined preservice teachers' problem-solving skills through the use of an online video case study. Eighty preservice teachers participated in the study with a three-level video presentation by a two-grade-level between-subjects factorial design. The study incorporates a content analysis framework to examine both the components and the…
ERIC Educational Resources Information Center
Artzt, Alice F.; Armour-Thomas, Eleanor
1998-01-01
Uses a "teaching as problem solving" perspective to examine the components of metacognition underlying the instructional practice of seven experienced and seven beginning secondary-school mathematics teachers. Data analysis of observations, lesson plans, videotapes, and audiotapes of structured interviews suggests that the metacognition of…
Problem Solving Learning Environments and Assessment: A Knowledge Space Theory Approach
ERIC Educational Resources Information Center
Reimann, Peter; Kickmeier-Rust, Michael; Albert, Dietrich
2013-01-01
This paper explores the relation between problem solving learning environments (PSLEs) and assessment concepts. The general framework of evidence-centered assessment design is used to describe PSLEs in terms of assessment concepts, and to identify similarities between the process of assessment design and of PSLE design. We use a recently developed…
Possibilities: A Framework for Modeling Students' Deductive Reasoning in Physics
ERIC Educational Resources Information Center
Gaffney, Jonathan David Housley
2010-01-01
Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning…
A Problem-Solving Template for Integrating Qualitative and Quantitative Physics Instruction
ERIC Educational Resources Information Center
Fink, Janice M.; Mankey, Gary J.
2010-01-01
A problem-solving template enables a methodology of instruction that integrates aspects of both sequencing and conceptual learning. It is designed to enhance critical-thinking skills when used within the framework of a learner-centered approach to teaching, where regular, thorough assessments of student learning are key components of the…
MAUVE: A New Strategy for Solving and Grading Physics Problems
ERIC Educational Resources Information Center
Hill, Nicole Breanne
2016-01-01
MAUVE (magnitude, answer, units, variables, and equations) is a framework and rubric to help students and teachers through the process of clearly solving and assessing solutions to introductory physics problems. Success in introductory physics often derives from an understanding of units, a command over dimensional analysis, and good bookkeeping.…
ERIC Educational Resources Information Center
De Corte, Erik; Verschaffel, Lieven; Masui, Chris
2004-01-01
A major challenge for education and educational research is to build on our present understanding of learning for designing environments for education that are conducive to fostering in students self-regulatory and cooperative learning skills, transferable knowledge, and a disposition toward competent thinking and problem solving. Taking into…
NASA Astrophysics Data System (ADS)
Kelly, Regina; McLoughlin, Eilish; Finlayson, Odilla E.
2016-07-01
An interdisciplinary science course has been implemented at a university with the intention of providing students the opportunity to develop a range of key skills in relation to: real-world connections of science, problem-solving, information and communications technology use and team while linking subject knowledge in each of the science disciplines. One of the problems used in this interdisciplinary course has been selected to evaluate if it affords students the opportunity to explicitly display problem-solving processes. While the benefits of implementing problem-based learning have been well reported, far less research has been devoted to methods of assessing student problem-solving solutions. A problem-solving theoretical framework was used as a tool to assess student written solutions to indicate if problem-solving processes were present. In two academic years, student problem-solving processes were satisfactory for exploring and understanding, representing and formulating, and planning and executing, indicating that student collaboration on problems is a good initiator of developing these processes. In both academic years, students displayed poor monitoring and reflecting (MR) processes at the intermediate level. A key impact of evaluating student work in this way is that it facilitated meaningful feedback about the students' problem-solving process rather than solely assessing the correctness of problem solutions.
ERIC Educational Resources Information Center
Hickendorff, Marian
2013-01-01
The results of an exploratory study into measurement of elementary mathematics ability are presented. The focus is on the abilities involved in solving standard computation problems on the one hand and problems presented in a realistic context on the other. The objectives were to assess to what extent these abilities are shared or distinct, and…
Solving Word Problems using Schemas: A Review of the Literature
Powell, Sarah R.
2011-01-01
Solving word problems is a difficult task for students at-risk for or with learning disabilities (LD). One instructional approach that has emerged as a valid method for helping students at-risk for or with LD to become more proficient at word-problem solving is using schemas. A schema is a framework for solving a problem. With a schema, students are taught to recognize problems as falling within word-problem types and to apply a problem solution method that matches that problem type. This review highlights two schema approaches for 2nd- and 3rd-grade students at-risk for or with LD: schema-based instruction and schema-broadening instruction. A total of 12 schema studies were reviewed and synthesized. Both types of schema approaches enhanced the word-problem skill of students at-risk for or with LD. Based on the review, suggestions are provided for incorporating word-problem instruction using schemas. PMID:21643477
Self-aligned quadruple patterning-compliant placement
NASA Astrophysics Data System (ADS)
Nakajima, Fumiharu; Kodama, Chikaaki; Nakayama, Koichi; Nojima, Shigeki; Kotani, Toshiya
2015-03-01
Self-Aligned Quadruple Patterning (SAQP) will be one of the leading candidates for sub-14nm node and beyond. However, compared with triple patterning, making a feasible standard cell placement has following problems. (1) When coloring conflicts occur between two adjoining cells, they may not be solved easily since SAQP layout has stronger coloring constraints. (2) SAQP layout cannot use stitch to solve coloring conflict. In this paper, we present a framework of SAQP-aware standard cell placement considering the above problems. When standard cell is placed, the proposed method tries to solve coloring conflicts between two cells by exchanging two of three colors. If some conflicts remain between adjoining cells, dummy space will be inserted to keep coloring constraints of SAQP. We show some examples to confirm effectiveness of the proposed framework. To our best knowledge, this is the first framework of SAQP-aware standard cell placement.
Knowledge Discovery from Biomedical Ontologies in Cross Domains.
Shen, Feichen; Lee, Yugyung
2016-01-01
In recent years, there is an increasing demand for sharing and integration of medical data in biomedical research. In order to improve a health care system, it is required to support the integration of data by facilitating semantic interoperability systems and practices. Semantic interoperability is difficult to achieve in these systems as the conceptual models underlying datasets are not fully exploited. In this paper, we propose a semantic framework, called Medical Knowledge Discovery and Data Mining (MedKDD), that aims to build a topic hierarchy and serve the semantic interoperability between different ontologies. For the purpose, we fully focus on the discovery of semantic patterns about the association of relations in the heterogeneous information network representing different types of objects and relationships in multiple biological ontologies and the creation of a topic hierarchy through the analysis of the discovered patterns. These patterns are used to cluster heterogeneous information networks into a set of smaller topic graphs in a hierarchical manner and then to conduct cross domain knowledge discovery from the multiple biological ontologies. Thus, patterns made a greater contribution in the knowledge discovery across multiple ontologies. We have demonstrated the cross domain knowledge discovery in the MedKDD framework using a case study with 9 primary biological ontologies from Bio2RDF and compared it with the cross domain query processing approach, namely SLAP. We have confirmed the effectiveness of the MedKDD framework in knowledge discovery from multiple medical ontologies.
Knowledge Discovery from Biomedical Ontologies in Cross Domains
Shen, Feichen; Lee, Yugyung
2016-01-01
In recent years, there is an increasing demand for sharing and integration of medical data in biomedical research. In order to improve a health care system, it is required to support the integration of data by facilitating semantic interoperability systems and practices. Semantic interoperability is difficult to achieve in these systems as the conceptual models underlying datasets are not fully exploited. In this paper, we propose a semantic framework, called Medical Knowledge Discovery and Data Mining (MedKDD), that aims to build a topic hierarchy and serve the semantic interoperability between different ontologies. For the purpose, we fully focus on the discovery of semantic patterns about the association of relations in the heterogeneous information network representing different types of objects and relationships in multiple biological ontologies and the creation of a topic hierarchy through the analysis of the discovered patterns. These patterns are used to cluster heterogeneous information networks into a set of smaller topic graphs in a hierarchical manner and then to conduct cross domain knowledge discovery from the multiple biological ontologies. Thus, patterns made a greater contribution in the knowledge discovery across multiple ontologies. We have demonstrated the cross domain knowledge discovery in the MedKDD framework using a case study with 9 primary biological ontologies from Bio2RDF and compared it with the cross domain query processing approach, namely SLAP. We have confirmed the effectiveness of the MedKDD framework in knowledge discovery from multiple medical ontologies. PMID:27548262
Interoperability between phenotype and anatomy ontologies.
Hoehndorf, Robert; Oellrich, Anika; Rebholz-Schuhmann, Dietrich
2010-12-15
Phenotypic information is important for the analysis of the molecular mechanisms underlying disease. A formal ontological representation of phenotypic information can help to identify, interpret and infer phenotypic traits based on experimental findings. The methods that are currently used to represent data and information about phenotypes fail to make the semantics of the phenotypic trait explicit and do not interoperate with ontologies of anatomy and other domains. Therefore, valuable resources for the analysis of phenotype studies remain unconnected and inaccessible to automated analysis and reasoning. We provide a framework to formalize phenotypic descriptions and make their semantics explicit. Based on this formalization, we provide the means to integrate phenotypic descriptions with ontologies of other domains, in particular anatomy and physiology. We demonstrate how our framework leads to the capability to represent disease phenotypes, perform powerful queries that were not possible before and infer additional knowledge. http://bioonto.de/pmwiki.php/Main/PheneOntology.
Robot, computer problem solving system
NASA Technical Reports Server (NTRS)
Becker, J. D.
1972-01-01
The development of a computer problem solving system is reported that considers physical problems faced by an artificial robot moving around in a complex environment. Fundamental interaction constraints with a real environment are simulated for the robot by visual scan and creation of an internal environmental model. The programming system used in constructing the problem solving system for the simulated robot and its simulated world environment is outlined together with the task that the system is capable of performing. A very general framework for understanding the relationship between an observed behavior and an adequate description of that behavior is included.
Advancing data reuse in phyloinformatics using an ontology-driven Semantic Web approach.
Panahiazar, Maryam; Sheth, Amit P; Ranabahu, Ajith; Vos, Rutger A; Leebens-Mack, Jim
2013-01-01
Phylogenetic analyses can resolve historical relationships among genes, organisms or higher taxa. Understanding such relationships can elucidate a wide range of biological phenomena, including, for example, the importance of gene and genome duplications in the evolution of gene function, the role of adaptation as a driver of diversification, or the evolutionary consequences of biogeographic shifts. Phyloinformaticists are developing data standards, databases and communication protocols (e.g. Application Programming Interfaces, APIs) to extend the accessibility of gene trees, species trees, and the metadata necessary to interpret these trees, thus enabling researchers across the life sciences to reuse phylogenetic knowledge. Specifically, Semantic Web technologies are being developed to make phylogenetic knowledge interpretable by web agents, thereby enabling intelligently automated, high-throughput reuse of results generated by phylogenetic research. This manuscript describes an ontology-driven, semantic problem-solving environment for phylogenetic analyses and introduces artefacts that can promote phyloinformatic efforts to promote accessibility of trees and underlying metadata. PhylOnt is an extensible ontology with concepts describing tree types and tree building methodologies including estimation methods, models and programs. In addition we present the PhylAnt platform for annotating scientific articles and NeXML files with PhylOnt concepts. The novelty of this work is the annotation of NeXML files and phylogenetic related documents with PhylOnt Ontology. This approach advances data reuse in phyloinformatics.
ERIC Educational Resources Information Center
Sternberg, Robert J.
1979-01-01
An information-processing framework is presented for understanding intelligence. Two levels of processing are discussed: the steps involved in solving a complex intellectual task, and higher-order processes used to decide how to solve the problem. (MH)
Positive Youth Development and Nutrition: Interdisciplinary Strategies to Enhance Student Outcomes
ERIC Educational Resources Information Center
Edwards, Oliver W.; Cheeley, Taylor
2016-01-01
Educational policies require the use of data and progress monitoring frameworks to guide instruction and intervention in schools. As a result, different problem-solving models such as multitiered systems of supports (MTSS) have emerged that use these frameworks to improve student outcomes. However, problem-focused models emphasize negative…
Metaphor and analogy in everyday problem solving.
Keefer, Lucas A; Landau, Mark J
2016-11-01
Early accounts of problem solving focused on the ways people represent information directly related to target problems and possible solutions. Subsequent theory and research point to the role of peripheral influences such as heuristics and bodily states. We discuss how metaphor and analogy similarly influence stages of everyday problem solving: Both processes mentally map features of a target problem onto the structure of a relatively more familiar concept. When individuals apply this structure, they use a well-known concept as a framework for reasoning about real world problems and candidate solutions. Early studies found that analogy use helped people gain insight into novel problems. More recent research on metaphor goes further to show that activating mappings has subtle, sometimes surprising effects on judgment and reasoning in everyday problem solving. These findings highlight situations in which mappings can help or hinder efforts to solve problems. WIREs Cogn Sci 2016, 7:394-405. doi: 10.1002/wcs.1407 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.
Learning Resources Organization Using Ontological Framework
NASA Astrophysics Data System (ADS)
Gavrilova, Tatiana; Gorovoy, Vladimir; Petrashen, Elena
The paper describes the ontological approach to the knowledge structuring for the e-learning portal design as it turns out to be efficient and relevant to current domain conditions. It is primarily based on the visual ontology-based description of the content of the learning materials and this helps to provide productive and personalized access to these materials. The experience of ontology developing for Knowledge Engineering coursetersburg State University is discussed and “OntolingeWiki” tool for creating ontology-based e-learning portals is described.
Modulated evaluation metrics for drug-based ontologies.
Amith, Muhammad; Tao, Cui
2017-04-24
Research for ontology evaluation is scarce. If biomedical ontological datasets and knowledgebases are to be widely used, there needs to be quality control and evaluation for the content and structure of the ontology. This paper introduces how to effectively utilize a semiotic-inspired approach to ontology evaluation, specifically towards drug-related ontologies hosted on the National Center for Biomedical Ontology BioPortal. Using the semiotic-based evaluation framework for drug-based ontologies, we adjusted the quality metrics based on the semiotic features of drug ontologies. Then, we compared the quality scores before and after tailoring. The scores revealed a more precise measurement and a closer distribution compared to the before-tailoring. The results of this study reveal that a tailored semiotic evaluation produced a more meaningful and accurate assessment of drug-based ontologies, lending to the possible usefulness of semiotics in ontology evaluation.
Critical Ontology for an Enactive Music Pedagogy
ERIC Educational Resources Information Center
van der Schyff, Dylan; Schiavio, Andrea; Elliott, David J.
2016-01-01
An enactive approach to music education is explored through the lens of critical ontology. Assumptions central to Western academic music culture are critically discussed; and the concept of "ontological education" is introduced as an alternative framework. We argue that this orientation embraces more primordial ways of knowing and being,…
GeoSciGraph: An Ontological Framework for EarthCube Semantic Infrastructure
NASA Astrophysics Data System (ADS)
Gupta, A.; Schachne, A.; Condit, C.; Valentine, D.; Richard, S.; Zaslavsky, I.
2015-12-01
The CINERGI (Community Inventory of EarthCube Resources for Geosciences Interoperability) project compiles an inventory of a wide variety of earth science resources including documents, catalogs, vocabularies, data models, data services, process models, information repositories, domain-specific ontologies etc. developed by research groups and data practitioners. We have developed a multidisciplinary semantic framework called GeoSciGraph semantic ingration of earth science resources. An integrated ontology is constructed with Basic Formal Ontology (BFO) as its upper ontology and currently ingests multiple component ontologies including the SWEET ontology, GeoSciML's lithology ontology, Tematres controlled vocabulary server, GeoNames, GCMD vocabularies on equipment, platforms and institutions, software ontology, CUAHSI hydrology vocabulary, the environmental ontology (ENVO) and several more. These ontologies are connected through bridging axioms; GeoSciGraph identifies lexically close terms and creates equivalence class or subclass relationships between them after human verification. GeoSciGraph allows a community to create community-specific customizations of the integrated ontology. GeoSciGraph uses the Neo4J,a graph database that can hold several billion concepts and relationships. GeoSciGraph provides a number of REST services that can be called by other software modules like the CINERGI information augmentation pipeline. 1) Vocabulary services are used to find exact and approximate terms, term categories (community-provided clusters of terms e.g., measurement-related terms or environmental material related terms), synonyms, term definitions and annotations. 2) Lexical services are used for text parsing to find entities, which can then be included into the ontology by a domain expert. 3) Graph services provide the ability to perform traversal centric operations e.g., finding paths and neighborhoods which can be used to perform ontological operations like computing transitive closure (e.g., finding all subclasses of rocks). 4) Annotation services are used to adorn an arbitrary block of text (e.g., from a NOAA catalog record) with ontology terms. The system has been used to ontologically integrate diverse sources like Science-base, NOAA records, PETDB.
Hybrid Optimization Parallel Search PACKage
DOE Office of Scientific and Technical Information (OSTI.GOV)
2009-11-10
HOPSPACK is open source software for solving optimization problems without derivatives. Application problems may have a fully nonlinear objective function, bound constraints, and linear and nonlinear constraints. Problem variables may be continuous, integer-valued, or a mixture of both. The software provides a framework that supports any derivative-free type of solver algorithm. Through the framework, solvers request parallel function evaluation, which may use MPI (multiple machines) or multithreading (multiple processors/cores on one machine). The framework provides a Cache and Pending Cache of saved evaluations that reduces execution time and facilitates restarts. Solvers can dynamically create other algorithms to solve subproblems, amore » useful technique for handling multiple start points and integer-valued variables. HOPSPACK ships with the Generating Set Search (GSS) algorithm, developed at Sandia as part of the APPSPACK open source software project.« less
Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris
2016-07-08
This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) BACKGROUND: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) METHODS: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) RESULTS: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) CONCLUSION: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database.
Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris
2016-01-01
This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) Background: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) Methods: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) Results: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) Conclusion: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database. PMID:27399717
Ontology-Based Retrieval of Spatially Related Objects for Location Based Services
NASA Astrophysics Data System (ADS)
Haav, Hele-Mai; Kaljuvee, Aivi; Luts, Martin; Vajakas, Toivo
Advanced Location Based Service (LBS) applications have to integrate information stored in GIS, information about users' preferences (profile) as well as contextual information and information about application itself. Ontology engineering provides methods to semantically integrate several data sources. We propose an ontology-driven LBS development framework: the paper describes the architecture of ontologies and their usage for retrieval of spatially related objects relevant to the user. Our main contribution is to enable personalised ontology driven LBS by providing a novel approach for defining personalised semantic spatial relationships by means of ontologies. The approach is illustrated by an industrial case study.
TNM-O: ontology support for staging of malignant tumours.
Boeker, Martin; França, Fábio; Bronsert, Peter; Schulz, Stefan
2016-11-14
Objectives of this work are to (1) present an ontological framework for the TNM classification system, (2) exemplify this framework by an ontology for colon and rectum tumours, and (3) evaluate this ontology by assigning TNM classes to real world pathology data. The TNM ontology uses the Foundational Model of Anatomy for anatomical entities and BioTopLite 2 as a domain top-level ontology. General rules for the TNM classification system and the specific TNM classification for colorectal tumours were axiomatised in description logic. Case-based information was collected from tumour documentation practice in the Comprehensive Cancer Centre of a large university hospital. Based on the ontology, a module was developed that classifies pathology data. TNM was represented as an information artefact, which consists of single representational units. Corresponding to every representational unit, tumours and tumour aggregates were defined. Tumour aggregates consist of the primary tumour and, if existing, of infiltrated regional lymph nodes and distant metastases. TNM codes depend on the location and certain qualities of the primary tumour (T), the infiltrated regional lymph nodes (N) and the existence of distant metastases (M). Tumour data from clinical and pathological documentation were successfully classified with the ontology. A first version of the TNM Ontology represents the TNM system for the description of the anatomical extent of malignant tumours. The present work demonstrates its representational power and completeness as well as its applicability for classification of instance data.
Open ended intelligence: the individuation of intelligent agents
NASA Astrophysics Data System (ADS)
Weinbaum Weaver, David; Veitas, Viktoras
2017-03-01
Artificial general intelligence is a field of research aiming to distil the principles of intelligence that operate independently of a specific problem domain and utilise these principles in order to synthesise systems capable of performing any intellectual task a human being is capable of and beyond. While "narrow" artificial intelligence which focuses on solving specific problems such as speech recognition, text comprehension, visual pattern recognition and robotic motion has shown impressive breakthroughs lately, understanding general intelligence remains elusive. We propose a paradigm shift from intelligence perceived as a competence of individual agents defined in relation to an a priori given problem domain or a goal, to intelligence perceived as a formative process of self-organisation. We call this process open-ended intelligence. Starting with a brief introduction of the current conceptual approach, we expose a number of serious limitations that are traced back to the ontological roots of the concept of intelligence. Open-ended intelligence is then developed as an abstraction of the process of human cognitive development, so its application can be extended to general agents and systems. We introduce and discuss three facets of the idea: the philosophical concept of individuation, sense-making and the individuation of general cognitive agents. We further show how open-ended intelligence can be framed in terms of a distributed, self-organising network of interacting elements and how such process is scalable. The framework highlights an important relation between coordination and intelligence and a new understanding of values.
Science Curriculum Components Favored by Taiwanese Biology Teachers
NASA Astrophysics Data System (ADS)
Lin, Chen-Yung; Hu, Reping; Changlai, Miao-Li
2005-09-01
The new 1-9 curriculum framework in Taiwan provides a remarkable change from previous frameworks in terms of the coverage of content and the powers of teachers. This study employs a modified repertory grid technique to investigate biology teachers' preferences with regard to six curriculum components. One hundred and eighty-five in-service and pre-service biology teachers were asked to determine which science curriculum components they liked and disliked most of all to include in their biology classes. The data show that the rank order of these science curriculum components, from top to bottom, was as follows: application of science, manipulation skills, scientific concepts, social/ethical issues, problem-solving skills, and the history of science. They also showed that pre-service biology teachers, as compared with in-service biology teachers, favored problem-solving skills significantly more than manipulative skills, while in-service biology teachers, as compared with pre-service biology teachers, favored manipulative skills significantly more than problem-solving skills. Some recommendations for ensuring the successful implementation of the Taiwanese 1-9 curriculum framework are also proposed.
Technology for a Purpose: Technology for Information Problem-Solving with the Big6[R].
ERIC Educational Resources Information Center
Eisenberg, Mike B
2003-01-01
Explains the Big6 model of information problem solving as a conceptual framework for learning and teaching information and technology skills. Highlights include information skills; examples of integrating technology in Big6 contexts; and the Big6 and the Internet, including email, listservs, chat, Web browsers, search engines, portals, Web…
ERIC Educational Resources Information Center
Yavuz, Ahmet
2015-01-01
This study aims to investigate (1) students' trust in mathematics calculation versus intuition in a physics problem solving and (2) whether this trust is related to achievement in physics in the context of epistemic game theoretical framework. To achieve this research objective, paper-pencil and interview sessions were conducted. A paper-pencil…
The Embodiment of Cases as Alternative Perspective in a Mathematics Hypermedia Learning Environment
ERIC Educational Resources Information Center
Valentine, Keri D.; Kopcha, Theodore J.
2016-01-01
This paper presents a design framework for cases as alternative perspectives (Jonassen in Learning to solve problems: a handbook for designing problem-solving learning environments, 2011a) in the context of K-12 mathematics. Using the design-based research strategy of conjecture mapping, the design of cases for a hypermedia site is described…
ERIC Educational Resources Information Center
Guerra, Norma S.
2009-01-01
Graphic organizers are powerful visual tools. The representation provides dimension and relationship to ideas and a framework for organization and elaboration. The LIBRE Stick Figure Tool is a graphic organizer for the problem-solving application of the LIBRE Model counseling approach. It resembles a "stick person" and offers the teacher and…
ERIC Educational Resources Information Center
McLennan, Natasha A.; Arthur, Nancy
1999-01-01
Outlines an expanded framework of the Cognitive Information Processing (CIP) approach to career problem solving and decision making for career counseling with women. Addresses structural and individual barriers in women's career development and provides practical suggestions for applying and evaluating the CIP approach in career counseling.…
An ontology-based framework for bioinformatics workflows.
Digiampietri, Luciano A; Perez-Alcazar, Jose de J; Medeiros, Claudia Bauzer
2007-01-01
The proliferation of bioinformatics activities brings new challenges - how to understand and organise these resources, how to exchange and reuse successful experimental procedures, and to provide interoperability among data and tools. This paper describes an effort toward these directions. It is based on combining research on ontology management, AI and scientific workflows to design, reuse and annotate bioinformatics experiments. The resulting framework supports automatic or interactive composition of tasks based on AI planning techniques and takes advantage of ontologies to support the specification and annotation of bioinformatics workflows. We validate our proposal with a prototype running on real data.
NASA Astrophysics Data System (ADS)
Ning, Po; Feng, Zhi-Qiang; Quintero, Juan Antonio Rojas; Zhou, Yang-Jing; Peng, Lei
2018-03-01
This paper deals with elastic and elastic-plastic fretting problems. The wear gap is taken into account along with the initial contact distance to obtain the Signorini conditions. Both the Signorini conditions and the Coulomb friction laws are written in a compact form. Within the bipotential framework, an augmented Lagrangian method is applied to calculate the contact forces. The Archard wear law is then used to calculate the wear gap at the contact surface. The local fretting problems are solved via the Uzawa algorithm. Numerical examples are performed to show the efficiency and accuracy of the proposed approach. The influence of plasticity has been discussed.
Ontology-Based Annotation of Learning Object Content
ERIC Educational Resources Information Center
Gasevic, Dragan; Jovanovic, Jelena; Devedzic, Vladan
2007-01-01
The paper proposes a framework for building ontology-aware learning object (LO) content. Previously ontologies were exclusively employed for enriching LOs' metadata. Although such an approach is useful, as it improves retrieval of relevant LOs from LO repositories, it does not enable one to reuse components of a LO, nor to incorporate an explicit…
Arighi, Cecilia; Shamovsky, Veronica; Masci, Anna Maria; Ruttenberg, Alan; Smith, Barry; Natale, Darren A; Wu, Cathy; D'Eustachio, Peter
2015-01-01
The Protein Ontology (PRO) provides terms for and supports annotation of species-specific protein complexes in an ontology framework that relates them both to their components and to species-independent families of complexes. Comprehensive curation of experimentally known forms and annotations thereof is expected to expose discrepancies, differences, and gaps in our knowledge. We have annotated the early events of innate immune signaling mediated by Toll-Like Receptor 3 and 4 complexes in human, mouse, and chicken. The resulting ontology and annotation data set has allowed us to identify species-specific gaps in experimental data and possible functional differences between species, and to employ inferred structural and functional relationships to suggest plausible resolutions of these discrepancies and gaps.
Particle Filter with State Permutations for Solving Image Jigsaw Puzzles
Yang, Xingwei; Adluru, Nagesh; Latecki, Longin Jan
2016-01-01
We deal with an image jigsaw puzzle problem, which is defined as reconstructing an image from a set of square and non-overlapping image patches. It is known that a general instance of this problem is NP-complete, and it is also challenging for humans, since in the considered setting the original image is not given. Recently a graphical model has been proposed to solve this and related problems. The target label probability function is then maximized using loopy belief propagation. We also formulate the problem as maximizing a label probability function and use exactly the same pairwise potentials. Our main contribution is a novel inference approach in the sampling framework of Particle Filter (PF). Usually in the PF framework it is assumed that the observations arrive sequentially, e.g., the observations are naturally ordered by their time stamps in the tracking scenario. Based on this assumption, the posterior density over the corresponding hidden states is estimated. In the jigsaw puzzle problem all observations (puzzle pieces) are given at once without any particular order. Therefore, we relax the assumption of having ordered observations and extend the PF framework to estimate the posterior density by exploring different orders of observations and selecting the most informative permutations of observations. This significantly broadens the scope of applications of the PF inference. Our experimental results demonstrate that the proposed inference framework significantly outperforms the loopy belief propagation in solving the image jigsaw puzzle problem. In particular, the extended PF inference triples the accuracy of the label assignment compared to that using loopy belief propagation. PMID:27795660
Thematic clustering of text documents using an EM-based approach
2012-01-01
Clustering textual contents is an important step in mining useful information on the web or other text-based resources. The common task in text clustering is to handle text in a multi-dimensional space, and to partition documents into groups, where each group contains documents that are similar to each other. However, this strategy lacks a comprehensive view for humans in general since it cannot explain the main subject of each cluster. Utilizing semantic information can solve this problem, but it needs a well-defined ontology or pre-labeled gold standard set. In this paper, we present a thematic clustering algorithm for text documents. Given text, subject terms are extracted and used for clustering documents in a probabilistic framework. An EM approach is used to ensure documents are assigned to correct subjects, hence it converges to a locally optimal solution. The proposed method is distinctive because its results are sufficiently explanatory for human understanding as well as efficient for clustering performance. The experimental results show that the proposed method provides a competitive performance compared to other state-of-the-art approaches. We also show that the extracted themes from the MEDLINE® dataset represent the subjects of clusters reasonably well. PMID:23046528
One More Time: The Need for More Mathematical Problem Solving and What the Research Says about It
ERIC Educational Resources Information Center
Woodward, John
2013-01-01
This article reviews recent research in math problem solving for students with learning disabilities. Two recently published syntheses of research on mathematics by the Institute of Education Sciences (IES) are used as frameworks for interpreting this body of work. A significant amount of the work in special education over the last decade is…
Problem Solving and the Use of Digital Technologies within the Mathematical Working Space Framework
ERIC Educational Resources Information Center
Santos-Trigo, Manuel; Moreno-Armella, Luis; Camacho-Machín, Matías
2016-01-01
The aim of this study is to analyze and document the extent to which high school teachers rely on a set of technology affordances to articulate epistemological and cognitive actions in problem solving approaches. Participants were encouraged to construct dynamic representations of tasks and always to look for different ways to identify and support…
ERIC Educational Resources Information Center
Begeny, John C.; Schulte, Ann C.; Johnson, Kent
2012-01-01
This book presents a schoolwide model of instructional support designed to make the most of available time, resources, and personnel--one that is also fully compatible with other problem-solving models, such as response to intervention. The authors provide a comprehensive and cohesive framework for linking assessment and intervention. They show…
ERIC Educational Resources Information Center
Aurah, Catherine Muhonja
2013-01-01
Within the framework of social cognitive theory, the influence of self-efficacy beliefs and metacognitive prompting on genetics problem solving ability among high school students in Kenya was examined through a mixed methods research design. A quasi-experimental study, supplemented by focus group interviews, was conducted to investigate both the…
Problem Solving in Mathematics: Focus for the Future. 1987. Senior High School Monograph.
ERIC Educational Resources Information Center
Alberta Dept. of Education, Edmonton. Curriculum Branch.
This monograph was developed with the intention of addressing the concerns of high school mathematics teachers in Alberta (Canada) who want to base their programs on problem solving but have questions about effective and efficient ways to do so. Considered are the most basic philosophical questions, and a framework is provided to use in solving…
OMOGENIA: A Semantically Driven Collaborative Environment
NASA Astrophysics Data System (ADS)
Liapis, Aggelos
Ontology creation can be thought of as a social procedure. Indeed the concepts involved in general need to be elicited from communities of domain experts and end-users by teams of knowledge engineers. Many problems in ontology creation appear to resemble certain problems in software design, particularly with respect to the setup of collaborative systems. For instance, the resolution of conceptual conflicts between formalized ontologies is a major engineering problem as ontologies move into widespread use on the semantic web. Such conflict resolution often requires human collaboration and cannot be achieved by automated methods with the exception of simple cases. In this chapter we discuss research in the field of computer-supported cooperative work (CSCW) that focuses on classification and which throws light on ontology building. Furthermore, we present a semantically driven collaborative environment called OMOGENIA as a natural way to display and examine the structure of an evolving ontology in a collaborative setting.
ER2OWL: Generating OWL Ontology from ER Diagram
NASA Astrophysics Data System (ADS)
Fahad, Muhammad
Ontology is the fundamental part of Semantic Web. The goal of W3C is to bring the web into (its full potential) a semantic web with reusing previous systems and artifacts. Most legacy systems have been documented in structural analysis and structured design (SASD), especially in simple or Extended ER Diagram (ERD). Such systems need up-gradation to become the part of semantic web. In this paper, we present ERD to OWL-DL ontology transformation rules at concrete level. These rules facilitate an easy and understandable transformation from ERD to OWL. The set of rules for transformation is tested on a structured analysis and design example. The framework provides OWL ontology for semantic web fundamental. This framework helps software engineers in upgrading the structured analysis and design artifact ERD, to components of semantic web. Moreover our transformation tool, ER2OWL, reduces the cost and time for building OWL ontologies with the reuse of existing entity relationship models.
Semantic Web meets Integrative Biology: a survey.
Chen, Huajun; Yu, Tong; Chen, Jake Y
2013-01-01
Integrative Biology (IB) uses experimental or computational quantitative technologies to characterize biological systems at the molecular, cellular, tissue and population levels. IB typically involves the integration of the data, knowledge and capabilities across disciplinary boundaries in order to solve complex problems. We identify a series of bioinformatics problems posed by interdisciplinary integration: (i) data integration that interconnects structured data across related biomedical domains; (ii) ontology integration that brings jargons, terminologies and taxonomies from various disciplines into a unified network of ontologies; (iii) knowledge integration that integrates disparate knowledge elements from multiple sources; (iv) service integration that build applications out of services provided by different vendors. We argue that IB can benefit significantly from the integration solutions enabled by Semantic Web (SW) technologies. The SW enables scientists to share content beyond the boundaries of applications and websites, resulting into a web of data that is meaningful and understandable to any computers. In this review, we provide insight into how SW technologies can be used to build open, standardized and interoperable solutions for interdisciplinary integration on a global basis. We present a rich set of case studies in system biology, integrative neuroscience, bio-pharmaceutics and translational medicine, to highlight the technical features and benefits of SW applications in IB.
The MGED ontology: a framework for describing functional genomics experiments.
Stoeckert, Christian J; Parkinson, Helen
2003-01-01
The Microarray Gene Expression Data (MGED) society was formed with an initial focus on experiments involving microarray technology. Despite the diversity of applications, there are common concepts used and a common need to capture experimental information in a standardized manner. In building the MGED ontology, it was recognized that it would be impractical to cover all the different types of experiments on all the different types of organisms by listing and defining all the types of organisms and their properties. Our solution was to create a framework for describing microarray experiments with an initial focus on the biological sample and its manipulation. For concepts that are common for many species, we could provide a manageable listing of controlled terms. For concepts that are species-specific or whose values cannot be readily listed, we created an 'OntologyEntry' concept that referenced an external resource. The MGED ontology is a work in progress that needs additional instances and particularly needs constraints to be added. The ontology currently covers the experimental sample and design, and we have begun capturing aspects of the microarrays themselves as well. The primary application of the ontology will be to develop forms for entering information into databases, and consequently allowing queries, taking advantage of the structure provided by the ontology. The application of an ontology of experimental conditions extends beyond microarray experiments and, as the scope of MGED includes other aspects of functional genomics, so too will the MGED ontology.
Dynamic motion planning of 3D human locomotion using gradient-based optimization.
Kim, Hyung Joo; Wang, Qian; Rahmatalla, Salam; Swan, Colby C; Arora, Jasbir S; Abdel-Malek, Karim; Assouline, Jose G
2008-06-01
Since humans can walk with an infinite variety of postures and limb movements, there is no unique solution to the modeling problem to predict human gait motions. Accordingly, we test herein the hypothesis that the redundancy of human walking mechanisms makes solving for human joint profiles and force time histories an indeterminate problem best solved by inverse dynamics and optimization methods. A new optimization-based human-modeling framework is thus described for predicting three-dimensional human gait motions on level and inclined planes. The basic unknowns in the framework are the joint motion time histories of a 25-degree-of-freedom human model and its six global degrees of freedom. The joint motion histories are calculated by minimizing an objective function such as deviation of the trunk from upright posture that relates to the human model's performance. A variety of important constraints are imposed on the optimization problem, including (1) satisfaction of dynamic equilibrium equations by requiring the model's zero moment point (ZMP) to lie within the instantaneous geometrical base of support, (2) foot collision avoidance, (3) limits on ground-foot friction, and (4) vanishing yawing moment. Analytical forms of objective and constraint functions are presented and discussed for the proposed human-modeling framework in which the resulting optimization problems are solved using gradient-based mathematical programming techniques. When the framework is applied to the modeling of bipedal locomotion on level and inclined planes, acyclic human walking motions that are smooth and realistic as opposed to less natural robotic motions are obtained. The aspects of the modeling framework requiring further investigation and refinement, as well as potential applications of the framework in biomechanics, are discussed.
Epidemiology and causation: a realist view.
Renton, A
1994-01-01
In this paper the controversy over how to decide whether associations between factors and diseases are causal is placed within a description of the public health and scientific relevance of epidemiology. It is argued that the rise in popularity of the Popperian view of science, together with a perception of the aims of epidemiology as being to identify appropriate public health interventions, have focussed this debate on unresolved questions of inferential logic, leaving largely unanalysed the notions of causation and of disease at the ontological level. A realist ontology of causation of disease and pathogenesis is constructed within the framework of "scientific materialism", and is shown to provide a coherent basis from which to decide causes and to deal with problems of confounding and interaction in epidemiological research. It is argued that a realist analysis identifies a richer role for epidemiology as an integral part of an ontologically unified medical science. It is this unified medical science as a whole rather than epidemiological observation or experiment which decides causes and, in turn, provides a key element to the foundations of rational public health decision making. PMID:8138775
Evaluation of research in biomedical ontologies
Dumontier, Michel; Gkoutos, Georgios V.
2013-01-01
Ontologies are now pervasive in biomedicine, where they serve as a means to standardize terminology, to enable access to domain knowledge, to verify data consistency and to facilitate integrative analyses over heterogeneous biomedical data. For this purpose, research on biomedical ontologies applies theories and methods from diverse disciplines such as information management, knowledge representation, cognitive science, linguistics and philosophy. Depending on the desired applications in which ontologies are being applied, the evaluation of research in biomedical ontologies must follow different strategies. Here, we provide a classification of research problems in which ontologies are being applied, focusing on the use of ontologies in basic and translational research, and we demonstrate how research results in biomedical ontologies can be evaluated. The evaluation strategies depend on the desired application and measure the success of using an ontology for a particular biomedical problem. For many applications, the success can be quantified, thereby facilitating the objective evaluation and comparison of research in biomedical ontology. The objective, quantifiable comparison of research results based on scientific applications opens up the possibility for systematically improving the utility of ontologies in biomedical research. PMID:22962340
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2013-12-01
Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.
ERIC Educational Resources Information Center
Hanyak, Michael E., Jr.
2015-01-01
In an introductory chemical engineering course, the conceptual framework of a holistic problem-solving methodology in conjunction with a problem-based learning approach has been shown to create a learning environment that nurtures deep learning rather than surface learning. Based on exam scores, student grades are either the same or better than…
Ontology Research and Development. Part 1-A Review of Ontology Generation.
ERIC Educational Resources Information Center
Ding, Ying; Foo, Schubert
2002-01-01
Discusses the role of ontology in knowledge representation, including enabling content-based access, interoperability, communications, and new levels of service on the Semantic Web; reviews current ontology generation studies and projects as well as problems facing such research; and discusses ontology mapping, information extraction, natural…
Internet computer coaches for introductory physics problem solving
NASA Astrophysics Data System (ADS)
Xu Ryan, Qing
The ability to solve problems in a variety of contexts is becoming increasingly important in our rapidly changing technological society. Problem-solving is a complex process that is important for everyday life and crucial for learning physics. Although there is a great deal of effort to improve student problem solving skills throughout the educational system, national studies have shown that the majority of students emerge from such courses having made little progress toward developing good problem-solving skills. The Physics Education Research Group at the University of Minnesota has been developing Internet computer coaches to help students become more expert-like problem solvers. During the Fall 2011 and Spring 2013 semesters, the coaches were introduced into large sections (200+ students) of the calculus based introductory mechanics course at the University of Minnesota. This dissertation, will address the research background of the project, including the pedagogical design of the coaches and the assessment of problem solving. The methodological framework of conducting experiments will be explained. The data collected from the large-scale experimental studies will be discussed from the following aspects: the usage and usability of these coaches; the usefulness perceived by students; and the usefulness measured by final exam and problem solving rubric. It will also address the implications drawn from this study, including using this data to direct future coach design and difficulties in conducting authentic assessment of problem-solving.
Knowledge Management within the Medical University.
Rauzina, Svetlana Ye; Tikhonova, Tatiana A; Karpenko, Dmitriy S; Bogopolskiy, Gennady A; Zarubina, Tatiana V
2015-01-01
The aim of the work is studying the possibilities of ontological engineering in managing of medical knowledge. And also practical implementation of knowledge management system (KMS) in medical university. The educational process model is established that allows analyzing learning results within time scale. Glossary sub-system has been developed; ontologies of educational disciplines are constructed; environment for setup and solution of situational cases is established; ontological approach to assess competencies is developed. The possibilities of the system for solving situation tasks have been described. The approach to the evaluation of competence has been developed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ling Zou; Hongbin Zhang; Jess Gehin
A coupled TH/Neutronics/CRUD framework, which is able to simulate the CRUD deposits impact on CIPS phenomenon, was described in this paper. This framework includes the coupling among three essential physics, thermal-hydraulics, CRUD and neutronics. The overall framework was implemented by using the CFD software STAR-CCM+, developing CRUD codes, and using the neutronics code DeCART. The coupling was implemented by exchanging data between softwares using intermediate exchange files. A typical 3 by 3 PWR fuel pin problem was solved under this framework. The problem was solved in a 12 months length period of time. Time-dependent solutions were provided, including CRUD depositsmore » inventory and their distributions on fuels, boron hideout amount inside CRUD deposits, as well as power shape changing over time. The results clearly showed the power shape suppression in regions where CRUD deposits exist, which is a strong indication of CIPS phenomenon.« less
Engineering neural systems for high-level problem solving.
Sylvester, Jared; Reggia, James
2016-07-01
There is a long-standing, sometimes contentious debate in AI concerning the relative merits of a symbolic, top-down approach vs. a neural, bottom-up approach to engineering intelligent machine behaviors. While neurocomputational methods excel at lower-level cognitive tasks (incremental learning for pattern classification, low-level sensorimotor control, fault tolerance and processing of noisy data, etc.), they are largely non-competitive with top-down symbolic methods for tasks involving high-level cognitive problem solving (goal-directed reasoning, metacognition, planning, etc.). Here we take a step towards addressing this limitation by developing a purely neural framework named galis. Our goal in this work is to integrate top-down (non-symbolic) control of a neural network system with more traditional bottom-up neural computations. galis is based on attractor networks that can be "programmed" with temporal sequences of hand-crafted instructions that control problem solving by gating the activity retention of, communication between, and learning done by other neural networks. We demonstrate the effectiveness of this approach by showing that it can be applied successfully to solve sequential card matching problems, using both human performance and a top-down symbolic algorithm as experimental controls. Solving this kind of problem makes use of top-down attention control and the binding together of visual features in ways that are easy for symbolic AI systems but not for neural networks to achieve. Our model can not only be instructed on how to solve card matching problems successfully, but its performance also qualitatively (and sometimes quantitatively) matches the performance of both human subjects that we had perform the same task and the top-down symbolic algorithm that we used as an experimental control. We conclude that the core principles underlying the galis framework provide a promising approach to engineering purely neurocomputational systems for problem-solving tasks that in people require higher-level cognitive functions. Copyright © 2016 Elsevier Ltd. All rights reserved.
Creativity: Creativity in Complex Military Systems
2017-05-25
generation later in the problem-solving process. The design process is an alternative problem-solving framework individuals or groups use to orient...no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control ...the potential of their formations. 15. SUBJECT TERMS Creativity, Divergent Thinking, Design , Systems Thinking, Operational Art 16. SECURITY
I.M. Sechenov (1829 - 1905) and the scientific self-understanding for medical sciences.
Kofler, Walter
2007-01-01
There is no discussion about the historic relevance of I. Sechenov for physiology and neurosciences as the "father of Russian modern physiology". But he is relevant for modern natural science too because of his basic epistemological and ontological work. He did not accept the up to now basic paradigm of "Ignorabimus" which can be seen as the reason to exclude even the generalizable aspects of individuality, creativity and spontaneity from natural science. He developed techniques for empirical based science to deal with materialistic and idealistic aspects of the comprehensive person the "ignoramus" according to the actual stay of knowledge and the acceptable ontologies. He demonstrated that ontologies ("paradigms") can be used as tools according to the given problem which should be solved. So Sechenov can be seen as a precursor of the so efficient philosophical positions of Einstein and Th. Kuhn. The stay of the art in physiology and neurosciences changed since the time of Sechenov dramatically. Therefore the philosophical positions of the 19th century should be discussed. Maybe this is indispensable for the needed linkage between materialistic and idealistic aspects of a person. For this the proposals of Sechenov are helpful up to now but nearly unknown. There is no discussion about the historic relevance of I. Sechenov as the "father of Russian physiology." But he is relevant for modern natural science too because of his epistemological and ontological work. He did not accept the up to now basic paradigm of "Ignorabimus" that can be seen as the reason to exclude even the generalizable aspects of individuality, creativity, and spontaneity from natural science. He demonstrated that ontologies ("paradigms") and epistemology can be used as tools according to the given problem. So Sechenov can be seen as a precursor of the so efficient philosophical positions of Einstein and Th. Kuhn. The state of the art changed dramatically. Therefore, the philosophical positions of the nineteenth century should be questioned. Maybe this is indispensable for the needed link between materialistic and idealistic aspects of a person as a whole. In this respect the proposals of Sechenov are helpful for medical science in the twenty-first century too but nearly unknown.
ERIC Educational Resources Information Center
Mikula, Brendon D.; Heckler, Andrew F.
2017-01-01
We propose a framework for improving accuracy, fluency, and retention of basic skills essential for solving problems relevant to STEM introductory courses, and implement the framework for the case of basic vector math skills over several semesters in an introductory physics course. Using an iterative development process, the framework begins with…
The missions and means framework as an ontology
NASA Astrophysics Data System (ADS)
Deitz, Paul H.; Bray, Britt E.; Michaelis, James R.
2016-05-01
The analysis of warfare frequently suffers from an absence of logical structure for a] specifying explicitly the military mission and b] quantitatively evaluating the mission utility of alternative products and services. In 2003, the Missions and Means Framework (MMF) was developed to redress these shortcomings. The MMF supports multiple combatants, levels of war and, in fact, is a formal embodiment of the Military Decision-Making Process (MDMP). A major effect of incomplete analytic discipline in military systems analyses is that they frequently fall into the category of ill-posed problems in which they are under-specified, under-determined, or under-constrained. Critical context is often missing. This is frequently the result of incomplete materiel requirements analyses which have unclear linkages to higher levels of warfare, system-of-systems linkages, tactics, techniques and procedures, and the effect of opposition forces. In many instances the capabilities of materiel are assumed to be immutable. This is a result of not assessing how platform components morph over time due to damage, logistics, or repair. Though ill-posed issues can be found many places in military analysis, probably the greatest challenge comes in the disciplines of C4ISR supported by ontologies in which formal naming and definition of the types, properties, and interrelationships of the entities are fundamental to characterizing mission success. Though the MMF was not conceived as an ontology, over the past decade some workers, particularly in the field of communication, have labelled the MMF as such. This connection will be described and discussed.
Adaptive leadership and person-centered care: a new approach to solving problems.
Corazzini, Kirsten N; Anderson, Ruth A
2014-01-01
Successfully transitioning to person-centered care in nursing homes requires a new approach to solving care issues. The adaptive leadership framework suggests that expert providers must support frontline caregivers in their efforts to develop high-quality, person-centered solutions.
ERIC Educational Resources Information Center
Belser, Christopher T.; Shillingford, M. Ann; Joe, J. Richelle
2016-01-01
The American School Counselor Association (ASCA) National Model and a multi-tiered system of supports (MTSS) both provide frameworks for systematically solving problems in schools, including student behavior concerns. The authors outline a model that integrates overlapping elements of the National Model and MTSS as a support for marginalized…
Towards ontology-driven navigation of the lipid bibliosphere
Baker, Christopher JO; Kanagasabai, Rajaraman; Ang, Wee Tiong; Veeramani, Anitha; Low, Hong-Sang; Wenk, Markus R
2008-01-01
Background The indexing of scientific literature and content is a relevant and contemporary requirement within life science information systems. Navigating information available in legacy formats continues to be a challenge both in enterprise and academic domains. The emergence of semantic web technologies and their fusion with artificial intelligence techniques has provided a new toolkit with which to address these data integration challenges. In the emerging field of lipidomics such navigation challenges are barriers to the translation of scientific results into actionable knowledge, critical to the treatment of diseases such as Alzheimer's syndrome, Mycobacterium infections and cancer. Results We present a literature-driven workflow involving document delivery and natural language processing steps generating tagged sentences containing lipid, protein and disease names, which are instantiated to custom designed lipid ontology. We describe the design challenges in capturing lipid nomenclature, the mandate of the ontology and its role as query model in the navigation of the lipid bibliosphere. We illustrate the extent of the description logic-based A-box query capability provided by the instantiated ontology using a graphical query composer to query sentences describing lipid-protein and lipid-disease correlations. Conclusion As scientists accept the need to readjust the manner in which we search for information and derive knowledge we illustrate a system that can constrain the literature explosion and knowledge navigation problems. Specifically we have focussed on solving this challenge for lipidomics researchers who have to deal with the lack of standardized vocabulary, differing classification schemes, and a wide array of synonyms before being able to derive scientific insights. The use of the OWL-DL variant of the Web Ontology Language (OWL) and description logic reasoning is pivotal in this regard, providing the lipid scientist with advanced query access to the results of text mining algorithms instantiated into the ontology. The visual query paradigm assists in the adoption of this technology. PMID:18315858
Towards ontology-driven navigation of the lipid bibliosphere.
Baker, Christopher Jo; Kanagasabai, Rajaraman; Ang, Wee Tiong; Veeramani, Anitha; Low, Hong-Sang; Wenk, Markus R
2008-01-01
The indexing of scientific literature and content is a relevant and contemporary requirement within life science information systems. Navigating information available in legacy formats continues to be a challenge both in enterprise and academic domains. The emergence of semantic web technologies and their fusion with artificial intelligence techniques has provided a new toolkit with which to address these data integration challenges. In the emerging field of lipidomics such navigation challenges are barriers to the translation of scientific results into actionable knowledge, critical to the treatment of diseases such as Alzheimer's syndrome, Mycobacterium infections and cancer. We present a literature-driven workflow involving document delivery and natural language processing steps generating tagged sentences containing lipid, protein and disease names, which are instantiated to custom designed lipid ontology. We describe the design challenges in capturing lipid nomenclature, the mandate of the ontology and its role as query model in the navigation of the lipid bibliosphere. We illustrate the extent of the description logic-based A-box query capability provided by the instantiated ontology using a graphical query composer to query sentences describing lipid-protein and lipid-disease correlations. As scientists accept the need to readjust the manner in which we search for information and derive knowledge we illustrate a system that can constrain the literature explosion and knowledge navigation problems. Specifically we have focussed on solving this challenge for lipidomics researchers who have to deal with the lack of standardized vocabulary, differing classification schemes, and a wide array of synonyms before being able to derive scientific insights. The use of the OWL-DL variant of the Web Ontology Language (OWL) and description logic reasoning is pivotal in this regard, providing the lipid scientist with advanced query access to the results of text mining algorithms instantiated into the ontology. The visual query paradigm assists in the adoption of this technology.
ERIC Educational Resources Information Center
Bachore, Zelalem
2012-01-01
Ontology not only is considered to be the backbone of the semantic web but also plays a significant role in distributed and heterogeneous information systems. However, ontology still faces limited application and adoption to date. One of the major problems is that prevailing engineering-oriented methodologies for building ontologies do not…
NASA Astrophysics Data System (ADS)
Kuzle, A.
2018-06-01
The important role that metacognition plays as a predictor for student mathematical learning and for mathematical problem-solving, has been extensively documented. But only recently has attention turned to primary grades, and more research is needed at this level. The goals of this paper are threefold: (1) to present metacognitive framework during mathematics problem-solving, (2) to describe their multi-method interview approach developed to study student mathematical metacognition, and (3) to empirically evaluate the utility of their model and the adaptation of their approach in the context of grade 2 and grade 4 mathematics problem-solving. The results are discussed not only with regard to further development of the adapted multi-method interview approach, but also with regard to their theoretical and practical implications.
Shaban-Nejad, Arash; Haarslev, Volker
2015-01-01
The issue of ontology evolution and change management is inadequately addressed by available tools and algorithms, mostly due to the lack of suitable knowledge representation formalisms to deal with temporal abstract notations and the overreliance on human factors. Also most of the current approaches have been focused on changes within the internal structure of ontologies and interactions with other existing ontologies have been widely neglected. In our research, after revealing and classifying some of the common alterations in a number of popular biomedical ontologies, we present a novel agent-based framework, Represent, Legitimate and Reproduce (RLR), to semi-automatically manage the evolution of bio-ontologies, with emphasis on the FungalWeb Ontology, with minimal human intervention. RLR assists and guides ontology engineers through the change management process in general and aids in tracking and representing the changes, particularly through the use of category theory and hierarchical graph transformation.
Work Strategies: The Development and Testing of a Model.
1986-03-01
strategies (e.g., Craik & Lockhart , 1972); hemispheric process - -7 ing differences (e.g., Seamon & Gazzaniga, 1973); problem-solving strategies (e.g...Charness, N. (1931). Aging and skilled problem solving. 3ournal of Experimental Psychology: General, 110, 21-38. Craik , F. I. \\., & Lockhart , R. S...1972). Levels of processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, L1, 671-684. 3ansereau, D. F., McDonald
ERIC Educational Resources Information Center
DeMeo, Stephen
2007-01-01
Common examples of graphic organizers include flow diagrams, concept maps, and decision trees. The author has created a novel type of graphic organizer called a decision map. A decision map is a directional heuristic that helps learners solve problems within a generic framework. It incorporates questions that the user must answer and contains…
Uciteli, Alexandr; Groß, Silvia; Kireyev, Sergej; Herre, Heinrich
2011-08-09
This paper presents an ontologically founded basic architecture for information systems, which are intended to capture, represent, and maintain metadata for various domains of clinical and epidemiological research. Clinical trials exhibit an important basis for clinical research, and the accurate specification of metadata and their documentation and application in clinical and epidemiological study projects represents a significant expense in the project preparation and has a relevant impact on the value and quality of these studies.An ontological foundation of an information system provides a semantic framework for the precise specification of those entities which are presented in this system. This semantic framework should be grounded, according to our approach, on a suitable top-level ontology. Such an ontological foundation leads to a deeper understanding of the entities of the domain under consideration, and provides a common unifying semantic basis, which supports the integration of data and the interoperability between different information systems.The intended information systems will be applied to the field of clinical and epidemiological research and will provide, depending on the application context, a variety of functionalities. In the present paper, we focus on a basic architecture which might be common to all such information systems. The research, set forth in this paper, is included in a broader framework of clinical research and continues the work of the IMISE on these topics.
From Information Society to Knowledge Society: The Ontology Issue
NASA Astrophysics Data System (ADS)
Roche, Christophe
2002-09-01
Information society, virtual enterprise, e-business rely more and more on communication and knowledge sharing between heterogeneous actors. But, no communication is possible, and all the more so no co-operation or collaboration, if those actors do not share the same or at least a compatible meaning for the terms they use. Ontology, understood as an agreed vocabulary of common terms and meanings, is a solution to that problem. Nevertheless, although there is quite a lot of experience in using ontologies, several barriers remain which stand against a real use of ontology. As a matter of fact, it is very difficult to build, reuse and share ontologies. We claim that the ontology problem requires a multidisciplinary approach based on sound epistemological, logical and linguistic principles. This article presents the Ontological Knowledge Station (OK Station©), a software environment for building and using ontologies which relies on such principles. The OK Station is currently being used in several industrial applications.
Development of an Adolescent Depression Ontology for Analyzing Social Data.
Jung, Hyesil; Park, Hyeoun-Ae; Song, Tae-Min; Jeon, Eunjoo; Kim, Ae Ran; Lee, Joo Yun
2015-01-01
Depression in adolescence is associated with significant suicidality. Therefore, it is important to detect the risk for depression and provide timely care to adolescents. This study aims to develop an ontology for collecting and analyzing social media data about adolescent depression. This ontology was developed using the 'ontology development 101'. The important terms were extracted from several clinical practice guidelines and postings on Social Network Service. We extracted 777 terms, which were categorized into 'risk factors', 'sign and symptoms', 'screening', 'diagnosis', 'treatment', and 'prevention'. An ontology developed in this study can be used as a framework to understand adolescent depression using unstructured data from social media.
Age-related differences in strategic monitoring during arithmetic problem solving.
Geurten, Marie; Lemaire, Patrick
2017-10-01
We examined the role of metacognitive monitoring in strategic behavior during arithmetic problem solving, a process that is expected to shed light on age-related differences in strategy selection. Young and older adults accomplished better strategy-judgment, better strategy-selection, and strategy-execution tasks. Data showed that participants made better strategy judgments when problems were problems with homogeneous unit digits (i.e., problems with both unit digits smaller or larger than 5; 31×62) relative to problems with heterogeneous unit digits (i.e., problems with one unit digit smaller or larger than 5; 31×67) and when the better strategy was cued on rounding-up problems (e.g., 68×23) compared to rounding-down problems (e.g., 36×53). Results also indicated higher rates of better strategy judgment in young than in older adults. These aging effects differed across problem types. Older adults made more accurate judgments on rounding-up problems than on rounding-down problems when the cued strategy was rounding-up, while young adults did not show such problem-related differences. Moreover, strategy selection correlated with strategy judgment, and even more so in older adults than in young adults. To discuss the implications of these findings, we propose a theoretical framework of how strategy judgments occur in young and older adults and discuss how this framework enables to understand relationships between metacognitive monitoring and strategic behaviors when participants solve arithmetic problems. Copyright © 2017 Elsevier B.V. All rights reserved.
A Framework and a Methodology for Developing Authentic Constructivist e-Learning Environments
ERIC Educational Resources Information Center
Zualkernan, Imran A.
2006-01-01
Semantically rich domains require operative knowledge to solve complex problems in real-world settings. These domains provide an ideal environment for developing authentic constructivist e-learning environments. In this paper we present a framework and a methodology for developing authentic learning environments for such domains. The framework is…
Problem-Framing: A perspective on environmental problem-solving
NASA Astrophysics Data System (ADS)
Bardwell, Lisa V.
1991-09-01
The specter of environmental calamity calls for the best efforts of an involved public. Ironically, the way people understand the issues all too often serves to discourage and frustrate rather than motivate them to action. This article draws from problem-solving perspectives offered by cognitive psychology and conflict management to examine a framework for thinking about environmental problems that promises to help rather than hinder efforts to address them. Problem-framing emphasizes focusing on the problem definition. Since how one defines a problem determines one's understanding of and approach to that problem, being able to redefine or reframe a problem and to explore the “problem space” can help broaden the range of alternatives and solutions examined. Problem-framing incorporates a cognitive perspective on how people respond to information. It explains why an emphasis on problem definition is not part of people's typical approach to problems. It recognizes the importance of structure and of having ways to organize that information on one's problem-solving effort. Finally, problem-framing draws on both cognitive psychology and conflict management for strategies to manage information and to create a problem-solving environment that not only encourages participation but can yield better approaches to our environmental problems.
Spatio-structural granularity of biological material entities
2010-01-01
Background With the continuously increasing demands on knowledge- and data-management that databases have to meet, ontologies and the theories of granularity they use become more and more important. Unfortunately, currently used theories and schemes of granularity unnecessarily limit the performance of ontologies due to two shortcomings: (i) they do not allow the integration of multiple granularity perspectives into one granularity framework; (ii) they are not applicable to cumulative-constitutively organized material entities, which cover most of the biomedical material entities. Results The above mentioned shortcomings are responsible for the major inconsistencies in currently used spatio-structural granularity schemes. By using the Basic Formal Ontology (BFO) as a top-level ontology and Keet's general theory of granularity, a granularity framework is presented that is applicable to cumulative-constitutively organized material entities. It provides a scheme for granulating complex material entities into their constitutive and regional parts by integrating various compositional and spatial granularity perspectives. Within a scale dependent resolution perspective, it even allows distinguishing different types of representations of the same material entity. Within other scale dependent perspectives, which are based on specific types of measurements (e.g. weight, volume, etc.), the possibility of organizing instances of material entities independent of their parthood relations and only according to increasing measures is provided as well. All granularity perspectives are connected to one another through overcrossing granularity levels, together forming an integrated whole that uses the compositional object perspective as an integrating backbone. This granularity framework allows to consistently assign structural granularity values to all different types of material entities. Conclusions The here presented framework provides a spatio-structural granularity framework for all domain reference ontologies that model cumulative-constitutively organized material entities. With its multi-perspectives approach it allows querying an ontology stored in a database at one's own desired different levels of detail: The contents of a database can be organized according to diverse granularity perspectives, which in their turn provide different views on its content (i.e. data, knowledge), each organized into different levels of detail. PMID:20509878
The Relationship between User Expertise and Structural Ontology Characteristics
ERIC Educational Resources Information Center
Waldstein, Ilya Michael
2014-01-01
Ontologies are commonly used to support application tasks such as natural language processing, knowledge management, learning, browsing, and search. Literature recommends considering specific context during ontology design, and highlights that a different context is responsible for problems in ontology reuse. However, there is still no clear…
NASA Astrophysics Data System (ADS)
Milbourne, Jeffrey David
The purpose of this dissertation study was to explore the experiences of high school physics students who were solving complex, ill-structured problems, in an effort to better understand how self-regulatory behavior mediated the project experience. Consistent with Voss, Green, Post, and Penner's (1983) conception of an ill-structured problem in the natural sciences, the 'problems' consisted of scientific research projects that students completed under the supervision of a faculty mentor. Zimmerman and Campillo's (2003) self-regulatory framework of problem solving provided a holistic guide to data collection and analysis of this multi-case study, with five individual student cases. The study's results are explored in two manuscripts, each targeting a different audience. The first manuscript, intended for the Science Education Research community, presents a thick, rich description of the students' project experiences, consistent with a qualitative, case study analysis. Findings suggest that intrinsic interest was an important self-regulatory factor that helped motivate students throughout their project work, and that the self-regulatory cycle of forethought, performance monitoring, and self-reflection was an important component of the problem-solving process. Findings also support the application of Zimmerman and Campillo's framework to complex, ill-structured problems, particularly the cyclical nature of the framework. Finally, this study suggests that scientific research projects, with the appropriate support, can be a mechanism for improving students' selfregulatory behavior. The second manuscript, intended for Physics practitioners, combines the findings of the first manuscript with the perspectives of the primary, on-site research mentor, who has over a decade's worth of experience mentoring students doing physics research. His experience suggests that a successful research experience requires certain characteristics, including: a slow, 'on-ramp' to the research experience, space to experience productive failure, and an opportunity to enjoy the work they are doing.
Jung, Hyesil; Park, Hyeoun-Ae; Song, Tae-Min
2017-07-24
Social networking services (SNSs) contain abundant information about the feelings, thoughts, interests, and patterns of behavior of adolescents that can be obtained by analyzing SNS postings. An ontology that expresses the shared concepts and their relationships in a specific field could be used as a semantic framework for social media data analytics. The aim of this study was to refine an adolescent depression ontology and terminology as a framework for analyzing social media data and to evaluate description logics between classes and the applicability of this ontology to sentiment analysis. The domain and scope of the ontology were defined using competency questions. The concepts constituting the ontology and terminology were collected from clinical practice guidelines, the literature, and social media postings on adolescent depression. Class concepts, their hierarchy, and the relationships among class concepts were defined. An internal structure of the ontology was designed using the entity-attribute-value (EAV) triplet data model, and superclasses of the ontology were aligned with the upper ontology. Description logics between classes were evaluated by mapping concepts extracted from the answers to frequently asked questions (FAQs) onto the ontology concepts derived from description logic queries. The applicability of the ontology was validated by examining the representability of 1358 sentiment phrases using the ontology EAV model and conducting sentiment analyses of social media data using ontology class concepts. We developed an adolescent depression ontology that comprised 443 classes and 60 relationships among the classes; the terminology comprised 1682 synonyms of the 443 classes. In the description logics test, no error in relationships between classes was found, and about 89% (55/62) of the concepts cited in the answers to FAQs mapped onto the ontology class. Regarding applicability, the EAV triplet models of the ontology class represented about 91.4% of the sentiment phrases included in the sentiment dictionary. In the sentiment analyses, "academic stresses" and "suicide" contributed negatively to the sentiment of adolescent depression. The ontology and terminology developed in this study provide a semantic foundation for analyzing social media data on adolescent depression. To be useful in social media data analysis, the ontology, especially the terminology, needs to be updated constantly to reflect rapidly changing terms used by adolescents in social media postings. In addition, more attributes and value sets reflecting depression-related sentiments should be added to the ontology. ©Hyesil Jung, Hyeoun-Ae Park, Tae-Min Song. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 24.07.2017.
Jung, Hyesil; Song, Tae-Min
2017-01-01
Background Social networking services (SNSs) contain abundant information about the feelings, thoughts, interests, and patterns of behavior of adolescents that can be obtained by analyzing SNS postings. An ontology that expresses the shared concepts and their relationships in a specific field could be used as a semantic framework for social media data analytics. Objective The aim of this study was to refine an adolescent depression ontology and terminology as a framework for analyzing social media data and to evaluate description logics between classes and the applicability of this ontology to sentiment analysis. Methods The domain and scope of the ontology were defined using competency questions. The concepts constituting the ontology and terminology were collected from clinical practice guidelines, the literature, and social media postings on adolescent depression. Class concepts, their hierarchy, and the relationships among class concepts were defined. An internal structure of the ontology was designed using the entity-attribute-value (EAV) triplet data model, and superclasses of the ontology were aligned with the upper ontology. Description logics between classes were evaluated by mapping concepts extracted from the answers to frequently asked questions (FAQs) onto the ontology concepts derived from description logic queries. The applicability of the ontology was validated by examining the representability of 1358 sentiment phrases using the ontology EAV model and conducting sentiment analyses of social media data using ontology class concepts. Results We developed an adolescent depression ontology that comprised 443 classes and 60 relationships among the classes; the terminology comprised 1682 synonyms of the 443 classes. In the description logics test, no error in relationships between classes was found, and about 89% (55/62) of the concepts cited in the answers to FAQs mapped onto the ontology class. Regarding applicability, the EAV triplet models of the ontology class represented about 91.4% of the sentiment phrases included in the sentiment dictionary. In the sentiment analyses, “academic stresses” and “suicide” contributed negatively to the sentiment of adolescent depression. Conclusions The ontology and terminology developed in this study provide a semantic foundation for analyzing social media data on adolescent depression. To be useful in social media data analysis, the ontology, especially the terminology, needs to be updated constantly to reflect rapidly changing terms used by adolescents in social media postings. In addition, more attributes and value sets reflecting depression-related sentiments should be added to the ontology. PMID:28739560
Content-oriented Approach to Organization of Theories and Its Utilization
NASA Astrophysics Data System (ADS)
Hayashi, Yusuke; Bourdeau, Jacqueline; Mizoguch, Riichiro
In spite of the fact that the relation between theory and practice is a foundation of scientific and technological development, the trend of increasing the gap between theory and practice accelerates in these years. The gap embraces a risk of distrust of science and technology. Ontological engineering as the content-oriented research is expected to contribute to the resolution of the gap. This paper presents the feasibility of organization of theoretical knowledge on ontological engineering and new-generation intelligent systems based on it through an application of ontological engineering in the area of learning/instruction support. This area also has the problem of the gap between theory and practice, and its resolution is strongly required. So far we proposed OMNIBUS ontology, which is a comprehensive ontology that covers different learning/instructional theories and paradigms, and SMARTIES, which is a theory-aware and standard-compliant authoring system for making learning/instructional scenarios based on OMNIBUS ontology. We believe the theory-awareness and standard-compliance bridge the gap between theory and practice because it links theories to practical use of standard technologies and enables practitioners to easily enjoy theoretical support while using standard technologies in practice. The following goals are set in order to achieve it; computers (1) understand a variety of learning/instructional theories based on the organization of them, (2) utilize the understanding for helping authors' learning/instructional scenario making and (3) make such theoretically sound scenarios interoperable within the framework of standard technologies. This paper suggests an ontological engineering solution to the achievement of these three goals. Although the evaluation is far from complete in terms of practical use, we believe that the results of this study address high-level technical challenges from the viewpoint of the current state of the art in the research area of artificial intelligence not only in education but also in general, and therefore we hope that constitute a substantial contribution for organization of theoretical knowledge in many other areas.
Ontology-supported research on vaccine efficacy, safety and integrative biological networks.
He, Yongqun
2014-07-01
While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including Vaccine Ontology, Ontology of Adverse Events and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network ('OneNet') Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms.
Ontology-supported Research on Vaccine Efficacy, Safety, and Integrative Biological Networks
He, Yongqun
2016-01-01
Summary While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including the Vaccine Ontology, Ontology of Adverse Events, and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network (“OneNet”) Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms. PMID:24909153
NASA Astrophysics Data System (ADS)
Hong, Haibo; Yin, Yuehong; Chen, Xing
2016-11-01
Despite the rapid development of computer science and information technology, an efficient human-machine integrated enterprise information system for designing complex mechatronic products is still not fully accomplished, partly because of the inharmonious communication among collaborators. Therefore, one challenge in human-machine integration is how to establish an appropriate knowledge management (KM) model to support integration and sharing of heterogeneous product knowledge. Aiming at the diversity of design knowledge, this article proposes an ontology-based model to reach an unambiguous and normative representation of knowledge. First, an ontology-based human-machine integrated design framework is described, then corresponding ontologies and sub-ontologies are established according to different purposes and scopes. Second, a similarity calculation-based ontology integration method composed of ontology mapping and ontology merging is introduced. The ontology searching-based knowledge sharing method is then developed. Finally, a case of human-machine integrated design of a large ultra-precision grinding machine is used to demonstrate the effectiveness of the method.
Boucher, Philip
2011-09-01
This article builds upon previous discussion of social and technical determinisms as implicit positions in the biofuel debate. To ensure these debates are balanced, it has been suggested that they should be designed to contain a variety of deterministic positions. Whilst it is agreed that determinism does not feature strongly in contemporary academic literatures, it is found that they have generally been superseded by an absence of any substantive conceptualisation of how the social shaping of technology may be related to, or occur alongside, an objective or autonomous reality. The problem of determinism emerges at an ontological level and must be resolved in situ. A critical realist approach to technology is presented which may provide a more appropriate framework for debate. In dialogue with previous discussion, the distribution of responsibility is revisited with reference to the role of scientists and engineers.
The Role of Ontologies in Schema-based Program Synthesis
NASA Technical Reports Server (NTRS)
Bures, Tomas; Denney, Ewen; Fischer, Bernd; Nistor, Eugen C.
2004-01-01
Program synthesis is the process of automatically deriving executable code from (non-executable) high-level specifications. It is more flexible and powerful than conventional code generation techniques that simply translate algorithmic specifications into lower-level code or only create code skeletons from structural specifications (such as UML class diagrams). Key to building a successful synthesis system is specializing to an appropriate application domain. The AUTOBAYES and AUTOFILTER systems, under development at NASA Ames, operate in the two domains of data analysis and state estimation, respectively. The central concept of both systems is the schema, a representation of reusable computational knowledge. This can take various forms, including high-level algorithm templates, code optimizations, datatype refinements, or architectural information. A schema also contains applicability conditions that are used to determine when it can be applied safely. These conditions can refer to the initial specification, to intermediate results, or to elements of the partially-instantiated code. Schema-based synthesis uses AI technology to recursively apply schemas to gradually refine a specification into executable code. This process proceeds in two main phases. A front-end gradually transforms the problem specification into a program represented in an abstract intermediate code. A backend then compiles this further down into a concrete target programming language of choice. A core engine applies schemas on the initial problem specification, then uses the output of those schemas as the input for other schemas, until the full implementation is generated. Since there might be different schemas that implement different solutions to the same problem this process can generate an entire solution tree. AUTOBAYES and AUTOFILTER have reached the level of maturity where they enable users to solve interesting application problems, e.g., the analysis of Hubble Space Telescope images. They are large (in total around 100kLoC Prolog), knowledge intensive systems that employ complex symbolic reasoning to generate a wide range of non-trivial programs for complex application do- mains. Their schemas can have complex interactions, which make it hard to change them in isolation or even understand what an existing schema actually does. Adding more capabilities by increasing the number of schemas will only worsen this situation, ultimately leading to the entropy death of the synthesis system. The root came of this problem is that the domain knowledge is scattered throughout the entire system and only represented implicitly in the schema implementations. In our current work, we are addressing this problem by making explicit the knowledge from Merent parts of the synthesis system. Here; we discuss how Gruber's definition of an ontology as an explicit specification of a conceptualization matches our efforts in identifying and explicating the domain-specific concepts. We outline the dual role ontologies play in schema-based synthesis and argue that they address different audiences and serve different purposes. Their first role is descriptive: they serve as explicit documentation, and help to understand the internal structure of the system. Their second role is prescriptive: they provide the formal basis against which the other parts of the system (e.g., schemas) can be checked. Their final role is referential: ontologies also provide semantically meaningful "hooks" which allow schemas and tools to access the internal state of the program derivation process (e.g., fragments of the generated code) in domain-specific rather than language-specific terms, and thus to modify it in a controlled fashion. For discussion purposes we use AUTOLINEAR, a small synthesis system we are currently experimenting with, which can generate code for solving a system of linear equations, Az = b.
Adluru, Nagesh; Yang, Xingwei; Latecki, Longin Jan
2015-05-01
We consider a problem of finding maximum weight subgraphs (MWS) that satisfy hard constraints in a weighted graph. The constraints specify the graph nodes that must belong to the solution as well as mutual exclusions of graph nodes, i.e., pairs of nodes that cannot belong to the same solution. Our main contribution is a novel inference approach for solving this problem in a sequential monte carlo (SMC) sampling framework. Usually in an SMC framework there is a natural ordering of the states of the samples. The order typically depends on observations about the states or on the annealing setup used. In many applications (e.g., image jigsaw puzzle problems), all observations (e.g., puzzle pieces) are given at once and it is hard to define a natural ordering. Therefore, we relax the assumption of having ordered observations about states and propose a novel SMC algorithm for obtaining maximum a posteriori estimate of a high-dimensional posterior distribution. This is achieved by exploring different orders of states and selecting the most informative permutations in each step of the sampling. Our experimental results demonstrate that the proposed inference framework significantly outperforms loopy belief propagation in solving the image jigsaw puzzle problem. In particular, our inference quadruples the accuracy of the puzzle assembly compared to that of loopy belief propagation.
Sequential Monte Carlo for Maximum Weight Subgraphs with Application to Solving Image Jigsaw Puzzles
Adluru, Nagesh; Yang, Xingwei; Latecki, Longin Jan
2015-01-01
We consider a problem of finding maximum weight subgraphs (MWS) that satisfy hard constraints in a weighted graph. The constraints specify the graph nodes that must belong to the solution as well as mutual exclusions of graph nodes, i.e., pairs of nodes that cannot belong to the same solution. Our main contribution is a novel inference approach for solving this problem in a sequential monte carlo (SMC) sampling framework. Usually in an SMC framework there is a natural ordering of the states of the samples. The order typically depends on observations about the states or on the annealing setup used. In many applications (e.g., image jigsaw puzzle problems), all observations (e.g., puzzle pieces) are given at once and it is hard to define a natural ordering. Therefore, we relax the assumption of having ordered observations about states and propose a novel SMC algorithm for obtaining maximum a posteriori estimate of a high-dimensional posterior distribution. This is achieved by exploring different orders of states and selecting the most informative permutations in each step of the sampling. Our experimental results demonstrate that the proposed inference framework significantly outperforms loopy belief propagation in solving the image jigsaw puzzle problem. In particular, our inference quadruples the accuracy of the puzzle assembly compared to that of loopy belief propagation. PMID:26052182
Hoehndorf, Robert; Alshahrani, Mona; Gkoutos, Georgios V; Gosline, George; Groom, Quentin; Hamann, Thomas; Kattge, Jens; de Oliveira, Sylvia Mota; Schmidt, Marco; Sierra, Soraya; Smets, Erik; Vos, Rutger A; Weiland, Claus
2016-11-14
The systematic analysis of a large number of comparable plant trait data can support investigations into phylogenetics and ecological adaptation, with broad applications in evolutionary biology, agriculture, conservation, and the functioning of ecosystems. Floras, i.e., books collecting the information on all known plant species found within a region, are a potentially rich source of such plant trait data. Floras describe plant traits with a focus on morphology and other traits relevant for species identification in addition to other characteristics of plant species, such as ecological affinities, distribution, economic value, health applications, traditional uses, and so on. However, a key limitation in systematically analyzing information in Floras is the lack of a standardized vocabulary for the described traits as well as the difficulties in extracting structured information from free text. We have developed the Flora Phenotype Ontology (FLOPO), an ontology for describing traits of plant species found in Floras. We used the Plant Ontology (PO) and the Phenotype And Trait Ontology (PATO) to extract entity-quality relationships from digitized taxon descriptions in Floras, and used a formal ontological approach based on phenotype description patterns and automated reasoning to generate the FLOPO. The resulting ontology consists of 25,407 classes and is based on the PO and PATO. The classified ontology closely follows the structure of Plant Ontology in that the primary axis of classification is the observed plant anatomical structure, and more specific traits are then classified based on parthood and subclass relations between anatomical structures as well as subclass relations between phenotypic qualities. The FLOPO is primarily intended as a framework based on which plant traits can be integrated computationally across all species and higher taxa of flowering plants. Importantly, it is not intended to replace established vocabularies or ontologies, but rather serve as an overarching framework based on which different application- and domain-specific ontologies, thesauri and vocabularies of phenotypes observed in flowering plants can be integrated.
Mainstream web standards now support science data too
NASA Astrophysics Data System (ADS)
Richard, S. M.; Cox, S. J. D.; Janowicz, K.; Fox, P. A.
2017-12-01
The science community has developed many models and ontologies for representation of scientific data and knowledge. In some cases these have been built as part of coordinated frameworks. For example, the biomedical communities OBO Foundry federates applications covering various aspects of life sciences, which are united through reference to a common foundational ontology (BFO). The SWEET ontology, originally developed at NASA and now governed through ESIP, is a single large unified ontology for earth and environmental sciences. On a smaller scale, GeoSciML provides a UML and corresponding XML representation of geological mapping and observation data. Some of the key concepts related to scientific data and observations have recently been incorporated into domain-neutral mainstream ontologies developed by the World Wide Web consortium through their Spatial Data on the Web working group (SDWWG). OWL-Time has been enhanced to support temporal reference systems needed for science, and has been deployed in a linked data representation of the International Chronostratigraphic Chart. The Semantic Sensor Network ontology has been extended to cover samples and sampling, including relationships between samples. Gridded data and time-series is supported by applications of the statistical data-cube ontology (QB) for earth observations (the EO-QB profile) and spatio-temporal data (QB4ST). These standard ontologies and encodings can be used directly for science data, or can provide a bridge to specialized domain ontologies. There are a number of advantages in alignment with the W3C standards. The W3C vocabularies use discipline-neutral language and thus support cross-disciplinary applications directly without complex mappings. The W3C vocabularies are already aligned with the core ontologies that are the building blocks of the semantic web. The W3C vocabularies are each tightly scoped thus encouraging good practices in the combination of complementary small ontologies. The W3C vocabularies are hosted on well known, reliable infrastructure. The W3C SDWWG outputs are being selectively adopted by the general schema.org discovery framework.
NASA Astrophysics Data System (ADS)
Anku, Sitsofe E.
1997-09-01
Using the reform documents of the National Council of Teachers of Mathematics (NCTM) (NCTM, 1989, 1991, 1995), a theory-based multi-dimensional assessment framework (the "SEA" framework) which should help expand the scope of assessment in mathematics is proposed. This framework uses a context based on mathematical reasoning and has components that comprise mathematical concepts, mathematical procedures, mathematical communication, mathematical problem solving, and mathematical disposition.
Pozza, Giandomenico; Borgo, Stefano; Oltramari, Alessandro; Contalbrigo, Laura; Marangon, Stefano
2016-09-08
Ontologies are widely used both in the life sciences and in the management of public and private companies. Typically, the different offices in an organization develop their own models and related ontologies to capture specific tasks and goals. Although there might be an overall coordination, the use of distinct ontologies can jeopardize the integration of data across the organization since data sharing and reusability are sensitive to modeling choices. The paper provides a study of the entities that are typically found at the reception, analysis and report phases in public institutes in the life science domain. Ontological considerations and techniques are introduced and their implementation exemplified by studying the Istituto Zooprofilattico Sperimentale delle Venezie (IZSVe), a public veterinarian institute with different geographical locations and several laboratories. Different modeling issues are discussed like the identification and characterization of the main entities in these phases; the classification of the (types of) data; the clarification of the contexts and the roles of the involved entities. The study is based on a foundational ontology and shows how it can be extended to a comprehensive and coherent framework comprising the different institute's roles, processes and data. In particular, it shows how to use notions lying at the borderline between ontology and applications, like that of knowledge object. The paper aims to help the modeler to understand the core viewpoint of the organization and to improve data transparency. The study shows that the entities at play can be analyzed within a single ontological perspective allowing us to isolate a single ontological framework for the whole organization. This facilitates the development of coherent representations of the entities and related data, and fosters the use of integrated software for data management and reasoning across the company.
Using Stochastic Spiking Neural Networks on SpiNNaker to Solve Constraint Satisfaction Problems
Fonseca Guerra, Gabriel A.; Furber, Steve B.
2017-01-01
Constraint satisfaction problems (CSP) are at the core of numerous scientific and technological applications. However, CSPs belong to the NP-complete complexity class, for which the existence (or not) of efficient algorithms remains a major unsolved question in computational complexity theory. In the face of this fundamental difficulty heuristics and approximation methods are used to approach instances of NP (e.g., decision and hard optimization problems). The human brain efficiently handles CSPs both in perception and behavior using spiking neural networks (SNNs), and recent studies have demonstrated that the noise embedded within an SNN can be used as a computational resource to solve CSPs. Here, we provide a software framework for the implementation of such noisy neural solvers on the SpiNNaker massively parallel neuromorphic hardware, further demonstrating their potential to implement a stochastic search that solves instances of P and NP problems expressed as CSPs. This facilitates the exploration of new optimization strategies and the understanding of the computational abilities of SNNs. We demonstrate the basic principles of the framework by solving difficult instances of the Sudoku puzzle and of the map color problem, and explore its application to spin glasses. The solver works as a stochastic dynamical system, which is attracted by the configuration that solves the CSP. The noise allows an optimal exploration of the space of configurations, looking for the satisfiability of all the constraints; if applied discontinuously, it can also force the system to leap to a new random configuration effectively causing a restart. PMID:29311791
A unified approach for debugging is-a structure and mappings in networked taxonomies
2013-01-01
Background With the increased use of ontologies and ontology mappings in semantically-enabled applications such as ontology-based search and data integration, the issue of detecting and repairing defects in ontologies and ontology mappings has become increasingly important. These defects can lead to wrong or incomplete results for the applications. Results We propose a unified framework for debugging the is-a structure of and mappings between taxonomies, the most used kind of ontologies. We present theory and algorithms as well as an implemented system RepOSE, that supports a domain expert in detecting and repairing missing and wrong is-a relations and mappings. We also discuss two experiments performed by domain experts: an experiment on the Anatomy ontologies from the Ontology Alignment Evaluation Initiative, and a debugging session for the Swedish National Food Agency. Conclusions Semantically-enabled applications need high quality ontologies and ontology mappings. One key aspect is the detection and removal of defects in the ontologies and ontology mappings. Our system RepOSE provides an environment that supports domain experts to deal with this issue. We have shown the usefulness of the approach in two experiments by detecting and repairing circa 200 and 30 defects, respectively. PMID:23548155
Lesion mapping of social problem solving
Colom, Roberto; Paul, Erick J.; Chau, Aileen; Solomon, Jeffrey; Grafman, Jordan H.
2014-01-01
Accumulating neuroscience evidence indicates that human intelligence is supported by a distributed network of frontal and parietal regions that enable complex, goal-directed behaviour. However, the contributions of this network to social aspects of intellectual function remain to be well characterized. Here, we report a human lesion study (n = 144) that investigates the neural bases of social problem solving (measured by the Everyday Problem Solving Inventory) and examine the degree to which individual differences in performance are predicted by a broad spectrum of psychological variables, including psychometric intelligence (measured by the Wechsler Adult Intelligence Scale), emotional intelligence (measured by the Mayer, Salovey, Caruso Emotional Intelligence Test), and personality traits (measured by the Neuroticism-Extraversion-Openness Personality Inventory). Scores for each variable were obtained, followed by voxel-based lesion–symptom mapping. Stepwise regression analyses revealed that working memory, processing speed, and emotional intelligence predict individual differences in everyday problem solving. A targeted analysis of specific everyday problem solving domains (involving friends, home management, consumerism, work, information management, and family) revealed psychological variables that selectively contribute to each. Lesion mapping results indicated that social problem solving, psychometric intelligence, and emotional intelligence are supported by a shared network of frontal, temporal, and parietal regions, including white matter association tracts that bind these areas into a coordinated system. The results support an integrative framework for understanding social intelligence and make specific recommendations for the application of the Everyday Problem Solving Inventory to the study of social problem solving in health and disease. PMID:25070511
Coping and social problem solving correlates of asthma control and quality of life.
McCormick, Sean P; Nezu, Christine M; Nezu, Arthur M; Sherman, Michael; Davey, Adam; Collins, Bradley N
2014-02-01
In a sample of adults with asthma receiving care and medication in an outpatient pulmonary clinic, this study tested for statistical associations between social problem-solving styles, asthma control, and asthma-related quality of life. These variables were measured cross sectionally as a first step toward more systematic application of social problem-solving frameworks in asthma self-management training. Recruitment occurred during pulmonology clinic service hours. Forty-four adults with physician-confirmed diagnosis of asthma provided data including age, gender, height, weight, race, income, and comorbid conditions. The Asthma Control Questionnaire, the Mini Asthma Quality of Life Questionnaire (Short Form), and peak expiratory force measures offered multiple views of asthma health at the time of the study. Maladaptive coping (impulsive and careless problem-solving styles) based on transactional stress models of health were assessed with the Social Problem-Solving Inventory-Revised: Short Form. Controlling for variance associated with gender, age, and income, individuals reporting higher impulsive-careless scores exhibited significantly lower scores on asthma control (β = 0.70, p = 0.001, confidence interval (CI) [0.37-1.04]) and lower asthma-related quality of life (β = 0.79, p = 0.017, CI [0.15-1.42]). These findings suggest that specific maladaptive problem-solving styles may uniquely contribute to asthma health burdens. Because problem-solving coping strategies are both measureable and teachable, behavioral interventions aimed at facilitating adaptive coping and problem solving could positively affect patient's asthma management and quality of life.
Ong, Edison; Xiang, Zuoshuang; Zhao, Bin; Liu, Yue; Lin, Yu; Zheng, Jie; Mungall, Chris; Courtot, Mélanie; Ruttenberg, Alan; He, Yongqun
2017-01-01
Linked Data (LD) aims to achieve interconnected data by representing entities using Unified Resource Identifiers (URIs), and sharing information using Resource Description Frameworks (RDFs) and HTTP. Ontologies, which logically represent entities and relations in specific domains, are the basis of LD. Ontobee (http://www.ontobee.org/) is a linked ontology data server that stores ontology information using RDF triple store technology and supports query, visualization and linkage of ontology terms. Ontobee is also the default linked data server for publishing and browsing biomedical ontologies in the Open Biological Ontology (OBO) Foundry (http://obofoundry.org) library. Ontobee currently hosts more than 180 ontologies (including 131 OBO Foundry Library ontologies) with over four million terms. Ontobee provides a user-friendly web interface for querying and visualizing the details and hierarchy of a specific ontology term. Using the eXtensible Stylesheet Language Transformation (XSLT) technology, Ontobee is able to dereference a single ontology term URI, and then output RDF/eXtensible Markup Language (XML) for computer processing or display the HTML information on a web browser for human users. Statistics and detailed information are generated and displayed for each ontology listed in Ontobee. In addition, a SPARQL web interface is provided for custom advanced SPARQL queries of one or multiple ontologies. PMID:27733503
Framework Requirements for MDO Application Development
NASA Technical Reports Server (NTRS)
Salas, A. O.; Townsend, J. C.
1999-01-01
Frameworks or problem solving environments that support application development form an active area of research. The Multidisciplinary Optimization Branch at NASA Langley Research Center is investigating frameworks for supporting multidisciplinary analysis and optimization research. The Branch has generated a list of framework requirements, based on the experience gained from the Framework for Interdisciplinary Design Optimization project and the information acquired during a framework evaluation process. In this study, four existing frameworks are examined against these requirements. The results of this examination suggest several topics for further framework research.
Evaluating Health Information Systems Using Ontologies
Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-01-01
Background There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. Objectives The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems—whether similar or heterogeneous—by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. Methods On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. Results The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. Conclusions The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems. PMID:27311735
Evaluating Health Information Systems Using Ontologies.
Eivazzadeh, Shahryar; Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-06-16
There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems-whether similar or heterogeneous-by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems.
Characterization and Developmental History of Problem Solving Methods in Medicine
Harbort, Robert A.
1980-01-01
The central thesis of this paper is the importance of the framework in which information is structured. It is technically important in the design of systems; it is also important in guaranteeing that systems are usable by clinicians. Progress in medical computing depends on our ability to develop a more quantitative understanding of the role of context in our choice of problem solving techniques. This in turn will help us to design more flexible and responsive computer systems. The paper contains an overview of some models of knowledge and problem solving methods, a characterization of modern diagnostic techniques, and a discussion of skill development in medical practice. Diagnostic techniques are examined in terms of how they are taught, what problem solving methods they use, and how they fit together into an overall theory of interpretation of the medical status of a patient.
eClims: An Extensible and Dynamic Integration Framework for Biomedical Information Systems.
Savonnet, Marinette; Leclercq, Eric; Naubourg, Pierre
2016-11-01
Biomedical information systems (BIS) require consideration of three types of variability: data variability induced by new high throughput technologies, schema or model variability induced by large scale studies or new fields of research, and knowledge variability resulting from new discoveries. Beyond data heterogeneity, managing variabilities in the context of BIS requires extensible and dynamic integration process. In this paper, we focus on data and schema variabilities and we propose an integration framework based on ontologies, master data, and semantic annotations. The framework addresses issues related to: 1) collaborative work through a dynamic integration process; 2) variability among studies using an annotation mechanism; and 3) quality control over data and semantic annotations. Our approach relies on two levels of knowledge: BIS-related knowledge is modeled using an application ontology coupled with UML models that allow controlling data completeness and consistency, and domain knowledge is described by a domain ontology, which ensures data coherence. A system build with the eClims framework has been implemented and evaluated in the context of a proteomic platform.
Kwok, T; Smith, K A
2000-09-01
The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.
Fan, Jianping; Gao, Yuli; Luo, Hangzai
2008-03-01
In this paper, we have developed a new scheme for achieving multilevel annotations of large-scale images automatically. To achieve more sufficient representation of various visual properties of the images, both the global visual features and the local visual features are extracted for image content representation. To tackle the problem of huge intraconcept visual diversity, multiple types of kernels are integrated to characterize the diverse visual similarity relationships between the images more precisely, and a multiple kernel learning algorithm is developed for SVM image classifier training. To address the problem of huge interconcept visual similarity, a novel multitask learning algorithm is developed to learn the correlated classifiers for the sibling image concepts under the same parent concept and enhance their discrimination and adaptation power significantly. To tackle the problem of huge intraconcept visual diversity for the image concepts at the higher levels of the concept ontology, a novel hierarchical boosting algorithm is developed to learn their ensemble classifiers hierarchically. In order to assist users on selecting more effective hypotheses for image classifier training, we have developed a novel hyperbolic framework for large-scale image visualization and interactive hypotheses assessment. Our experiments on large-scale image collections have also obtained very positive results.
Principles for Framing a Healthy Food System.
Hamm, Michael W
2009-07-01
Wicked problems are most simply defined as ones that are impossible to solve. In other words, the range of complex interacting influences and effects; the influence of human values in all their range; and the constantly changing conditions in which the problem exists guarantee that what we strive to do is improve the situation rather than solve the wicked problem. This does not mean that we cannot move a long way toward resolving the problem but simply that there is no clean endpoint. This commentary outlines principles that could be used in moving us toward a healthy food system within the framework of it presenting as a wicked problem.
Two-phase framework for near-optimal multi-target Lambert rendezvous
NASA Astrophysics Data System (ADS)
Bang, Jun; Ahn, Jaemyung
2018-03-01
This paper proposes a two-phase framework to obtain a near-optimal solution of multi-target Lambert rendezvous problem. The objective of the problem is to determine the minimum-cost rendezvous sequence and trajectories to visit a given set of targets within a maximum mission duration. The first phase solves a series of single-target rendezvous problems for all departure-arrival object pairs to generate the elementary solutions, which provides candidate rendezvous trajectories. The second phase formulates a variant of traveling salesman problem (TSP) using the elementary solutions prepared in the first phase and determines the final rendezvous sequence and trajectories of the multi-target rendezvous problem. The validity of the proposed optimization framework is demonstrated through an asteroid exploration case study.
Ontological interpretation of biomedical database content.
Santana da Silva, Filipe; Jansen, Ludger; Freitas, Fred; Schulz, Stefan
2017-06-26
Biological databases store data about laboratory experiments, together with semantic annotations, in order to support data aggregation and retrieval. The exact meaning of such annotations in the context of a database record is often ambiguous. We address this problem by grounding implicit and explicit database content in a formal-ontological framework. By using a typical extract from the databases UniProt and Ensembl, annotated with content from GO, PR, ChEBI and NCBI Taxonomy, we created four ontological models (in OWL), which generate explicit, distinct interpretations under the BioTopLite2 (BTL2) upper-level ontology. The first three models interpret database entries as individuals (IND), defined classes (SUBC), and classes with dispositions (DISP), respectively; the fourth model (HYBR) is a combination of SUBC and DISP. For the evaluation of these four models, we consider (i) database content retrieval, using ontologies as query vocabulary; (ii) information completeness; and, (iii) DL complexity and decidability. The models were tested under these criteria against four competency questions (CQs). IND does not raise any ontological claim, besides asserting the existence of sample individuals and relations among them. Modelling patterns have to be created for each type of annotation referent. SUBC is interpreted regarding maximally fine-grained defined subclasses under the classes referred to by the data. DISP attempts to extract truly ontological statements from the database records, claiming the existence of dispositions. HYBR is a hybrid of SUBC and DISP and is more parsimonious regarding expressiveness and query answering complexity. For each of the four models, the four CQs were submitted as DL queries. This shows the ability to retrieve individuals with IND, and classes in SUBC and HYBR. DISP does not retrieve anything because the axioms with disposition are embedded in General Class Inclusion (GCI) statements. Ambiguity of biological database content is addressed by a method that identifies implicit knowledge behind semantic annotations in biological databases and grounds it in an expressive upper-level ontology. The result is a seamless representation of database structure, content and annotations as OWL models.
Imai, Takeshi; Hayakawa, Masayo; Ohe, Kazuhiko
2013-01-01
Prediction of synergistic or antagonistic effects of drug-drug interaction (DDI) in vivo has been of considerable interest over the years. Formal representation of pharmacological knowledge such as ontology is indispensable for machine reasoning of possible DDIs. However, current pharmacology knowledge bases are not sufficient to provide formal representation of DDI information. With this background, this paper presents: (1) a description framework of pharmacodynamics ontology; and (2) a methodology to utilize pharmacodynamics ontology to detect different types of possible DDI pairs with supporting information such as underlying pharmacodynamics mechanisms. We also evaluated our methodology in the field of drugs related to noradrenaline signal transduction process and 11 different types of possible DDI pairs were detected. The main features of our methodology are the explanation capability of the reason for possible DDIs and the distinguishability of different types of DDIs. These features will not only be useful for providing supporting information to prescribers, but also for large-scale monitoring of drug safety.
An approach for formalising the supply chain operations
NASA Astrophysics Data System (ADS)
Zdravković, Milan; Panetto, Hervé; Trajanović, Miroslav; Aubry, Alexis
2011-11-01
Reference models play an important role in the knowledge management of the various complex collaboration domains (such as supply chain networks). However, they often show a lack of semantic precision and, they are sometimes incomplete. In this article, we present an approach to overcome semantic inconsistencies and incompleteness of the Supply Chain Operations Reference (SCOR) model and hence improve its usefulness and expand the application domain. First, we describe a literal web ontology language (OWL) specification of SCOR concepts (and related tools) built with the intention to preserve the original approach in the classification of process reference model entities, and hence enable the effectiveness of usage in original contexts. Next, we demonstrate the system for its exploitation, in specific - tools for SCOR framework browsing and rapid supply chain process configuration. Then, we describe the SCOR-Full ontology, its relations with relevant domain ontology and show how it can be exploited for improvement of SCOR ontological framework competence. Finally, we elaborate the potential impact of the presented approach, to interoperability of systems in supply chain networks.
Automated software system for checking the structure and format of ACM SIG documents
NASA Astrophysics Data System (ADS)
Mirza, Arsalan Rahman; Sah, Melike
2017-04-01
Microsoft (MS) Office Word is one of the most commonly used software tools for creating documents. MS Word 2007 and above uses XML to represent the structure of MS Word documents. Metadata about the documents are automatically created using Office Open XML (OOXML) syntax. We develop a new framework, which is called ADFCS (Automated Document Format Checking System) that takes the advantage of the OOXML metadata, in order to extract semantic information from MS Office Word documents. In particular, we develop a new ontology for Association for Computing Machinery (ACM) Special Interested Group (SIG) documents for representing the structure and format of these documents by using OWL (Web Ontology Language). Then, the metadata is extracted automatically in RDF (Resource Description Framework) according to this ontology using the developed software. Finally, we generate extensive rules in order to infer whether the documents are formatted according to ACM SIG standards. This paper, introduces ACM SIG ontology, metadata extraction process, inference engine, ADFCS online user interface, system evaluation and user study evaluations.
A sensor and video based ontology for activity recognition in smart environments.
Mitchell, D; Morrow, Philip J; Nugent, Chris D
2014-01-01
Activity recognition is used in a wide range of applications including healthcare and security. In a smart environment activity recognition can be used to monitor and support the activities of a user. There have been a range of methods used in activity recognition including sensor-based approaches, vision-based approaches and ontological approaches. This paper presents a novel approach to activity recognition in a smart home environment which combines sensor and video data through an ontological framework. The ontology describes the relationships and interactions between activities, the user, objects, sensors and video data.
Experimental evaluation of ontology-based HIV/AIDS frequently asked question retrieval system.
Ayalew, Yirsaw; Moeng, Barbara; Mosweunyane, Gontlafetse
2018-05-01
This study presents the results of experimental evaluations of an ontology-based frequently asked question retrieval system in the domain of HIV and AIDS. The main purpose of the system is to provide answers to questions on HIV/AIDS using ontology. To evaluate the effectiveness of the frequently asked question retrieval system, we conducted two experiments. The first experiment focused on the evaluation of the quality of the ontology we developed using the OQuaRE evaluation framework which is based on software quality metrics and metrics designed for ontology quality evaluation. The second experiment focused on evaluating the effectiveness of the ontology in retrieving relevant answers. For this we used an open-source information retrieval platform, Terrier, with retrieval models BM25 and PL2. For the measurement of performance, we used the measures mean average precision, mean reciprocal rank, and precision at 5. The results suggest that frequently asked question retrieval with ontology is more effective than frequently asked question retrieval without ontology in the domain of HIV/AIDS.
Controlling uncertainty: a review of human behavior in complex dynamic environments.
Osman, Magda
2010-01-01
Complex dynamic control (CDC) tasks are a type of problem-solving environment used for examining many cognitive activities (e.g., attention, control, decision making, hypothesis testing, implicit learning, memory, monitoring, planning, and problem solving). Because of their popularity, there have been many findings from diverse domains of research (economics, engineering, ergonomics, human-computer interaction, management, psychology), but they remain largely disconnected from each other. The objective of this article is to review theoretical developments and empirical work on CDC tasks, and to introduce a novel framework (monitoring and control framework) as a tool for integrating theory and findings. The main thesis of the monitoring and control framework is that CDC tasks are characteristically uncertain environments, and subjective judgments of uncertainty guide the way in which monitoring and control behaviors attempt to reduce it. The article concludes by discussing new insights into continuing debates and future directions for research on CDC tasks.
PetIGA: A framework for high-performance isogeometric analysis
Dalcin, Lisandro; Collier, Nathaniel; Vignal, Philippe; ...
2016-05-25
We present PetIGA, a code framework to approximate the solution of partial differential equations using isogeometric analysis. PetIGA can be used to assemble matrices and vectors which come from a Galerkin weak form, discretized with Non-Uniform Rational B-spline basis functions. We base our framework on PETSc, a high-performance library for the scalable solution of partial differential equations, which simplifies the development of large-scale scientific codes, provides a rich environment for prototyping, and separates parallelism from algorithm choice. We describe the implementation of PetIGA, and exemplify its use by solving a model nonlinear problem. To illustrate the robustness and flexibility ofmore » PetIGA, we solve some challenging nonlinear partial differential equations that include problems in both solid and fluid mechanics. Lastly, we show strong scaling results on up to 4096 cores, which confirm the suitability of PetIGA for large scale simulations.« less
NASA Astrophysics Data System (ADS)
Roth, Wolff-Michael
1995-12-01
The present study was designed to investigate problem- and solution-related activity of elementary students in ill-defined and open-ended settings. One Grade 4/5 class of 28 students engaged in the activities of the “Engineering for Children: Structures” curriculum, designed as a vehicle for introducing science concepts, providing ill-defined problem solving contexts, and fostering positive attitudes towards science and technology. Data included video recordings, ethnographic field notes, student produced artefacts (projects and engineering logbooks), and interviews with teachers and observers. These data supported the notion of problems, solutions, and courses of actions as entities with flexible ontologies. In the course of their negotiations, students demonstrated an uncanny competence to frame and reframe problems and solutions and to decide courses of actions of different complexities in spite of the ambiguous nature of (arte)facts, plans, and language. A case study approach was chosen as the literary device to report these general findings. The discussion focuses on the inevitably ambiguous nature of (arte)facts, plans, and language and the associated notion of “interpretive flexibility.” Suggestions are provided for teachers on how to deal with interpretive flexibility without seeking recourse to the didactic approaches of direct teaching. But what happens when problems and solutions are negotiable, when there are no longer isolated problems which one tries to solve but problems which maintain complex linkages with ensembles of other problems and diverse constraints, or when problems and solutions are simultaneously invented? (Lestel, 1989, p. 692, my translation)
An ontological knowledge framework for adaptive medical workflow.
Dang, Jiangbo; Hedayati, Amir; Hampel, Ken; Toklu, Candemir
2008-10-01
As emerging technologies, semantic Web and SOA (Service-Oriented Architecture) allow BPMS (Business Process Management System) to automate business processes that can be described as services, which in turn can be used to wrap existing enterprise applications. BPMS provides tools and methodologies to compose Web services that can be executed as business processes and monitored by BPM (Business Process Management) consoles. Ontologies are a formal declarative knowledge representation model. It provides a foundation upon which machine understandable knowledge can be obtained, and as a result, it makes machine intelligence possible. Healthcare systems can adopt these technologies to make them ubiquitous, adaptive, and intelligent, and then serve patients better. This paper presents an ontological knowledge framework that covers healthcare domains that a hospital encompasses-from the medical or administrative tasks, to hospital assets, medical insurances, patient records, drugs, and regulations. Therefore, our ontology makes our vision of personalized healthcare possible by capturing all necessary knowledge for a complex personalized healthcare scenario involving patient care, insurance policies, and drug prescriptions, and compliances. For example, our ontology facilitates a workflow management system to allow users, from physicians to administrative assistants, to manage, even create context-aware new medical workflows and execute them on-the-fly.
NASA Astrophysics Data System (ADS)
Gembong, S.; Suwarsono, S. T.; Prabowo
2018-03-01
Schema in the current study refers to a set of action, process, object and other schemas already possessed to build an individual’s ways of thinking to solve a given problem. The current study aims to investigate the schemas built among elementary school students in solving problems related to operations of addition to fractions. The analyses of the schema building were done qualitatively on the basis of the analytical framework of the APOS theory (Action, Process, Object, and Schema). Findings show that the schemas built on students of high and middle ability indicate the following. In the action stage, students were able to add two fractions by way of drawing a picture or procedural way. In the Stage of process, they could add two and three fractions. In the stage of object, they could explain the steps of adding two fractions and change a fraction into addition of fractions. In the last stage, schema, they could add fractions by relating them to another schema they have possessed i.e. the least common multiple. Those of high and middle mathematic abilities showed that their schema building in solving problems related to operations odd addition to fractions worked in line with the framework of the APOS theory. Those of low mathematic ability, however, showed that their schema on each stage did not work properly.
Mathematics Framework for California Public Schools, Kindergarten Through Grade Twelve.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento.
This report, prepared by a statewide Mathematics Advisory Committee, revises the framework in the Second Strands Report of 1972, expanding it to encompass kindergarten through grade 12. Strands for kindergarten through grade 8 are: arithmetic, numbers, and operations; geometry; measurement, problem solving/ applications; probability and…
NASA Astrophysics Data System (ADS)
Westlander, Meghan Joanne
Interactive engagement environments are critical to students' conceptual learning gains, and often the instructor is ultimately responsible for the creation of that environment in the classroom. When those instructors are graduate teaching assistants (GTAs), one of the primary ways in which they can promote interactive engagement is through their interactions with students. Much of the prior research on physics GTA-student interactions focuses on GTA training programs (e.g. Ezrailson (2004); Smith, Ward, and Rosenshein (1977)) or on GTAs' specific actions and beliefs (e.g. West, Paul, Webb, and Potter (2013); Goertzen (2010); Spike and Finkelstein (2012a)). Research on students' ideas and behaviors within and surrounding those interactions is limited but important to obtaining a more complete understanding of how GTAs promote an interactive environment. In order to begin understanding this area, I developed the Issues Framework to examine how GTA-student interactions are situated in students' processes during physics problem solving activities. Using grounded theory, the Issues Framework emerged from an analysis of the relationships between GTA-student interactions and the students procedures and expressions of physics content in and surrounding those interactions. This study is focused on introducing the Issues Framework and the insight it can provide into GTA-student interactions and students' processes. The framework is general in nature and has a visually friendly design making it a useful tool for consolidating complex data and quickly pattern-matching important pieces of a complex process. Four different categories of Issues emerged spanning the problem solving process: (1) Getting Started, (2) Solution Approach, (3) Unit Conversions, and (4) Other. The framework allowed for identification of the specific contents of the Issues in each category as well as revealing the common stories of students' processes and how the interactions were situated in those processes in each category. Through the stories, the Issues Framework revealed processes in which students often focused narrowly on procedures with the physics content expressed through their procedures and only sometimes through conceptual discussions. Interactions with the GTA affected changes in students' processes, typically leading students to correct their procedures. The interactions often focused narrowly on procedures as well but introduced conceptual discussions more often than students did surrounding the interactions. Comparing stories across GTAs instead of across categories revealed one GTA who, more often than other GTAs, used conceptual discussion and encouraged students' participation in the interactions. The Issues Framework still needs continued refinement and testing. However, it represents a significant step toward understanding GTA-student interactions from the perspective of students' processes in physics problem solving.
Engaging the creative to better build science into water resource solutions
NASA Astrophysics Data System (ADS)
Klos, P. Z.
2014-12-01
Psychological thought suggests that social engagement with an environmental problem requires 1) cognitive understanding of the problem, 2) emotional engagement with the problem, and 3) perceived efficacy that there is something we can do to solve the problem. Within the water sciences, we form problem-focused, cross-disciplinary teams to help address complex water resource problems, but often we only seek teammates from other disciplines within the realms of engineering and the natural/social sciences. Here I argue that this science-centric focus fails to fully solve these water resource problems, and often the science goes unheard because it is heavily cognitive and lacks the ability to effectively engage the audience through crucial social-psychological aspects of emotion and efficacy. To solve this, future cross-disciplinary collaborations that seek to include creative actors from the worlds of art, humanities, and design can begin to provide a much stronger overlap of the cognition, emotion, and efficacy needed to communicate the science, engage the audience, and create the solutions needed to solve or world's most complex water resource problems. Disciplines across the arts, sciences, and engineering all bring unique strengths that, through collaboration, allow for uniquely creative modes of art-science overlap that can engage people through additions of emotion and efficacy that compliment the science and go beyond the traditional cognitive approach. I highlight examples of this art-science overlap in action and argue that water resource collaborations like these will be more likely to have their hydrologic science accepted and applied by those who decide on water resource solutions. For this Pop-up Talk session, I aim to share the details of this proposed framework in the context of my own research and the work of others. I hope to incite discussion regarding the utility and relevance of this framework as a future option for other water resource collaboratives working to solve hydrologic issues across the globe.
Building and evaluating an ontology-based tool for reasoning about consent permission
Grando, Adela; Schwab, Richard
2013-01-01
Given the lack of mechanisms for specifying, sharing and checking the compliance of consent permissions, we focus on building and testing novel approaches to address this gap. In our previous work, we introduced a “permission ontology” to capture in a precise, machine-interpretable form informed consent permissions in research studies. Here we explain how we built and evaluated a framework for specifying subject’s permissions and checking researcher’s resource request in compliance with those permissions. The framework is proposed as an extension of an existing policy engine based on the eXtensible Access Control Markup Language (XACML), incorporating ontology-based reasoning. The framework is evaluated in the context of the UCSD Moores Cancer Center biorepository, modeling permissions from an informed consent and a HIPAA form. The resulting permission ontology and mechanisms to check subject’s permission are implementation and institution independent, and therefore offer the potential to be reusable in other biorepositories and data warehouses. PMID:24551354
NASA Astrophysics Data System (ADS)
Aurah, Catherine Muhonja
Within the framework of social cognitive theory, the influence of self-efficacy beliefs and metacognitive prompting on genetics problem solving ability among high school students in Kenya was examined through a mixed methods research design. A quasi-experimental study, supplemented by focus group interviews, was conducted to investigate both the outcomes and the processes of students' genetics problem-solving ability. Focus group interviews substantiated and supported findings from the quantitative instruments. The study was conducted in 17 high schools in Western Province, Kenya. A total of 2,138 high school students were purposively sampled. A sub-sample of 48 students participated in focus group interviews to understand their perspectives and experiences during the study so as to corroborate the quantitative data. Quantitative data were analyzed through descriptive statistics, zero-order correlations, 2 x 2 factorial ANOVA,, and sequential hierarchical multiple regressions. Qualitative data were transcribed, coded, and reported thematically. Results revealed metacognitive prompts had significant positive effects on student problem-solving ability independent of gender. Self-efficacy and metacognitive prompting significantly predicted genetics problem-solving ability. Gender differences were revealed, with girls outperforming boys on the genetics problem-solving test. Furthermore, self-efficacy moderated the relationship between metacognitive prompting and genetics problem-solving ability. This study established a foundation for instructional methods for biology teachers and recommendations are made for implementing metacognitive prompting in a problem-based learning environment in high schools and science teacher education programs in Kenya.
Probabilistic Ontology Architecture for a Terrorist Identification Decision Support System
2014-06-01
in real-world problems requires probabilistic ontologies, which integrate the inferential reasoning power of probabilistic representations with the... inferential reasoning power of probabilistic representations with the first-order expressivity of ontologies. The Reference Architecture for...ontology, terrorism, inferential reasoning, architecture I. INTRODUCTION A. Background Whether by nature or design, the personas of terrorists are
How Can One Learn Mathematical Word Problems in a Second Language? A Cognitive Load Perspective
ERIC Educational Resources Information Center
Moussa-Inaty, Jase; Causapin, Mark; Groombridge, Timothy
2015-01-01
Language may ordinarily account for difficulties in solving word problems and this is particularly true if mathematical word problems are taught in a language other than one's native language. Research into cognitive load may offer a clear theoretical framework when investigating word problems because memory, specifically working memory, plays a…
Ong, Edison; Xiang, Zuoshuang; Zhao, Bin; Liu, Yue; Lin, Yu; Zheng, Jie; Mungall, Chris; Courtot, Mélanie; Ruttenberg, Alan; He, Yongqun
2017-01-04
Linked Data (LD) aims to achieve interconnected data by representing entities using Unified Resource Identifiers (URIs), and sharing information using Resource Description Frameworks (RDFs) and HTTP. Ontologies, which logically represent entities and relations in specific domains, are the basis of LD. Ontobee (http://www.ontobee.org/) is a linked ontology data server that stores ontology information using RDF triple store technology and supports query, visualization and linkage of ontology terms. Ontobee is also the default linked data server for publishing and browsing biomedical ontologies in the Open Biological Ontology (OBO) Foundry (http://obofoundry.org) library. Ontobee currently hosts more than 180 ontologies (including 131 OBO Foundry Library ontologies) with over four million terms. Ontobee provides a user-friendly web interface for querying and visualizing the details and hierarchy of a specific ontology term. Using the eXtensible Stylesheet Language Transformation (XSLT) technology, Ontobee is able to dereference a single ontology term URI, and then output RDF/eXtensible Markup Language (XML) for computer processing or display the HTML information on a web browser for human users. Statistics and detailed information are generated and displayed for each ontology listed in Ontobee. In addition, a SPARQL web interface is provided for custom advanced SPARQL queries of one or multiple ontologies. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Stevenson, R. Jan
Frameworks for solving environmental problems have been presented over the past 40 years from many organizations and disciplines, often with a strong focus on their own discipline. This paper describes a modification of an existing framework that can be better applied to manage environmental problems. Human well being, environmental policy, human activities, stressors (contaminants and habitat alterations), and ecosystem services are highlighted as five elements of the coupled human and natural system in the proposed framework. Thresholds in relationships among elements in coupled human and natural systems are key attributes of couplings because of their use in development of environmental criteria by facilitating stakeholder consensus and preventing catastrophic changes. Propagation of thresholds through coupled human and natural systems is hypothesized to be a significant driver of policy development. The application of the framework is related to managing eutrophication and algal bloom problems.
Timeline and the Timeline Exchange Infrastructure: a Framework for Exchanging Temporal Information
NASA Technical Reports Server (NTRS)
Donahue, Kenneth; Chung, Seung H,
2013-01-01
The concept of a timeline is used ubiquitously during space mission design and development to specify elements of flight and ground system designs. In this paper we introduce our Timeline Ontology. The Timeline Ontology is grounded in mathematical formalism, thus proving concrete semantics.
Bouaud, Jacques; Guézennec, Gilles; Séroussi, Brigitte
2018-01-01
The integration of clinical information models and termino-ontological models into a unique ontological framework is highly desirable for it facilitates data integration and management using the same formal mechanisms for both data concepts and information model components. This is particularly true for knowledge-based decision support tools that aim to take advantage of all facets of semantic web technologies in merging ontological reasoning, concept classification, and rule-based inferences. We present an ontology template that combines generic data model components with (parts of) existing termino-ontological resources. The approach is developed for the guideline-based decision support module on breast cancer management within the DESIREE European project. The approach is based on the entity attribute value model and could be extended to other domains.
Organizational and Pedagogical Conditions for Training Teachers under Distance Education Framework
ERIC Educational Resources Information Center
Khuziakhmetov, Anvar N.; Suleymanova, Dilyara N.; Nasibullov, Ramis R.; Yarullin, Ilnar F.
2016-01-01
Distance education in a professional higher school is of particular importance in terms of fundamental changes in modern educational institutions. This form of training together with the expansion of information technologies can effectively solve the problem of training students and life-long learning. Distance education is able to solve the…
A bibliometric and visual analysis of global geo-ontology research
NASA Astrophysics Data System (ADS)
Li, Lin; Liu, Yu; Zhu, Haihong; Ying, Shen; Luo, Qinyao; Luo, Heng; Kuai, Xi; Xia, Hui; Shen, Hang
2017-02-01
In this paper, the results of a bibliometric and visual analysis of geo-ontology research articles collected from the Web of Science (WOS) database between 1999 and 2014 are presented. The numbers of national institutions and published papers are visualized and a global research heat map is drawn, illustrating an overview of global geo-ontology research. In addition, we present a chord diagram of countries and perform a visual cluster analysis of a knowledge co-citation network of references, disclosing potential academic communities and identifying key points, main research areas, and future research trends. The International Journal of Geographical Information Science, Progress in Human Geography, and Computers & Geosciences are the most active journals. The USA makes the largest contributions to geo-ontology research by virtue of its highest numbers of independent and collaborative papers, and its dominance was also confirmed in the country chord diagram. The majority of institutions are in the USA, Western Europe, and Eastern Asia. Wuhan University, University of Munster, and the Chinese Academy of Sciences are notable geo-ontology institutions. Keywords such as "Semantic Web," "GIS," and "space" have attracted a great deal of attention. "Semantic granularity in ontology-driven geographic information systems, "Ontologies in support of activities in geographical space" and "A translation approach to portable ontology specifications" have the highest cited centrality. Geographical space, computer-human interaction, and ontology cognition are the three main research areas of geo-ontology. The semantic mismatch between the producers and users of ontology data as well as error propagation in interdisciplinary and cross-linguistic data reuse needs to be solved. In addition, the development of geo-ontology modeling primitives based on OWL (Web Ontology Language)and finding methods to automatically rework data in Semantic Web are needed. Furthermore, the topological relations between geographical entities still require further study.
MachineProse: an Ontological Framework for Scientific Assertions
Dinakarpandian, Deendayal; Lee, Yugyung; Vishwanath, Kartik; Lingambhotla, Rohini
2006-01-01
Objective: The idea of testing a hypothesis is central to the practice of biomedical research. However, the results of testing a hypothesis are published mainly in the form of prose articles. Encoding the results as scientific assertions that are both human and machine readable would greatly enhance the synergistic growth and dissemination of knowledge. Design: We have developed MachineProse (MP), an ontological framework for the concise specification of scientific assertions. MP is based on the idea of an assertion constituting a fundamental unit of knowledge. This is in contrast to current approaches that use discrete concept terms from domain ontologies for annotation and assertions are only inferred heuristically. Measurements: We use illustrative examples to highlight the advantages of MP over the use of the Medical Subject Headings (MeSH) system and keywords in indexing scientific articles. Results: We show how MP makes it possible to carry out semantic annotation of publications that is machine readable and allows for precise search capabilities. In addition, when used by itself, MP serves as a knowledge repository for emerging discoveries. A prototype for proof of concept has been developed that demonstrates the feasibility and novel benefits of MP. As part of the MP framework, we have created an ontology of relationship types with about 100 terms optimized for the representation of scientific assertions. Conclusion: MachineProse is a novel semantic framework that we believe may be used to summarize research findings, annotate biomedical publications, and support sophisticated searches. PMID:16357355
Panacea, a semantic-enabled drug recommendations discovery framework.
Doulaverakis, Charalampos; Nikolaidis, George; Kleontas, Athanasios; Kompatsiaris, Ioannis
2014-03-06
Personalized drug prescription can be benefited from the use of intelligent information management and sharing. International standard classifications and terminologies have been developed in order to provide unique and unambiguous information representation. Such standards can be used as the basis of automated decision support systems for providing drug-drug and drug-disease interaction discovery. Additionally, Semantic Web technologies have been proposed in earlier works, in order to support such systems. The paper presents Panacea, a semantic framework capable of offering drug-drug and drug-diseases interaction discovery. For enabling this kind of service, medical information and terminology had to be translated to ontological terms and be appropriately coupled with medical knowledge of the field. International standard classifications and terminologies, provide the backbone of the common representation of medical data while the medical knowledge of drug interactions is represented by a rule base which makes use of the aforementioned standards. Representation is based on a lightweight ontology. A layered reasoning approach is implemented where at the first layer ontological inference is used in order to discover underlying knowledge, while at the second layer a two-step rule selection strategy is followed resulting in a computationally efficient reasoning approach. Details of the system architecture are presented while also giving an outline of the difficulties that had to be overcome. Panacea is evaluated both in terms of quality of recommendations against real clinical data and performance. The quality recommendation gave useful insights regarding requirements for real world deployment and revealed several parameters that affected the recommendation results. Performance-wise, Panacea is compared to a previous published work by the authors, a service for drug recommendations named GalenOWL, and presents their differences in modeling and approach to the problem, while also pinpointing the advantages of Panacea. Overall, the paper presents a framework for providing an efficient drug recommendations service where Semantic Web technologies are coupled with traditional business rule engines.
Complex collaborative problem-solving processes in mission control.
Fiore, Stephen M; Wiltshire, Travis J; Oglesby, James M; O'Keefe, William S; Salas, Eduardo
2014-04-01
NASA's Mission Control Center (MCC) is responsible for control of the International Space Station (ISS), which includes responding to problems that obstruct the functioning of the ISS and that may pose a threat to the health and well-being of the flight crew. These problems are often complex, requiring individuals, teams, and multiteam systems, to work collaboratively. Research is warranted to examine individual and collaborative problem-solving processes in this context. Specifically, focus is placed on how Mission Control personnel-each with their own skills and responsibilities-exchange information to gain a shared understanding of the problem. The Macrocognition in Teams Model describes the processes that individuals and teams undertake in order to solve problems and may be applicable to Mission Control teams. Semistructured interviews centering on a recent complex problem were conducted with seven MCC professionals. In order to assess collaborative problem-solving processes in MCC with those predicted by the Macrocognition in Teams Model, a coding scheme was developed to analyze the interview transcriptions. Findings are supported with excerpts from participant transcriptions and suggest that team knowledge-building processes accounted for approximately 50% of all coded data and are essential for successful collaborative problem solving in mission control. Support for the internalized and externalized team knowledge was also found (19% and 20%, respectively). The Macrocognition in Teams Model was shown to be a useful depiction of collaborative problem solving in mission control and further research with this as a guiding framework is warranted.
The Conceptual Framework for the Development of a Mathematics Performance Assessment Instrument.
ERIC Educational Resources Information Center
Lane, Suzanne
1993-01-01
A conceptual framework is presented for the development of the Quantitative Understanding: Amplifying Student Achievement and Reasoning (QUASAR) Cognitive Assessment Instrument (QCAI) that focuses on the ability of middle-school students to problem solve, reason, and communicate mathematically. The instrument will provide programatic rather than…
Any Ontological Model of the Single Qubit Stabilizer Formalism must be Contextual
NASA Astrophysics Data System (ADS)
Lillystone, Piers; Wallman, Joel J.
Quantum computers allow us to easily solve some problems classical computers find hard. Non-classical improvements in computational power should be due to some non-classical property of quantum theory. Contextuality, a more general notion of non-locality, is a necessary, but not sufficient, resource for quantum speed-up. Proofs of contextuality can be constructed for the classically simulable stabilizer formalism. Previous proofs of stabilizer contextuality are known for 2 or more qubits, for example the Mermin-Peres magic square. In the work presented we extend these results and prove that any ontological model of the single qubit stabilizer theory must be contextual, as defined by R. Spekkens, and give a relation between our result and the Mermin-Peres square. By demonstrating that contextuality is present in the qubit stabilizer formalism we provide further insight into the contextuality present in quantum theory. Understanding the contextuality of classical sub-theories will allow us to better identify the physical properties of quantum theory required for computational speed up. This research was supported by CIFAR, the Government of Ontario, and the Government of Canada through NSERC and Industry Canada.
Predicting the Extension of Biomedical Ontologies
Pesquita, Catia; Couto, Francisco M.
2012-01-01
Developing and extending a biomedical ontology is a very demanding task that can never be considered complete given our ever-evolving understanding of the life sciences. Extension in particular can benefit from the automation of some of its steps, thus releasing experts to focus on harder tasks. Here we present a strategy to support the automation of change capturing within ontology extension where the need for new concepts or relations is identified. Our strategy is based on predicting areas of an ontology that will undergo extension in a future version by applying supervised learning over features of previous ontology versions. We used the Gene Ontology as our test bed and obtained encouraging results with average f-measure reaching 0.79 for a subset of biological process terms. Our strategy was also able to outperform state of the art change capturing methods. In addition we have identified several issues concerning prediction of ontology evolution, and have delineated a general framework for ontology extension prediction. Our strategy can be applied to any biomedical ontology with versioning, to help focus either manual or semi-automated extension methods on areas of the ontology that need extension. PMID:23028267
Developing a Problem-Based Learning Simulation: An Economics Unit on Trade
ERIC Educational Resources Information Center
Maxwell, Nan L.; Mergendoller, John R.; Bellisimo, Yolanda
2004-01-01
This article argues that the merger of simulations and problem-based learning (PBL) can enhance both active-learning strategies. Simulations benefit by using a PBL framework to promote student-directed learning and problem-solving skills to explain a simulated dilemma with multiple solutions. PBL benefits because simulations structure the…
Lame problem for a multilayer viscoelastic hollow ball with regard to inhomogeneous aging
NASA Astrophysics Data System (ADS)
Davtyan, Z. A.; Mirzoyan, S. Y.; Gasparyan, A. V.
2018-04-01
Determination of characteristics of the stress strain state of compound viscoelastic bodies is of both theoretical and practical interest. In the present paper, the Lamé problem is investigated for an uneven-aged multilayer viscoelastic hollow ball in the framework of N. Kh. Arutyunyan’s theory of creep of nonhomogeneously aging bodies [1, 2]. Solving this problem reduces to solving an inhomogeneous finite-difference equation of second order that contains operators with coordinates of time and space. The obtained formulas allow one to determine the required contact stresses and other mechanical characteristics of the problem related to uneven age of contacting balls.
Semantics-Based Interoperability Framework for the Geosciences
NASA Astrophysics Data System (ADS)
Sinha, A.; Malik, Z.; Raskin, R.; Barnes, C.; Fox, P.; McGuinness, D.; Lin, K.
2008-12-01
Interoperability between heterogeneous data, tools and services is required to transform data to knowledge. To meet geoscience-oriented societal challenges such as forcing of climate change induced by volcanic eruptions, we suggest the need to develop semantic interoperability for data, services, and processes. Because such scientific endeavors require integration of multiple data bases associated with global enterprises, implicit semantic-based integration is impossible. Instead, explicit semantics are needed to facilitate interoperability and integration. Although different types of integration models are available (syntactic or semantic) we suggest that semantic interoperability is likely to be the most successful pathway. Clearly, the geoscience community would benefit from utilization of existing XML-based data models, such as GeoSciML, WaterML, etc to rapidly advance semantic interoperability and integration. We recognize that such integration will require a "meanings-based search, reasoning and information brokering", which will be facilitated through inter-ontology relationships (ontologies defined for each discipline). We suggest that Markup languages (MLs) and ontologies can be seen as "data integration facilitators", working at different abstraction levels. Therefore, we propose to use an ontology-based data registration and discovery approach to compliment mark-up languages through semantic data enrichment. Ontologies allow the use of formal and descriptive logic statements which permits expressive query capabilities for data integration through reasoning. We have developed domain ontologies (EPONT) to capture the concept behind data. EPONT ontologies are associated with existing ontologies such as SUMO, DOLCE and SWEET. Although significant efforts have gone into developing data (object) ontologies, we advance the idea of developing semantic frameworks for additional ontologies that deal with processes and services. This evolutionary step will facilitate the integrative capabilities of scientists as we examine the relationships between data and external factors such as processes that may influence our understanding of "why" certain events happen. We emphasize the need to go from analysis of data to concepts related to scientific principles of thermodynamics, kinetics, heat flow, mass transfer, etc. Towards meeting these objectives, we report on a pair of related service engines: DIA (Discovery, integration and analysis), and SEDRE (Semantically-Enabled Data Registration Engine) that utilize ontologies for semantic interoperability and integration.
Ontology of Space Physics for e-Science Applications Based on ISO 19156
NASA Astrophysics Data System (ADS)
Galkin, I. A.; Fung, S. F.; Benson, R. F.; Heynderickx, D.; Charisi, A.; Lowe, D.; Ventouras, S.; Ritschel, B.; Hapgood, M. A.; Belehaki, A.; Roberts, D. A.; King, T. A.; Narock, T.
2014-12-01
A structural, ontological presentation of the discipline domain concepts and their relationships is a powerful e-science tool: it enables data search and discovery by content of the observations. Even a simple classification of the concepts using the parent-child hierarchies enables analyses by association, thus bringing a greater insight in the data. Ontology specifications have been put to many uses in space physics, primarily to harmonize data analysis across multiple data resources and thus facilitate interoperability. Among the multitude of ontology writeups, the SPASE data model stands out as a prominent, highly detailed collection of keywords for heliophysics. We will present an ontology design that draws its strengths from SPASE and further enhances it with a greater structural organization of the keyword vocabularies, in particular related to wave phenomena, as well as describes a variety of events and activities in the Sun-Earth system beyond the quiet-time behaviour. The new ontology is being developed for the Near Earth Space Data Infrastructure for e-Science (ESPAS) project funded by the 7th European Framework, whose data model is based on a suite of ISO 19156 standards for Observations and Measurements (O&M). The O&M structure and language have driven the ESPAS ontology organization, with the Observed Property vocabulary as its cornerstone. The ontology development has progressed beyond the O&M framework to include domain-specific components required to describe the space physics concepts in a dictionary-controlled, unambiguous manner. Not surprisingly, wave phenomena and events presented the greatest challenge to the ESPAS ontology team as they demanded characterization of processes involved in the wave generation, propagation, modification, and reception, as well as the propagation medium itself. One of the notable outcomes of this effort is the ability of the new ontology schema to accommodate and categorize, for example, the URSI standard ionospheric characteristics such as the O-wave critical frequency of the F2 layer, foF2.The efforts are underway to interface new ontology definitions with similar designs in other e-Science projects.
Data Reduction Algorithm Using Nonnegative Matrix Factorization with Nonlinear Constraints
NASA Astrophysics Data System (ADS)
Sembiring, Pasukat
2017-12-01
Processing ofdata with very large dimensions has been a hot topic in recent decades. Various techniques have been proposed in order to execute the desired information or structure. Non- Negative Matrix Factorization (NMF) based on non-negatives data has become one of the popular methods for shrinking dimensions. The main strength of this method is non-negative object, the object model by a combination of some basic non-negative parts, so as to provide a physical interpretation of the object construction. The NMF is a dimension reduction method thathasbeen used widely for numerous applications including computer vision,text mining, pattern recognitions,and bioinformatics. Mathematical formulation for NMF did not appear as a convex optimization problem and various types of algorithms have been proposed to solve the problem. The Framework of Alternative Nonnegative Least Square(ANLS) are the coordinates of the block formulation approaches that have been proven reliable theoretically and empirically efficient. This paper proposes a new algorithm to solve NMF problem based on the framework of ANLS.This algorithm inherits the convergenceproperty of the ANLS framework to nonlinear constraints NMF formulations.
Software For Genetic Algorithms
NASA Technical Reports Server (NTRS)
Wang, Lui; Bayer, Steve E.
1992-01-01
SPLICER computer program is genetic-algorithm software tool used to solve search and optimization problems. Provides underlying framework and structure for building genetic-algorithm application program. Written in Think C.
Burton, Brett M; Tate, Jess D; Erem, Burak; Swenson, Darrell J; Wang, Dafang F; Steffen, Michael; Brooks, Dana H; van Dam, Peter M; Macleod, Rob S
2012-01-01
Computational modeling in electrocardiography often requires the examination of cardiac forward and inverse problems in order to non-invasively analyze physiological events that are otherwise inaccessible or unethical to explore. The study of these models can be performed in the open-source SCIRun problem solving environment developed at the Center for Integrative Biomedical Computing (CIBC). A new toolkit within SCIRun provides researchers with essential frameworks for constructing and manipulating electrocardiographic forward and inverse models in a highly efficient and interactive way. The toolkit contains sample networks, tutorials and documentation which direct users through SCIRun-specific approaches in the assembly and execution of these specific problems. PMID:22254301
Educational Gymnastics--Stages of Content Development.
ERIC Educational Resources Information Center
Nilges, Lynda M.
1997-01-01
Educational gymnastics uses a problem-solving approach to accommodate multiple correct solutions to open-ended movement problems in gymnastics. A four-stage framework is outlined to help teachers systematically increase and decrease task difficulty in educational gymnastics. Answers to common questions about educational gymnastics are provided.…
Riemann tensor of motion vision revisited.
Brill, M
2001-07-02
This note shows that the Riemann-space interpretation of motion vision developed by Barth and Watson is neither necessary for their results, nor sufficient to handle an intrinsic coordinate problem. Recasting the Barth-Watson framework as a classical velocity-solver (as in computer vision) solves these problems.
NASA Astrophysics Data System (ADS)
Ma, Lin; Wang, Kexin; Xu, Zuhua; Shao, Zhijiang; Song, Zhengyu; Biegler, Lorenz T.
2018-05-01
This study presents a trajectory optimization framework for lunar rover performing vertical takeoff vertical landing (VTVL) maneuvers in the presence of terrain using variable-thrust propulsion. First, a VTVL trajectory optimization problem with three-dimensional kinematics and dynamics model, boundary conditions, and path constraints is formulated. Then, a finite-element approach transcribes the formulated trajectory optimization problem into a nonlinear programming (NLP) problem solved by a highly efficient NLP solver. A homotopy-based backtracking strategy is applied to enhance the convergence in solving the formulated VTVL trajectory optimization problem. The optimal thrust solution typically has a "bang-bang" profile considering that bounds are imposed on the magnitude of engine thrust. An adaptive mesh refinement strategy based on a constant Hamiltonian profile is designed to address the difficulty in locating the breakpoints in the thrust profile. Four scenarios are simulated. Simulation results indicate that the proposed trajectory optimization framework has sufficient adaptability to handle VTVL missions efficiently.
Semantic SenseLab: implementing the vision of the Semantic Web in neuroscience
Samwald, Matthias; Chen, Huajun; Ruttenberg, Alan; Lim, Ernest; Marenco, Luis; Miller, Perry; Shepherd, Gordon; Cheung, Kei-Hoi
2011-01-01
Summary Objective Integrative neuroscience research needs a scalable informatics framework that enables semantic integration of diverse types of neuroscience data. This paper describes the use of the Web Ontology Language (OWL) and other Semantic Web technologies for the representation and integration of molecular-level data provided by several of SenseLab suite of neuroscience databases. Methods Based on the original database structure, we semi-automatically translated the databases into OWL ontologies with manual addition of semantic enrichment. The SenseLab ontologies are extensively linked to other biomedical Semantic Web resources, including the Subcellular Anatomy Ontology, Brain Architecture Management System, the Gene Ontology, BIRNLex and UniProt. The SenseLab ontologies have also been mapped to the Basic Formal Ontology and Relation Ontology, which helps ease interoperability with many other existing and future biomedical ontologies for the Semantic Web. In addition, approaches to representing contradictory research statements are described. The SenseLab ontologies are designed for use on the Semantic Web that enables their integration into a growing collection of biomedical information resources. Conclusion We demonstrate that our approach can yield significant potential benefits and that the Semantic Web is rapidly becoming mature enough to realize its anticipated promises. The ontologies are available online at http://neuroweb.med.yale.edu/senselab/ PMID:20006477
Semantic SenseLab: Implementing the vision of the Semantic Web in neuroscience.
Samwald, Matthias; Chen, Huajun; Ruttenberg, Alan; Lim, Ernest; Marenco, Luis; Miller, Perry; Shepherd, Gordon; Cheung, Kei-Hoi
2010-01-01
Integrative neuroscience research needs a scalable informatics framework that enables semantic integration of diverse types of neuroscience data. This paper describes the use of the Web Ontology Language (OWL) and other Semantic Web technologies for the representation and integration of molecular-level data provided by several of SenseLab suite of neuroscience databases. Based on the original database structure, we semi-automatically translated the databases into OWL ontologies with manual addition of semantic enrichment. The SenseLab ontologies are extensively linked to other biomedical Semantic Web resources, including the Subcellular Anatomy Ontology, Brain Architecture Management System, the Gene Ontology, BIRNLex and UniProt. The SenseLab ontologies have also been mapped to the Basic Formal Ontology and Relation Ontology, which helps ease interoperability with many other existing and future biomedical ontologies for the Semantic Web. In addition, approaches to representing contradictory research statements are described. The SenseLab ontologies are designed for use on the Semantic Web that enables their integration into a growing collection of biomedical information resources. We demonstrate that our approach can yield significant potential benefits and that the Semantic Web is rapidly becoming mature enough to realize its anticipated promises. The ontologies are available online at http://neuroweb.med.yale.edu/senselab/. 2009 Elsevier B.V. All rights reserved.
Myneni, Sahiti; Amith, Muhammad; Geng, Yimin; Tao, Cui
2015-01-01
Adolescent and Young Adult (AYA) cancer survivors manage an array of health-related issues. Survivorship Care Plans (SCPs) have the potential to empower these young survivors by providing information regarding treatment summary, late-effects of cancer therapies, healthy lifestyle guidance, coping with work-life-health balance, and follow-up care. However, current mHealth infrastructure used to deliver SCPs has been limited in terms of flexibility, engagement, and reusability. The objective of this study is to develop an ontology-driven survivor engagement framework to facilitate rapid development of mobile apps that are targeted, extensible, and engaging. The major components include ontology models, patient engagement features, and behavioral intervention technologies. We apply the proposed framework to characterize individual building blocks ("survivor digilegos"), which form the basis for mHealth tools that address user needs across the cancer care continuum. Results indicate that the framework (a) allows identification of AYA survivorship components, (b) facilitates infusion of engagement elements, and (c) integrates behavior change constructs into the design architecture of survivorship applications. Implications for design of patient-engaging chronic disease management solutions are discussed.
NASA Astrophysics Data System (ADS)
Guo, Sangang
2017-09-01
There are two stages in solving security-constrained unit commitment problems (SCUC) within Lagrangian framework: one is to obtain feasible units’ states (UC), the other is power economic dispatch (ED) for each unit. The accurate solution of ED is more important for enhancing the efficiency of the solution to SCUC for the fixed feasible units’ statues. Two novel methods named after Convex Combinatorial Coefficient Method and Power Increment Method respectively based on linear programming problem for solving ED are proposed by the piecewise linear approximation to the nonlinear convex fuel cost functions. Numerical testing results show that the methods are effective and efficient.
Surrogate assisted multidisciplinary design optimization for an all-electric GEO satellite
NASA Astrophysics Data System (ADS)
Shi, Renhe; Liu, Li; Long, Teng; Liu, Jian; Yuan, Bin
2017-09-01
State-of-the-art all-electric geostationary earth orbit (GEO) satellites use electric thrusters to execute all propulsive duties, which significantly differ from the traditional all-chemical ones in orbit-raising, station-keeping, radiation damage protection, and power budget, etc. Design optimization task of an all-electric GEO satellite is therefore a complex multidisciplinary design optimization (MDO) problem involving unique design considerations. However, solving the all-electric GEO satellite MDO problem faces big challenges in disciplinary modeling techniques and efficient optimization strategy. To address these challenges, we presents a surrogate assisted MDO framework consisting of several modules, i.e., MDO problem definition, multidisciplinary modeling, multidisciplinary analysis (MDA), and surrogate assisted optimizer. Based on the proposed framework, the all-electric GEO satellite MDO problem is formulated to minimize the total mass of the satellite system under a number of practical constraints. Then considerable efforts are spent on multidisciplinary modeling involving geosynchronous transfer, GEO station-keeping, power, thermal control, attitude control, and structure disciplines. Since orbit dynamics models and finite element structural model are computationally expensive, an adaptive response surface surrogate based optimizer is incorporated in the proposed framework to solve the satellite MDO problem with moderate computational cost, where a response surface surrogate is gradually refined to represent the computationally expensive MDA process. After optimization, the total mass of the studied GEO satellite is decreased by 185.3 kg (i.e., 7.3% of the total mass). Finally, the optimal design is further discussed to demonstrate the effectiveness of our proposed framework to cope with the all-electric GEO satellite system design optimization problems. This proposed surrogate assisted MDO framework can also provide valuable references for other all-electric spacecraft system design.
Beckwith, Sue; Dickinson, Angela; Kendall, Sally
2008-12-01
This paper draws on the work of Paley and Duncan et al in order to extend and engender debate regarding the use of Concept Analysis frameworks. Despite the apparent plethora of Concept Analysis frameworks used in nursing studies we found that over half of those used were derived from the work of one author. This paper explores the suitability and use of these frameworks and is set at a time when the numbers of published concept analysis papers are increasing. For the purpose of this study thirteen commonly used frameworks, identified from the nursing journals 1993 to 2005, were explored to reveal their origins, ontological and philosophical stance, and any common elements. The frameworks were critiqued and links made between their antecedents. It was noted if the articles contained discussion of any possible tensions between the ontological perspective of the framework used, the process of analysis, praxis and possible nursing theory developments. It was found that the thirteen identified frameworks are mainly based on hermeneutic propositions regarding understandings and are interpretive procedures founded on self-reflective modes of discovery. Six frameworks rely on or include the use of casuistry. Seven of the frameworks identified are predicated on, or adapt the work of Wilson, a school master writing for his pupils. Wilson's framework has a simplistic eleven step, binary and reductionist structure. Other frameworks identified include Morse et al's framework which this article suggests employs a contestable theory of concept maturity. Based on the findings revealed through our exploration of the use of concept analysis frameworks in the nursing literature, concerns were raised regarding the unjustified adaptation and alterations and the uncritical use of the frameworks. There is little evidence that these frameworks provide the necessary depth, rigor or replicability to enable the development in nursing theory which they underpin.
From Patient Discharge Summaries to an Ontology for Psychiatry.
Richard, Marion; Aimé, Xavier; Jaulent, Marie-Christine; Krebs, Marie-Odile; Charlet, Jean
2017-01-01
Psychiatry aims at detecting symptoms, providing diagnoses and treating mental disorders. We developed ONTOPSYCHIA, an ontology for psychiatry in three modules: social and environmental factors of mental disorders, mental disorders, and treatments. The use of ONTOPSYCHIA, associated with dedicated tools, will facilitate semantic research in Patient Discharge Summaries (PDS). To develop the first module of the ontology we propose a PDS text analysis in order to explicit psychiatry concepts. We decided to set aside classifications during the construction of the modu le, to focus only on the information contained in PDS (bottom-up approach) and to return to domain classifications solely for the enrichment phase (top-down approach). Then, we focused our work on the development of the LOVMI methodology (Les Ontologies Validées par Méthode Interactive - Ontologies Validated by Interactive Method), which aims to provide a methodological framework to validate the structure and the semantic of an ontology.
Przydzial, Magdalena J; Bhhatarai, Barun; Koleti, Amar; Vempati, Uma; Schürer, Stephan C
2013-12-15
Novel tools need to be developed to help scientists analyze large amounts of available screening data with the goal to identify entry points for the development of novel chemical probes and drugs. As the largest class of drug targets, G protein-coupled receptors (GPCRs) remain of particular interest and are pursued by numerous academic and industrial research projects. We report the first GPCR ontology to facilitate integration and aggregation of GPCR-targeting drugs and demonstrate its application to classify and analyze a large subset of the PubChem database. The GPCR ontology, based on previously reported BioAssay Ontology, depicts available pharmacological, biochemical and physiological profiles of GPCRs and their ligands. The novelty of the GPCR ontology lies in the use of diverse experimental datasets linked by a model to formally define these concepts. Using a reasoning system, GPCR ontology offers potential for knowledge-based classification of individuals (such as small molecules) as a function of the data. The GPCR ontology is available at http://www.bioassayontology.org/bao_gpcr and the National Center for Biomedical Ontologies Web site.
Formal ontologies in biomedical knowledge representation.
Schulz, S; Jansen, L
2013-01-01
Medical decision support and other intelligent applications in the life sciences depend on increasing amounts of digital information. Knowledge bases as well as formal ontologies are being used to organize biomedical knowledge and data. However, these two kinds of artefacts are not always clearly distinguished. Whereas the popular RDF(S) standard provides an intuitive triple-based representation, it is semantically weak. Description logics based ontology languages like OWL-DL carry a clear-cut semantics, but they are computationally expensive, and they are often misinterpreted to encode all kinds of statements, including those which are not ontological. We distinguish four kinds of statements needed to comprehensively represent domain knowledge: universal statements, terminological statements, statements about particulars and contingent statements. We argue that the task of formal ontologies is solely to represent universal statements, while the non-ontological kinds of statements can nevertheless be connected with ontological representations. To illustrate these four types of representations, we use a running example from parasitology. We finally formulate recommendations for semantically adequate ontologies that can efficiently be used as a stable framework for more context-dependent biomedical knowledge representation and reasoning applications like clinical decision support systems.
Dynamic optimization of chemical processes using ant colony framework.
Rajesh, J; Gupta, K; Kusumakar, H S; Jayaraman, V K; Kulkarni, B D
2001-11-01
Ant colony framework is illustrated by considering dynamic optimization of six important bench marking examples. This new computational tool is simple to implement and can tackle problems with state as well as terminal constraints in a straightforward fashion. It requires fewer grid points to reach the global optimum at relatively very low computational effort. The examples with varying degree of complexities, analyzed here, illustrate its potential for solving a large class of process optimization problems in chemical engineering.
CNTRO: A Semantic Web Ontology for Temporal Relation Inferencing in Clinical Narratives.
Tao, Cui; Wei, Wei-Qi; Solbrig, Harold R; Savova, Guergana; Chute, Christopher G
2010-11-13
Using Semantic-Web specifications to represent temporal information in clinical narratives is an important step for temporal reasoning and answering time-oriented queries. Existing temporal models are either not compatible with the powerful reasoning tools developed for the Semantic Web, or designed only for structured clinical data and therefore are not ready to be applied on natural-language-based clinical narrative reports directly. We have developed a Semantic-Web ontology which is called Clinical Narrative Temporal Relation ontology. Using this ontology, temporal information in clinical narratives can be represented as RDF (Resource Description Framework) triples. More temporal information and relations can then be inferred by Semantic-Web based reasoning tools. Experimental results show that this ontology can represent temporal information in real clinical narratives successfully.
NASA Technical Reports Server (NTRS)
Macready, William; Wolpert, David
2005-01-01
We demonstrate a new framework for analyzing and controlling distributed systems, by solving constrained optimization problems with an algorithm based on that framework. The framework is ar. information-theoretic extension of conventional full-rationality game theory to allow bounded rational agents. The associated optimization algorithm is a game in which agents control the variables of the optimization problem. They do this by jointly minimizing a Lagrangian of (the probability distribution of) their joint state. The updating of the Lagrange parameters in that Lagrangian is a form of automated annealing, one that focuses the multi-agent system on the optimal pure strategy. We present computer experiments for the k-sat constraint satisfaction problem and for unconstrained minimization of NK functions.
Donnarumma, Francesco; Maisto, Domenico; Pezzulo, Giovanni
2016-01-01
How do humans and other animals face novel problems for which predefined solutions are not available? Human problem solving links to flexible reasoning and inference rather than to slow trial-and-error learning. It has received considerable attention since the early days of cognitive science, giving rise to well known cognitive architectures such as SOAR and ACT-R, but its computational and brain mechanisms remain incompletely known. Furthermore, it is still unclear whether problem solving is a “specialized” domain or module of cognition, in the sense that it requires computations that are fundamentally different from those supporting perception and action systems. Here we advance a novel view of human problem solving as probabilistic inference with subgoaling. In this perspective, key insights from cognitive architectures are retained such as the importance of using subgoals to split problems into subproblems. However, here the underlying computations use probabilistic inference methods analogous to those that are increasingly popular in the study of perception and action systems. To test our model we focus on the widely used Tower of Hanoi (ToH) task, and show that our proposed method can reproduce characteristic idiosyncrasies of human problem solvers: their sensitivity to the “community structure” of the ToH and their difficulties in executing so-called “counterintuitive” movements. Our analysis reveals that subgoals have two key roles in probabilistic inference and problem solving. First, prior beliefs on (likely) useful subgoals carve the problem space and define an implicit metric for the problem at hand—a metric to which humans are sensitive. Second, subgoals are used as waypoints in the probabilistic problem solving inference and permit to find effective solutions that, when unavailable, lead to problem solving deficits. Our study thus suggests that a probabilistic inference scheme enhanced with subgoals provides a comprehensive framework to study problem solving and its deficits. PMID:27074140
Donnarumma, Francesco; Maisto, Domenico; Pezzulo, Giovanni
2016-04-01
How do humans and other animals face novel problems for which predefined solutions are not available? Human problem solving links to flexible reasoning and inference rather than to slow trial-and-error learning. It has received considerable attention since the early days of cognitive science, giving rise to well known cognitive architectures such as SOAR and ACT-R, but its computational and brain mechanisms remain incompletely known. Furthermore, it is still unclear whether problem solving is a "specialized" domain or module of cognition, in the sense that it requires computations that are fundamentally different from those supporting perception and action systems. Here we advance a novel view of human problem solving as probabilistic inference with subgoaling. In this perspective, key insights from cognitive architectures are retained such as the importance of using subgoals to split problems into subproblems. However, here the underlying computations use probabilistic inference methods analogous to those that are increasingly popular in the study of perception and action systems. To test our model we focus on the widely used Tower of Hanoi (ToH) task, and show that our proposed method can reproduce characteristic idiosyncrasies of human problem solvers: their sensitivity to the "community structure" of the ToH and their difficulties in executing so-called "counterintuitive" movements. Our analysis reveals that subgoals have two key roles in probabilistic inference and problem solving. First, prior beliefs on (likely) useful subgoals carve the problem space and define an implicit metric for the problem at hand-a metric to which humans are sensitive. Second, subgoals are used as waypoints in the probabilistic problem solving inference and permit to find effective solutions that, when unavailable, lead to problem solving deficits. Our study thus suggests that a probabilistic inference scheme enhanced with subgoals provides a comprehensive framework to study problem solving and its deficits.
Teachers as Thinking Coaches: Creating Strategic Learners and Problem Solvers.
ERIC Educational Resources Information Center
Gaskins, Irene W.
1989-01-01
An across-the-curriculum program was developed to teach learning, thinking, and problem-solving skills to bright middle-school underachievers. This article describes the pilot program's theoretical basis, axioms of program development, guidelines for teaching metacognitive strategies, and a framework for strategy implementation. (Author/JDD)
Fundamental awareness: A framework for integrating science, philosophy and metaphysics
Theise, Neil D.; Kafatos, Menas C.
2016-01-01
ABSTRACT The ontologic framework of Fundamental Awareness proposed here assumes that non-dual Awareness is foundational to the universe, not arising from the interactions or structures of higher level phenomena. The framework allows comparison and integration of views from the three investigative domains concerned with understanding the nature of consciousness: science, philosophy, and metaphysics. In this framework, Awareness is the underlying reality, not reducible to anything else. Awareness and existence are the same. As such, the universe is non-material, self-organizing throughout, a holarchy of complementary, process driven, recursive interactions. The universe is both its own first observer and subject. Considering the world to be non-material and comprised, a priori, of Awareness is to privilege information over materiality, action over agency and to understand that qualia are not a “hard problem,” but the foundational elements of all existence. These views fully reflect main stream Western philosophical traditions, insights from culturally diverse contemplative and mystical traditions, and are in keeping with current scientific thinking, expressible mathematically. PMID:27489576
Fundamental awareness: A framework for integrating science, philosophy and metaphysics.
Theise, Neil D; Kafatos, Menas C
2016-01-01
The ontologic framework of Fundamental Awareness proposed here assumes that non-dual Awareness is foundational to the universe, not arising from the interactions or structures of higher level phenomena. The framework allows comparison and integration of views from the three investigative domains concerned with understanding the nature of consciousness: science, philosophy, and metaphysics. In this framework, Awareness is the underlying reality, not reducible to anything else. Awareness and existence are the same. As such, the universe is non-material, self-organizing throughout, a holarchy of complementary, process driven, recursive interactions. The universe is both its own first observer and subject. Considering the world to be non-material and comprised, a priori, of Awareness is to privilege information over materiality, action over agency and to understand that qualia are not a "hard problem," but the foundational elements of all existence. These views fully reflect main stream Western philosophical traditions, insights from culturally diverse contemplative and mystical traditions, and are in keeping with current scientific thinking, expressible mathematically.
Lesion mapping of social problem solving.
Barbey, Aron K; Colom, Roberto; Paul, Erick J; Chau, Aileen; Solomon, Jeffrey; Grafman, Jordan H
2014-10-01
Accumulating neuroscience evidence indicates that human intelligence is supported by a distributed network of frontal and parietal regions that enable complex, goal-directed behaviour. However, the contributions of this network to social aspects of intellectual function remain to be well characterized. Here, we report a human lesion study (n = 144) that investigates the neural bases of social problem solving (measured by the Everyday Problem Solving Inventory) and examine the degree to which individual differences in performance are predicted by a broad spectrum of psychological variables, including psychometric intelligence (measured by the Wechsler Adult Intelligence Scale), emotional intelligence (measured by the Mayer, Salovey, Caruso Emotional Intelligence Test), and personality traits (measured by the Neuroticism-Extraversion-Openness Personality Inventory). Scores for each variable were obtained, followed by voxel-based lesion-symptom mapping. Stepwise regression analyses revealed that working memory, processing speed, and emotional intelligence predict individual differences in everyday problem solving. A targeted analysis of specific everyday problem solving domains (involving friends, home management, consumerism, work, information management, and family) revealed psychological variables that selectively contribute to each. Lesion mapping results indicated that social problem solving, psychometric intelligence, and emotional intelligence are supported by a shared network of frontal, temporal, and parietal regions, including white matter association tracts that bind these areas into a coordinated system. The results support an integrative framework for understanding social intelligence and make specific recommendations for the application of the Everyday Problem Solving Inventory to the study of social problem solving in health and disease. © The Author (2014). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Phan, Huy P.; Ngu, Bing H.; Yeung, Alexander S.
2017-01-01
We recently developed the "Framework of Achievement Bests" to explain the importance of effective functioning, personal growth, and enrichment of well-being experiences. This framework postulates a concept known as "optimal achievement best," which stipulates the idea that individuals may, in general, strive to achieve personal…
A Framework for Analyzing the Collaborative Construction of Arguments and Its Interplay with Agency
ERIC Educational Resources Information Center
Mueller, Mary; Yankelewitz, Dina; Maher, Carolyn
2012-01-01
In this report, we offer a framework for analyzing the ways in which collaboration influences learners' building of mathematical arguments and thus promotes mathematical understanding. Building on a previous model used to analyze discursive practices of students engaged in mathematical problem solving, we introduce three types of collaboration and…
Framework for Implementing Engineering Senior Design Capstone Courses and Design Clinics
ERIC Educational Resources Information Center
Franchetti, Matthew; Hefzy, Mohamed Samir; Pourazady, Mehdi; Smallman, Christine
2012-01-01
Senior design capstone projects for engineering students are essential components of an undergraduate program that enhances communication, teamwork, and problem solving skills. Capstone projects with industry are well established in management, but not as heavily utilized in engineering. This paper outlines a general framework that can be used by…
ERIC Educational Resources Information Center
Nelson, Catherine; van Dijk, Jan; McDonnell, Andrea P.; Thompson, Kristina
2002-01-01
This article describes a framework for assessing young children with severe multiple disabilities. The assessment is child-led and examines underlying processes of learning, including biobehavioral state, orienting response, learning channels, approach-withdrawal, memory, interactions, communication, and problem solving. Case studies and a sample…
ERIC Educational Resources Information Center
Haas, Stephanie W.; Pattuelli, Maria Cristina; Brown, Ron T.
2003-01-01
Describes the Statistical Interactive Glossary (SIG), an enhanced glossary of statistical terms supported by the GovStat ontology of statistical concepts. Presents a conceptual framework whose components articulate different aspects of a term's basic explanation that can be manipulated to produce a variety of presentations. The overarching…
An Ontological Informatics Framework for Pharmaceutical Product Development: Milling as a Case Study
ERIC Educational Resources Information Center
Akkisetty, Venkata Sai Pavan Kumar
2009-01-01
Pharmaceutical product development is an expensive, time consuming and information intensive process. Providing the right information at the right time is of great importance in pharmaceutical industry. To achieve this, knowledge management is the approach to deal with the humongous quantity of information. Ontological approach proposed in Venkat…
An Ontology Infrastructure for an E-Learning Scenario
ERIC Educational Resources Information Center
Guo, Wen-Ying; Chen, De-Ren
2007-01-01
Selecting appropriate learning services for a learner from a large number of heterogeneous knowledge sources is a complex and challenging task. This article illustrates and discusses how Semantic Web technologies such as RDF [resource description framework] and ontology can be applied to e-learning systems to help the learner in selecting an…
FAST: a framework for simulation and analysis of large-scale protein-silicon biosensor circuits.
Gu, Ming; Chakrabartty, Shantanu
2013-08-01
This paper presents a computer aided design (CAD) framework for verification and reliability analysis of protein-silicon hybrid circuits used in biosensors. It is envisioned that similar to integrated circuit (IC) CAD design tools, the proposed framework will be useful for system level optimization of biosensors and for discovery of new sensing modalities without resorting to laborious fabrication and experimental procedures. The framework referred to as FAST analyzes protein-based circuits by solving inverse problems involving stochastic functional elements that admit non-linear relationships between different circuit variables. In this regard, FAST uses a factor-graph netlist as a user interface and solving the inverse problem entails passing messages/signals between the internal nodes of the netlist. Stochastic analysis techniques like density evolution are used to understand the dynamics of the circuit and estimate the reliability of the solution. As an example, we present a complete design flow using FAST for synthesis, analysis and verification of our previously reported conductometric immunoassay that uses antibody-based circuits to implement forward error-correction (FEC).
Supporting Valid Decision Making: Uses and Misuses of Assessment Data within the Context of RtI
ERIC Educational Resources Information Center
Ball, Carrie R.; Christ, Theodore J.
2012-01-01
Within an RtI problem-solving context, assessment and decision making generally center around the tasks of problem identification, problem analysis, progress monitoring, and program evaluation. We use this framework to discuss the current state of the literature regarding curriculum based measurement, its technical properties, and its utility for…
ELM Meets Urban Big Data Analysis: Case Studies
Chen, Huajun; Chen, Jiaoyan
2016-01-01
In the latest years, the rapid progress of urban computing has engendered big issues, which creates both opportunities and challenges. The heterogeneous and big volume of data and the big difference between physical and virtual worlds have resulted in lots of problems in quickly solving practical problems in urban computing. In this paper, we propose a general application framework of ELM for urban computing. We present several real case studies of the framework like smog-related health hazard prediction and optimal retain store placement. Experiments involving urban data in China show the efficiency, accuracy, and flexibility of our proposed framework. PMID:27656203
A constrained robust least squares approach for contaminant release history identification
NASA Astrophysics Data System (ADS)
Sun, Alexander Y.; Painter, Scott L.; Wittmeyer, Gordon W.
2006-04-01
Contaminant source identification is an important type of inverse problem in groundwater modeling and is subject to both data and model uncertainty. Model uncertainty was rarely considered in the previous studies. In this work, a robust framework for solving contaminant source recovery problems is introduced. The contaminant source identification problem is first cast into one of solving uncertain linear equations, where the response matrix is constructed using a superposition technique. The formulation presented here is general and is applicable to any porous media flow and transport solvers. The robust least squares (RLS) estimator, which originated in the field of robust identification, directly accounts for errors arising from model uncertainty and has been shown to significantly reduce the sensitivity of the optimal solution to perturbations in model and data. In this work, a new variant of RLS, the constrained robust least squares (CRLS), is formulated for solving uncertain linear equations. CRLS allows for additional constraints, such as nonnegativity, to be imposed. The performance of CRLS is demonstrated through one- and two-dimensional test problems. When the system is ill-conditioned and uncertain, it is found that CRLS gave much better performance than its classical counterpart, the nonnegative least squares. The source identification framework developed in this work thus constitutes a reliable tool for recovering source release histories in real applications.
SART-Type Half-Threshold Filtering Approach for CT Reconstruction
YU, HENGYONG; WANG, GE
2014-01-01
The ℓ1 regularization problem has been widely used to solve the sparsity constrained problems. To enhance the sparsity constraint for better imaging performance, a promising direction is to use the ℓp norm (0 < p < 1) and solve the ℓp minimization problem. Very recently, Xu et al. developed an analytic solution for the ℓ1∕2 regularization via an iterative thresholding operation, which is also referred to as half-threshold filtering. In this paper, we design a simultaneous algebraic reconstruction technique (SART)-type half-threshold filtering framework to solve the computed tomography (CT) reconstruction problem. In the medical imaging filed, the discrete gradient transform (DGT) is widely used to define the sparsity. However, the DGT is noninvertible and it cannot be applied to half-threshold filtering for CT reconstruction. To demonstrate the utility of the proposed SART-type half-threshold filtering framework, an emphasis of this paper is to construct a pseudoinverse transforms for DGT. The proposed algorithms are evaluated with numerical and physical phantom data sets. Our results show that the SART-type half-threshold filtering algorithms have great potential to improve the reconstructed image quality from few and noisy projections. They are complementary to the counterparts of the state-of-the-art soft-threshold filtering and hard-threshold filtering. PMID:25530928
SART-Type Half-Threshold Filtering Approach for CT Reconstruction.
Yu, Hengyong; Wang, Ge
2014-01-01
The [Formula: see text] regularization problem has been widely used to solve the sparsity constrained problems. To enhance the sparsity constraint for better imaging performance, a promising direction is to use the [Formula: see text] norm (0 < p < 1) and solve the [Formula: see text] minimization problem. Very recently, Xu et al. developed an analytic solution for the [Formula: see text] regularization via an iterative thresholding operation, which is also referred to as half-threshold filtering. In this paper, we design a simultaneous algebraic reconstruction technique (SART)-type half-threshold filtering framework to solve the computed tomography (CT) reconstruction problem. In the medical imaging filed, the discrete gradient transform (DGT) is widely used to define the sparsity. However, the DGT is noninvertible and it cannot be applied to half-threshold filtering for CT reconstruction. To demonstrate the utility of the proposed SART-type half-threshold filtering framework, an emphasis of this paper is to construct a pseudoinverse transforms for DGT. The proposed algorithms are evaluated with numerical and physical phantom data sets. Our results show that the SART-type half-threshold filtering algorithms have great potential to improve the reconstructed image quality from few and noisy projections. They are complementary to the counterparts of the state-of-the-art soft-threshold filtering and hard-threshold filtering.
Evolving Frameworks for Different Communities of Scientists and End Users
NASA Astrophysics Data System (ADS)
Graves, S. J.; Keiser, K.
2016-12-01
Two evolving frameworks for interdisciplinary science will be described in the context of the Common Data Framework for Earth-Observation Data and the importance of standards and protocols. The Event Data Driven Delivery (ED3) Framework, funded by NASA Applied Sciences, provides the delivery of data based on predetermined subscriptions and associated workflows to various communities of end users. ED3's capabilities are used by scientists, as well as policy and resource managers, when event alerts are triggered to respond to their needs. The EarthCube Integration and Testing Environment (ECITE) Assessment Framework for Technology Interoperability and Integration is being developed to facilitate the EarthCube community's assessment of NSF funded technologies addressing Earth science problems. ECITE is addressing the translation of geoscience researchers' use cases into technology use case that apply EarthCube-funded building block technologies (and other existing technologies) for solving science problems. EarthCube criteria for technology assessment include the use of data, metadata and service standards to improve interoperability and integration across program components. The long-range benefit will be the growth of a cyberinfrastructure with technology components that have been shown to work together to solve known science objectives.
Stapp's quantum dualism: The James and Heisenberg model of consciousness
NASA Astrophysics Data System (ADS)
Noyes, H. P.
1994-02-01
Henry Stapp attempts to resolve the Cartesian dilemma by introducing what the author would characterize as an ontological dualism between mind and matter. His model for mind comes from William James' description of conscious events and for matter from Werner Heisenberg's ontological model for quantum events (wave function collapse). His demonstration of the isomorphism between the two types of events is successful, but in the author's opinion fails to establish a monistic, scientific theory. The author traces Stapp's failure to his adamant rejection of arbitrariness, or 'randomness.' This makes it impossible for him (or for Bohr and Pauli before him) to understand the power of Darwin's explanation of biology, let alone the triumphs of modern 'neo-Darwinism.' The author notes that the point at issue is a modern version of the unresolved opposition between Leucippus and Democritus on one side and Epicurus on the other. Stapp's views are contrasted with recent discussions of consciousness by two eminent biologists: Crick and Edelman. They locate the problem firmly in the context of natural selection on the surface of the earth. Their approaches provide a sound basis for further scientific work. The author briefly examines the connection between this scientific (rather than ontological) framework and the new fundamental theory based on bit-strings and the combinatorial hierarchy.
Integrating systems biology models and biomedical ontologies
2011-01-01
Background Systems biology is an approach to biology that emphasizes the structure and dynamic behavior of biological systems and the interactions that occur within them. To succeed, systems biology crucially depends on the accessibility and integration of data across domains and levels of granularity. Biomedical ontologies were developed to facilitate such an integration of data and are often used to annotate biosimulation models in systems biology. Results We provide a framework to integrate representations of in silico systems biology with those of in vivo biology as described by biomedical ontologies and demonstrate this framework using the Systems Biology Markup Language. We developed the SBML Harvester software that automatically converts annotated SBML models into OWL and we apply our software to those biosimulation models that are contained in the BioModels Database. We utilize the resulting knowledge base for complex biological queries that can bridge levels of granularity, verify models based on the biological phenomenon they represent and provide a means to establish a basic qualitative layer on which to express the semantics of biosimulation models. Conclusions We establish an information flow between biomedical ontologies and biosimulation models and we demonstrate that the integration of annotated biosimulation models and biomedical ontologies enables the verification of models as well as expressive queries. Establishing a bi-directional information flow between systems biology and biomedical ontologies has the potential to enable large-scale analyses of biological systems that span levels of granularity from molecules to organisms. PMID:21835028
Discovering gene annotations in biomedical text databases
Cakmak, Ali; Ozsoyoglu, Gultekin
2008-01-01
Background Genes and gene products are frequently annotated with Gene Ontology concepts based on the evidence provided in genomics articles. Manually locating and curating information about a genomic entity from the biomedical literature requires vast amounts of human effort. Hence, there is clearly a need forautomated computational tools to annotate the genes and gene products with Gene Ontology concepts by computationally capturing the related knowledge embedded in textual data. Results In this article, we present an automated genomic entity annotation system, GEANN, which extracts information about the characteristics of genes and gene products in article abstracts from PubMed, and translates the discoveredknowledge into Gene Ontology (GO) concepts, a widely-used standardized vocabulary of genomic traits. GEANN utilizes textual "extraction patterns", and a semantic matching framework to locate phrases matching to a pattern and produce Gene Ontology annotations for genes and gene products. In our experiments, GEANN has reached to the precision level of 78% at therecall level of 61%. On a select set of Gene Ontology concepts, GEANN either outperforms or is comparable to two other automated annotation studies. Use of WordNet for semantic pattern matching improves the precision and recall by 24% and 15%, respectively, and the improvement due to semantic pattern matching becomes more apparent as the Gene Ontology terms become more general. Conclusion GEANN is useful for two distinct purposes: (i) automating the annotation of genomic entities with Gene Ontology concepts, and (ii) providing existing annotations with additional "evidence articles" from the literature. The use of textual extraction patterns that are constructed based on the existing annotations achieve high precision. The semantic pattern matching framework provides a more flexible pattern matching scheme with respect to "exactmatching" with the advantage of locating approximate pattern occurrences with similar semantics. Relatively low recall performance of our pattern-based approach may be enhanced either by employing a probabilistic annotation framework based on the annotation neighbourhoods in textual data, or, alternatively, the statistical enrichment threshold may be adjusted to lower values for applications that put more value on achieving higher recall values. PMID:18325104
Discovering gene annotations in biomedical text databases.
Cakmak, Ali; Ozsoyoglu, Gultekin
2008-03-06
Genes and gene products are frequently annotated with Gene Ontology concepts based on the evidence provided in genomics articles. Manually locating and curating information about a genomic entity from the biomedical literature requires vast amounts of human effort. Hence, there is clearly a need forautomated computational tools to annotate the genes and gene products with Gene Ontology concepts by computationally capturing the related knowledge embedded in textual data. In this article, we present an automated genomic entity annotation system, GEANN, which extracts information about the characteristics of genes and gene products in article abstracts from PubMed, and translates the discoveredknowledge into Gene Ontology (GO) concepts, a widely-used standardized vocabulary of genomic traits. GEANN utilizes textual "extraction patterns", and a semantic matching framework to locate phrases matching to a pattern and produce Gene Ontology annotations for genes and gene products. In our experiments, GEANN has reached to the precision level of 78% at therecall level of 61%. On a select set of Gene Ontology concepts, GEANN either outperforms or is comparable to two other automated annotation studies. Use of WordNet for semantic pattern matching improves the precision and recall by 24% and 15%, respectively, and the improvement due to semantic pattern matching becomes more apparent as the Gene Ontology terms become more general. GEANN is useful for two distinct purposes: (i) automating the annotation of genomic entities with Gene Ontology concepts, and (ii) providing existing annotations with additional "evidence articles" from the literature. The use of textual extraction patterns that are constructed based on the existing annotations achieve high precision. The semantic pattern matching framework provides a more flexible pattern matching scheme with respect to "exactmatching" with the advantage of locating approximate pattern occurrences with similar semantics. Relatively low recall performance of our pattern-based approach may be enhanced either by employing a probabilistic annotation framework based on the annotation neighbourhoods in textual data, or, alternatively, the statistical enrichment threshold may be adjusted to lower values for applications that put more value on achieving higher recall values.
Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving
Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni
2015-01-01
It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load. PMID:25652466
A Modular Framework for Transforming Structured Data into HTML with Machine-Readable Annotations
NASA Astrophysics Data System (ADS)
Patton, E. W.; West, P.; Rozell, E.; Zheng, J.
2010-12-01
There is a plethora of web-based Content Management Systems (CMS) available for maintaining projects and data, i.a. However, each system varies in its capabilities and often content is stored separately and accessed via non-uniform web interfaces. Moving from one CMS to another (e.g., MediaWiki to Drupal) can be cumbersome, especially if a large quantity of data must be adapted to the new system. To standardize the creation, display, management, and sharing of project information, we have assembled a framework that uses existing web technologies to transform data provided by any service that supports the SPARQL Protocol and RDF Query Language (SPARQL) queries into HTML fragments, allowing it to be embedded in any existing website. The framework utilizes a two-tier XML Stylesheet Transformation (XSLT) that uses existing ontologies (e.g., Friend-of-a-Friend, Dublin Core) to interpret query results and render them as HTML documents. These ontologies can be used in conjunction with custom ontologies suited to individual needs (e.g., domain-specific ontologies for describing data records). Furthermore, this transformation process encodes machine-readable annotations, namely, the Resource Description Framework in attributes (RDFa), into the resulting HTML, so that capable parsers and search engines can extract the relationships between entities (e.g, people, organizations, datasets). To facilitate editing of content, the framework provides a web-based form system, mapping each query to a dynamically generated form that can be used to modify and create entities, while keeping the native data store up-to-date. This open framework makes it easy to duplicate data across many different sites, allowing researchers to distribute their data in many different online forums. In this presentation we will outline the structure of queries and the stylesheets used to transform them, followed by a brief walkthrough that follows the data from storage to human- and machine-accessible web page. We conclude with a discussion on content caching and steps toward performing queries across multiple domains.
Network Community Detection based on the Physarum-inspired Computational Framework.
Gao, Chao; Liang, Mingxin; Li, Xianghua; Zhang, Zili; Wang, Zhen; Zhou, Zhili
2016-12-13
Community detection is a crucial and essential problem in the structure analytics of complex networks, which can help us understand and predict the characteristics and functions of complex networks. Many methods, ranging from the optimization-based algorithms to the heuristic-based algorithms, have been proposed for solving such a problem. Due to the inherent complexity of identifying network structure, how to design an effective algorithm with a higher accuracy and a lower computational cost still remains an open problem. Inspired by the computational capability and positive feedback mechanism in the wake of foraging process of Physarum, which is a large amoeba-like cell consisting of a dendritic network of tube-like pseudopodia, a general Physarum-based computational framework for community detection is proposed in this paper. Based on the proposed framework, the inter-community edges can be identified from the intra-community edges in a network and the positive feedback of solving process in an algorithm can be further enhanced, which are used to improve the efficiency of original optimization-based and heuristic-based community detection algorithms, respectively. Some typical algorithms (e.g., genetic algorithm, ant colony optimization algorithm, and Markov clustering algorithm) and real-world datasets have been used to estimate the efficiency of our proposed computational framework. Experiments show that the algorithms optimized by Physarum-inspired computational framework perform better than the original ones, in terms of accuracy and computational cost. Moreover, a computational complexity analysis verifies the scalability of our framework.
A multilevel finite element method for Fredholm integral eigenvalue problems
NASA Astrophysics Data System (ADS)
Xie, Hehu; Zhou, Tao
2015-12-01
In this work, we proposed a multigrid finite element (MFE) method for solving the Fredholm integral eigenvalue problems. The main motivation for such studies is to compute the Karhunen-Loève expansions of random fields, which play an important role in the applications of uncertainty quantification. In our MFE framework, solving the eigenvalue problem is converted to doing a series of integral iterations and eigenvalue solving in the coarsest mesh. Then, any existing efficient integration scheme can be used for the associated integration process. The error estimates are provided, and the computational complexity is analyzed. It is noticed that the total computational work of our method is comparable with a single integration step in the finest mesh. Several numerical experiments are presented to validate the efficiency of the proposed numerical method.
Stucky, Brian J; Guralnick, Rob; Deck, John; Denny, Ellen G; Bolmgren, Kjell; Walls, Ramona
2018-01-01
Plant phenology - the timing of plant life-cycle events, such as flowering or leafing out - plays a fundamental role in the functioning of terrestrial ecosystems, including human agricultural systems. Because plant phenology is often linked with climatic variables, there is widespread interest in developing a deeper understanding of global plant phenology patterns and trends. Although phenology data from around the world are currently available, truly global analyses of plant phenology have so far been difficult because the organizations producing large-scale phenology data are using non-standardized terminologies and metrics during data collection and data processing. To address this problem, we have developed the Plant Phenology Ontology (PPO). The PPO provides the standardized vocabulary and semantic framework that is needed for large-scale integration of heterogeneous plant phenology data. Here, we describe the PPO, and we also report preliminary results of using the PPO and a new data processing pipeline to build a large dataset of phenology information from North America and Europe.
NASA Astrophysics Data System (ADS)
Macris, Aristomenis M.; Georgakellos, Dimitrios A.
Technology selection decisions such as equipment purchasing and supplier selection are decisions of strategic importance to companies. The nature of these decisions usually is complex, unstructured and thus, difficult to be captured in a way that will be efficiently reusable. Knowledge reusability is of paramount importance since it enables users participate actively in process design/redesign activities stimulated by the changing technology selection environment. This paper addresses the technology selection problem through an ontology-based approach that captures and makes reusable the equipment purchasing process and assists in identifying (a) the specifications requested by the users' organization, (b) those offered by various candidate vendors' organizations and (c) in performing specifications gap analysis as a prerequisite for effective and efficient technology selection. This approach has practical appeal, operational simplicity, and the potential for both immediate and long-term strategic impact. An example from the iron and steel industry is also presented to illustrate the approach.
A common layer of interoperability for biomedical ontologies based on OWL EL.
Hoehndorf, Robert; Dumontier, Michel; Oellrich, Anika; Wimalaratne, Sarala; Rebholz-Schuhmann, Dietrich; Schofield, Paul; Gkoutos, Georgios V
2011-04-01
Ontologies are essential in biomedical research due to their ability to semantically integrate content from different scientific databases and resources. Their application improves capabilities for querying and mining biological knowledge. An increasing number of ontologies is being developed for this purpose, and considerable effort is invested into formally defining them in order to represent their semantics explicitly. However, current biomedical ontologies do not facilitate data integration and interoperability yet, since reasoning over these ontologies is very complex and cannot be performed efficiently or is even impossible. We propose the use of less expressive subsets of ontology representation languages to enable efficient reasoning and achieve the goal of genuine interoperability between ontologies. We present and evaluate EL Vira, a framework that transforms OWL ontologies into the OWL EL subset, thereby enabling the use of tractable reasoning. We illustrate which OWL constructs and inferences are kept and lost following the conversion and demonstrate the performance gain of reasoning indicated by the significant reduction of processing time. We applied EL Vira to the open biomedical ontologies and provide a repository of ontologies resulting from this conversion. EL Vira creates a common layer of ontological interoperability that, for the first time, enables the creation of software solutions that can employ biomedical ontologies to perform inferences and answer complex queries to support scientific analyses. The EL Vira software is available from http://el-vira.googlecode.com and converted OBO ontologies and their mappings are available from http://bioonto.gen.cam.ac.uk/el-ont.
Ratio Analysis: Where Investments Meet Mathematics.
ERIC Educational Resources Information Center
Barton, Susan D.; Woodbury, Denise
2002-01-01
Discusses ratio analysis by which investments may be evaluated. Requires the use of fundamental mathematics, problem solving, and a comparison of the mathematical results within the framework of industry. (Author/NB)
Possibilities: A framework for modeling students' deductive reasoning in physics
NASA Astrophysics Data System (ADS)
Gaffney, Jonathan David Housley
Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning process, I have developed a new framework, which is based on the mental models framework in psychology championed by P. N. Johnson-Laird. My new framework models how students search possibility space when thinking about conceptual physics problems and suggests that errors arise from failing to flesh out all possibilities. It further suggests that instructional interventions should focus on making apparent those possibilities, as well as all physical consequences those possibilities would incur. The possibilities framework emerged from the analysis of data from a unique research project specifically invented for the purpose of understanding how students use deductive reasoning. In the selection task, participants were given a physics problem along with three written possible solutions with the goal of identifying which one of the three possible solutions was correct. Each participant was also asked to identify the errors in the incorrect solutions. For the study presented in this dissertation, participants not only performed the selection task individually on four problems, but they were also placed into groups of two or three and asked to discuss with each other the reasoning they used in making their choices and attempt to reach a consensus about which solution was correct. Finally, those groups were asked to work together to perform the selection task on three new problems. The possibilities framework appropriately models the reasoning that students use, and it makes useful predictions about potentially helpful instructional interventions. The study reported in this dissertation emphasizes the useful insight the possibilities framework provides. For example, this framework allows us to detect subtle differences in students' reasoning errors, even when those errors result in the same final answer. It also illuminates how simply mentioning overlooked quantities can instigate new lines of student reasoning. It allows us to better understand how well-known psychological biases, such as the belief bias, affect the reasoning process by preventing reasoners from fleshing out all of the possibilities. The possibilities framework also allows us to track student discussions about physics, revealing the need for all parties in communication to use the same set of possibilities in the conversations to facilitate successful understanding. The framework also suggests some of the influences that affect how reasoners choose between possible solutions to a given problem. This new framework for understanding how students reason when solving conceptual physics problems opens the door to a significant field of research. The framework itself needs to be further tested and developed, but it provides substantial suggestions for instructional interventions. If we hope to improve student reasoning in physics, the possibilities framework suggests that we are perhaps best served by teaching students how to fully flesh out the possibilities in every situation. This implies that we need to ensure students have a deep understanding of all of the implied possibilities afforded by the fundamental principles that are the cornerstones of the models we teach in physics classes.
AMMO-Prot: amine system project 3D-model finder.
Navas-Delgado, Ismael; Montañez, Raúl; Pino-Angeles, Almudena; Moya-García, Aurelio A; Urdiales, José Luis; Sánchez-Jiménez, Francisca; Aldana-Montes, José F
2008-04-25
Amines are biogenic amino acid derivatives, which play pleiotropic and very important yet complex roles in animal physiology. For many other relevant biomolecules, biochemical and molecular data are being accumulated, which need to be integrated in order to be effective in the advance of biological knowledge in the field. For this purpose, a multidisciplinary group has started an ontology-based system named the Amine System Project (ASP) for which amine-related information is the validation bench. In this paper, we describe the Ontology-Based Mediator developed in the Amine System Project (http://asp.uma.es) using the infrastructure of Semantic Directories, and how this system has been used to solve a case related to amine metabolism-related protein structures. This infrastructure is used to publish and manage not only ontologies and their relationships, but also metadata relating to the resources committed with the ontologies. The system developed is available at http://asp.uma.es/WebMediator.
NASA Astrophysics Data System (ADS)
Rebello, Carina M.
This study explored the effects of alternative forms of argumentation on undergraduates' physics solutions in introductory calculus-based physics. A two-phase concurrent mixed methods design was employed to investigate relationships between undergraduates' written argumentation abilities, conceptual quality of problem solutions, as well as approaches and strategies for solving argumentative physics problems across multiple physics topics. Participants were assigned via stratified sampling to one of three conditions (control, guided construct, or guided evaluate) based on gender and pre-test scores on a conceptual instrument. The guided construct and guided evaluate groups received tasks and prompts drawn from literature to facilitate argument construction or evaluation. Using a multiple case study design, with each condition serving as a case, interviews were conducted consisting of a think-aloud problem solving session paired with a semi-structured interview. The analysis of problem solving strategies was guided by the theoretical framework on epistemic games adapted by Tuminaro and Redish (2007). This study provides empirical evidence that integration of written argumentation into physics problems can potentially improve the conceptual quality of solutions, expand their repertoire of problem solving strategies and show promise for addressing the gender gap in physics. The study suggests further avenues for research in this area and implications for designing and implementing argumentation tasks in introductory college physics.
Self-organizing ontology of biochemically relevant small molecules
2012-01-01
Background The advent of high-throughput experimentation in biochemistry has led to the generation of vast amounts of chemical data, necessitating the development of novel analysis, characterization, and cataloguing techniques and tools. Recently, a movement to publically release such data has advanced biochemical structure-activity relationship research, while providing new challenges, the biggest being the curation, annotation, and classification of this information to facilitate useful biochemical pattern analysis. Unfortunately, the human resources currently employed by the organizations supporting these efforts (e.g. ChEBI) are expanding linearly, while new useful scientific information is being released in a seemingly exponential fashion. Compounding this, currently existing chemical classification and annotation systems are not amenable to automated classification, formal and transparent chemical class definition axiomatization, facile class redefinition, or novel class integration, thus further limiting chemical ontology growth by necessitating human involvement in curation. Clearly, there is a need for the automation of this process, especially for novel chemical entities of biological interest. Results To address this, we present a formal framework based on Semantic Web technologies for the automatic design of chemical ontology which can be used for automated classification of novel entities. We demonstrate the automatic self-assembly of a structure-based chemical ontology based on 60 MeSH and 40 ChEBI chemical classes. This ontology is then used to classify 200 compounds with an accuracy of 92.7%. We extend these structure-based classes with molecular feature information and demonstrate the utility of our framework for classification of functionally relevant chemicals. Finally, we discuss an iterative approach that we envision for future biochemical ontology development. Conclusions We conclude that the proposed methodology can ease the burden of chemical data annotators and dramatically increase their productivity. We anticipate that the use of formal logic in our proposed framework will make chemical classification criteria more transparent to humans and machines alike and will thus facilitate predictive and integrative bioactivity model development. PMID:22221313
Guidance for modeling causes and effects in environmental problem solving
Armour, Carl L.; Williamson, Samuel C.
1988-01-01
Environmental problems are difficult to solve because their causes and effects are not easily understood. When attempts are made to analyze causes and effects, the principal challenge is organization of information into a framework that is logical, technically defensible, and easy to understand and communicate. When decisionmakers attempt to solve complex problems before an adequate cause and effect analysis is performed there are serious risks. These risks include: greater reliance on subjective reasoning, lessened chance for scoping an effective problem solving approach, impaired recognition of the need for supplemental information to attain understanding, increased chance for making unsound decisions, and lessened chance for gaining approval and financial support for a program/ Cause and effect relationships can be modeled. This type of modeling has been applied to various environmental problems, including cumulative impact assessment (Dames and Moore 1981; Meehan and Weber 1985; Williamson et al. 1987; Raley et al. 1988) and evaluation of effects of quarrying (Sheate 1986). This guidance for field users was written because of the current interest in documenting cause-effect logic as a part of ecological problem solving. Principal literature sources relating to the modeling approach are: Riggs and Inouye (1975a, b), Erickson (1981), and United States Office of Personnel Management (1986).
How doctors learn: the role of clinical problems across the medical school-to-practice continuum.
Slotnick, H B
1996-01-01
The author proposes a theory of how physicians learn that uses clinical problem solving as its central feature. His theory, which integrates insights from Maslow, Schön, Norman, and others, claims that physicians-in-training and practicing physicians learn largely by deriving insights from clinical experience. These insights allow the learner to solve future problems and thereby address the learner's basic human needs for security, affiliation, and self-esteem. Ensuring that students gain such insights means that the proper roles of the teacher are (1) to select problems for students to solve and offer guidance on how to solve them, and (2) to serve as a role model of how to reflect on the problem, its solution, and the solution's effectiveness. Three principles guide instruction within its framework for learning: (1) learners, whether physicians-in-training or practicing physicians, seek to solve problems they recognize they have; (2) learners want to be involved in their own learning; and (3) instruction must both be time-efficient and also demonstrate the range of ways in which students can apply what they learn. The author concludes by applying the theory to an aspect of undergraduate education and to the general process of continuing medical education.
The MMI Semantic Framework: Rosetta Stones for Earth Sciences
NASA Astrophysics Data System (ADS)
Rueda, C.; Bermudez, L. E.; Graybeal, J.; Alexander, P.
2009-12-01
Semantic interoperability—the exchange of meaning among computer systems—is needed to successfully share data in Ocean Science and across all Earth sciences. The best approach toward semantic interoperability requires a designed framework, and operationally tested tools and infrastructure within that framework. Currently available technologies make a scientific semantic framework feasible, but its development requires sustainable architectural vision and development processes. This presentation outlines the MMI Semantic Framework, including recent progress on it and its client applications. The MMI Semantic Framework consists of tools, infrastructure, and operational and community procedures and best practices, to meet short-term and long-term semantic interoperability goals. The design and prioritization of the semantic framework capabilities are based on real-world scenarios in Earth observation systems. We describe some key uses cases, as well as the associated requirements for building the overall infrastructure, which is realized through the MMI Ontology Registry and Repository. This system includes support for community creation and sharing of semantic content, ontology registration, version management, and seamless integration of user-friendly tools and application programming interfaces. The presentation describes the architectural components for semantic mediation, registry and repository for vocabularies, ontology, and term mappings. We show how the technologies and approaches in the framework can address community needs for managing and exchanging semantic information. We will demonstrate how different types of users and client applications exploit the tools and services for data aggregation, visualization, archiving, and integration. Specific examples from OOSTethys (http://www.oostethys.org) and the Ocean Observatories Initiative Cyberinfrastructure (http://www.oceanobservatories.org) will be cited. Finally, we show how semantic augmentation of web services standards could be performed using framework tools.
Van Landeghem, Sofie; Van Parys, Thomas; Dubois, Marieke; Inzé, Dirk; Van de Peer, Yves
2016-01-05
Differential networks have recently been introduced as a powerful way to study the dynamic rewiring capabilities of an interactome in response to changing environmental conditions or stimuli. Currently, such differential networks are generated and visualised using ad hoc methods, and are often limited to the analysis of only one condition-specific response or one interaction type at a time. In this work, we present a generic, ontology-driven framework to infer, visualise and analyse an arbitrary set of condition-specific responses against one reference network. To this end, we have implemented novel ontology-based algorithms that can process highly heterogeneous networks, accounting for both physical interactions and regulatory associations, symmetric and directed edges, edge weights and negation. We propose this integrative framework as a standardised methodology that allows a unified view on differential networks and promotes comparability between differential network studies. As an illustrative application, we demonstrate its usefulness on a plant abiotic stress study and we experimentally confirmed a predicted regulator. Diffany is freely available as open-source java library and Cytoscape plugin from http://bioinformatics.psb.ugent.be/supplementary_data/solan/diffany/.
Using the DPSIR Framework to Develop a Conceptual Model: Technical Support Document
Modern problems (e.g., pollution, urban sprawl, environmental equity) are complex and often transcend spatial and temporal scales. Systems thinking is an approach to problem solving that is based on the belief that the component parts of a system are best understood in the contex...
Ontological simulation for educational process organisation in a higher educational institution
NASA Astrophysics Data System (ADS)
Berestneva, O. G.; Marukhina, O. V.; Bahvalov, S. V.; Fisochenko, O. N.; Berestneva, E. V.
2017-01-01
Following the new-generation standards is needed to form a task list connected with planning and organizing of an academic process, structure and content formation of degree programmes. Even when planning the structure and content of an academic process, one meets some problems concerning the necessity to assess the correlation between degree programmes and demands of educational and professional standards and to consider today’s job-market and students demands. The paper presents examples of ontological simulations for solutions of organizing educational process problems in a higher educational institution and gives descriptions of model development. The article presents two examples: ontological simulation when planning an educational process in a higher educational institution and ontological simulation for describing competences of an IT-specialist. The paper sets a conclusion about ontology application perceptiveness for formalization of educational process organization in a higher educational institution.
Matos, Ely Edison; Campos, Fernanda; Braga, Regina; Palazzi, Daniele
2010-02-01
The amount of information generated by biological research has lead to an intensive use of models. Mathematical and computational modeling needs accurate description to share, reuse and simulate models as formulated by original authors. In this paper, we introduce the Cell Component Ontology (CelO), expressed in OWL-DL. This ontology captures both the structure of a cell model and the properties of functional components. We use this ontology in a Web project (CelOWS) to describe, query and compose CellML models, using semantic web services. It aims to improve reuse and composition of existent components and allow semantic validation of new models.
An algorithmic framework for multiobjective optimization.
Ganesan, T; Elamvazuthi, I; Shaari, Ku Zilati Ku; Vasant, P
2013-01-01
Multiobjective (MO) optimization is an emerging field which is increasingly being encountered in many fields globally. Various metaheuristic techniques such as differential evolution (DE), genetic algorithm (GA), gravitational search algorithm (GSA), and particle swarm optimization (PSO) have been used in conjunction with scalarization techniques such as weighted sum approach and the normal-boundary intersection (NBI) method to solve MO problems. Nevertheless, many challenges still arise especially when dealing with problems with multiple objectives (especially in cases more than two). In addition, problems with extensive computational overhead emerge when dealing with hybrid algorithms. This paper discusses these issues by proposing an alternative framework that utilizes algorithmic concepts related to the problem structure for generating efficient and effective algorithms. This paper proposes a framework to generate new high-performance algorithms with minimal computational overhead for MO optimization.
An Algorithmic Framework for Multiobjective Optimization
Ganesan, T.; Elamvazuthi, I.; Shaari, Ku Zilati Ku; Vasant, P.
2013-01-01
Multiobjective (MO) optimization is an emerging field which is increasingly being encountered in many fields globally. Various metaheuristic techniques such as differential evolution (DE), genetic algorithm (GA), gravitational search algorithm (GSA), and particle swarm optimization (PSO) have been used in conjunction with scalarization techniques such as weighted sum approach and the normal-boundary intersection (NBI) method to solve MO problems. Nevertheless, many challenges still arise especially when dealing with problems with multiple objectives (especially in cases more than two). In addition, problems with extensive computational overhead emerge when dealing with hybrid algorithms. This paper discusses these issues by proposing an alternative framework that utilizes algorithmic concepts related to the problem structure for generating efficient and effective algorithms. This paper proposes a framework to generate new high-performance algorithms with minimal computational overhead for MO optimization. PMID:24470795
Sahoo, Satya S; Valdez, Joshua; Rueschman, Michael
2016-01-01
Scientific reproducibility is key to scientific progress as it allows the research community to build on validated results, protect patients from potentially harmful trial drugs derived from incorrect results, and reduce wastage of valuable resources. The National Institutes of Health (NIH) recently published a systematic guideline titled "Rigor and Reproducibility " for supporting reproducible research studies, which has also been accepted by several scientific journals. These journals will require published articles to conform to these new guidelines. Provenance metadata describes the history or origin of data and it has been long used in computer science to capture metadata information for ensuring data quality and supporting scientific reproducibility. In this paper, we describe the development of Provenance for Clinical and healthcare Research (ProvCaRe) framework together with a provenance ontology to support scientific reproducibility by formally modeling a core set of data elements representing details of research study. We extend the PROV Ontology (PROV-O), which has been recommended as the provenance representation model by World Wide Web Consortium (W3C), to represent both: (a) data provenance, and (b) process provenance. We use 124 study variables from 6 clinical research studies from the National Sleep Research Resource (NSRR) to evaluate the coverage of the provenance ontology. NSRR is the largest repository of NIH-funded sleep datasets with 50,000 studies from 36,000 participants. The provenance ontology reuses ontology concepts from existing biomedical ontologies, for example the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), to model the provenance information of research studies. The ProvCaRe framework is being developed as part of the Big Data to Knowledge (BD2K) data provenance project.
Sahoo, Satya S.; Valdez, Joshua; Rueschman, Michael
2016-01-01
Scientific reproducibility is key to scientific progress as it allows the research community to build on validated results, protect patients from potentially harmful trial drugs derived from incorrect results, and reduce wastage of valuable resources. The National Institutes of Health (NIH) recently published a systematic guideline titled “Rigor and Reproducibility “ for supporting reproducible research studies, which has also been accepted by several scientific journals. These journals will require published articles to conform to these new guidelines. Provenance metadata describes the history or origin of data and it has been long used in computer science to capture metadata information for ensuring data quality and supporting scientific reproducibility. In this paper, we describe the development of Provenance for Clinical and healthcare Research (ProvCaRe) framework together with a provenance ontology to support scientific reproducibility by formally modeling a core set of data elements representing details of research study. We extend the PROV Ontology (PROV-O), which has been recommended as the provenance representation model by World Wide Web Consortium (W3C), to represent both: (a) data provenance, and (b) process provenance. We use 124 study variables from 6 clinical research studies from the National Sleep Research Resource (NSRR) to evaluate the coverage of the provenance ontology. NSRR is the largest repository of NIH-funded sleep datasets with 50,000 studies from 36,000 participants. The provenance ontology reuses ontology concepts from existing biomedical ontologies, for example the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), to model the provenance information of research studies. The ProvCaRe framework is being developed as part of the Big Data to Knowledge (BD2K) data provenance project. PMID:28269904
The Orthology Ontology: development and applications.
Fernández-Breis, Jesualdo Tomás; Chiba, Hirokazu; Legaz-García, María Del Carmen; Uchiyama, Ikuo
2016-06-04
Computational comparative analysis of multiple genomes provides valuable opportunities to biomedical research. In particular, orthology analysis can play a central role in comparative genomics; it guides establishing evolutionary relations among genes of organisms and allows functional inference of gene products. However, the wide variations in current orthology databases necessitate the research toward the shareability of the content that is generated by different tools and stored in different structures. Exchanging the content with other research communities requires making the meaning of the content explicit. The need for a common ontology has led to the creation of the Orthology Ontology (ORTH) following the best practices in ontology construction. Here, we describe our model and major entities of the ontology that is implemented in the Web Ontology Language (OWL), followed by the assessment of the quality of the ontology and the application of the ORTH to existing orthology datasets. This shareable ontology enables the possibility to develop Linked Orthology Datasets and a meta-predictor of orthology through standardization for the representation of orthology databases. The ORTH is freely available in OWL format to all users at http://purl.org/net/orth . The Orthology Ontology can serve as a framework for the semantic standardization of orthology content and it will contribute to a better exploitation of orthology resources in biomedical research. The results demonstrate the feasibility of developing shareable datasets using this ontology. Further applications will maximize the usefulness of this ontology.
NASA Astrophysics Data System (ADS)
Levin, Alan R.; Zhang, Deyin; Polizzi, Eric
2012-11-01
In a recent article Polizzi (2009) [15], the FEAST algorithm has been presented as a general purpose eigenvalue solver which is ideally suited for addressing the numerical challenges in electronic structure calculations. Here, FEAST is presented beyond the “black-box” solver as a fundamental modeling framework which can naturally address the original numerical complexity of the electronic structure problem as formulated by Slater in 1937 [3]. The non-linear eigenvalue problem arising from the muffin-tin decomposition of the real-space domain is first derived and then reformulated to be solved exactly within the FEAST framework. This new framework is presented as a fundamental and practical solution for performing both accurate and scalable electronic structure calculations, bypassing the various issues of using traditional approaches such as linearization and pseudopotential techniques. A finite element implementation of this FEAST framework along with simulation results for various molecular systems is also presented and discussed.
Examples of Linking Codes Within GeoFramework
NASA Astrophysics Data System (ADS)
Tan, E.; Choi, E.; Thoutireddy, P.; Aivazis, M.; Lavier, L.; Quenette, S.; Gurnis, M.
2003-12-01
Geological processes usually encompass a broad spectrum of length and time scales. Traditionally, a modeling code (solver) is written to solve a problem with specific length and time scales in mind. The utility of the solver beyond the designated purpose is usually limited. Furthermore, two distinct solvers, even if each can solve complementary parts of a new problem, are difficult to link together to solve the problem as a whole. For example, Lagrangian deformation model with visco-elastoplastic crust is used to study deformation near plate boundary. Ideally, the driving force of the deformation should be derived from underlying mantle convection, and it requires linking the Lagrangian deformation model with a Eulerian mantle convection model. As our understanding of geological processes evolves, the need of integrated modeling codes, which should reuse existing codes as much as possible, begins to surface. GeoFramework project addresses this need by developing a suite of reusable and re-combinable tools for the Earth science community. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. Under the framework, a solver is aware of the existence of other solvers and can interact with each other via exchanging information across adjacent boundary. A solver needs to conform a standard interface and provide its own implementation for exchanging boundary information. The framework also provides facilities to control the coordination between interacting solvers. We will show an example of linking two solvers within GeoFramework. CitcomS is a finite element code which solves for thermal convection within a 3D spherical shell. CitcomS can solve for problems either within a full spherical (global) domain or a restricted (regional) domain of a full sphere by using different meshers. We can embed a regional CitcomS solver within a global CitcomS solver. We not that linking instances of the same solver is conceptually equivalent to linking to different solvers. The global solver has a coarser grid and a longer stable time step than the regional solver. Therefore, a global-solver time step consists of several regional-solver time steps. The time-marching scheme is described below. First, the global solver is advanced one global-solver time step. Then, the regional solver is advanced for several regional-solver time steps until it catches up global solver. Within each regional-solver time step, the velocity field of the global solver is interpolated in time and then is imposed to the regional solver as boundary conditions. Finally, the temperature field of the regional solver is extrapolated in space and is fed back to the global. These two solvers are linked and synchronized by the time-marching scheme. An effort to embed a visco-elastoplastic representation of the crust within viscous mantle flow is underway.
Building a semi-automatic ontology learning and construction system for geosciences
NASA Astrophysics Data System (ADS)
Babaie, H. A.; Sunderraman, R.; Zhu, Y.
2013-12-01
We are developing an ontology learning and construction framework that allows continuous, semi-automatic knowledge extraction, verification, validation, and maintenance by potentially a very large group of collaborating domain experts in any geosciences field. The system brings geoscientists from the side-lines to the center stage of ontology building, allowing them to collaboratively construct and enrich new ontologies, and merge, align, and integrate existing ontologies and tools. These constantly evolving ontologies can more effectively address community's interests, purposes, tools, and change. The goal is to minimize the cost and time of building ontologies, and maximize the quality, usability, and adoption of ontologies by the community. Our system will be a domain-independent ontology learning framework that applies natural language processing, allowing users to enter their ontology in a semi-structured form, and a combined Semantic Web and Social Web approach that lets direct participation of geoscientists who have no skill in the design and development of their domain ontologies. A controlled natural language (CNL) interface and an integrated authoring and editing tool automatically convert syntactically correct CNL text into formal OWL constructs. The WebProtege-based system will allow a potentially large group of geoscientists, from multiple domains, to crowd source and participate in the structuring of their knowledge model by sharing their knowledge through critiquing, testing, verifying, adopting, and updating of the concept models (ontologies). We will use cloud storage for all data and knowledge base components of the system, such as users, domain ontologies, discussion forums, and semantic wikis that can be accessed and queried by geoscientists in each domain. We will use NoSQL databases such as MongoDB as a service in the cloud environment. MongoDB uses the lightweight JSON format, which makes it convenient and easy to build Web applications using just HTML5 and Javascript, thereby avoiding cumbersome server side coding present in the traditional approaches. The JSON format used in MongoDB is also suitable for storing and querying RDF data. We will store the domain ontologies and associated linked data in JSON/RDF formats. Our Web interface will be built upon the open source and configurable WebProtege ontology editor. We will develop a simplified mobile version of our user interface which will automatically detect the hosting device and adjust the user interface layout to accommodate different screen sizes. We will also use the Semantic Media Wiki that allows the user to store and query the data within the wiki pages. By using HTML 5, JavaScript, and WebGL, we aim to create an interactive, dynamic, and multi-dimensional user interface that presents various geosciences data sets in a natural and intuitive way.
Deng, Michelle; Zollanvari, Amin; Alterovitz, Gil
2012-01-01
The immense corpus of biomedical literature existing today poses challenges in information search and integration. Many links between pieces of knowledge occur or are significant only under certain contexts-rather than under the entire corpus. This study proposes using networks of ontology concepts, linked based on their co-occurrences in annotations of abstracts of biomedical literature and descriptions of experiments, to draw conclusions based on context-specific queries and to better integrate existing knowledge. In particular, a Bayesian network framework is constructed to allow for the linking of related terms from two biomedical ontologies under the queried context concept. Edges in such a Bayesian network allow associations between biomedical concepts to be quantified and inference to be made about the existence of some concepts given prior information about others. This approach could potentially be a powerful inferential tool for context-specific queries, applicable to ontologies in other fields as well.
A knowledge-based system for prototypical reasoning
NASA Astrophysics Data System (ADS)
Lieto, Antonio; Minieri, Andrea; Piana, Alberto; Radicioni, Daniele P.
2015-04-01
In this work we present a knowledge-based system equipped with a hybrid, cognitively inspired architecture for the representation of conceptual information. The proposed system aims at extending the classical representational and reasoning capabilities of the ontology-based frameworks towards the realm of the prototype theory. It is based on a hybrid knowledge base, composed of a classical symbolic component (grounded on a formal ontology) with a typicality based one (grounded on the conceptual spaces framework). The resulting system attempts to reconcile the heterogeneous approach to the concepts in Cognitive Science with the dual process theories of reasoning and rationality. The system has been experimentally assessed in a conceptual categorisation task where common sense linguistic descriptions were given in input, and the corresponding target concepts had to be identified. The results show that the proposed solution substantially extends the representational and reasoning 'conceptual' capabilities of standard ontology-based systems.
Deng, Michelle; Zollanvari, Amin; Alterovitz, Gil
2012-01-01
The immense corpus of biomedical literature existing today poses challenges in information search and integration. Many links between pieces of knowledge occur or are significant only under certain contexts—rather than under the entire corpus. This study proposes using networks of ontology concepts, linked based on their co-occurrences in annotations of abstracts of biomedical literature and descriptions of experiments, to draw conclusions based on context-specific queries and to better integrate existing knowledge. In particular, a Bayesian network framework is constructed to allow for the linking of related terms from two biomedical ontologies under the queried context concept. Edges in such a Bayesian network allow associations between biomedical concepts to be quantified and inference to be made about the existence of some concepts given prior information about others. This approach could potentially be a powerful inferential tool for context-specific queries, applicable to ontologies in other fields as well. PMID:22779044
An Approach to Formalizing Ontology Driven Semantic Integration: Concepts, Dimensions and Framework
ERIC Educational Resources Information Center
Gao, Wenlong
2012-01-01
The ontology approach has been accepted as a very promising approach to semantic integration today. However, because of the diversity of focuses and its various connections to other research domains, the core concepts, theoretical and technical approaches, and research areas of this domain still remain unclear. Such ambiguity makes it difficult to…
ERIC Educational Resources Information Center
Fast, Karl V.; Campbell, D. Grant
2001-01-01
Compares the implied ontological frameworks of the Open Archives Initiative Protocol for Metadata Harvesting and the World Wide Web Consortium's Semantic Web. Discusses current search engine technology, semantic markup, indexing principles of special libraries and online databases, and componentization and the distinction between data and…
Block clustering based on difference of convex functions (DC) programming and DC algorithms.
Le, Hoai Minh; Le Thi, Hoai An; Dinh, Tao Pham; Huynh, Van Ngai
2013-10-01
We investigate difference of convex functions (DC) programming and the DC algorithm (DCA) to solve the block clustering problem in the continuous framework, which traditionally requires solving a hard combinatorial optimization problem. DC reformulation techniques and exact penalty in DC programming are developed to build an appropriate equivalent DC program of the block clustering problem. They lead to an elegant and explicit DCA scheme for the resulting DC program. Computational experiments show the robustness and efficiency of the proposed algorithm and its superiority over standard algorithms such as two-mode K-means, two-mode fuzzy clustering, and block classification EM.
Hybrid Metaheuristics for Solving a Fuzzy Single Batch-Processing Machine Scheduling Problem
Molla-Alizadeh-Zavardehi, S.; Tavakkoli-Moghaddam, R.; Lotfi, F. Hosseinzadeh
2014-01-01
This paper deals with a problem of minimizing total weighted tardiness of jobs in a real-world single batch-processing machine (SBPM) scheduling in the presence of fuzzy due date. In this paper, first a fuzzy mixed integer linear programming model is developed. Then, due to the complexity of the problem, which is NP-hard, we design two hybrid metaheuristics called GA-VNS and VNS-SA applying the advantages of genetic algorithm (GA), variable neighborhood search (VNS), and simulated annealing (SA) frameworks. Besides, we propose three fuzzy earliest due date heuristics to solve the given problem. Through computational experiments with several random test problems, a robust calibration is applied on the parameters. Finally, computational results on different-scale test problems are presented to compare the proposed algorithms. PMID:24883359
From Walls to Windows: Using Barriers as Pathways to Insightful Solutions
ERIC Educational Resources Information Center
Walinga, Jennifer
2010-01-01
The purpose of this study was to explore and develop a conceptual model for how individuals unlock insight. The concept of insight--the "out of the box" or "aha!" solution to a problem--offers a framework for exploring and understanding how best to enhance problem solving skills due to the cognitive shift insight requires. Creative problem solving…
Computer-Aided Group Problem Solving for Unified Life Cycle Engineering (ULCE)
1989-02-01
defining the problem, generating alternative solutions, evaluating alternatives, selecting alternatives, and implementing the solution. Systems...specialist in group dynamics, assists the group in formulating the problem and selecting a model framework. The analyst provides the group with computer...allocating resources, evaluating and selecting options, making judgments explicit, and analyzing dynamic systems. c. University of Rhode Island Drs. Geoffery
Response to Intervention as a Vehicle for Powerful Mental Health Interventions in the Schools
ERIC Educational Resources Information Center
Froiland, John Mark
2011-01-01
School psychologists can work within a Response to Intervention (RtI) framework to increasingly promote the mental health of students. This article shares the unfolding of two composite case studies that exemplify how a practicing school psychologist can use a problem-solving framework to deliver effective mental health interventions to individual…
ERIC Educational Resources Information Center
OECD Publishing, 2017
2017-01-01
What is important for citizens to know and be able to do? The OECD Programme for International Student Assessment (PISA) seeks to answer that question through the most comprehensive and rigorous international assessment of student knowledge and skills. The PISA 2015 Assessment and Analytical Framework presents the conceptual foundations of the…
Sarntivijai, Sirarat; Zhang, Shelley; Jagannathan, Desikan G.; Zaman, Shadia; Burkhart, Keith K.; Omenn, Gilbert S.; He, Yongqun; Athey, Brian D.; Abernethy, Darrell R.
2016-01-01
Introduction A translational bioinformatics challenge lies in connecting population and individual’s clinical phenotypes in various formats to biological mechanisms. The Medical Dictionary for Regulatory Activities (MedDRA®) is the default dictionary for Adverse Event (AE) reporting in the FDA Adverse Event Reporting System (FAERS). The Ontology of Adverse Events (OAE) represents AEs as pathological processes occurring after drug exposures. Objectives The aim is to establish a semantic framework to link biological mechanisms to phenotypes of AEs by combining OAE with MedDRA® in FAERS data analysis. We investigated the AEs associated with Tyrosine Kinase Inhibitors (TKIs) and monoclonal antibodies (mAbs) targeting tyrosine kinases. The selected 5 TKIs/mAbs (i.e., dasatinib, imatinib, lapatinib, cetuximab, and trastuzumab) are known to induce impaired ventricular function (non-QT) cardiotoxicity. Results Statistical analysis of FAERS data identified 1,053 distinct MedDRA® terms significantly associated with TKIs/mAbs, where 884 did not have corresponding OAE terms. We manually annotated these terms, added them to OAE by the standard OAE development strategy, and mapped them to MedDRA®. The data integration to provide insights into molecular mechanisms for drug-associated AEs is performed by including linkages in OAE for all related AE terms to MedDRA® and existing ontologies including Human Phenotype Ontology (HP), Uber Anatomy Ontology (UBERON), and Gene Ontology (GO). Sixteen AEs are shared by all 5 TKIs/mAbs, and each of 17 cardiotoxicity AEs was associated with at least one TKI/mAb. As an example, we analyzed ‘cardiac failure’ using the relations established in OAE with other ontologies, and demonstrated that one of the biological processes associated with cardiac failure maps to the genes associated with heart contraction. Conclusion By expanding existing OAE ontological design, our TKI use case demonstrates that the combination of OAE and MedDRA® provides a semantic framework to link clinical phenotypes of adverse drug events to biological mechanisms. PMID:27003817
Miyoshi, Newton Shydeo Brandão; Pinheiro, Daniel Guariz; Silva, Wilson Araújo; Felipe, Joaquim Cezar
2013-06-06
The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. We have implemented an extension of Chado - the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different "omics" technologies with patient's clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans.
Predictive Models and Computational Embryology
EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...
Ontology-Based Multiple Choice Question Generation
Al-Yahya, Maha
2014-01-01
With recent advancements in Semantic Web technologies, a new trend in MCQ item generation has emerged through the use of ontologies. Ontologies are knowledge representation structures that formally describe entities in a domain and their relationships, thus enabling automated inference and reasoning. Ontology-based MCQ item generation is still in its infancy, but substantial research efforts are being made in the field. However, the applicability of these models for use in an educational setting has not been thoroughly evaluated. In this paper, we present an experimental evaluation of an ontology-based MCQ item generation system known as OntoQue. The evaluation was conducted using two different domain ontologies. The findings of this study show that ontology-based MCQ generation systems produce satisfactory MCQ items to a certain extent. However, the evaluation also revealed a number of shortcomings with current ontology-based MCQ item generation systems with regard to the educational significance of an automatically constructed MCQ item, the knowledge level it addresses, and its language structure. Furthermore, for the task to be successful in producing high-quality MCQ items for learning assessments, this study suggests a novel, holistic view that incorporates learning content, learning objectives, lexical knowledge, and scenarios into a single cohesive framework. PMID:24982937
NASA Astrophysics Data System (ADS)
Mamat, Nur Jumaadzan Zaleha; Jaaman, Saiful Hafizah; Ahmad, Rokiah@Rozita
2016-11-01
Two new methods adopted from methods commonly used in the field of transportation and logistics are proposed to solve a specific issue of investment allocation problem. Vehicle routing problem and capacitated vehicle routing methods are applied to optimize the fund allocation of a portfolio of investment assets. This is done by determining the sequence of the assets. As a result, total investment risk is minimized by this sequence.
NASA Astrophysics Data System (ADS)
Pattke, Marco; Martin, Manuel; Voit, Michael
2017-05-01
Tracking people with cameras in public areas is common today. However with an increasing number of cameras it becomes harder and harder to view the data manually. Especially in safety critical areas automatic image exploitation could help to solve this problem. Setting up such a system can however be difficult because of its increased complexity. Sensor placement is critical to ensure that people are detected and tracked reliably. We try to solve this problem using a simulation framework that is able to simulate different camera setups in the desired environment including animated characters. We combine this framework with our self developed distributed and scalable system for people tracking to test its effectiveness and can show the results of the tracking system in real time in the simulated environment.
Cloud-based large-scale air traffic flow optimization
NASA Astrophysics Data System (ADS)
Cao, Yi
The ever-increasing traffic demand makes the efficient use of airspace an imperative mission, and this paper presents an effort in response to this call. Firstly, a new aggregate model, called Link Transmission Model (LTM), is proposed, which models the nationwide traffic as a network of flight routes identified by origin-destination pairs. The traversal time of a flight route is assumed to be the mode of distribution of historical flight records, and the mode is estimated by using Kernel Density Estimation. As this simplification abstracts away physical trajectory details, the complexity of modeling is drastically decreased, resulting in efficient traffic forecasting. The predicative capability of LTM is validated against recorded traffic data. Secondly, a nationwide traffic flow optimization problem with airport and en route capacity constraints is formulated based on LTM. The optimization problem aims at alleviating traffic congestions with minimal global delays. This problem is intractable due to millions of variables. A dual decomposition method is applied to decompose the large-scale problem such that the subproblems are solvable. However, the whole problem is still computational expensive to solve since each subproblem is an smaller integer programming problem that pursues integer solutions. Solving an integer programing problem is known to be far more time-consuming than solving its linear relaxation. In addition, sequential execution on a standalone computer leads to linear runtime increase when the problem size increases. To address the computational efficiency problem, a parallel computing framework is designed which accommodates concurrent executions via multithreading programming. The multithreaded version is compared with its monolithic version to show decreased runtime. Finally, an open-source cloud computing framework, Hadoop MapReduce, is employed for better scalability and reliability. This framework is an "off-the-shelf" parallel computing model that can be used for both offline historical traffic data analysis and online traffic flow optimization. It provides an efficient and robust platform for easy deployment and implementation. A small cloud consisting of five workstations was configured and used to demonstrate the advantages of cloud computing in dealing with large-scale parallelizable traffic problems.
Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving.
Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni
2015-03-06
It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Duhamel, Karen V
2016-10-01
The purpose of this paper is to explore empirical findings of five studies related to graduate-level nurse educators' and nursing students' perceptions about the roles of creativity and creative problem-solving in traditional and innovative pedagogies, and examines conceptual differences in the value of creativity from teacher and student viewpoints. Five peer-reviewed scholarly articles; professional nursing organizations; conceptual frameworks of noted scholars specializing in creativity and creative problem-solving; business-related sources; primary and secondary sources of esteemed nurse scholars. Quantitative and qualitative studies were examined that used a variety of methodologies, including surveys, focus groups, 1:1 interviews, and convenience sampling of both nursing and non-nursing college students and faculty. Innovative teaching strategies supported student creativity and creative problem-solving development. Teacher personality traits and teaching styles receptive to students' needs led to greater student success in creative development. Adequate time allocation and perceived usefulness of creativity and creative problem-solving by graduate-level nurse educators must be reflected in classroom activities and course design. Findings indicated conservative teaching norms, evident in graduate nursing education today, should be revised to promote creativity and creative problem-solving development in graduate-level nursing students for best practice outcomes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mixed Integer Programming and Heuristic Scheduling for Space Communication Networks
NASA Technical Reports Server (NTRS)
Cheung, Kar-Ming; Lee, Charles H.
2012-01-01
We developed framework and the mathematical formulation for optimizing communication network using mixed integer programming. The design yields a system that is much smaller, in search space size, when compared to the earlier approach. Our constrained network optimization takes into account the dynamics of link performance within the network along with mission and operation requirements. A unique penalty function is introduced to transform the mixed integer programming into the more manageable problem of searching in a continuous space. The constrained optimization problem was proposed to solve in two stages: first using the heuristic Particle Swarming Optimization algorithm to get a good initial starting point, and then feeding the result into the Sequential Quadratic Programming algorithm to achieve the final optimal schedule. We demonstrate the above planning and scheduling methodology with a scenario of 20 spacecraft and 3 ground stations of a Deep Space Network site. Our approach and framework have been simple and flexible so that problems with larger number of constraints and network can be easily adapted and solved.
A Questioning Framework for Supporting Fraction Multiplication Understanding
ERIC Educational Resources Information Center
Johanning, Debra I.
2017-01-01
This research examined the role of the teacher in supporting students to make sense of fraction multiplication when using a problem solving approach. Using a qualitative approach, the teaching of four skillful experienced sixth-grade teachers was examined as they implemented a problem-based unit on fraction multiplication. This paper will present…
Learning Factory--Assembling Learning Content with a Framework
ERIC Educational Resources Information Center
Steininger, Peter
2016-01-01
Many of the challenges currently facing lectures are symptoms of problems with learning content creation, development and presentation. Learning Factory solves these problems by integrating critical innovations that have been proven over the last ten to twenty years in different industrial areas, but have not yet been brought or ported together in…
Relevancy in Problem Solving: A Computational Framework
ERIC Educational Resources Information Center
Kwisthout, Johan
2012-01-01
When computer scientists discuss the computational complexity of, for example, finding the shortest path from building A to building B in some town or city, their starting point typically is a formal description of the problem at hand, e.g., a graph with weights on every edge where buildings correspond to vertices, routes between buildings to…
The Semantic eScience Framework
NASA Astrophysics Data System (ADS)
McGuinness, Deborah; Fox, Peter; Hendler, James
2010-05-01
The goal of this effort is to design and implement a configurable and extensible semantic eScience framework (SESF). Configuration requires research into accommodating different levels of semantic expressivity and user requirements from use cases. Extensibility is being achieved in a modular approach to the semantic encodings (i.e. ontologies) performed in community settings, i.e. an ontology framework into which specific applications all the way up to communities can extend the semantics for their needs.We report on how we are accommodating the rapid advances in semantic technologies and tools and the sustainable software path for the future (certain) technical advances. In addition to a generalization of the current data science interface, we will present plans for an upper-level interface suitable for use by clearinghouses, and/or educational portals, digital libraries, and other disciplines.SESF builds upon previous work in the Virtual Solar-Terrestrial Observatory. The VSTO utilizes leading edge knowledge representation, query and reasoning techniques to support knowledge-enhanced search, data access, integration, and manipulation. It encodes term meanings and their inter-relationships in ontologies anduses these ontologies and associated inference engines to semantically enable the data services. The Semantically-Enabled Science Data Integration (SESDI) project implemented data integration capabilities among three sub-disciplines; solar radiation, volcanic outgassing and atmospheric structure using extensions to existingmodular ontolgies and used the VSTO data framework, while adding smart faceted search and semantic data registrationtools. The Semantic Provenance Capture in Data Ingest Systems (SPCDIS) has added explanation provenance capabilities to an observational data ingest pipeline for images of the Sun providing a set of tools to answer diverseend user questions such as ``Why does this image look bad?. http://tw.rpi.edu/portal/SESF
The Semantic eScience Framework
NASA Astrophysics Data System (ADS)
Fox, P. A.; McGuinness, D. L.
2009-12-01
The goal of this effort is to design and implement a configurable and extensible semantic eScience framework (SESF). Configuration requires research into accommodating different levels of semantic expressivity and user requirements from use cases. Extensibility is being achieved in a modular approach to the semantic encodings (i.e. ontologies) performed in community settings, i.e. an ontology framework into which specific applications all the way up to communities can extend the semantics for their needs.We report on how we are accommodating the rapid advances in semantic technologies and tools and the sustainable software path for the future (certain) technical advances. In addition to a generalization of the current data science interface, we will present plans for an upper-level interface suitable for use by clearinghouses, and/or educational portals, digital libraries, and other disciplines.SESF builds upon previous work in the Virtual Solar-Terrestrial Observatory. The VSTO utilizes leading edge knowledge representation, query and reasoning techniques to support knowledge-enhanced search, data access, integration, and manipulation. It encodes term meanings and their inter-relationships in ontologies anduses these ontologies and associated inference engines to semantically enable the data services. The Semantically-Enabled Science Data Integration (SESDI) project implemented data integration capabilities among three sub-disciplines; solar radiation, volcanic outgassing and atmospheric structure using extensions to existingmodular ontolgies and used the VSTO data framework, while adding smart faceted search and semantic data registrationtools. The Semantic Provenance Capture in Data Ingest Systems (SPCDIS) has added explanation provenance capabilities to an observational data ingest pipeline for images of the Sun providing a set of tools to answer diverseend user questions such as ``Why does this image look bad?.
Ontology Reuse in Geoscience Semantic Applications
NASA Astrophysics Data System (ADS)
Mayernik, M. S.; Gross, M. B.; Daniels, M. D.; Rowan, L. R.; Stott, D.; Maull, K. E.; Khan, H.; Corson-Rikert, J.
2015-12-01
The tension between local ontology development and wider ontology connections is fundamental to the Semantic web. It is often unclear, however, what the key decision points should be for new semantic web applications in deciding when to reuse existing ontologies and when to develop original ontologies. In addition, with the growth of semantic web ontologies and applications, new semantic web applications can struggle to efficiently and effectively identify and select ontologies to reuse. This presentation will describe the ontology comparison, selection, and consolidation effort within the EarthCollab project. UCAR, Cornell University, and UNAVCO are collaborating on the EarthCollab project to use semantic web technologies to enable the discovery of the research output from a diverse array of projects. The EarthCollab project is using the VIVO Semantic web software suite to increase discoverability of research information and data related to the following two geoscience-based communities: (1) the Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory (EOL), and (2) diverse research projects informed by geodesy through the UNAVCO geodetic facility and consortium. This presentation will outline of EarthCollab use cases, and provide an overview of key ontologies being used, including the VIVO-Integrated Semantic Framework (VIVO-ISF), Global Change Information System (GCIS), and Data Catalog (DCAT) ontologies. We will discuss issues related to bringing these ontologies together to provide a robust ontological structure to support the EarthCollab use cases. It is rare that a single pre-existing ontology meets all of a new application's needs. New projects need to stitch ontologies together in ways that fit into the broader semantic web ecosystem.
Predictive Models and Computational Toxicology (II IBAMTOX)
EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Kyri; Toomey, Bridget
Evolving power systems with increasing levels of stochasticity call for a need to solve optimal power flow problems with large quantities of random variables. Weather forecasts, electricity prices, and shifting load patterns introduce higher levels of uncertainty and can yield optimization problems that are difficult to solve in an efficient manner. Solution methods for single chance constraints in optimal power flow problems have been considered in the literature, ensuring single constraints are satisfied with a prescribed probability; however, joint chance constraints, ensuring multiple constraints are simultaneously satisfied, have predominantly been solved via scenario-based approaches or by utilizing Boole's inequality asmore » an upper bound. In this paper, joint chance constraints are used to solve an AC optimal power flow problem while preventing overvoltages in distribution grids under high penetrations of photovoltaic systems. A tighter version of Boole's inequality is derived and used to provide a new upper bound on the joint chance constraint, and simulation results are shown demonstrating the benefit of the proposed upper bound. The new framework allows for a less conservative and more computationally efficient solution to considering joint chance constraints, specifically regarding preventing overvoltages.« less
Smoothed low rank and sparse matrix recovery by iteratively reweighted least squares minimization.
Lu, Canyi; Lin, Zhouchen; Yan, Shuicheng
2015-02-01
This paper presents a general framework for solving the low-rank and/or sparse matrix minimization problems, which may involve multiple nonsmooth terms. The iteratively reweighted least squares (IRLSs) method is a fast solver, which smooths the objective function and minimizes it by alternately updating the variables and their weights. However, the traditional IRLS can only solve a sparse only or low rank only minimization problem with squared loss or an affine constraint. This paper generalizes IRLS to solve joint/mixed low-rank and sparse minimization problems, which are essential formulations for many tasks. As a concrete example, we solve the Schatten-p norm and l2,q-norm regularized low-rank representation problem by IRLS, and theoretically prove that the derived solution is a stationary point (globally optimal if p,q ≥ 1). Our convergence proof of IRLS is more general than previous one that depends on the special properties of the Schatten-p norm and l2,q-norm. Extensive experiments on both synthetic and real data sets demonstrate that our IRLS is much more efficient.
A medical ontology for intelligent web-based skin lesions image retrieval.
Maragoudakis, Manolis; Maglogiannis, Ilias
2011-06-01
Researchers have applied increasing efforts towards providing formal computational frameworks to consolidate the plethora of concepts and relations used in the medical domain. In the domain of skin related diseases, the variability of semantic features contained within digital skin images is a major barrier to the medical understanding of the symptoms and development of early skin cancers. The desideratum of making these standards machine-readable has led to their formalization in ontologies. In this work, in an attempt to enhance an existing Core Ontology for skin lesion images, hand-coded from image features, high quality images were analyzed by an autonomous ontology creation engine. We show that by exploiting agglomerative clustering methods with distance criteria upon the existing ontological structure, the original domain model could be enhanced with new instances, attributes and even relations, thus allowing for better classification and retrieval of skin lesion categories from the web.
Intelligent search in Big Data
NASA Astrophysics Data System (ADS)
Birialtsev, E.; Bukharaev, N.; Gusenkov, A.
2017-10-01
An approach to data integration, aimed on the ontology-based intelligent search in Big Data, is considered in the case when information objects are represented in the form of relational databases (RDB), structurally marked by their schemes. The source of information for constructing an ontology and, later on, the organization of the search are texts in natural language, treated as semi-structured data. For the RDBs, these are comments on the names of tables and their attributes. Formal definition of RDBs integration model in terms of ontologies is given. Within framework of the model universal RDB representation ontology, oil production subject domain ontology and linguistic thesaurus of subject domain language are built. Technique of automatic SQL queries generation for subject domain specialists is proposed. On the base of it, information system for TATNEFT oil-producing company RDBs was implemented. Exploitation of the system showed good relevance with majority of queries.
Bio-ontologies: current trends and future directions
Bodenreider, Olivier; Stevens, Robert
2006-01-01
In recent years, as a knowledge-based discipline, bioinformatics has been made more computationally amenable. After its beginnings as a technology advocated by computer scientists to overcome problems of heterogeneity, ontology has been taken up by biologists themselves as a means to consistently annotate features from genotype to phenotype. In medical informatics, artifacts called ontologies have been used for a longer period of time to produce controlled lexicons for coding schemes. In this article, we review the current position in ontologies and how they have become institutionalized within biomedicine. As the field has matured, the much older philosophical aspects of ontology have come into play. With this and the institutionalization of ontology has come greater formality. We review this trend and what benefits it might bring to ontologies and their use within biomedicine. PMID:16899495
A Monte-Carlo game theoretic approach for Multi-Criteria Decision Making under uncertainty
NASA Astrophysics Data System (ADS)
Madani, Kaveh; Lund, Jay R.
2011-05-01
Game theory provides a useful framework for studying Multi-Criteria Decision Making problems. This paper suggests modeling Multi-Criteria Decision Making problems as strategic games and solving them using non-cooperative game theory concepts. The suggested method can be used to prescribe non-dominated solutions and also can be used as a method to predict the outcome of a decision making problem. Non-cooperative stability definitions for solving the games allow consideration of non-cooperative behaviors, often neglected by other methods which assume perfect cooperation among decision makers. To deal with the uncertainty in input variables a Monte-Carlo Game Theory (MCGT) approach is suggested which maps the stochastic problem into many deterministic strategic games. The games are solved using non-cooperative stability definitions and the results include possible effects of uncertainty in input variables on outcomes. The method can handle multi-criteria multi-decision-maker problems with uncertainty. The suggested method does not require criteria weighting, developing a compound decision objective, and accurate quantitative (cardinal) information as it simplifies the decision analysis by solving problems based on qualitative (ordinal) information, reducing the computational burden substantially. The MCGT method is applied to analyze California's Sacramento-San Joaquin Delta problem. The suggested method provides insights, identifies non-dominated alternatives, and predicts likely decision outcomes.
Duque-Ramos, Astrid; Boeker, Martin; Jansen, Ludger; Schulz, Stefan; Iniesta, Miguela; Fernández-Breis, Jesualdo Tomás
2014-01-01
Objective To (1) evaluate the GoodOD guideline for ontology development by applying the OQuaRE evaluation method and metrics to the ontology artefacts that were produced by students in a randomized controlled trial, and (2) informally compare the OQuaRE evaluation method with gold standard and competency questions based evaluation methods, respectively. Background In the last decades many methods for ontology construction and ontology evaluation have been proposed. However, none of them has become a standard and there is no empirical evidence of comparative evaluation of such methods. This paper brings together GoodOD and OQuaRE. GoodOD is a guideline for developing robust ontologies. It was previously evaluated in a randomized controlled trial employing metrics based on gold standard ontologies and competency questions as outcome parameters. OQuaRE is a method for ontology quality evaluation which adapts the SQuaRE standard for software product quality to ontologies and has been successfully used for evaluating the quality of ontologies. Methods In this paper, we evaluate the effect of training in ontology construction based on the GoodOD guideline within the OQuaRE quality evaluation framework and compare the results with those obtained for the previous studies based on the same data. Results Our results show a significant effect of the GoodOD training over developed ontologies by topics: (a) a highly significant effect was detected in three topics from the analysis of the ontologies of untrained and trained students; (b) both positive and negative training effects with respect to the gold standard were found for five topics. Conclusion The GoodOD guideline had a significant effect over the quality of the ontologies developed. Our results show that GoodOD ontologies can be effectively evaluated using OQuaRE and that OQuaRE is able to provide additional useful information about the quality of the GoodOD ontologies. PMID:25148262
Duque-Ramos, Astrid; Boeker, Martin; Jansen, Ludger; Schulz, Stefan; Iniesta, Miguela; Fernández-Breis, Jesualdo Tomás
2014-01-01
To (1) evaluate the GoodOD guideline for ontology development by applying the OQuaRE evaluation method and metrics to the ontology artefacts that were produced by students in a randomized controlled trial, and (2) informally compare the OQuaRE evaluation method with gold standard and competency questions based evaluation methods, respectively. In the last decades many methods for ontology construction and ontology evaluation have been proposed. However, none of them has become a standard and there is no empirical evidence of comparative evaluation of such methods. This paper brings together GoodOD and OQuaRE. GoodOD is a guideline for developing robust ontologies. It was previously evaluated in a randomized controlled trial employing metrics based on gold standard ontologies and competency questions as outcome parameters. OQuaRE is a method for ontology quality evaluation which adapts the SQuaRE standard for software product quality to ontologies and has been successfully used for evaluating the quality of ontologies. In this paper, we evaluate the effect of training in ontology construction based on the GoodOD guideline within the OQuaRE quality evaluation framework and compare the results with those obtained for the previous studies based on the same data. Our results show a significant effect of the GoodOD training over developed ontologies by topics: (a) a highly significant effect was detected in three topics from the analysis of the ontologies of untrained and trained students; (b) both positive and negative training effects with respect to the gold standard were found for five topics. The GoodOD guideline had a significant effect over the quality of the ontologies developed. Our results show that GoodOD ontologies can be effectively evaluated using OQuaRE and that OQuaRE is able to provide additional useful information about the quality of the GoodOD ontologies.
Generic-distributed framework for cloud services marketplace based on unified ontology.
Hasan, Samer; Valli Kumari, V
2017-11-01
Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.
Six methodological steps to build medical data warehouses for research.
Szirbik, N B; Pelletier, C; Chaussalet, T
2006-09-01
We propose a simple methodology for heterogeneous data collection and central repository-style database design in healthcare. Our method can be used with or without other software development frameworks, and we argue that its application can save a relevant amount of implementation effort. Also, we believe that the method can be used in other fields of research, especially those that have a strong interdisciplinary nature. The idea emerged during a healthcare research project, which consisted among others in grouping information from heterogeneous and distributed information sources. We developed this methodology by the lessons learned when we had to build a data repository, containing information about elderly patients flows in the UK's long-term care system (LTC). We explain thoroughly those aspects that influenced the methodology building. The methodology is defined by six steps, which can be aligned with various iterative development frameworks. We describe here the alignment of our methodology with the RUP (rational unified process) framework. The methodology emphasizes current trends, as early identification of critical requirements, data modelling, close and timely interaction with users and stakeholders, ontology building, quality management, and exception handling. Of a special interest is the ontological engineering aspect, which had the effects with the highest impact after the project. That is, it helped stakeholders to perform better collaborative negotiations that brought better solutions for the overall system investigated. An insight into the problems faced by others helps to lead the negotiators to win-win situations. We consider that this should be the social result of any project that collects data for better decision making that leads finally to enhanced global outcomes.
Embodied Interactions in Human-Machine Decision Making for Situation Awareness Enhancement Systems
2016-06-09
characterize differences in spatial navigation strategies in a complex task, the Traveling Salesman Problem (TSP). For the second year, we developed...visual processing, leading to better solutions for spatial optimization problems . I will develop a framework to determine which body expressions best...methods include systematic characterization of gestures during complex problem solving. 15. SUBJECT TERMS Embodied interaction, gestures, one-shot
Tracing the foundations of a conceptual framework for a patient safety ontology.
Runciman, William B; Baker, G Ross; Michel, Philippe; Dovey, Susan; Lilford, Richard J; Jensen, Natasja; Flin, Rhona; Weeks, William B; Lewalle, Pierre; Larizgoitia, Itziar; Bates, David
2010-12-01
In work for the World Alliance for Patient Safety on research methods and measures and on defining key concepts for an International Patient Safety Classification (ICPS), it became apparent that there was a need to try to understand how the meaning of patient safety and underlying concepts relate to the existing safety and quality frameworks commonly used in healthcare. To unfold the concept of patient safety and how it relates to safety and quality frameworks commonly used in healthcare and to trace the evolution of the ICPS framework as a basis of the electronic capture of the component elements of patient safety. The ICPS conceptual framework for patient safety has its origins in existing frameworks and an international consultation process. Although its 10 classes and their semantic relationships may be used as a reference model for different disciplines, it must remain dynamic in the ever-changing world of healthcare. By expanding the ICPS by examining data from all available sources, and ensuring rigorous compliance with the latest principles of informatics, a deeper interdisciplinary approach will progressively be developed to address the complex, refractory problem of reducing healthcare-associated harm.
A Planning and Evaluation Six-Pack for Sustainable Organizations: The Six-P Framework
ERIC Educational Resources Information Center
Marker, Anthony; Johnsen, Elizabeth; Caswell, Christina
2009-01-01
As performance improvement (PI) practitioners, we seek not only to solve organizational problems but also to add value. For some time, we have focused on financial value. However, we are beginning to be held accountable also for the impact of our interventions on society and the environment. The Six-P framework--proposed here--can help PI…
Science Education as Public and Social Wealth: The Notion of Citizenship from a European Perspective
ERIC Educational Resources Information Center
Siatras, Anastasios; Koumaras, Panagiotis
2013-01-01
In this paper, (a) we present a framework for developing a science content (i.e., science concepts, scientific methods, scientific mindset, and problem-solving strategies for socio-scientific issues) used to design the new Cypriot science curriculum aiming at ensuring a democratic and human society, (b) we use the previous framework to explore the…
Drolet, Marie-Josée; Hudon, Anne
2015-02-01
In the past, several researchers in the field of physiotherapy have asserted that physiotherapy clinicians rarely use ethical knowledge to solve ethical issues raised by their practice. Does this assertion still hold true? Do the theoretical frameworks used by researchers and clinicians allow them to analyze thoroughly the ethical issues they encounter in their everyday practice? In our quest for answers, we conducted a literature review and analyzed the ethical theoretical frameworks used by physiotherapy researchers and clinicians to discuss the ethical issues raised by private physiotherapy practice. Our final analysis corpus consisted of thirty-nine texts. Our main finding is that researchers and clinicians in physiotherapy rarely use ethical knowledge to analyze the ethical issues raised in their practice and that gaps exist in the theoretical frameworks currently used to analyze these issues. Consequently, we developed, for ethical analysis, a four-part prism which we have called the Quadripartite Ethical Tool (QET). This tool can be incorporated into existing theoretical frameworks to enable professionals to integrate ethical knowledge into their ethical analyses. The innovative particularity of the QET is that it encompasses three ethical theories (utilitarism, deontologism, and virtue ethics) and axiological ontology (professional values) and also draws on both deductive and inductive approaches. It is our hope that this new tool will help researchers and clinicians integrate ethical knowledge into their analysis of ethical issues and contribute to fostering ethical analyses that are grounded in relevant philosophical and axiological foundations.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.
Lee, Leng-Feng; Umberger, Brian R
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB
Lee, Leng-Feng
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility. PMID:26835184
The OBO Foundry: coordinated evolution of ontologies to support biomedical data integration
Smith, Barry; Ashburner, Michael; Rosse, Cornelius; Bard, Jonathan; Bug, William; Ceusters, Werner; Goldberg, Louis J; Eilbeck, Karen; Ireland, Amelia; Mungall, Christopher J; Leontis, Neocles; Rocca-Serra, Philippe; Ruttenberg, Alan; Sansone, Susanna-Assunta; Scheuermann, Richard H; Shah, Nigam; Whetzel, Patricia L; Lewis, Suzanna
2010-01-01
The value of any kind of data is greatly enhanced when it exists in a form that allows it to be integrated with other data. One approach to integration is through the annotation of multiple bodies of data using common controlled vocabularies or ‘ontologies’. Unfortunately, the very success of this approach has led to a proliferation of ontologies, which itself creates obstacles to integration. The Open Biomedical Ontologies (OBO) consortium is pursuing a strategy to overcome this problem. Existing OBO ontologies, including the Gene Ontology, are undergoing coordinated reform, and new ontologies are being created on the basis of an evolving set of shared principles governing ontology development. The result is an expanding family of ontologies designed to be interoperable and logically well formed and to incorporate accurate representations of biological reality. We describe this OBO Foundry initiative and provide guidelines for those who might wish to become involved. PMID:17989687
NASA Astrophysics Data System (ADS)
Li, Ni; Huai, Wenqing; Wang, Shaodan
2017-08-01
C2 (command and control) has been understood to be a critical military component to meet an increasing demand for rapid information gathering and real-time decision-making in a dynamically changing battlefield environment. In this article, to improve a C2 behaviour model's reusability and interoperability, a behaviour modelling framework was proposed to specify a C2 model's internal modules and a set of interoperability interfaces based on the C-BML (coalition battle management language). WTA (weapon target assignment) is a typical C2 autonomous decision-making behaviour modelling problem. Different from most WTA problem descriptions, here sensors were considered to be available resources of detection and the relationship constraints between weapons and sensors were also taken into account, which brought it much closer to actual application. A modified differential evolution (MDE) algorithm was developed to solve this high-dimension optimisation problem and obtained an optimal assignment plan with high efficiency. In case study, we built a simulation system to validate the proposed C2 modelling framework and interoperability interface specification. Also, a new optimisation solution was used to solve the WTA problem efficiently and successfully.
Efficient computation of optimal actions.
Todorov, Emanuel
2009-07-14
Optimal choice of actions is a fundamental problem relevant to fields as diverse as neuroscience, psychology, economics, computer science, and control engineering. Despite this broad relevance the abstract setting is similar: we have an agent choosing actions over time, an uncertain dynamical system whose state is affected by those actions, and a performance criterion that the agent seeks to optimize. Solving problems of this kind remains hard, in part, because of overly generic formulations. Here, we propose a more structured formulation that greatly simplifies the construction of optimal control laws in both discrete and continuous domains. An exhaustive search over actions is avoided and the problem becomes linear. This yields algorithms that outperform Dynamic Programming and Reinforcement Learning, and thereby solve traditional problems more efficiently. Our framework also enables computations that were not possible before: composing optimal control laws by mixing primitives, applying deterministic methods to stochastic systems, quantifying the benefits of error tolerance, and inferring goals from behavioral data via convex optimization. Development of a general class of easily solvable problems tends to accelerate progress--as linear systems theory has done, for example. Our framework may have similar impact in fields where optimal choice of actions is relevant.
Klein-Weyl's program and the ontology of gauge and quantum systems
NASA Astrophysics Data System (ADS)
Catren, Gabriel
2018-02-01
We distinguish two orientations in Weyl's analysis of the fundamental role played by the notion of symmetry in physics, namely an orientation inspired by Klein's Erlangen program and a phenomenological-transcendental orientation. By privileging the former to the detriment of the latter, we sketch a group(oid)-theoretical program-that we call the Klein-Weyl program-for the interpretation of both gauge theories and quantum mechanics in a single conceptual framework. This program is based on Weyl's notion of a "structure-endowed entity" equipped with a "group of automorphisms". First, we analyze what Weyl calls the "problem of relativity" in the frameworks provided by special relativity, general relativity, and Yang-Mills theories. We argue that both general relativity and Yang-Mills theories can be understood in terms of a localization of Klein's Erlangen program: while the latter describes the group-theoretical automorphisms of a single structure (such as homogenous geometries), local gauge symmetries and the corresponding gauge fields (Ehresmann connections) can be naturally understood in terms of the groupoid-theoretical isomorphisms in a family of identical structures. Second, we argue that quantum mechanics can be understood in terms of a linearization of Klein's Erlangen program. This stance leads us to an interpretation of the fact that quantum numbers are "indices characterizing representations of groups" ((Weyl, 1931a), p.xxi) in terms of a correspondence between the ontological categories of identity and determinateness.
Li, Jia; Xia, Yunni; Luo, Xin
2014-01-01
OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.
Interoperable cross-domain semantic and geospatial framework for automatic change detection
NASA Astrophysics Data System (ADS)
Kuo, Chiao-Ling; Hong, Jung-Hong
2016-01-01
With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.
Ontology-Driven Search and Triage: Design of a Web-Based Visual Interface for MEDLINE.
Demelo, Jonathan; Parsons, Paul; Sedig, Kamran
2017-02-02
Diverse users need to search health and medical literature to satisfy open-ended goals such as making evidence-based decisions and updating their knowledge. However, doing so is challenging due to at least two major difficulties: (1) articulating information needs using accurate vocabulary and (2) dealing with large document sets returned from searches. Common search interfaces such as PubMed do not provide adequate support for exploratory search tasks. Our objective was to improve support for exploratory search tasks by combining two strategies in the design of an interactive visual interface by (1) using a formal ontology to help users build domain-specific knowledge and vocabulary and (2) providing multi-stage triaging support to help mitigate the information overload problem. We developed a Web-based tool, Ontology-Driven Visual Search and Triage Interface for MEDLINE (OVERT-MED), to test our design ideas. We implemented a custom searchable index of MEDLINE, which comprises approximately 25 million document citations. We chose a popular biomedical ontology, the Human Phenotype Ontology (HPO), to test our solution to the vocabulary problem. We implemented multistage triaging support in OVERT-MED, with the aid of interactive visualization techniques, to help users deal with large document sets returned from searches. Formative evaluation suggests that the design features in OVERT-MED are helpful in addressing the two major difficulties described above. Using a formal ontology seems to help users articulate their information needs with more accurate vocabulary. In addition, multistage triaging combined with interactive visualizations shows promise in mitigating the information overload problem. Our strategies appear to be valuable in addressing the two major problems in exploratory search. Although we tested OVERT-MED with a particular ontology and document collection, we anticipate that our strategies can be transferred successfully to other contexts. ©Jonathan Demelo, Paul Parsons, Kamran Sedig. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 02.02.2017.
Ontology-Driven Search and Triage: Design of a Web-Based Visual Interface for MEDLINE
2017-01-01
Background Diverse users need to search health and medical literature to satisfy open-ended goals such as making evidence-based decisions and updating their knowledge. However, doing so is challenging due to at least two major difficulties: (1) articulating information needs using accurate vocabulary and (2) dealing with large document sets returned from searches. Common search interfaces such as PubMed do not provide adequate support for exploratory search tasks. Objective Our objective was to improve support for exploratory search tasks by combining two strategies in the design of an interactive visual interface by (1) using a formal ontology to help users build domain-specific knowledge and vocabulary and (2) providing multi-stage triaging support to help mitigate the information overload problem. Methods We developed a Web-based tool, Ontology-Driven Visual Search and Triage Interface for MEDLINE (OVERT-MED), to test our design ideas. We implemented a custom searchable index of MEDLINE, which comprises approximately 25 million document citations. We chose a popular biomedical ontology, the Human Phenotype Ontology (HPO), to test our solution to the vocabulary problem. We implemented multistage triaging support in OVERT-MED, with the aid of interactive visualization techniques, to help users deal with large document sets returned from searches. Results Formative evaluation suggests that the design features in OVERT-MED are helpful in addressing the two major difficulties described above. Using a formal ontology seems to help users articulate their information needs with more accurate vocabulary. In addition, multistage triaging combined with interactive visualizations shows promise in mitigating the information overload problem. Conclusions Our strategies appear to be valuable in addressing the two major problems in exploratory search. Although we tested OVERT-MED with a particular ontology and document collection, we anticipate that our strategies can be transferred successfully to other contexts. PMID:28153818
A weak Galerkin generalized multiscale finite element method
Mu, Lin; Wang, Junping; Ye, Xiu
2016-03-31
In this study, we propose a general framework for weak Galerkin generalized multiscale (WG-GMS) finite element method for the elliptic problems with rapidly oscillating or high contrast coefficients. This general WG-GMS method features in high order accuracy on general meshes and can work with multiscale basis derived by different numerical schemes. A special case is studied under this WG-GMS framework in which the multiscale basis functions are obtained by solving local problem with the weak Galerkin finite element method. Convergence analysis and numerical experiments are obtained for the special case.
A weak Galerkin generalized multiscale finite element method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mu, Lin; Wang, Junping; Ye, Xiu
In this study, we propose a general framework for weak Galerkin generalized multiscale (WG-GMS) finite element method for the elliptic problems with rapidly oscillating or high contrast coefficients. This general WG-GMS method features in high order accuracy on general meshes and can work with multiscale basis derived by different numerical schemes. A special case is studied under this WG-GMS framework in which the multiscale basis functions are obtained by solving local problem with the weak Galerkin finite element method. Convergence analysis and numerical experiments are obtained for the special case.
Solving Homeland Security’s Wicked Problems: A Design Thinking Approach
2015-09-01
spur solutions. This thesis provides a framework for how S&T can incorporate design- thinking principles that are working well in other domains to...to spur solutions. This thesis provides a framework for how S&T can incorporate design-thinking principles that are working well in other domains to...Galbraith’s Star Model was used to analyze how DHS S&T, MindLab, and DARPA apply design-thinking principles to inform the framework to apply and
Sun, Xiao-Qing; Zhu, Rui; Li, Ming; Miao, Wang
2017-01-01
Emergency rescue material reserves are vital for the success of emergency rescue activities. In this study, we consider a situation where a government owned distribution center and framework agreement suppliers jointly store emergency rescue materials. Using a scenario-based approach to represent demand uncertainty, we propose a comprehensive transportation pattern for the following supply chain: “suppliers—government distribution center—disaster area.” Using a joint reserves model that includes the government and framework agreement suppliers, we develop a non-linear mathematic model that determines the choices of the framework suppliers, the corresponding optimal commitment quantities, and the quantity of materials that are stored at a government distribution center. Finally, we use IBM ILOG CPLEX to solve the numerical examples to verify the effectiveness of the mode and perform sensitivity analyses on the relevant parameters. PMID:29077722
Ontology Design Patterns as Interfaces (invited)
NASA Astrophysics Data System (ADS)
Janowicz, K.
2015-12-01
In recent years ontology design patterns (ODP) have gained popularity among knowledge engineers. ODPs are modular but self-contained building blocks that are reusable and extendible. They minimize the amount of ontological commitments and thereby are easier to integrate than large monolithic ontologies. Typically, patterns are not directly used to annotate data or to model certain domain problems but are combined and extended to form data and purpose-driven local ontologies that serve the needs of specific applications or communities. By relying on a common set of patterns these local ontologies can be aligned to improve interoperability and enable federated queries without enforcing a top-down model of the domain. In previous work, we introduced ontological views as layer on top of ontology design patterns to ease the reuse, combination, and integration of patterns. While the literature distinguishes multiple types of patterns, e.g., content patterns or logical patterns, we propose to use them as interfaces here to guide the development of ontology-driven systems.
Possible Solutions as a Concept in Behavior Change Interventions.
Mahoney, Diane E
2018-04-24
Nurses are uniquely positioned to implement behavior change interventions. Yet, nursing interventions have traditionally resulted from nurses problem-solving rather than allowing the patient to self-generate possible solutions for attaining specific health outcomes. The purpose of this review is to clarify the meaning of possible solutions in behavior change interventions. Walker and Avant's method on concept analysis serves as the framework for examination of the possible solutions. Possible solutions can be defined as continuous strategies initiated by patients and families to overcome existing health problems. As nurses engage in behavior change interventions, supporting patients and families in problem-solving will optimize health outcomes and transform clinical practice. © 2018 NANDA International, Inc.
Stapp`s quantum dualism: The James/Heisenberg model of consciousness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noyes, H.P.
1994-02-18
Henry Stapp attempts to resolve the Cartesian dilemma by introducing what the author would characterize as an ontological dualism between mind and matter. His model for mind comes from William James` description of conscious events and for matter from Werner Heisenberg`s ontological model for quantum events (wave function collapse). His demonstration of the isomorphism between the two types of events is successful, but in the author`s opinion fails to establish a monistic, scientific theory. The author traces Stapp`s failure to his adamant rejection of arbitrariness, or `randomness`. This makes it impossible for him (or for Bohr and Pauli before him)more » to understand the power of Darwin`s explanation of biology, let along the triumphs of modern `neo-Darwinism`. The author notes that the point at issue is a modern version of the unresolved opposition between Leucippus and Democritus on one side and Epicurus on the other. Stapp`s views are contrasted with recent discussions of consciousness by two eminent biologists: Crick and Edelman. They locate the problem firmly in the context of natural selection on the surface of the earth. Their approaches provide a sound basis for further scientific work. The author briefly examines the connection between this scientific (rather than ontological) framework and the new fundamental theory based on bit-strings and the combinatorial hierarchy.« less
NASA Astrophysics Data System (ADS)
Nomaguch, Yutaka; Fujita, Kikuo
This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.
ERIC Educational Resources Information Center
Connell, David B.
1982-01-01
General systems theory provides a theoretical framework for understanding stress and formulating problem-solving strategies. Both individuals and schools are systems, and general systems theory enables one to ask whether they are operating harmoniously and communicating effectively. (Author/RW)
A Large-scale Distributed Indexed Learning Framework for Data that Cannot Fit into Memory
2015-03-27
learn a classifier. Integrating three learning techniques (online, semi-supervised and active learning ) together with a selective sampling with minimum communication between the server and the clients solved this problem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Unseren, M.A.
A general framework for solving the dynamic load distribution when two manipulators hold a rigid object is proposed. The underspecified problem of solving for the contact forces and torques based on the object`s equations of motion is transformed into a well specified problem. This is accomplished by augmenting the object`s equations of motion with additional equations which relate a new vector variable quantifying the internal contact force and torque degrees of freedom (DOF) as a linear function of the contact forces and torques. The resulting augmented system yields a well specified solution for the contact forces and torques in whichmore » they are separated into their motion inducing and internal components. A particular solution is suggested which enables the designer to conveniently specify what portion of the payload`s mass each manipulator is to bear. It is also shown that the results of the previous work are just a special case of the general load distribution framework described here.« less
Theory and ontology for sharing temporal knowledge
NASA Technical Reports Server (NTRS)
Loganantharaj, Rasiah
1996-01-01
Using current technology, the sharing or re-using of knowledge-bases is very difficult, if not impossible. ARPA has correctly recognized the problem and funded a knowledge sharing initiative. One of the outcomes of this project is a formal language called Knowledge Interchange Format (KIF) for representing knowledge that could be translated into other languages. Capturing and representing design knowledge and reasoning with them have become very important for NASA who is a pioneer of innovative design of unique products. For upgrading an existing design for changing technology, needs, or requirements, it is essential to understand the design rationale, design choices, options and other relevant information associated with the design. Capturing such information and presenting them in the appropriate form are part of the ongoing Design Knowledge Capture project of NASA. The behavior of an object and various other aspects related to time are captured by the appropriate temporal knowledge. The captured design knowledge will be represented in such a way that various groups of NASA who are interested in various aspects of the design cycle should be able to access and use the design knowledge effectively. To facilitate knowledge sharing among these groups, one has to develop a very well defined ontology. Ontology is a specification of conceptualization. In the literature several specific domains were studied and some well defined ontologies were developed for such domains. However, very little, or no work has been done in the area of representing temporal knowledge to facilitate sharing. During the ASEE summer program, I have investigated several temporal models and have proposed a theory for time that is flexible to accommodate the time elements, such as, points and intervals, and is capable of handling the qualitative and quantitative temporal constraints. I have also proposed a primitive temporal ontology using which other relevant temporal ontologies can be built. I have investigated various issues of sharing knowledge and have proposed a formal framework for modeling the concept of knowledge sharing. This work may be implemented and tested in the software environment supplied by Knowledge Based System, Inc.
Ontology Design of Influential People Identification Using Centrality
NASA Astrophysics Data System (ADS)
Maulana Awangga, Rolly; Yusril, Muhammad; Setyawan, Helmi
2018-04-01
Identifying influential people as a node in a graph theory commonly calculated by social network analysis. The social network data has the user as node and edge as relation forming a friend relation graph. This research is conducting different meaning of every nodes relation in the social network. Ontology was perfect match science to describe the social network data as conceptual and domain. Ontology gives essential relationship in a social network more than a current graph. Ontology proposed as a standard for knowledge representation for the semantic web by World Wide Web Consortium. The formal data representation use Resource Description Framework (RDF) and Web Ontology Language (OWL) which is strategic for Open Knowledge-Based website data. Ontology used in the semantic description for a relationship in the social network, it is open to developing semantic based relationship ontology by adding and modifying various and different relationship to have influential people as a conclusion. This research proposes a model using OWL and RDF for influential people identification in the social network. The study use degree centrality, between ness centrality, and closeness centrality measurement for data validation. As a conclusion, influential people identification in Facebook can use proposed Ontology model in the Group, Photos, Photo Tag, Friends, Events and Works data.
Language and Thought in Mathematics Staff Development: A Problem Probing Protocol
ERIC Educational Resources Information Center
Kabasakalian, Rita
2007-01-01
Background/Context: The theoretical framework of the paper comes from research on problem solving, considered by many to be the essence of mathematics; research on the importance of oral language in learning mathematics; and on the importance of the teacher as the primary instrument of learning mathematics for most students. As a nation, we are…
ERIC Educational Resources Information Center
Musanti, Sandra I.; Celedon-Pattichis, Sylvia; Marshall, Mary E.
2009-01-01
This case study investigates a professional development initiative in which a first-grade bilingual teacher engages in learning and teaching Cognitively Guided Instruction, a framework for understanding student thinking through context-rich word-problem lessons. The study explores (a) the impact of classroom-based professional development on a…
An Exploratory Study of a Story Problem Assessment: Understanding Children's Number Sense
ERIC Educational Resources Information Center
Shumway, Jessica F.; Westenskow, Arla; Moyer-Packenham, Patricia S.
2016-01-01
The purpose of this study was to identify and describe students' use of number sense as they solved story problem tasks. Three 8- and 9-year-old students participated in clinical interviews. Through a process of holistic and qualitative coding, researchers used the number sense view as a theoretical framework for exploring how students' number…
A multidimensional framework of conceptual change for developing chemical equilibrium learning
NASA Astrophysics Data System (ADS)
Chanyoo, Wassana; Suwannoi, Paisan; Treagust, David F.
2018-01-01
The purposes of this research is to investigate the existing chemical equilibrium lessons in Thailand based on the multidimensional framework of conceptual change, to determine how the existing lessons could enhance students' conceptual change. This research was conducted based on qualitative perspective. Document, observations and interviews were used to collect data. To comprehend all students conceptions, diagnostic tests were applied comprised of The Chemical Equilibrium Diagnostic Test (the CEDT) and The Chemical Equilibrium Test for Reveal Conceptual Change (the CETforRCC). In addition, to study students' motivations, the Motivated Strategies for Learning Questionnaire (the MSLQ) and students' task engagement were applied. Following each perspective of conceptual change - ontological, epistemological, and social/affective - the result showed that the existing chemical equilibrium unit did not enhance students' conceptual change, and some issues were found. The problems obstructed students conceptual change should be remedy under the multidimensional framework of conceptual change. Finally, some suggestions were provided to enhance students' conceptual change in chemical equilibrium effectively
Investigating and developing engineering students' mathematical modelling and problem-solving skills
NASA Astrophysics Data System (ADS)
Wedelin, Dag; Adawi, Tom; Jahan, Tabassum; Andersson, Sven
2015-09-01
How do engineering students approach mathematical modelling problems and how can they learn to deal with such problems? In the context of a course in mathematical modelling and problem solving, and using a qualitative case study approach, we found that the students had little prior experience of mathematical modelling. They were also inexperienced problem solvers, unaware of the importance of understanding the problem and exploring alternatives, and impeded by inappropriate beliefs, attitudes and expectations. Important impacts of the course belong to the metacognitive domain. The nature of the problems, the supervision and the follow-up lectures were emphasised as contributing to the impacts of the course, where students show major development. We discuss these empirical results in relation to a framework for mathematical thinking and the notion of cognitive apprenticeship. Based on the results, we argue that this kind of teaching should be considered in the education of all engineers.