Sample records for proven knowledge-based approach

  1. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  2. Bridging the Gap between Scientific Data Producers and Consumers: A Provenance Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Pinheiro da Silva, Paulo; Kleese van Dam, Kerstin

    2013-06-03

    Despite the methodical and painstaking efforts made by scientists to record their scientific findings and protocols, a knowledge gap problem continues to persist today between producers of scientific results and consumers because technology is performing the exchange of data as opposed to scientists making direct contact. Provenance is a means to formalize how this knowledge is transferred. However, for it to be meaningful to scientists, the provenance research community needs continued contributions from the scientific community to extend and leverage provenance-based vocabularies and technology from the provenance community. Going forward the provenance community must also be vigilant to meet scalabilitymore » needs of data intensive science« less

  3. A Different Approach to the Generation of Patient Management Problems from a Knowledge-Based System

    PubMed Central

    Barriga, Rosa Maria

    1988-01-01

    Several strategies are proposed to approach the generation of Patient Management Problems from a Knowledge Base and avoid inconsistencies in the results. These strategies are based on a different Knowledge Base structure and in the use of case introductions that describe the patient attributes which are not disease-dependent. This methodology has proven effective in a recent pilot test and it is on its way to implementation as part of an educational program at CWRU, School of Medicine.

  4. Early Violence Prevention: Tools for Teachers of Young Children.

    ERIC Educational Resources Information Center

    Slaby, Ronald G.; And Others

    Based on the latest knowledge about early violence prevention and effective teaching strategies, this book describes practical ways for early childhood educators to handle children's aggression and shows how to help children become assertive, nonviolent problem solvers. The book's repertoire of proven approaches includes teaching children how to…

  5. Evaluation of Community Health Education Workshops among Chinese Older Adults in Chicago: A Community-Based Participatory Research Approach

    ERIC Educational Resources Information Center

    Dong, Xinqi; Li, Yawen; Chen, Ruijia; Chang, E-Shien; Simon, Melissa

    2013-01-01

    Background: Health education is one of the proven ways to improve knowledge and change health attitudes and behaviors. This study is intended to assess the effectiveness of five health workshops in a Chinese community, focusing on depression, elder abuse, nutrition, breast cancer and stroke. Methods: A community-based participatory research…

  6. Fostering Knowledge and Skills to Teach for Diversity: "There Is Nothing so Practical as a Good Theory"

    ERIC Educational Resources Information Center

    Marchel, Carol A.; Green, Susan K

    2014-01-01

    Increased use of field-based teacher preparation offers important opportunities to develop skills with diverse learners. However, limited focus on theoretical content restricts understanding and generalization of well-proven theoretical approaches, resulting in fragmented field applications unlikely to result in broad application. Inspired by Kurt…

  7. On acquisition of programming knowledge

    NASA Technical Reports Server (NTRS)

    Amin, Ashok T.

    1987-01-01

    For the evolving discipline of programming, acquisition of programming knowledge is a difficult issue. Common knowledge results from the acceptance of proven techniques based on results of formal inquiries into the nature of the programming process. This is a rather slow process. In addition, the vast body of common knowledge needs to be explicated to a low enough level of details for it to be represented in the machine processable form. It is felt that this is an impediment to the progress of automatic programming. The importance of formal approaches cannot be overstated since their contributions lead to quantum leaps in the state of the art.

  8. Don’t Like RDF Reification? Making Statements about Statements Using Singleton Property

    PubMed Central

    Nguyen, Vinh; Bodenreider, Olivier; Sheth, Amit

    2015-01-01

    Statements about RDF statements, or meta triples, provide additional information about individual triples, such as the source, the occurring time or place, or the certainty. Integrating such meta triples into semantic knowledge bases would enable the querying and reasoning mechanisms to be aware of provenance, time, location, or certainty of triples. However, an efficient RDF representation for such meta knowledge of triples remains challenging. The existing standard reification approach allows such meta knowledge of RDF triples to be expressed using RDF by two steps. The first step is representing the triple by a Statement instance which has subject, predicate, and object indicated separately in three different triples. The second step is creating assertions about that instance as if it is a statement. While reification is simple and intuitive, this approach does not have formal semantics and is not commonly used in practice as described in the RDF Primer. In this paper, we propose a novel approach called Singleton Property for representing statements about statements and provide a formal semantics for it. We explain how this singleton property approach fits well with the existing syntax and formal semantics of RDF, and the syntax of SPARQL query language. We also demonstrate the use of singleton property in the representation and querying of meta knowledge in two examples of Semantic Web knowledge bases: YAGO2 and BKR. Our experiments on the BKR show that the singleton property approach gives a decent performance in terms of number of triples, query length and query execution time compared to existing approaches. This approach, which is also simple and intuitive, can be easily adopted for representing and querying statements about statements in other knowledge bases. PMID:25750938

  9. Process-based upscaling of surface-atmosphere exchange

    NASA Astrophysics Data System (ADS)

    Keenan, T. F.; Prentice, I. C.; Canadell, J.; Williams, C. A.; Wang, H.; Raupach, M. R.; Collatz, G. J.; Davis, T.; Stocker, B.; Evans, B. J.

    2015-12-01

    Empirical upscaling techniques such as machine learning and data-mining have proven invaluable tools for the global scaling of disparate observations of surface-atmosphere exchange, but are not based on a theoretical understanding of the key processes involved. This makes spatial and temporal extrapolation outside of the training domain difficult at best. There is therefore a clear need for the incorporation of knowledge of ecosystem function, in combination with the strength of data mining. Here, we present such an approach. We describe a novel diagnostic process-based model of global photosynthesis and ecosystem respiration, which is directly informed by a variety of global datasets relevant to ecosystem state and function. We use the model framework to estimate global carbon cycling both spatially and temporally, with a specific focus on the mechanisms responsible for long-term change. Our results show the importance of incorporating process knowledge into upscaling approaches, and highlight the effect of key processes on the terrestrial carbon cycle.

  10. Knowledge-driven lead discovery.

    PubMed

    Pirard, Bernard

    2005-11-01

    Virtual screening encompasses several computational approaches which have proven valuable for identifying novel leads. These approaches rely on available information. Herein, we review recent successful applications of virtual screening. The extension of virtual screening methodologies to target families is also briefly discussed.

  11. Painful and involuntary Multiple Sclerosis

    PubMed Central

    Bagnato, Francesca; Centonze, Diego; Galgani, Simonetta; Grasso, Maria Grazia; Haggiag, Shalom; Strano, Stefano

    2010-01-01

    Importance of the field Pain, dysphagia, respiratory problems, sexual and cardiovascular dysfunctions may occur in patients with multiple sclerosis (MS). Areas covered in the field In the present review we attempt to summarize the current knowledge on the impact that pain, dysphagia, respiratory problems, sexual and cardiovascular dysfunctions have in patients with MS. What the reader will gain The current understanding on pain, dysphagia, respiratory problems, sexual and cardiovascular dysfunctions and future research perspectives to expand the knowledge of this field. Take home message To effectively manage MS it is essential that these symptoms are recognised as early as possible and treated by a rehabilitative multidisciplinary approach, based on proven scientific evidence. PMID:21323633

  12. A unified framework for managing provenance information in translational research

    PubMed Central

    2011-01-01

    Background A critical aspect of the NIH Translational Research roadmap, which seeks to accelerate the delivery of "bench-side" discoveries to patient's "bedside," is the management of the provenance metadata that keeps track of the origin and history of data resources as they traverse the path from the bench to the bedside and back. A comprehensive provenance framework is essential for researchers to verify the quality of data, reproduce scientific results published in peer-reviewed literature, validate scientific process, and associate trust value with data and results. Traditional approaches to provenance management have focused on only partial sections of the translational research life cycle and they do not incorporate "domain semantics", which is essential to support domain-specific querying and analysis by scientists. Results We identify a common set of challenges in managing provenance information across the pre-publication and post-publication phases of data in the translational research lifecycle. We define the semantic provenance framework (SPF), underpinned by the Provenir upper-level provenance ontology, to address these challenges in the four stages of provenance metadata: (a) Provenance collection - during data generation (b) Provenance representation - to support interoperability, reasoning, and incorporate domain semantics (c) Provenance storage and propagation - to allow efficient storage and seamless propagation of provenance as the data is transferred across applications (d) Provenance query - to support queries with increasing complexity over large data size and also support knowledge discovery applications We apply the SPF to two exemplar translational research projects, namely the Semantic Problem Solving Environment for Trypanosoma cruzi (T.cruzi SPSE) and the Biomedical Knowledge Repository (BKR) project, to demonstrate its effectiveness. Conclusions The SPF provides a unified framework to effectively manage provenance of translational research data during pre and post-publication phases. This framework is underpinned by an upper-level provenance ontology called Provenir that is extended to create domain-specific provenance ontologies to facilitate provenance interoperability, seamless propagation of provenance, automated querying, and analysis. PMID:22126369

  13. Hybrid approach for robust diagnostics of cutting tools

    NASA Astrophysics Data System (ADS)

    Ramamurthi, K.; Hough, C. L., Jr.

    1994-03-01

    A new multisensor based hybrid technique has been developed for robust diagnosis of cutting tools. The technique combines the concepts of pattern classification and real-time knowledge based systems (RTKBS) and draws upon their strengths; learning facility in the case of pattern classification and a higher level of reasoning in the case of RTKBS. It eliminates some of their major drawbacks: false alarms or delayed/lack of diagnosis in case of pattern classification and tedious knowledge base generation in case of RTKBS. It utilizes a dynamic distance classifier, developed upon a new separability criterion and a new definition of robust diagnosis for achieving these benefits. The promise of this technique has been proven concretely through an on-line diagnosis of drill wear. Its suitability for practical implementation is substantiated by the use of practical, inexpensive, machine-mounted sensors and low-cost delivery systems.

  14. Knowledge-based identification of soluble biomarkers: hepatic fibrosis in NAFLD as an example.

    PubMed

    Page, Sandra; Birerdinc, Aybike; Estep, Michael; Stepanova, Maria; Afendy, Arian; Petricoin, Emanuel; Younossi, Zobair; Chandhoke, Vikas; Baranova, Ancha

    2013-01-01

    The discovery of biomarkers is often performed using high-throughput proteomics-based platforms and is limited to the molecules recognized by a given set of purified and validated antigens or antibodies. Knowledge-based, or systems biology, approaches that involve the analysis of integrated data, predominantly molecular pathways and networks may infer quantitative changes in the levels of biomolecules not included by the given assay from the levels of the analytes profiled. In this study we attempted to use a knowledge-based approach to predict biomarkers reflecting the changes in underlying protein phosphorylation events using Nonalcoholic Fatty Liver Disease (NAFLD) as a model. Two soluble biomarkers, CCL-2 and FasL, were inferred in silico as relevant to NAFLD pathogenesis. Predictive performance of these biomarkers was studied using serum samples collected from patients with histologically proven NAFLD. Serum levels of both molecules, in combination with clinical and demographic data, were predictive of hepatic fibrosis in a cohort of NAFLD patients. Our study suggests that (1) NASH-specific disruption of the kinase-driven signaling cascades in visceral adipose tissue lead to detectable changes in the levels of soluble molecules released into the bloodstream, and (2) biomarkers discovered in silico could contribute to predictive models for non-malignant chronic diseases.

  15. Knowledge-Based Identification of Soluble Biomarkers: Hepatic Fibrosis in NAFLD as an Example

    PubMed Central

    Page, Sandra; Birerdinc, Aybike; Estep, Michael; Stepanova, Maria; Afendy, Arian; Petricoin, Emanuel; Younossi, Zobair; Chandhoke, Vikas; Baranova, Ancha

    2013-01-01

    The discovery of biomarkers is often performed using high-throughput proteomics-based platforms and is limited to the molecules recognized by a given set of purified and validated antigens or antibodies. Knowledge-based, or systems biology, approaches that involve the analysis of integrated data, predominantly molecular pathways and networks may infer quantitative changes in the levels of biomolecules not included by the given assay from the levels of the analytes profiled. In this study we attempted to use a knowledge-based approach to predict biomarkers reflecting the changes in underlying protein phosphorylation events using Nonalcoholic Fatty Liver Disease (NAFLD) as a model. Two soluble biomarkers, CCL-2 and FasL, were inferred in silico as relevant to NAFLD pathogenesis. Predictive performance of these biomarkers was studied using serum samples collected from patients with histologically proven NAFLD. Serum levels of both molecules, in combination with clinical and demographic data, were predictive of hepatic fibrosis in a cohort of NAFLD patients. Our study suggests that (1) NASH-specific disruption of the kinase-driven signaling cascades in visceral adipose tissue lead to detectable changes in the levels of soluble molecules released into the bloodstream, and (2) biomarkers discovered in silico could contribute to predictive models for non-malignant chronic diseases. PMID:23405244

  16. Enabling Linked Science in Global Climate Uncertainty Quantification (UQ) Research

    NASA Astrophysics Data System (ADS)

    Elsethagen, T.; Stephan, E.; Lin, G.; Williams, D.; Banks, E.

    2012-12-01

    This paper shares a real-world global climate UQ science use case and illustrates how a linked science application called Provenance Environment (ProvEn), currently being developed, enables and facilitates scientific teams to publish, share, link, and discover new links over their UQ research results. UQ results include terascale datasets that are published to an Earth Systems Grid Federation (ESGF) repository. ProvEn demonstrates how a scientific team conducting UQ studies can discover dataset links using its domain knowledgebase, allowing them to better understand the UQ study research objectives, the experimental protocol used, the resulting dataset lineage, related analytical findings, ancillary literature citations, along with the social network of scientists associated with the study. This research claims that scientists using this linked science approach will not only allow them to greatly benefit from understanding a particular dataset within a knowledge context, a benefit can also be seen by the cross reference of knowledge among the numerous UQ studies being stored in ESGF. ProvEn collects native forms of data provenance resources as the UQ study is carried out. The native data provenance resources can be collected from a variety of sources such as scripts, a workflow engine log, simulation log files, scientific team members etc. Schema alignment is used to translate the native forms of provenance into a set of W3C PROV-O semantic statements used as a common interchange format which will also contain URI references back to resources in the UQ study dataset for querying and cross referencing. ProvEn leverages Fedora Commons' digital object model in a Resource Oriented Architecture (ROA) (i.e. a RESTful framework) to logically organize and partition native and translated provenance resources by UQ study. The ROA also provides scientists the means to both search native and translated forms of provenance.

  17. Biomedical discovery acceleration, with applications to craniofacial development.

    PubMed

    Leach, Sonia M; Tipney, Hannah; Feng, Weiguo; Baumgartner, William A; Kasliwal, Priyanka; Schuyler, Ronald P; Williams, Trevor; Spritz, Richard A; Hunter, Lawrence

    2009-03-01

    The profusion of high-throughput instruments and the explosion of new results in the scientific literature, particularly in molecular biomedicine, is both a blessing and a curse to the bench researcher. Even knowledgeable and experienced scientists can benefit from computational tools that help navigate this vast and rapidly evolving terrain. In this paper, we describe a novel computational approach to this challenge, a knowledge-based system that combines reading, reasoning, and reporting methods to facilitate analysis of experimental data. Reading methods extract information from external resources, either by parsing structured data or using biomedical language processing to extract information from unstructured data, and track knowledge provenance. Reasoning methods enrich the knowledge that results from reading by, for example, noting two genes that are annotated to the same ontology term or database entry. Reasoning is also used to combine all sources into a knowledge network that represents the integration of all sorts of relationships between a pair of genes, and to calculate a combined reliability score. Reporting methods combine the knowledge network with a congruent network constructed from experimental data and visualize the combined network in a tool that facilitates the knowledge-based analysis of that data. An implementation of this approach, called the Hanalyzer, is demonstrated on a large-scale gene expression array dataset relevant to craniofacial development. The use of the tool was critical in the creation of hypotheses regarding the roles of four genes never previously characterized as involved in craniofacial development; each of these hypotheses was validated by further experimental work.

  18. Cancer Immunotherapy and Breaking Immune Tolerance-New Approaches to an Old Challenge

    PubMed Central

    Makkouk, Amani; Weiner, George

    2014-01-01

    Cancer immunotherapy has proven to be challenging as it depends on overcoming multiple mechanisms that mediate immune tolerance to self-antigens. A growing understanding of immune tolerance has been the foundation for new approaches to cancer immunotherapy. Adoptive transfer of immune effectors such as antitumor monoclonal antibodies and Chimeric Antigen Receptor T cells bypasses many of the mechanisms involved in immune tolerance by allowing for expansion of tumor specific effectors ex vivo. Vaccination with whole tumor cells, protein, peptide, or dendritic cells has proven challenging, yet may be more useful when combined with other cancer immunotherapeutic strategies. Immunomodulatory approaches to cancer immunotherapy include treatment with agents that enhance and maintain T cell activation. Recent advances in the use of checkpoint blockade to block negative signals and so maintain the antitumor response are particularly exciting. With our growing knowledge of immune tolerance and ways to overcome it, combination treatments are being developed, tested and have particular promise. One example is in situ immunization that is designed to break tolerance within the tumor microenvironment. Progress in all these areas is continuing based on clear evidence that cancer immunotherapy designed to overcome immune tolerance can be useful for a growing number of cancer patients. PMID:25524899

  19. A fully automatic three-step liver segmentation method on LDA-based probability maps for multiple contrast MR images.

    PubMed

    Gloger, Oliver; Kühn, Jens; Stanski, Adam; Völzke, Henry; Puls, Ralf

    2010-07-01

    Automatic 3D liver segmentation in magnetic resonance (MR) data sets has proven to be a very challenging task in the domain of medical image analysis. There exist numerous approaches for automatic 3D liver segmentation on computer tomography data sets that have influenced the segmentation of MR images. In contrast to previous approaches to liver segmentation in MR data sets, we use all available MR channel information of different weightings and formulate liver tissue and position probabilities in a probabilistic framework. We apply multiclass linear discriminant analysis as a fast and efficient dimensionality reduction technique and generate probability maps then used for segmentation. We develop a fully automatic three-step 3D segmentation approach based upon a modified region growing approach and a further threshold technique. Finally, we incorporate characteristic prior knowledge to improve the segmentation results. This novel 3D segmentation approach is modularized and can be applied for normal and fat accumulated liver tissue properties. Copyright 2010 Elsevier Inc. All rights reserved.

  20. Capturing domain knowledge from multiple sources: the rare bone disorders use case.

    PubMed

    Groza, Tudor; Tudorache, Tania; Robinson, Peter N; Zankl, Andreas

    2015-01-01

    Lately, ontologies have become a fundamental building block in the process of formalising and storing complex biomedical information. The community-driven ontology curation process, however, ignores the possibility of multiple communities building, in parallel, conceptualisations of the same domain, and thus providing slightly different perspectives on the same knowledge. The individual nature of this effort leads to the need of a mechanism to enable us to create an overarching and comprehensive overview of the different perspectives on the domain knowledge. We introduce an approach that enables the loose integration of knowledge emerging from diverse sources under a single coherent interoperable resource. To accurately track the original knowledge statements, we record the provenance at very granular levels. We exemplify the approach in the rare bone disorders domain by proposing the Rare Bone Disorders Ontology (RBDO). Using RBDO, researchers are able to answer queries, such as: "What phenotypes describe a particular disorder and are common to all sources?" or to understand similarities between disorders based on divergent groupings (classifications) provided by the underlying sources. RBDO is available at http://purl.org/skeletome/rbdo. In order to support lightweight query and integration, the knowledge captured by RBDO has also been made available as a SPARQL Endpoint at http://bio-lark.org/se_skeldys.html.

  1. Educational Approach to Seismic Risk Mitigation in Indian Himalayas -Hazard Map Making Workshops at High Schools-

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Oki, S.; Kimura, M.; Chadha, R. K.; Davuluri, S.

    2014-12-01

    How can we encourage people to take preventive measures against damage risks and empower them to take the right actions in emergencies to save their lives? The conventional approach taken by scientists had been disseminating intelligible information on up-to-date seismological knowledge. However, it has been proven that knowledge alone does not have enough impact to modify people's behaviors in emergencies (Oki and Nakayachi, 2012). On the other hand, the conventional approach taken by practitioners had been to conduct emergency drills at schools or workplaces. The loss of many lives from the 2011 Tohoku earthquake has proven that these emergency drills were not enough to save people's lives, unless they were empowered to assess the given situation on their own and react flexibly. Our challenge is to bridge the gap between knowledge and practice. With reference to best practices observed in Tohoku, such as The Miracles of Kamaishi, our endeavor is to design an effective Disaster Preparedness Education Program that is applicable to other disaster-prone regions in the world, even with different geological, socio-economical and cultural backgrounds. The key concepts for this new approach are 1) empowering individuals to take preventive actions to save their lives, 2) granting community-based understanding of disaster risks and 3) building a sense of reality and relevancy to disasters. With these in mind, we held workshops at some high schools in the Lesser Himalayan Region, combining lectures with an activity called "Hazard Map Making" where students proactively identify and assess the hazards around their living areas and learn practical strategies on how to manage risks. We observed the change of awareness of the students by conducting a preliminary questionnaire survey and interviews after each session. Results strongly implied that the significant change of students' attitudes towards disaster preparedness occurred not by the lectures of scientific knowledge, but after completing the whole program of activities. Students closed their presentation by spontaneously adding messages to others about importance of life and preparedness. In this presentation, we share good practices in terms of program design and facilitation that encouraged the transition of participants from a learner to an actor.

  2. A proposed solution to integrating cognitive-affective neuroscience and neuropsychiatry in psychiatry residency training: The time is now.

    PubMed

    Torous, John; Stern, Adam P; Padmanabhan, Jaya L; Keshavan, Matcheri S; Perez, David L

    2015-10-01

    Despite increasing recognition of the importance of a strong neuroscience and neuropsychiatry education in the training of psychiatry residents, achieving this competency has proven challenging. In this perspective article, we selectively discuss the current state of these educational efforts and outline how using brain-symptom relationships from a systems-level neural circuit approach in clinical formulations may help residents value, understand, and apply cognitive-affective neuroscience based principles towards the care of psychiatric patients. To demonstrate the utility of this model, we present a case of major depressive disorder and discuss suspected abnormal neural circuits and therapeutic implications. A clinical neural systems-level, symptom-based approach to conceptualize mental illness can complement and expand residents' existing psychiatric knowledge. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. DisGeNET-RDF: harnessing the innovative power of the Semantic Web to explore the genetic basis of diseases.

    PubMed

    Queralt-Rosinach, Núria; Piñero, Janet; Bravo, Àlex; Sanz, Ferran; Furlong, Laura I

    2016-07-15

    DisGeNET-RDF makes available knowledge on the genetic basis of human diseases in the Semantic Web. Gene-disease associations (GDAs) and their provenance metadata are published as human-readable and machine-processable web resources. The information on GDAs included in DisGeNET-RDF is interlinked to other biomedical databases to support the development of bioinformatics approaches for translational research through evidence-based exploitation of a rich and fully interconnected linked open data. http://rdf.disgenet.org/ support@disgenet.org. © The Author 2016. Published by Oxford University Press.

  4. A Scientific Data Provenance API for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raju, Bibi; Elsethagen, Todd O.; Stephan, Eric G.

    Data provenance has been an active area of research as a means to standardize how the origin of data, process event history, and what or who was responsible for influencing results is explained. There are two approaches to capture provenance information. The first approach is to collect observed evidence produced by an executing application using log files, event listeners, and temporary files that are used by the application or application developer. The provenance translated from these observations is an interpretation of the provided evidence. The second approach is called disclosed because the application provides a firsthand account of the provenancemore » based on the anticipated questions on data flow, process flow, and responsible agents. Most observed provenance collection systems collect lot of provenance information during an application run or workflow execution. The common trend in capturing provenance is to collect all possible information, then attempt to find relevant information, which is not efficient. Existing disclosed provenance system APIs do not work well in distributed environment and have trouble finding where to fit the individual pieces of provenance information. This work focuses on determining more reliable solutions for provenance capture. As part of the Integrated End-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project, an API was developed, called Producer API (PAPI), which can disclose application targeted provenance, designed to work in distributed environments by means of unique object identification methods. The provenance disclosure approach used adds additional metadata to the provenance information to uniquely identify the pieces and connect them together. PAPI uses a common provenance model to support this provenance integration across disclosure sources. The API also provides the flexibility to let the user decide what to do with the collected provenance. The collected provenance can be sent to a triple store using REST services or it can be logged to a file.« less

  5. Knowledge management for systems biology a general and visually driven framework applied to translational medicine.

    PubMed

    Maier, Dieter; Kalus, Wenzel; Wolff, Martin; Kalko, Susana G; Roca, Josep; Marin de Mas, Igor; Turan, Nil; Cascante, Marta; Falciani, Francesco; Hernandez, Miguel; Villà-Freixa, Jordi; Losko, Sascha

    2011-03-05

    To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype-phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene--disease and gene--compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.

  6. Knowledge management for systems biology a general and visually driven framework applied to translational medicine

    PubMed Central

    2011-01-01

    Background To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development. PMID:21375767

  7. Knowledge of Evidence-Based Urinary Catheter Care Practice Recommendations Among Healthcare Workers in Nursing Homes

    PubMed Central

    Mody, Lona; Saint, Sanjay; Galecki, Andrzej; Chen, Shu; Krein, Sarah L.

    2010-01-01

    Objectives This study assessed the knowledge of recommended urinary catheter care practices among nursing home (NH) healthcare workers (HCWs) in Southeast Michigan. Design A self-administered survey. Setting Seven nursing homes in Southeast Michigan. Participants Three hundred and fifty-six healthcare workers. Methods An anonymous, self-administered survey of HCWs (nurses & nurse aides) in seven NHs in 2006. The survey included questions about respondent characteristics and knowledge about indications, care, and personal hygiene pertaining to urinary catheters. The association of knowledge measures with occupation (nurses vs. aides) was assessed using generalized estimating equations. Results A total of 356 of 440 HCWs (81%) responded. Over 90% of HCWs were aware of measures such as cleaning around the catheter daily, glove use, and hand hygiene with catheter manipulation. They were less aware of research-proven recommendations of not disconnecting the catheter from its bag (59% nurses vs. 30% aides, P < .001), not routinely irrigating the catheter (48% nurses vs. 8% aides, P < .001), and hand hygiene even after casual contact (60% nurses vs. 69% aides, P = .07). HCWs were also unaware of recommendations regarding alcohol-based handrub (27% nurses & 32% aides with correct responses, P = .38). HCWs reported sources, both informal (such as nurse supervisors) and formal (in-services), of knowledge about catheter care. Conclusion Wide discrepancies remain between research-proven recommendations pertaining to urinary catheter care and HCWs' knowledge. Nurses and aides differ in their knowledge of recommendations against harmful practices, such as disconnecting the catheter from the bag and routinely irrigating catheters. Further research should focus on strategies to enhance dissemination of proven infection control practices in NHs. PMID:20662957

  8. Representing annotation compositionality and provenance for the Semantic Web

    PubMed Central

    2013-01-01

    Background Though the annotation of digital artifacts with metadata has a long history, the bulk of that work focuses on the association of single terms or concepts to single targets. As annotation efforts expand to capture more complex information, annotations will need to be able to refer to knowledge structures formally defined in terms of more atomic knowledge structures. Existing provenance efforts in the Semantic Web domain primarily focus on tracking provenance at the level of whole triples and do not provide enough detail to track how individual triple elements of annotations were derived from triple elements of other annotations. Results We present a task- and domain-independent ontological model for capturing annotations and their linkage to their denoted knowledge representations, which can be singular concepts or more complex sets of assertions. We have implemented this model as an extension of the Information Artifact Ontology in OWL and made it freely available, and we show how it can be integrated with several prominent annotation and provenance models. We present several application areas for the model, ranging from linguistic annotation of text to the annotation of disease-associations in genome sequences. Conclusions With this model, progressively more complex annotations can be composed from other annotations, and the provenance of compositional annotations can be represented at the annotation level or at the level of individual elements of the RDF triples composing the annotations. This in turn allows for progressively richer annotations to be constructed from previous annotation efforts, the precise provenance recording of which facilitates evidence-based inference and error tracking. PMID:24268021

  9. Becoming Chemists through Game-Based Inquiry Learning: The Case of "Legends of Alkhimia"

    ERIC Educational Resources Information Center

    Chee, Yam San; Tan, Kim Chwee Daniel

    2012-01-01

    Traditional modes of chemistry education in schools focus on imparting chemistry knowledge to students via instruction. Consequently, students often acquire the mistaken understanding that scientific knowledge comprises a fixed body of "proven" facts. They fail to comprehend that the construction of scientific understanding is a human…

  10. Designing for Sustained Adoption: A Model of Developing Educational Innovations for Successful Propagation

    ERIC Educational Resources Information Center

    Khatri, Raina; Henderson, Charles; Cole, Renée; Froyd, Jeffrey E.; Friedrichsen, Debra; Stanford, Courtney

    2016-01-01

    The physics education research community has produced a wealth of knowledge about effective teaching and learning of college level physics. Based on this knowledge, many research-proven instructional strategies and teaching materials have been developed and are currently available to instructors. Unfortunately, these intensive research and…

  11. SensePath: Understanding the Sensemaking Process Through Analytic Provenance.

    PubMed

    Nguyen, Phong H; Xu, Kai; Wheat, Ashley; Wong, B L William; Attfield, Simon; Fields, Bob

    2016-01-01

    Sensemaking is described as the process of comprehension, finding meaning and gaining insight from information, producing new knowledge and informing further action. Understanding the sensemaking process allows building effective visual analytics tools to make sense of large and complex datasets. Currently, it is often a manual and time-consuming undertaking to comprehend this: researchers collect observation data, transcribe screen capture videos and think-aloud recordings, identify recurring patterns, and eventually abstract the sensemaking process into a general model. In this paper, we propose a general approach to facilitate such a qualitative analysis process, and introduce a prototype, SensePath, to demonstrate the application of this approach with a focus on browser-based online sensemaking. The approach is based on a study of a number of qualitative research sessions including observations of users performing sensemaking tasks and post hoc analyses to uncover their sensemaking processes. Based on the study results and a follow-up participatory design session with HCI researchers, we decided to focus on the transcription and coding stages of thematic analysis. SensePath automatically captures user's sensemaking actions, i.e., analytic provenance, and provides multi-linked views to support their further analysis. A number of other requirements elicited from the design session are also implemented in SensePath, such as easy integration with existing qualitative analysis workflow and non-intrusive for participants. The tool was used by an experienced HCI researcher to analyze two sensemaking sessions. The researcher found the tool intuitive and considerably reduced analysis time, allowing better understanding of the sensemaking process.

  12. Patterns-Based IS Change Management in SMEs

    NASA Astrophysics Data System (ADS)

    Makna, Janis; Kirikova, Marite

    The majority of information systems change management guidelines and standards are either too abstract or too bureaucratic to be easily applicable in small enterprises. This chapter proposes the approach, the method, and the prototype that are designed especially for information systems change management in small and medium enterprises. The approach is based on proven patterns of changes in the set of information systems elements. The set of elements was obtained by theoretical analysis of information systems and business process definitions and enterprise architectures. The patterns were evolved from a number of information systems theories and tested in 48 information systems change management projects. The prototype presents and helps to handle three basic change patterns, which help to anticipate the overall scope of changes related to particular elementary changes in an enterprise information system. The use of prototype requires just basic knowledge in organizational business process and information management.

  13. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  14. Awareness of knowledge and practice regarding physical activity: A population-based prospective, observational study among students in Nanjing, China

    PubMed Central

    Xiang, Dandan; Wang, Zhiyong; Ye, Qing; Ware, Robert S.

    2017-01-01

    Background Physical activity (PA) promotion has proven effectiveness in preventing childhood obesity. Increasing children’s health knowledge is the most frequently used approach in PA intervention programs targeting childhood obesity prevention. However, little is known about the specific association between the change in a child’s knowledge awareness and their PA practice. Methods A one-year follow-up study was conducted among primary and junior high school students in Nanjing, China. At baseline students’ knowledge of healthy behavior, and their PA levels, were assessed. Students who were unaware of the association between PA and obesity were followed for one academic year. After nine-months their knowledge and PA levels were re-measured using the same validated questionnaire. Mixed effects regression models were used to estimate the relationship between awareness of knowledge about the link between PA and obesity and PA changes. Results Of the 1899 students who were unaware of the association between PA and obesity at baseline, 1859 (follow-up rate = 97.9%) were successfully followed-up. After nine months 1318 (70.9%) participants had become aware of PA-obesity association. Compared to their counterparts who remained unaware, students who became aware of the PA-obesity association were more likely to increase both the frequency (odds ratio (OR) = 1.34, 95%CI = 1.09, 1.64) and duration (OR = 1.34, 95%CI = 1.09, 1.65) of PA, after adjusting for potentially confounding variables. Conclusion Becoming aware of the known link between PA and obesity led to positive behavior modification regarding PA in this cohort of Chinese students. This is of particular importance that knowledge disimination and health education may be a useful approach for population-based physical activity promotion aiming at childhood obesity prevention in China. PMID:28622354

  15. Climate Data Analytics Workflow Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Lee, S.; Pan, L.; Mattmann, C. A.; Lee, T. J.

    2016-12-01

    In this project we aim to pave a novel path to create a sustainable building block toward Earth science big data analytics and knowledge sharing. Closely studying how Earth scientists conduct data analytics research in their daily work, we have developed a provenance model to record their activities, and to develop a technology to automatically generate workflows for scientists from the provenance. On top of it, we have built the prototype of a data-centric provenance repository, and establish a PDSW (People, Data, Service, Workflow) knowledge network to support workflow recommendation. To ensure the scalability and performance of the expected recommendation system, we have leveraged the Apache OODT system technology. The community-approved, metrics-based performance evaluation web-service will allow a user to select a metric from the list of several community-approved metrics and to evaluate model performance using the metric as well as the reference dataset. This service will facilitate the use of reference datasets that are generated in support of the model-data intercomparison projects such as Obs4MIPs and Ana4MIPs. The data-centric repository infrastructure will allow us to catch richer provenance to further facilitate knowledge sharing and scientific collaboration in the Earth science community. This project is part of Apache incubator CMDA project.

  16. Knowledge Translation Efforts in Child and Youth Mental Health: A Systematic Review

    PubMed Central

    SCHACHTER, HOWARD M.; BENNETT, LINDSAY M.; McGOWAN, JESSIE; LY, MYLAN; WILSON, ANGELA; BENNETT, KATHRYN; BUCHANAN, DON H.; FERGUSSON, DEAN; MANION, IAN

    2012-01-01

    The availability of knowledge translation strategies that have been empirically studied and proven useful is a critical prerequisite to narrowing the research-to-practice gap in child and youth mental health. Through this review the authors sought to determine the current state of scientific knowledge of the effectiveness of knowledge translation approaches in child and youth mental health by conducting a systematic review of the research evidence. The findings and quality of the 12 included studies are discussed. Future work of high methodological quality that explores a broader range of knowledge translation strategies and practitioners to which they are applied and that also attends to implementation process is recommended. PMID:22830938

  17. Site-directed nucleases: a paradigm shift in predictable, knowledge-based plant breeding.

    PubMed

    Podevin, Nancy; Davies, Howard V; Hartung, Frank; Nogué, Fabien; Casacuberta, Josep M

    2013-06-01

    Conventional plant breeding exploits existing genetic variability and introduces new variability by mutagenesis. This has proven highly successful in securing food supplies for an ever-growing human population. The use of genetically modified plants is a complementary approach but all plant breeding techniques have limitations. Here, we discuss how the recent evolution of targeted mutagenesis and DNA insertion techniques based on tailor-made site-directed nucleases (SDNs) provides opportunities to overcome such limitations. Plant breeding companies are exploiting SDNs to develop a new generation of crops with new and improved traits. Nevertheless, some technical limitations as well as significant uncertainties on the regulatory status of SDNs may challenge their use for commercial plant breeding. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Contaminant source and release history identification in groundwater: A multi-step approach

    NASA Astrophysics Data System (ADS)

    Gzyl, G.; Zanini, A.; Frączek, R.; Kura, K.

    2014-02-01

    The paper presents a new multi-step approach aiming at source identification and release history estimation. The new approach consists of three steps: performing integral pumping tests, identifying sources, and recovering the release history by means of a geostatistical approach. The present paper shows the results obtained from the application of the approach within a complex case study in Poland in which several areal sources were identified. The investigated site is situated in the vicinity of a former chemical plant in southern Poland in the city of Jaworzno in the valley of the Wąwolnica River; the plant has been in operation since the First World War producing various chemicals. From an environmental point of view the most relevant activity was the production of pesticides, especially lindane. The application of the multi-step approach enabled a significant increase in the knowledge of contamination at the site. Some suspected contamination sources have been proven to have minor effect on the overall contamination. Other suspected sources have been proven to have key significance. Some areas not taken into consideration previously have now been identified as key sources. The method also enabled estimation of the magnitude of the sources and, a list of the priority reclamation actions will be drawn as a result. The multi-step approach has proven to be effective and may be applied to other complicated contamination cases. Moreover, the paper shows the capability of the geostatistical approach to manage a complex real case study.

  19. Plato, Pascal, and the Dynamics of Personal Knowledge

    ERIC Educational Resources Information Center

    Otte, Michael Friedrich; Campos, Tania M. M.; Abido, Alexandre S.

    2013-01-01

    Educational practices are to be based on proven scientific knowledge, not least because the function science has to perform in human culture consists of unifying practical skills and general beliefs, the episteme and the techne (Amsterdamski, 1975, pp. 43-44). Now, modern societies first of all presuppose regular and standardized ways of…

  20. Head in the clouds: Re-imagining the experimental laboratory record for the web-based networked world

    PubMed Central

    2009-01-01

    The means we use to record the process of carrying out research remains tied to the concept of a paginated paper notebook despite the advances over the past decade in web based communication and publication tools. The development of these tools offers an opportunity to re-imagine what the laboratory record would look like if it were re-built in a web-native form. In this paper I describe a distributed approach to the laboratory record based which uses the most appropriate tool available to house and publish each specific object created during the research process, whether they be a physical sample, a digital data object, or the record of how one was created from another. I propose that the web-native laboratory record would act as a feed of relationships between these items. This approach can be seen as complementary to, rather than competitive with, integrative approaches that aim to aggregate relevant objects together to describe knowledge. The potential for the recent announcement of the Google Wave protocol to have a significant impact on realizing this vision is discussed along with the issues of security and provenance that are raised by such an approach. PMID:20098590

  1. Designing a Knowledge Representation Approach for the Generation of Pedagogical Interventions by MTTs

    ERIC Educational Resources Information Center

    Paquette, Luc; Lebeau, Jean-François; Beaulieu, Gabriel; Mayers, André

    2015-01-01

    Model-tracing tutors (MTTs) have proven effective for the tutoring of well-defined tasks, but the pedagogical interventions they produce are limited and usually require the inclusion of pedagogical content, such as text message templates, in the model of the task. The capability to generate pedagogical content would be beneficial to MTT…

  2. Introducing a Learning Management System at a Russian University: Students' and Teachers' Perceptions

    ERIC Educational Resources Information Center

    Emelyanova, Natalya; Voronina, Elena

    2014-01-01

    Learning management systems (LMS) have been proven to encourage a constructive approach to knowledge acquisition and support active learning. One of the keys to successful and efficient use of LMS is how the stakeholders adopt and perceive this learning tool. The present research is therefore motivated by the importance of understanding teachers'…

  3. Can Inferred Provenance and Its Visualisation Be Used to Detect Erroneous Annotation? A Case Study Using UniProtKB

    PubMed Central

    Bell, Michael J.; Collison, Matthew; Lord, Phillip

    2013-01-01

    A constant influx of new data poses a challenge in keeping the annotation in biological databases current. Most biological databases contain significant quantities of textual annotation, which often contains the richest source of knowledge. Many databases reuse existing knowledge; during the curation process annotations are often propagated between entries. However, this is often not made explicit. Therefore, it can be hard, potentially impossible, for a reader to identify where an annotation originated from. Within this work we attempt to identify annotation provenance and track its subsequent propagation. Specifically, we exploit annotation reuse within the UniProt Knowledgebase (UniProtKB), at the level of individual sentences. We describe a visualisation approach for the provenance and propagation of sentences in UniProtKB which enables a large-scale statistical analysis. Initially levels of sentence reuse within UniProtKB were analysed, showing that reuse is heavily prevalent, which enables the tracking of provenance and propagation. By analysing sentences throughout UniProtKB, a number of interesting propagation patterns were identified, covering over sentences. Over sentences remain in the database after they have been removed from the entries where they originally occurred. Analysing a subset of these sentences suggest that approximately are erroneous, whilst appear to be inconsistent. These results suggest that being able to visualise sentence propagation and provenance can aid in the determination of the accuracy and quality of textual annotation. Source code and supplementary data are available from the authors website at http://homepages.cs.ncl.ac.uk/m.j.bell1/sentence_analysis/. PMID:24143170

  4. [Introduction of active learning and student readership in teaching by the pharmaceutical faculty].

    PubMed

    Sekiguchi, Masaki; Yamato, Ippei; Kato, Tetsuta; Torigoe, Kojyun

    2005-07-01

    We have introduced improvements and new approaches into our teaching methods by exploiting 4 active learning methods for pharmacy students of first year. The 4 teaching methods for each lesson or take home assignment are follows: 1) problem-based learning (clinical case) including a student presentation of the clinical case, 2) schematic drawings of the human organs, one drawing done in 15-20 min during the week following a lecture and a second drawing done with reference to a professional textbook, 3) learning of professional themes in take home assignments, and 4) short test in order to confirm the understanding of technical terms by using paper or computer. These improvements and new methods provide active approaches for pharmacy students (as opposed to passive memorization of words and image study). In combination, they have proven to be useful as a learning method to acquire expert knowledge and to convert from passive learning approach to active learning approach of pharmacy students in the classroom.

  5. Bacterial meningitis - principles of antimicrobial treatment.

    PubMed

    Jawień, Miroslaw; Garlicki, Aleksander M

    2013-01-01

    Bacterial meningitis is associated with significant morbidity and mortality despite the availability of effective antimicrobial therapy. The management approach to patients with suspected or proven bacterial meningitis includes emergent cerebrospinal fluid analysis and initiation of appropriate antimicrobial and adjunctive therapies. The choice of empirical antimicrobial therapy is based on the patient's age and underlying disease status; once the infecting pathogen is isolated, antimicrobial therapy can be modified for optimal treatment. Successful treatment of bacterial meningitis requires the knowledge on epidemiology including prevalence of antimicrobial resistant pathogens, pathogenesis of meningitis, pharmacokinetics and pharmacodynamics of antimicrobial agents. The emergence of antibiotic-resistant bacterial strains in recent years has necessitated the development of new strategies for empiric antimicrobial therapy for bacterial meningitis.

  6. Starting Strong: Evidence-­Based Early Literacy Practices

    ERIC Educational Resources Information Center

    Blamey, Katrin; Beauchat, Katherine

    2016-01-01

    Four evidence-based instructional approaches create an essential resource for any early literacy teacher or coach. Improve your teaching practices in all areas of early literacy. Use four proven instructional approaches--standards based, evidenced based, assessment based, and student based--to improve their teaching practice in all areas of early…

  7. Improving local health workers' knowledge of malaria in the elimination phase-determinants and strategies: a cross-sectional study in rural China.

    PubMed

    Wang, Ruoxi; Tang, Shangfeng; Yang, Jun; Shao, Tian; Shao, Piaopiao; Liu, Chunyan; Feng, Da; Fu, Hang; Chen, Xiaoyu; Hu, Tao; Feng, Zhanchun

    2017-05-19

    The current stage of malaria elimination in China requires experienced local health workers with sufficient knowledge of malaria who help to keep the public health system vigilant about a possible resurgence. However, the influencing factors of local health workers' knowledge level are not fully comprehended. This study aims to explore the factors with heavy impact on local health worker's knowledge of malaria and propose corresponding suggestions. Underpinned by stratified sampling method, a cross-sectional survey was carried out between November 2014 and April 2016. Chi square test was performed to identify the factors with potential influence on health workers' knowledge level of malaria. Bivariate logistic regression was employed to explore the relationship between the predictors and local health workers' knowledge level of malaria. Layered Chi square test was used to calculate the homogeneity of the interaction between training approaches and the percentage of participants with high-level knowledge. The endemic type of county and type of organization played the most significant role in influencing local health workers' knowledge level regarding malaria in the sample population. The participants from Type 1 and Type 2 counties were 4.3 times (4.336 and 4.328, respectively) more likely to have high-level knowledge of malaria than those who work in Type 3 counties. The probability of having high-level knowledge amongst the participants from county-level facilities (county hospitals and CDCs) were more than 2.2 times higher than those who work in villages. Other socio-demographic factors, such as education and work experience, also affected one's knowledge regarding malaria. Amongst the six most-used training approaches, electronic material (OR = 2.356, 95% CI 1.112-4.989), thematic series (OR = 1.784, 95% CI 0.907-3.508) and supervision (OR = 2.788, 95% CI 1.018-7.632) were proven with significant positive impact on local health workers' knowledge of malaria. Village doctors and who served in Type 3 counties were identified as the ones in urgent need of effective training. Three types of training approaches, including electronic material, thematic series and supervision, were proven to be effective in improving local health workers' knowledge. Nevertheless, the coverage of these training approaches was still limited. This study suggests expanding the coverage of training, especially the three particular types of training, to local health workers, particularly to the target populations (village doctors and who served in Type 3 counties). Online training, small group discussion and targeted skill development may be the directions for the future development of training programmes.

  8. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes.

    PubMed

    Ragan, Eric D; Endert, Alex; Sanyal, Jibonananda; Chen, Jian

    2016-01-01

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance information and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research.

  9. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric; Alex, Endert; Sanyal, Jibonananda

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance informationmore » and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research« less

  10. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes

    DOE PAGES

    Ragan, Eric; Alex, Endert; Sanyal, Jibonananda; ...

    2016-01-01

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance informationmore » and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research« less

  11. Self-management support interventions that are clinically linked and technology enabled: can they successfully prevent and treat diabetes?

    PubMed

    Kaufman, Neal D; Woodley, Paula D Patnoe

    2011-05-01

    Patients with diabetes need a complex set of services and supports. The challenge of integrating these services into the diabetes regimen can be successfully overcome through self-management support interventions that are clinically linked and technology enabled: self-management support because patients need help mastering the knowledge, attitudes, skills, and behaviors so necessary for good outcomes; interventions because comprehensive theory-based, evidence-proven, long-term, longitudinal interventions work better than direct-to-consumer or nonplanned health promotion approaches; clinically linked because patients are more likely to adopt new behaviors when the approach is in the context of a trusted therapeutic relationship and within an effective medical care system; and technology enabled because capitalizing on the amazing power of information technology leads to the delivery of cost-effective, scalable, engaging solutions that prevent and manage diabetes. © 2011 Diabetes Technology Society.

  12. A data dictionary approach to multilingual documentation and decision support for the diagnosis of acute abdominal pain. (COPERNICUS 555, an European concerted action).

    PubMed

    Ohmann, C; Eich, H P; Sippel, H

    1998-01-01

    This paper describes the design and development of a multilingual documentation and decision support system for the diagnosis of acute abdominal pain. The work was performed within a multi-national COPERNICUS European concerted action dealing with information technology for quality assurance in acute abdominal pain in Europe (EURO-AAP, 555). The software engineering was based on object-oriented analysis design and programming. The program cover three modules: a data dictionary, a documentation program and a knowledge based system. National versions of the software were provided and introduced into 16 centers from Central and Eastern Europe. A prospective data collection was performed in which 4020 patients were recruited. The software design has been proven to be very efficient and useful for the development of multilingual software.

  13. Connecting Provenance with Semantic Descriptions in the NASA Earth Exchange (NEX)

    NASA Astrophysics Data System (ADS)

    Votava, P.; Michaelis, A.; Nemani, R. R.

    2012-12-01

    NASA Earth Exchange (NEX) is a data, modeling and knowledge collaboratory that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform. Some of the main goals of NEX are transparency and repeatability and to that extent we have been adding components that enable tracking of provenance of both scientific processes and datasets produced by these processes. As scientific processes become more complex, they are often developed collaboratively and it becomes increasingly important for the research team to be able to track the development of the process and the datasets that are produced along the way. Additionally, we want to be able to link the processes and the datasets developed on NEX to an existing information and knowledge, so that the users can query and compare the provenance of any dataset or process with regard to the component-specific attributes such as data quality, geographic location, related publications, user comments and annotations etc. We have developed several ontologies that describe datasets and workflow components available on NEX using the OWL ontology language as well as a simple ontology that provides linking mechanism to the collected provenance information. The provenance is captured in two ways - we utilize existing provenance infrastructure of VisTrails, which is used as a workflow engine on NEX, and we extend the captured provenance using the PROV data model expressed through the PROV-O ontology. We do this in order to link and query the provenance easier in the context of the existing NEX information and knowledge. The captured provenance graph is processed and stored using RDFlib with MySQL backend that can be queried using either RDFLib or SPARQL. As a concrete example, we show how this information is captured during anomaly detection process in large satellite datasets.

  14. An Ontology-Enabled Natural Language Processing Pipeline for Provenance Metadata Extraction from Biomedical Text (Short Paper).

    PubMed

    Valdez, Joshua; Rueschman, Michael; Kim, Matthew; Redline, Susan; Sahoo, Satya S

    2016-10-01

    Extraction of structured information from biomedical literature is a complex and challenging problem due to the complexity of biomedical domain and lack of appropriate natural language processing (NLP) techniques. High quality domain ontologies model both data and metadata information at a fine level of granularity, which can be effectively used to accurately extract structured information from biomedical text. Extraction of provenance metadata, which describes the history or source of information, from published articles is an important task to support scientific reproducibility. Reproducibility of results reported by previous research studies is a foundational component of scientific advancement. This is highlighted by the recent initiative by the US National Institutes of Health called "Principles of Rigor and Reproducibility". In this paper, we describe an effective approach to extract provenance metadata from published biomedical research literature using an ontology-enabled NLP platform as part of the Provenance for Clinical and Healthcare Research (ProvCaRe). The ProvCaRe-NLP tool extends the clinical Text Analysis and Knowledge Extraction System (cTAKES) platform using both provenance and biomedical domain ontologies. We demonstrate the effectiveness of ProvCaRe-NLP tool using a corpus of 20 peer-reviewed publications. The results of our evaluation demonstrate that the ProvCaRe-NLP tool has significantly higher recall in extracting provenance metadata as compared to existing NLP pipelines such as MetaMap.

  15. Impact of Gender, Ethnicity, Year in School, Social Economic Status, and State Standardized Assessment Scores on Student Content Knowledge Achievement when Using Vee Maps as a Formative Assessment Tool

    ERIC Educational Resources Information Center

    Thoron, Andrew C.; Myers, Brian E.

    2011-01-01

    The National Research Council has recognized the challenge of assessing laboratory investigation and called for the investigation of assessments that are proven through sound research-based studies. The Vee map provides a framework that allows the learners to conceptualize their previous knowledge as they develop success in meaningful learning…

  16. Persistent identifiers for web service requests relying on a provenance ontology design pattern

    NASA Astrophysics Data System (ADS)

    Car, Nicholas; Wang, Jingbo; Wyborn, Lesley; Si, Wei

    2016-04-01

    Delivering provenance information for datasets produced from static inputs is relatively straightforward: we represent the processing actions and data flow using provenance ontologies and link to stored copies of the inputs stored in repositories. If appropriate detail is given, the provenance information can then describe what actions have occurred (transparency) and enable reproducibility. When web service-generated data is used by a process to create a dataset instead of a static inputs, we need to use sophisticated provenance representations of the web service request as we can no longer just link to data stored in a repository. A graph-based provenance representation, such as the W3C's PROV standard, can be used to model the web service request as a single conceptual dataset and also as a small workflow with a number of components within the same provenance report. This dual representation does more than just allow simplified or detailed views of a dataset's production to be used where appropriate. It also allow persistent identifiers to be assigned to instances of a web service requests, thus enabling one form of dynamic data citation, and for those identifiers to resolve to whatever level of detail implementers think appropriate in order for that web service request to be reproduced. In this presentation we detail our reasoning in representing web service requests as small workflows. In outline, this stems from the idea that web service requests are perdurant things and in order to most easily persist knowledge of them for provenance, we should represent them as a nexus of relationships between endurant things, such as datasets and knowledge of particular system types, as these endurant things are far easier to persist. We also describe the ontology design pattern that we use to represent workflows in general and how we apply it to different types of web service requests. We give examples of specific web service requests instances that were made by systems at Australia's National Computing Infrastructure and show how one can 'click' through provenance interfaces to see the dual representations of the requests using provenance management tooling we have built.

  17. Knowledge Provenance in Semantic Wikis

    NASA Astrophysics Data System (ADS)

    Ding, L.; Bao, J.; McGuinness, D. L.

    2008-12-01

    Collaborative online environments with a technical Wiki infrastructure are becoming more widespread. One of the strengths of a Wiki environment is that it is relatively easy for numerous users to contribute original content and modify existing content (potentially originally generated by others). As more users begin to depend on informational content that is evolving by Wiki communities, it becomes more important to track the provenance of the information. Semantic Wikis expand upon traditional Wiki environments by adding some computationally understandable encodings of some of the terms and relationships in Wikis. We have developed a semantic Wiki environment that expands a semantic Wiki with provenance markup. Provenance of original contributions as well as modifications is encoded using the provenance markup component of the Proof Markup Language. The Wiki environment provides the provenance markup automatically, thus users are not required to make specific encodings of author, contribution date, and modification trail. Further, our Wiki environment includes a search component that understands the provenance primitives and thus can be used to provide a provenance-aware search facility. We will describe the knowledge provenance infrastructure of our Semantic Wiki and show how it is being used as the foundation of our group web site as well as a number of project web sites.

  18. CI-Miner: A Semantic Methodology to Integrate Scientists, Data and Documents through the Use of Cyber-Infrastructure

    NASA Astrophysics Data System (ADS)

    Pinheiro da Silva, P.; CyberShARE Center of Excellence

    2011-12-01

    Scientists today face the challenge of rethinking the manner in which they document and make available their processes and data in an international cyber-infrastructure of shared resources. Some relevant examples of new scientific practices in the realm of computational and data extraction sciences include: large scale data discovery; data integration; data sharing across distinct scientific domains, systematic management of trust and uncertainty; and comprehensive support for explaining processes and results. This talk introduces CI-Miner - an innovative hands-on, open-source, community-driven methodology to integrate these new scientific practices. It has been developed in collaboration with scientists, with the purpose of capturing, storing and retrieving knowledge about scientific processes and their products, thereby further supporting a new generation of science techniques based on data exploration. CI-Miner uses semantic annotations in the form of W3C Ontology Web Language-based ontologies and Proof Markup Language (PML)-based provenance to represent knowledge. This methodology specializes in general-purpose ontologies, projected into workflow-driven ontologies(WDOs) and into semantic abstract workflows (SAWs). Provenance in PML is CI-Miner's integrative component, which allows scientists to retrieve and reason with the knowledge represented in these new semantic documents. It serves additionally as a platform to share such collected knowledge with the scientific community participating in the international cyber-infrastructure. The integrated semantic documents that are tailored for the use of human epistemic agents may also be utilized by machine epistemic agents, since the documents are based on W3C Resource Description Framework (RDF) notation. This talk is grounded upon interdisciplinary lessons learned through the use of CI-Miner in support of government-funded national and international cyber-infrastructure initiatives in the areas of geo-sciences (NSF-GEON and NSF-EarthScope), environmental sciences (CEON, NSF NEON, NSF-LTER and DOE-Ameri-Flux), and solar physics (VSTO and NSF-SPCDIS). The discussion on provenance is based on the use of PML in support of projects in collaboration with government organizations (DARPA, ARDA, NSF, DHS and DOE), research organizations (NCAR and PNNL), and industries (IBM and SRI International).

  19. Personalization of Learning Activities within a Virtual Environment for Training Based on Fuzzy Logic Theory

    ERIC Educational Resources Information Center

    Mohamed, Fahim; Abdeslam, Jakimi; Lahcen, El Bermi

    2017-01-01

    Virtual Environments for Training (VET) are useful tools for visualization, discovery as well as for training. VETs are based on virtual reality technique to put learners in training situations that emulate genuine situations. VETs have proven to be advantageous in putting learners into varied training situations to acquire knowledge and…

  20. An Analysis of Medical Ethic Practice by Union and Confederate Medical Departments During the American Civil War

    DTIC Science & Technology

    2011-04-05

    Therapeutics While physicians during this era were unfamiliar with today’ s term of evidence - based medicine , they did apply their knowledge of the...86. 88 Evidence - based medicine is the practice of providing interventional medical care that has been proven to lead to positive medical outcomes

  1. Cross-domain Collaborative Research and People Interoperability: Beyond Knowledge Representation Frameworks

    NASA Astrophysics Data System (ADS)

    Fox, P. A.; Diviacco, P.; Busato, A.

    2016-12-01

    Geo-scientific research collaboration commonly faces of complex systems where multiple skills and competences are needed at the same time. Efficacy of such collaboration among researchers then becomes of paramount importance. Multidisciplinary studies draw from domains that are far from each other. Researchers also need to understand: how to extract what data they need and eventually produce something that can be used by others. The management of information and knowledge in this perspective is non-trivial. Interoperability is frequently sought in computer-to-computer environements, so-as to overcome mismatches in vocabulary, data formats, coordinate reference system and so on. Successful researcher collaboration also relies on interoperability of the people! Smaller, synchronous and face-to-face settings for researchers are knownn to enhance people interoperability. However changing settings; either geographically; temporally; or with increasing the team size, diversity, and expertise requires people-computer-people-computer (...) interoperability. To date, knowledge representation framework have been proposed but not proven as necessary and sufficient to achieve multi-way interoperability. In this contribution, we address epistemology and sociology of science advocating for a fluid perspective where science is mostly a social construct, conditioned by cognitive issues; especially cognitive bias. Bias cannot be obliterated. On the contrary it must be carefully taken into consideration. Information-centric interfaces built from different perspectives and ways of thinking by actors with different point of views, approaches and aims, are proposed as a means for enhancing people interoperability in computer-based settings. The contribution will provide details on the approach of augmenting and interfacing to knowledge representation frameworks to the cognitive-conceptual frameworks for people that are needed to meet and exceed collaborative research goals in the 21st century. A web based collaborative portal has been developed that integrates both approaches and will be presented. Reports will be given on initial tests that have encouraging results.

  2. Sampling-Based Motion Planning Algorithms for Replanning and Spatial Load Balancing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boardman, Beth Leigh

    The common theme of this dissertation is sampling-based motion planning with the two key contributions being in the area of replanning and spatial load balancing for robotic systems. Here, we begin by recalling two sampling-based motion planners: the asymptotically optimal rapidly-exploring random tree (RRT*), and the asymptotically optimal probabilistic roadmap (PRM*). We also provide a brief background on collision cones and the Distributed Reactive Collision Avoidance (DRCA) algorithm. The next four chapters detail novel contributions for motion replanning in environments with unexpected static obstacles, for multi-agent collision avoidance, and spatial load balancing. First, we show improved performance of the RRT*more » when using the proposed Grandparent-Connection (GP) or Focused-Refinement (FR) algorithms. Next, the Goal Tree algorithm for replanning with unexpected static obstacles is detailed and proven to be asymptotically optimal. A multi-agent collision avoidance problem in obstacle environments is approached via the RRT*, leading to the novel Sampling-Based Collision Avoidance (SBCA) algorithm. The SBCA algorithm is proven to guarantee collision free trajectories for all of the agents, even when subject to uncertainties in the knowledge of the other agents’ positions and velocities. Given that a solution exists, we prove that livelocks and deadlock will lead to the cost to the goal being decreased. We introduce a new deconfliction maneuver that decreases the cost-to-come at each step. This new maneuver removes the possibility of livelocks and allows a result to be formed that proves convergence to the goal configurations. Finally, we present a limited range Graph-based Spatial Load Balancing (GSLB) algorithm which fairly divides a non-convex space among multiple agents that are subject to differential constraints and have a limited travel distance. The GSLB is proven to converge to a solution when maximizing the area covered by the agents. The analysis for each of the above mentioned algorithms is confirmed in simulations.« less

  3. Restful Implementation of Catalogue Service for Geospatial Data Provenance

    NASA Astrophysics Data System (ADS)

    Jiang, L. C.; Yue, P.; Lu, X. C.

    2013-10-01

    Provenance, also known as lineage, is important in understanding the derivation history of data products. Geospatial data provenance helps data consumers to evaluate the quality and reliability of geospatial data. In a service-oriented environment, where data are often consumed or produced by distributed services, provenance could be managed by following the same service-oriented paradigm. The Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) is used for the registration and query of geospatial data provenance by extending ebXML Registry Information Model (ebRIM). Recent advance of the REpresentational State Transfer (REST) paradigm has shown great promise for the easy integration of distributed resources. RESTful Web Service aims to provide a standard way for Web clients to communicate with servers based on REST principles. The existing approach for provenance catalogue service could be improved by adopting the RESTful design. This paper presents the design and implementation of a catalogue service for geospatial data provenance following RESTful architecture style. A middleware named REST Converter is added on the top of the legacy catalogue service to support a RESTful style interface. The REST Converter is composed of a resource request dispatcher and six resource handlers. A prototype service is developed to demonstrate the applicability of the approach.

  4. Role-playing simulation as an educational tool for health care personnel: developing an embedded assessment framework.

    PubMed

    Libin, Alexander; Lauderdale, Manon; Millo, Yuri; Shamloo, Christine; Spencer, Rachel; Green, Brad; Donnellan, Joyce; Wellesley, Christine; Groah, Suzanne

    2010-04-01

    Simulation- and video game-based role-playing techniques have been proven effective in changing behavior and enhancing positive decision making in a variety of professional settings, including education, the military, and health care. Although the need for developing assessment frameworks for learning outcomes has been clearly defined, there is a significant gap between the variety of existing multimedia-based instruction and technology-mediated learning systems and the number of reliable assessment algorithms. This study, based on a mixed methodology research design, aims to develop an embedded assessment algorithm, a Knowledge Assessment Module (NOTE), to capture both user interaction with the educational tool and knowledge gained from the training. The study is regarded as the first step in developing an assessment framework for a multimedia educational tool for health care professionals, Anatomy of Care (AOC), that utilizes Virtual Experience Immersive Learning Simulation (VEILS) technology. Ninety health care personnel of various backgrounds took part in online AOC training, choosing from five possible scenarios presenting difficult situations of everyday care. The results suggest that although the simulation-based training tool demonstrated partial effectiveness in improving learners' decision-making capacity, a differential learner-oriented approach might be more effective and capable of synchronizing educational efforts with identifiable relevant individual factors such as sociobehavioral profile and professional background.

  5. Validation of highly reliable, real-time knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1988-01-01

    Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.

  6. Targeted versus statistical approaches to selecting parameters for modelling sediment provenance

    NASA Astrophysics Data System (ADS)

    Laceby, J. Patrick

    2017-04-01

    One effective field-based approach to modelling sediment provenance is the source fingerprinting technique. Arguably, one of the most important steps for this approach is selecting the appropriate suite of parameters or fingerprints used to model source contributions. Accordingly, approaches to selecting parameters for sediment source fingerprinting will be reviewed. Thereafter, opportunities and limitations of these approaches and some future research directions will be presented. For properties to be effective tracers of sediment, they must discriminate between sources whilst behaving conservatively. Conservative behavior is characterized by constancy in sediment properties, where the properties of sediment sources remain constant, or at the very least, any variation in these properties should occur in a predictable and measurable way. Therefore, properties selected for sediment source fingerprinting should remain constant through sediment detachment, transportation and deposition processes, or vary in a predictable and measurable way. One approach to select conservative properties for sediment source fingerprinting is to identify targeted tracers, such as caesium-137, that provide specific source information (e.g. surface versus subsurface origins). A second approach is to use statistical tests to select an optimal suite of conservative properties capable of modelling sediment provenance. In general, statistical approaches use a combination of a discrimination (e.g. Kruskal Wallis H-test, Mann-Whitney U-test) and parameter selection statistics (e.g. Discriminant Function Analysis or Principle Component Analysis). The challenge is that modelling sediment provenance is often not straightforward and there is increasing debate in the literature surrounding the most appropriate approach to selecting elements for modelling. Moving forward, it would be beneficial if researchers test their results with multiple modelling approaches, artificial mixtures, and multiple lines of evidence to provide secondary support to their initial modelling results. Indeed, element selection can greatly impact modelling results and having multiple lines of evidence will help provide confidence when modelling sediment provenance.

  7. Principles of cancer prevention.

    PubMed

    Meyskens, Frank L; Tully, Patricia

    2005-11-01

    To summarize the scientific principles underlying cancer prevention. Articles, text books, personal communications, and experience. The scientific basis of cancer prevention is complex and involves experimental and epidemiologic approaches and clinical trials. As more information becomes available regarding proven and potential cancer-prevention strategies, oncology nurses are regularly called upon to guide patients and others in making choices regarding preventative options. It is important for oncology nurses to stay abreast of this growing body of knowledge.

  8. Prevention and mental illness: a new era for a healthier tomorrow.

    PubMed

    Buck, Steven

    2010-07-01

    The Department of Mental Health and Substance Abuse Services strives to provide the best possible care for Oklahoma communities through preventative programs and approaches such as QPR, Mental Health First Aid and mental health screenings. All of these techniques have been proven in providing adequate knowledge of risk factors in Oklahoma communities for mental health disorders and help to prevent those predisposed to mental illness from experiencing an onset of the disorder.

  9. Natural Gas Value-Chain and Network Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kobos, Peter H.; Outkin, Alexander V.; Beyeler, Walter E.

    2015-09-01

    The current expansion of natural gas (NG) development in the United States requires an understanding of how this change will affect the natural gas industry, downstream consumers, and economic growth in order to promote effective planning and policy development. The impact of this expansion may propagate through the NG system and US economy via changes in manufacturing, electric power generation, transportation, commerce, and increased exports of liquefied natural gas. We conceptualize this problem as supply shock propagation that pushes the NG system and the economy away from its current state of infrastructure development and level of natural gas use. Tomore » illustrate this, the project developed two core modeling approaches. The first is an Agent-Based Modeling (ABM) approach which addresses shock propagation throughout the existing natural gas distribution system. The second approach uses a System Dynamics-based model to illustrate the feedback mechanisms related to finding new supplies of natural gas - notably shale gas - and how those mechanisms affect exploration investments in the natural gas market with respect to proven reserves. The ABM illustrates several stylized scenarios of large liquefied natural gas (LNG) exports from the U.S. The ABM preliminary results demonstrate that such scenario is likely to have substantial effects on NG prices and on pipeline capacity utilization. Our preliminary results indicate that the price of natural gas in the U.S. may rise by about 50% when the LNG exports represent 15% of the system-wide demand. The main findings of the System Dynamics model indicate that proven reserves for coalbed methane, conventional gas and now shale gas can be adequately modeled based on a combination of geologic, economic and technology-based variables. A base case scenario matches historical proven reserves data for these three types of natural gas. An environmental scenario, based on implementing a $50/tonne CO 2 tax results in less proven reserves being developed in the coming years while demand may decrease in the absence of acceptable substitutes, incentives or changes in consumer behavior. An increase in demand of 25% increases proven reserves being developed by a very small amount by the end of the forecast period of 2025.« less

  10. Systematic reviews and knowledge translation.

    PubMed Central

    Tugwell, Peter; Robinson, Vivian; Grimshaw, Jeremy; Santesso, Nancy

    2006-01-01

    Proven effective interventions exist that would enable all countries to meet the Millennium Development Goals. However, uptake and use of these interventions in the poorest populations is at least 50% less than in the richest populations within each country. Also, we have recently shown that community effectiveness of interventions is lower for the poorest populations due to a "staircase" effect of lower coverage/access, worse diagnostic accuracy, less provider compliance and less consumer adherence. We propose an evidence-based framework for equity-oriented knowledge translation to enhance community effectiveness and health equity. This framework is represented as a cascade of steps to assess and prioritize barriers and thus choose effective knowledge translation interventions that are tailored for relevant audiences (public, patient, practitioner, policy-maker, press and private sector), as well as the evaluation, monitoring and sharing of these strategies. We have used two examples of effective interventions (insecticide-treated bednets to prevent malaria and childhood immunization) to illustrate how this framework can provide a systematic method for decision-makers to ensure the application of evidence-based knowledge in disadvantaged populations. Future work to empirically validate and evaluate the usefulness of this framework is needed. We invite researchers and implementers to use the cascade for equity-oriented knowledge translation as a guide when planning implementation strategies for proven effective interventions. We also encourage policy-makers and health-care managers to use this framework when deciding how effective interventions can be implemented in their own settings. PMID:16917652

  11. Systematic reviews and knowledge translation.

    PubMed

    Tugwell, Peter; Robinson, Vivian; Grimshaw, Jeremy; Santesso, Nancy

    2006-08-01

    Proven effective interventions exist that would enable all countries to meet the Millennium Development Goals. However, uptake and use of these interventions in the poorest populations is at least 50% less than in the richest populations within each country. Also, we have recently shown that community effectiveness of interventions is lower for the poorest populations due to a "staircase" effect of lower coverage/access, worse diagnostic accuracy, less provider compliance and less consumer adherence. We propose an evidence-based framework for equity-oriented knowledge translation to enhance community effectiveness and health equity. This framework is represented as a cascade of steps to assess and prioritize barriers and thus choose effective knowledge translation interventions that are tailored for relevant audiences (public, patient, practitioner, policy-maker, press and private sector), as well as the evaluation, monitoring and sharing of these strategies. We have used two examples of effective interventions (insecticide-treated bednets to prevent malaria and childhood immunization) to illustrate how this framework can provide a systematic method for decision-makers to ensure the application of evidence-based knowledge in disadvantaged populations. Future work to empirically validate and evaluate the usefulness of this framework is needed. We invite researchers and implementers to use the cascade for equity-oriented knowledge translation as a guide when planning implementation strategies for proven effective interventions. We also encourage policy-makers and health-care managers to use this framework when deciding how effective interventions can be implemented in their own settings.

  12. Biomechanical differences in the stem straightening process among Pinus pinaster provenances. A new approach for early selection of stem straightness.

    PubMed

    Sierra-de-Grado, Rosario; Pando, Valentín; Martínez-Zurimendi, Pablo; Peñalvo, Alejandro; Báscones, Esther; Moulia, Bruno

    2008-06-01

    Stem straightness is an important selection trait in Pinus pinaster Ait. breeding programs. Despite the stability of stem straightness rankings in provenance trials, the efficiency of breeding programs based on a quantitative index of stem straightness remains low. An alternative approach is to analyze biomechanical processes that underlie stem form. The rationale for this selection method is that genetic differences in the biomechanical processes that maintain stem straightness in young plants will continue to control stem form throughout the life of the tree. We analyzed the components contributing most to genetic differences among provenances in stem straightening processes by kinetic analysis and with a biomechanical model defining the interactions between the variables involved (Fournier's model). This framework was tested on three P. pinaster provenances differing in adult stem straightness and growth. One-year-old plants were tilted at 45 degrees, and individual stem positions and sizes were recorded weekly for 5 months. We measured the radial extension of reaction wood and the anatomical features of wood cells in serial stem cross sections. The integral effect of reaction wood on stem leaning was computed with Fournier's model. Responses driven by both primary and secondary growth were involved in the stem straightening process, but secondary-growth-driven responses accounted for most differences among provenances. Plants from the straight-stemmed provenance showed a greater capacity for stem straightening than plants from the sinuous provenances mainly because of (1) more efficient reaction wood (higher maturation strains) and (2) more pronounced secondary-growth-driven autotropic decurving. These two process-based traits are thus good candidates for early selection of stem straightness, but additional tests on a greater number of genotypes over a longer period are required.

  13. BioFed: federated query processing over life sciences linked open data.

    PubMed

    Hasnain, Ali; Mehmood, Qaiser; Sana E Zainab, Syeda; Saleem, Muhammad; Warren, Claude; Zehra, Durre; Decker, Stefan; Rebholz-Schuhmann, Dietrich

    2017-03-15

    Biomedical data, e.g. from knowledge bases and ontologies, is increasingly made available following open linked data principles, at best as RDF triple data. This is a necessary step towards unified access to biological data sets, but this still requires solutions to query multiple endpoints for their heterogeneous data to eventually retrieve all the meaningful information. Suggested solutions are based on query federation approaches, which require the submission of SPARQL queries to endpoints. Due to the size and complexity of available data, these solutions have to be optimised for efficient retrieval times and for users in life sciences research. Last but not least, over time, the reliability of data resources in terms of access and quality have to be monitored. Our solution (BioFed) federates data over 130 SPARQL endpoints in life sciences and tailors query submission according to the provenance information. BioFed has been evaluated against the state of the art solution FedX and forms an important benchmark for the life science domain. The efficient cataloguing approach of the federated query processing system 'BioFed', the triple pattern wise source selection and the semantic source normalisation forms the core to our solution. It gathers and integrates data from newly identified public endpoints for federated access. Basic provenance information is linked to the retrieved data. Last but not least, BioFed makes use of the latest SPARQL standard (i.e., 1.1) to leverage the full benefits for query federation. The evaluation is based on 10 simple and 10 complex queries, which address data in 10 major and very popular data sources (e.g., Dugbank, Sider). BioFed is a solution for a single-point-of-access for a large number of SPARQL endpoints providing life science data. It facilitates efficient query generation for data access and provides basic provenance information in combination with the retrieved data. BioFed fully supports SPARQL 1.1 and gives access to the endpoint's availability based on the EndpointData graph. Our evaluation of BioFed against FedX is based on 20 heterogeneous federated SPARQL queries and shows competitive execution performance in comparison to FedX, which can be attributed to the provision of provenance information for the source selection. Developing and testing federated query engines for life sciences data is still a challenging task. According to our findings, it is advantageous to optimise the source selection. The cataloguing of SPARQL endpoints, including type and property indexing, leads to efficient querying of data resources over the Web of Data. This could even be further improved through the use of ontologies, e.g., for abstract normalisation of query terms.

  14. A fuzzy-logic based decision-making approach for identification of groundwater quality based on groundwater quality indices.

    PubMed

    Vadiati, M; Asghari-Moghaddam, A; Nakhaei, M; Adamowski, J; Akbarzadeh, A H

    2016-12-15

    Due to inherent uncertainties in measurement and analysis, groundwater quality assessment is a difficult task. Artificial intelligence techniques, specifically fuzzy inference systems, have proven useful in evaluating groundwater quality in uncertain and complex hydrogeological systems. In the present study, a Mamdani fuzzy-logic-based decision-making approach was developed to assess groundwater quality based on relevant indices. In an effort to develop a set of new hybrid fuzzy indices for groundwater quality assessment, a Mamdani fuzzy inference model was developed with widely-accepted groundwater quality indices: the Groundwater Quality Index (GQI), the Water Quality Index (WQI), and the Ground Water Quality Index (GWQI). In an effort to present generalized hybrid fuzzy indices a significant effort was made to employ well-known groundwater quality index acceptability ranges as fuzzy model output ranges rather than employing expert knowledge in the fuzzification of output parameters. The proposed approach was evaluated for its ability to assess the drinking water quality of 49 samples collected seasonally from groundwater resources in Iran's Sarab Plain during 2013-2014. Input membership functions were defined as "desirable", "acceptable" and "unacceptable" based on expert knowledge and the standard and permissible limits prescribed by the World Health Organization. Output data were categorized into multiple categories based on the GQI (5 categories), WQI (5 categories), and GWQI (3 categories). Given the potential of fuzzy models to minimize uncertainties, hybrid fuzzy-based indices produce significantly more accurate assessments of groundwater quality than traditional indices. The developed models' accuracy was assessed and a comparison of the performance indices demonstrated the Fuzzy Groundwater Quality Index model to be more accurate than both the Fuzzy Water Quality Index and Fuzzy Ground Water Quality Index models. This suggests that the new hybrid fuzzy indices developed in this research are reliable and flexible when used in groundwater quality assessment for drinking purposes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Using the Knowledge, Process, Practice (KPP) model for driving the design and development of online postgraduate medical education.

    PubMed

    Shaw, Tim; Barnet, Stewart; Mcgregor, Deborah; Avery, Jennifer

    2015-01-01

    Online learning is a primary delivery method for continuing health education programs. It is critical that programs have curricula objectives linked to educational models that support learning. Using a proven educational modelling process ensures that curricula objectives are met and a solid basis for learning and assessment is achieved. To develop an educational design model that produces an educationally sound program development plan for use by anyone involved in online course development. We have described the development of a generic educational model designed for continuing health education programs. The Knowledge, Process, Practice (KPP) model is founded on recognised educational theory and online education practice. This paper presents a step-by-step guide on using this model for program development that encases reliable learning and evaluation. The model supports a three-step approach, KPP, based on learning outcomes and supporting appropriate assessment activities. It provides a program structure for online or blended learning that is explicit, educationally defensible, and supports multiple assessment points for health professionals. The KPP model is based on best practice educational design using a structure that can be adapted for a variety of online or flexibly delivered postgraduate medical education programs.

  16. A novel approach for connecting temporal-ontologies with blood flow simulations.

    PubMed

    Weichert, F; Mertens, C; Walczak, L; Kern-Isberner, G; Wagner, M

    2013-06-01

    In this paper an approach for developing a temporal domain ontology for biomedical simulations is introduced. The ideas are presented in the context of simulations of blood flow in aneurysms using the Lattice Boltzmann Method. The advantages in using ontologies are manyfold: On the one hand, ontologies having been proven to be able to provide medical special knowledge e.g., key parameters for simulations. On the other hand, based on a set of rules and the usage of a reasoner, a system for checking the plausibility as well as tracking the outcome of medical simulations can be constructed. Likewise, results of simulations including data derived from them can be stored and communicated in a way that can be understood by computers. Later on, this set of results can be analyzed. At the same time, the ontologies provide a way to exchange knowledge between researchers. Lastly, this approach can be seen as a black-box abstraction of the internals of the simulation for the biomedical researcher as well. This approach is able to provide the complete parameter sets for simulations, part of the corresponding results and part of their analysis as well as e.g., geometry and boundary conditions. These inputs can be transferred to different simulation methods for comparison. Variations on the provided parameters can be automatically used to drive these simulations. Using a rule base, unphysical inputs or outputs of the simulation can be detected and communicated to the physician in a suitable and familiar way. An example for an instantiation of the blood flow simulation ontology and exemplary rules for plausibility checking are given. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Inquiry-Based Approach to a Carbohydrate Analysis Experiment

    NASA Astrophysics Data System (ADS)

    Senkbeil, Edward G.

    1999-01-01

    The analysis of an unknown carbohydrate in an inquiry-based learning format has proven to be a valuable and interesting undergraduate biochemistry laboratory experiment. Students are given a list of carbohydrates and a list of references for carbohydrate analysis. The references contain a variety of well-characterized wet chemistry and instrumental techniques for carbohydrate identification, but the students must develop an appropriate sequential protocol for unknown identification. The students are required to provide a list of chemicals and procedures and a flow chart for identification before the lab. During the 3-hour laboratory period, they utilize their accumulated information and knowledge to classify and identify their unknown. Advantages of the inquiry-based format are (i) students must be well prepared in advance to be successful in the laboratory, (ii) students feel a sense of accomplishment in both designing and carrying out a successful experiment, and (iii) the carbohydrate background information digested by the students significantly decreases the amount of lecture time required for this topic.

  18. Adult Bronchoscopy Training

    PubMed Central

    Wahidi, Momen M.; Read, Charles A.; Buckley, John D.; Addrizzo-Harris, Doreen J.; Shah, Pallav L.; Herth, Felix J. F.; de Hoyos Parra, Alberto; Ornelas, Joseph; Yarmus, Lonny; Silvestri, Gerard A.

    2015-01-01

    BACKGROUND: The determination of competency of trainees in programs performing bronchoscopy is quite variable. Some programs provide didactic lectures with hands-on supervision, other programs incorporate advanced simulation centers, whereas others have a checklist approach. Although no single method has been proven best, the variability alone suggests that outcomes are variable. Program directors and certifying bodies need guidance to create standards for training programs. Little well-developed literature on the topic exists. METHODS: To provide credible and trustworthy guidance, rigorous methodology has been applied to create this bronchoscopy consensus training statement. All panelists were vetted and approved by the CHEST Guidelines Oversight Committee. Each topic group drafted questions in a PICO (population, intervention, comparator, outcome) format. MEDLINE data through PubMed and the Cochrane Library were systematically searched. Manual searches also supplemented the searches. All gathered references were screened for consideration based on inclusion criteria, and all statements were designated as an Ungraded Consensus-Based Statement. RESULTS: We suggest that professional societies move from a volume-based certification system to skill acquisition and knowledge-based competency assessment for trainees. Bronchoscopy training programs should incorporate multiple tools, including simulation. We suggest that ongoing quality and process improvement systems be introduced and that certifying agencies move from a volume-based certification system to skill acquisition and knowledge-based competency assessment for trainees. We also suggest that assessment of skill maintenance and improvement in practice be evaluated regularly with ongoing quality and process improvement systems after initial skill acquisition. CONCLUSIONS: The current methods used for bronchoscopy competency in training programs are variable. We suggest that professional societies and certifying agencies move from a volume- based certification system to a standardized skill acquisition and knowledge-based competency assessment for pulmonary and thoracic surgery trainees. PMID:25674901

  19. Novel scenarios of early animal evolution--is it time to rewrite textbooks?

    PubMed

    Dohrmann, Martin; Wörheide, Gert

    2013-09-01

    Understanding how important phenotypic, developmental, and genomic features of animals originated and evolved is essential for many fields of biological research, but such understanding depends on robust hypotheses about the phylogenetic interrelationships of the higher taxa to which the studied species belong. Molecular approaches to phylogenetics have proven able to revolutionize our knowledge of organismal evolution. However, with respect to the deepest splits in the metazoan Tree of Life-the relationships between Bilateria and the four non-bilaterian phyla (Porifera, Placozoa, Ctenophora, and Cnidaria)-no consensus has been reached yet, since a number of different, often contradictory, hypotheses with sometimes spectacular implications have been proposed in recent years. Here, we review the recent literature on the topic and contrast it with more classical perceptions based on analyses of morphological characters. We conclude that the time is not yet ripe to rewrite zoological textbooks and advocate a conservative approach when it comes to developing scenarios of the early evolution of animals.

  20. Gradient Dynamics and Entropy Production Maximization

    NASA Astrophysics Data System (ADS)

    Janečka, Adam; Pavelka, Michal

    2018-01-01

    We compare two methods for modeling dissipative processes, namely gradient dynamics and entropy production maximization. Both methods require similar physical inputs-how energy (or entropy) is stored and how it is dissipated. Gradient dynamics describes irreversible evolution by means of dissipation potential and entropy, it automatically satisfies Onsager reciprocal relations as well as their nonlinear generalization (Maxwell-Onsager relations), and it has statistical interpretation. Entropy production maximization is based on knowledge of free energy (or another thermodynamic potential) and entropy production. It also leads to the linear Onsager reciprocal relations and it has proven successful in thermodynamics of complex materials. Both methods are thermodynamically sound as they ensure approach to equilibrium, and we compare them and discuss their advantages and shortcomings. In particular, conditions under which the two approaches coincide and are capable of providing the same constitutive relations are identified. Besides, a commonly used but not often mentioned step in the entropy production maximization is pinpointed and the condition of incompressibility is incorporated into gradient dynamics.

  1. Active Provenance in Data-intensive Research

    NASA Astrophysics Data System (ADS)

    Spinuso, Alessandro; Mihajlovski, Andrej; Filgueira, Rosa; Atkinson, Malcolm

    2017-04-01

    Scientific communities are building platforms where the usage of data-intensive workflows is crucial to conduct their research campaigns. However managing and effectively support the understanding of the 'live' processes, fostering computational steering, sharing and re-use of data and methods, present several bottlenecks. These are often caused by the poor level of documentation on the methods and the data and how users interact with it. This work wants to explore how in such systems, flexibility in the management of the provenance and its adaptation to the different users and application contexts can lead to new opportunities for its exploitation, improving productivity. In particular, this work illustrates a conceptual and technical framework enabling tunable and actionable provenance in data-intensive workflow systems in support of reproducible science. It introduces the concept of Agile data-intensive systems to define the characteristic of our target platform. It shows a novel approach to the integration of provenance mechanisms, offering flexibility in the scale and in the precision of the provenance data collected, ensuring its relevance to the domain of the data-intensive task, fostering its rapid exploitation. The contributions address aspects of the scale of the provenance records, their usability and active role in the research life-cycle. We will discuss the use of dynamically generated provenance types as the approach for the integration of provenance mechanisms into a data-intensive workflow system. Enabling provenance can be transparent to the workflow user and developer, as well as fully controllable and customisable, depending from their expertise and the application's reproducibility, monitoring and validation requirements. The API that allows the realisation and adoption of a provenance type is presented, especially for what concerns the support of provenance profiling, contextualisation and precision. An actionable approach to provenance management will be also discussed, enabling provenance-driven operations at runtime, regardless of the enactment technologies and connectivity impediments. We proposes a framework based on concepts such as provenance clusters and provenance sensors, envisaging new potential for exploiting large quantities of provenance traces at runtime. Finally the work will also introduce how the underlying provenance model can be explored with big-data visualization techniques, aiming at producing comprehensive and interactive views on top of large and heterogeneous provenance data. We will demonstrate the adoption of alternative visualisation methods, from detailed and localised interactive graphs to radial-views, serving different purposes and expertise. Combining provenance types, selective rules, extensible metadata with reactive clustering opens a new and more versatile role of the lineage information in the research life-cycle, thanks to its improved usability. The flexible profiling of the proposed framework offers aid to the human analysis of the process, with the support of advanced and intuitive interactive graphical tools. The Active provenance methods are discussed in the context of a real implementation for a data-intensive library (dispel4py) and its adoption within use cases for computational seismology, climate studies and generic correlation analysis.

  2. A rights-based approach to science literacy using local languages: Contextualising inquiry-based learning in Africa

    NASA Astrophysics Data System (ADS)

    Babaci-Wilhite, Zehlia

    2017-06-01

    This article addresses the importance of teaching and learning science in local languages. The author argues that acknowledging local knowledge and using local languages in science education while emphasising inquiry-based learning improve teaching and learning science. She frames her arguments with the theory of inquiry, which draws on perspectives of both dominant and non-dominant cultures with a focus on science literacy as a human right. She first examines key assumptions about knowledge which inform mainstream educational research and practice. She then argues for an emphasis on contextualised learning as a right in education. This means accounting for contextualised knowledge and resisting the current trend towards de-contextualisation of curricula. This trend is reflected in Zanzibar's recent curriculum reform, in which English replaced Kiswahili as the language of instruction (LOI) in the last two years of primary school. The author's own research during the initial stage of the change (2010-2015) revealed that the effect has in fact proven to be counterproductive, with educational quality deteriorating further rather than improving. Arguing that language is essential to inquiry-based learning, she introduces a new didactic model which integrates alternative assumptions about the value of local knowledge and local languages in the teaching and learning of science subjects. In practical terms, the model is designed to address key science concepts through multiple modalities - "do it, say it, read it, write it" - a "hands-on" experiential combination which, she posits, may form a new platform for innovation based on a unique mix of local and global knowledge, and facilitate genuine science literacy. She provides examples from cutting-edge educational research and practice that illustrate this new model of teaching and learning science. This model has the potential to improve learning while supporting local languages and culture, giving local languages their rightful place in all aspects of education.

  3. Distance learning education for mitigation/adaptation policy: a case study

    NASA Astrophysics Data System (ADS)

    Slini, T.; Giama, E.; Papadopoulou, Ch.-O.

    2016-02-01

    The efficient training of young environmental scientists has proven to be a challenging goal over the last years, while several dynamic initiatives have been developed aiming to provide complete and consistent education. A successful example is the e-learning course for participants mainly coming from emerging economy countries 'Development of mitigation/adaptation policy portfolios' organised in the frame of the project Promitheas4: Knowledge transfer and research needs for preparing mitigation/adaptation policy portfolios, aiming to provide knowledge transfer, enhance new skills and competencies, using modern didactic approaches and learning technologies. The present paper addresses the experience and the results of these actions, which seem promising and encouraging and were broadly welcomed by the participants.

  4. Provenance in Data Interoperability for Multi-Sensor Intercomparison

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris; Leptoukh, Greg; Berrick, Steve; Shen, Suhung; Prados, Ana; Fox, Peter; Yang, Wenli; Min, Min; Holloway, Dan; Enloe, Yonsook

    2008-01-01

    As our inventory of Earth science data sets grows, the ability to compare, merge and fuse multiple datasets grows in importance. This requires a deeper data interoperability than we have now. Efforts such as Open Geospatial Consortium and OPeNDAP (Open-source Project for a Network Data Access Protocol) have broken down format barriers to interoperability; the next challenge is the semantic aspects of the data. Consider the issues when satellite data are merged, cross-calibrated, validated, inter-compared and fused. We must match up data sets that are related, yet different in significant ways: the phenomenon being measured, measurement technique, location in space-time or quality of the measurements. If subtle distinctions between similar measurements are not clear to the user, results can be meaningless or lead to an incorrect interpretation of the data. Most of these distinctions trace to how the data came to be: sensors, processing and quality assessment. For example, monthly averages of satellite-based aerosol measurements often show significant discrepancies, which might be due to differences in spatio- temporal aggregation, sampling issues, sensor biases, algorithm differences or calibration issues. Provenance information must be captured in a semantic framework that allows data inter-use tools to incorporate it and aid in the intervention of comparison or merged products. Semantic web technology allows us to encode our knowledge of measurement characteristics, phenomena measured, space-time representation, and data quality attributes in a well-structured, machine-readable ontology and rulesets. An analysis tool can use this knowledge to show users the provenance-related distrintions between two variables, advising on options for further data processing and analysis. An additional problem for workflows distributed across heterogeneous systems is retrieval and transport of provenance. Provenance may be either embedded within the data payload, or transmitted from server to client in an out-of-band mechanism. The out of band mechanism is more flexible in the richness of provenance information that can be accomodated, but it relies on a persistent framework and can be difficult for legacy clients to use. We are prototyping the embedded model, incorporating provenance within metadata objects in the data payload. Thus, it always remains with the data. The downside is a limit to the size of provenance metadata that we can include, an issue that will eventually need resolution to encompass the richness of provenance information required for daata intercomparison and merging.

  5. Estimating impacts of plantation forestry on plant biodiversity in southern Chile-a spatially explicit modelling approach.

    PubMed

    Braun, Andreas Christian; Koch, Barbara

    2016-10-01

    Monitoring the impacts of land-use practices is of particular importance with regard to biodiversity hotspots in developing countries. Here, conserving the high level of unique biodiversity is challenged by limited possibilities for data collection on site. Especially for such scenarios, assisting biodiversity assessments by remote sensing has proven useful. Remote sensing techniques can be applied to interpolate between biodiversity assessments taken in situ. Through this approach, estimates of biodiversity for entire landscapes can be produced, relating land-use intensity to biodiversity conditions. Such maps are a valuable basis for developing biodiversity conservation plans. Several approaches have been published so far to interpolate local biodiversity assessments in remote sensing data. In the following, a new approach is proposed. Instead of inferring biodiversity using environmental variables or the variability of spectral values, a hypothesis-based approach is applied. Empirical knowledge about biodiversity in relation to land-use is formalized and applied as ascription rules for image data. The method is exemplified for a large study site (over 67,000 km(2)) in central Chile, where forest industry heavily impacts plant diversity. The proposed approach yields a coefficient of correlation of 0.73 and produces a convincing estimate of regional biodiversity. The framework is broad enough to be applied to other study sites.

  6. Bridging the provenance gap: opportunities and challenges tracking in and ex silico provenance in sUAS workflows

    NASA Astrophysics Data System (ADS)

    Thomer, A.

    2017-12-01

    Data provenance - the record of the varied processes that went into the creation of a dataset, as well as the relationships between resulting data objects - is necessary to support the reusability, reproducibility and reliability of earth science data. In sUAS-based research, capturing provenance can be particularly challenging because of the breadth and distributed nature of the many platforms used to collect, process and analyze data. In any given project, multiple drones, controllers, computers, software systems, sensors, cameras, imaging processing algorithms and data processing workflows are used over sometimes long periods of time. These platforms and processing result in dozens - if not hundreds - of data products in varying stages of readiness-for-analysis and sharing. Provenance tracking mechanisms are needed to make the relationships between these many data products explicit, and therefore more reusable and shareable. In this talk, I discuss opportunities and challenges in tracking provenance in sUAS-based research, and identify gaps in current workflow-capture technologies. I draw on prior work conducted as part of the IMLS-funded Site-Based Data Curation project in which we developed methods of documenting in and ex silico (that is, computational and non-computation) workflows, and demonstrate this approaches applicability to research with sUASes. I conclude with a discussion of ontologies and other semantic technologies that have potential application in sUAS research.

  7. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  8. Preliminary Climate Uncertainty Quantification Study on Model-Observation Test Beds at Earth Systems Grid Federation Repository

    NASA Astrophysics Data System (ADS)

    Lin, G.; Stephan, E.; Elsethagen, T.; Meng, D.; Riihimaki, L. D.; McFarlane, S. A.

    2012-12-01

    Uncertainty quantification (UQ) is the science of quantitative characterization and reduction of uncertainties in applications. It determines how likely certain outcomes are if some aspects of the system are not exactly known. UQ studies such as the atmosphere datasets greatly increased in size and complexity because they now comprise of additional complex iterative steps, involve numerous simulation runs and can consist of additional analytical products such as charts, reports, and visualizations to explain levels of uncertainty. These new requirements greatly expand the need for metadata support beyond the NetCDF convention and vocabulary and as a result an additional formal data provenance ontology is required to provide a historical explanation of the origin of the dataset that include references between the explanations and components within the dataset. This work shares a climate observation data UQ science use case and illustrates how to reduce climate observation data uncertainty and use a linked science application called Provenance Environment (ProvEn) to enable and facilitate scientific teams to publish, share, link, and discover knowledge about the UQ research results. UQ results include terascale datasets that are published to an Earth Systems Grid Federation (ESGF) repository. Uncertainty exists in observation data sets, which is due to sensor data process (such as time averaging), sensor failure in extreme weather conditions, and sensor manufacture error etc. To reduce the uncertainty in the observation data sets, a method based on Principal Component Analysis (PCA) was proposed to recover the missing values in observation data. Several large principal components (PCs) of data with missing values are computed based on available values using an iterative method. The computed PCs can approximate the true PCs with high accuracy given a condition of missing values is met; the iterative method greatly improve the computational efficiency in computing PCs. Moreover, noise removal is done at the same time during the process of computing missing values by using only several large PCs. The uncertainty quantification is done through statistical analysis of the distribution of different PCs. To record above UQ process, and provide an explanation on the uncertainty before and after the UQ process on the observation data sets, additional data provenance ontology, such as ProvEn, is necessary. In this study, we demonstrate how to reduce observation data uncertainty on climate model-observation test beds and using ProvEn to record the UQ process on ESGF. ProvEn demonstrates how a scientific team conducting UQ studies can discover dataset links using its domain knowledgebase, allowing them to better understand and convey the UQ study research objectives, the experimental protocol used, the resulting dataset lineage, related analytical findings, ancillary literature citations, along with the social network of scientists associated with the study. Climate scientists will not only benefit from understanding a particular dataset within a knowledge context, but also benefit from the cross reference of knowledge among the numerous UQ studies being stored in ESGF.

  9. Game-Based Assessment of Persistence

    ERIC Educational Resources Information Center

    DiCerbo, Kristen E.

    2014-01-01

    Interest in 21st century skills has brought concomitant interest in ways to teach and measure them. Games hold promise in these areas, but much of their potential has yet to be proven, and there are few examples of how to use the rich data from games to make inferences about players' knowledge, skills, and attributes. This article builds an…

  10. Is islet transplantation a realistic approach to curing diabetes?

    PubMed

    Jin, Sang-Man; Kim, Kwang-Won

    2017-01-01

    Since the report of type 1 diabetes reversal in seven consecutive patients by the Edmonton protocol in 2000, pancreatic islet transplantation has been reappraised based on accumulated clinical evidence. Although initially expected to therapeutically target long-term insulin independence, islet transplantation is now indicated for more specific clinical benefits. With the long-awaited report of the first phase 3 clinical trial in 2016, allogeneic islet transplantation is now transitioning from an experimental to a proven therapy for type 1 diabetes with problematic hypoglycemia. Islet autotransplantation has already been therapeutically proven in chronic pancreatitis with severe abdominal pain refractory to conventional treatments, and it holds promise for preventing diabetes after partial pancreatectomy due to benign pancreatic tumors. Based on current evidence, this review focuses on islet transplantation as a realistic approach to treating diabetes.

  11. Application of Lightweight Formal Methods to Software Security

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt

    2005-01-01

    Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.

  12. A Clinical Framework to Facilitate Risk Stratification When Considering an Active Surveillance Alternative to Immediate Biopsy and Surgery in Papillary Microcarcinoma.

    PubMed

    Brito, Juan P; Ito, Yasuhiro; Miyauchi, Akira; Tuttle, R Michael

    2016-01-01

    The 2015 American Thyroid Association thyroid cancer management guidelines endorse an active surveillance management approach as an alternative to immediate biopsy and surgery in subcentimeter thyroid nodules with highly suspicious ultrasonographic characteristics and in cytologically confirmed very low risk papillary thyroid cancer (PTC). However, the guidelines provide no specific recommendations with regard to the optimal selection of patients for an active surveillance management approach. This article describes a risk-stratified clinical decision-making framework that was developed by the thyroid cancer disease management team at Memorial Sloan Kettering Cancer Center as the lessons learned from Kuma Hospital in Japan were applied to a cohort of patients with probable or proven papillary microcarcinoma (PMC) who were being evaluated for an active surveillance management approach in the United States. A risk-stratified approach to the evaluation of patients with probable or proven PMC being considered for an active surveillance management approach requires an evaluation of three interrelated but distinct domains: (i) tumor/neck ultrasound characteristics (e.g., size of the primary tumor, the location of the tumor within the thyroid gland); (ii) patient characteristics (e.g., age, comorbidities, willingness to accept observation); and (iii) medical team characteristics (e.g., availability and experience of the multidisciplinary team). Based on an analysis of the critical factors within each of these domains, patients with probable or proven PTC can then be classified as ideal, appropriate, or inappropriate candidates for active surveillance. Risk stratification utilizing the proposed decision-making framework will improve the ability of clinicians to recognize individual patients with proven or probable PMC who are most likely to benefit from an active surveillance management option while at the same time identifying patients with proven or probable PMC that would be better served with an upfront biopsy and surgical management approach.

  13. Applying the Kanban method in problem-based project work: a case study in a manufacturing engineering bachelor's programme at Aalborg University Copenhagen

    NASA Astrophysics Data System (ADS)

    Balve, Patrick; Krüger, Volker; Tolstrup Sørensen, Lene

    2017-11-01

    Problem-based learning (PBL) has proven to be highly effective for educating students in an active and self-motivated manner in various disciplines. Student projects carried out following PBL principles are very dynamic and carry a high level of uncertainty, both conditions under which agile project management approaches are assumed to be highly supportive. The paper describes an empirical case study carried out at Aalborg University Copenhagen involving students from two different semesters of a Bachelor of Science programme. While executing the study, compelling examples of how PBL and the agile project management method Kanban blend could be identified. A final survey reveals that applying Kanban produces noticeable improvements with respect to creating, assigning and coordinating project tasks. Other improvements were found in group communication, knowledge about the work progress with regards to both the individual and the collective and the students' way of continuously improving their own teamwork.

  14. Membranes in Lithium Ion Batteries

    PubMed Central

    Yang, Min; Hou, Junbo

    2012-01-01

    Lithium ion batteries have proven themselves the main choice of power sources for portable electronics. Besides consumer electronics, lithium ion batteries are also growing in popularity for military, electric vehicle, and aerospace applications. The present review attempts to summarize the knowledge about some selected membranes in lithium ion batteries. Based on the type of electrolyte used, literature concerning ceramic-glass and polymer solid ion conductors, microporous filter type separators and polymer gel based membranes is reviewed. PMID:24958286

  15. The development of a classification schema for arts-based approaches to knowledge translation.

    PubMed

    Archibald, Mandy M; Caine, Vera; Scott, Shannon D

    2014-10-01

    Arts-based approaches to knowledge translation are emerging as powerful interprofessional strategies with potential to facilitate evidence uptake, communication, knowledge, attitude, and behavior change across healthcare provider and consumer groups. These strategies are in the early stages of development. To date, no classification system for arts-based knowledge translation exists, which limits development and understandings of effectiveness in evidence syntheses. We developed a classification schema of arts-based knowledge translation strategies based on two mechanisms by which these approaches function: (a) the degree of precision in key message delivery, and (b) the degree of end-user participation. We demonstrate how this classification is necessary to explore how context, time, and location shape arts-based knowledge translation strategies. Classifying arts-based knowledge translation strategies according to their core attributes extends understandings of the appropriateness of these approaches for various healthcare settings and provider groups. The classification schema developed may enhance understanding of how, where, and for whom arts-based knowledge translation approaches are effective, and enable theorizing of essential knowledge translation constructs, such as the influence of context, time, and location on utilization strategies. The classification schema developed may encourage systematic inquiry into the effectiveness of these approaches in diverse interprofessional contexts. © 2014 Sigma Theta Tau International.

  16. Foundation: Transforming data bases into knowledge bases

    NASA Technical Reports Server (NTRS)

    Purves, R. B.; Carnes, James R.; Cutts, Dannie E.

    1987-01-01

    One approach to transforming information stored in relational data bases into knowledge based representations and back again is described. This system, called Foundation, allows knowledge bases to take advantage of vast amounts of pre-existing data. A benefit of this approach is inspection, and even population, of data bases through an intelligent knowledge-based front-end.

  17. Integrating prediction, provenance, and optimization into high energy workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schram, M.; Bansal, V.; Friese, R. D.

    We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.

  18. Knowledge Engineering Approach to the Geotectonic Discourse

    NASA Astrophysics Data System (ADS)

    Pshenichny, Cyril

    2014-05-01

    The intellectual challenge of geotectonics is, and always was, much harder than that of most of the sciences: geotectonics has to say much when there is objectively not too much to say. As the target of study (the genesis of regional and planetary geological structures) is vast and multidisciplinary and is more or less generic for many geological disciplines, its more or less complete description is practically inachievable. Hence, the normal pathway of natural-scientific research - first acquire data, then draw conclusion - unlikely can be the case here. Geotectonics does quite the opposite; its approach is purely abductive: first to suggest a conceptualization (hypothesis) based on some external grounds (either general planetary/cosmic/philosophic/religious considerations, or based on experience gained from research of other structures/regions/planets) and then to acquire data that either support or refute it. In fact, geotectonics defines the context for data acquisition, and hence, the paradigm for the entire body of geology. Being an obvious necessity for a descriptive science, this nevertheless creates a number of threats: • Like any people, scientists like simplicity and unity, and therefore a single geotectonic hypothesis may seem preferable once based on the data available at the moment and oppress other views which may acquire evidence in the future; • As impartial data acquisition is rather a myth than reality even in most of the natural sciences, in a study like geology this process becomes strongly biased by the reigning hypothesis and controlled to supply only supportive evidence; • It becomes collectively agreed that any, or great many, domains of geological knowledge are determined by a geotectonic concept, which is, in turn, offered by a reigning hypothesis (sometimes reclassified as theory) - e.g., exploration geologists must involve the global geotectonic terminology in their technical reports on assessment of mineral or hydrocarbon resources, sessions and conferences are entitled like "Geochemical signatures of postcollisional magmas" thus assuming that the concept of collision (i) has been proven to reflect the reality and (ii) surely has something to do with geochemistry of rocks; tectonic terminology becomes a ubiquitous language with no warranty of its correctness and appropriateness to the case. These issues fall into the scope of the field defined as reasoning research in the geosciences (Pshenichny, 2002; 2003). One of its main tools is knowledge engineering (Feigenbaum, 1984). As has been suggested by Anokhin and Longhinos (2013), knowledge engineering, especially its dynamic part being rapidly evolving now, may offer remedies to handle the abovementioned problems. The following solutions will be reported: • Development of an integrated geotectonic context and language shared by the community that follow contrasting geotectonic views; making concepts more or less inter-hypothesis; studying the "anatomy and physiology" of geotectonic hypotheses and fixing the points of concordance, compatibility and disagreement, computation of logical probabilities of the views given a number of hypotheses (Pshenichny, 2004); • Constructing the ontologies, conceptual graphs and event bushes for data acquisition to impartially define the semantics of data and data provenance in geology; • Building the ensembles of event bushes for related domains of geological knowledge (petrology, volcanology, sedimentology and others) to track the actual influence of geotectonic concepts and views on the geo-knowledge. Following these lines of research would create a better environment for flourishing of scientific thought in geology and makes it more efficient and operative in responding to its traditional tasks (impartial geological mapping, mineral and hydrocarbon exploration, geological education and knowledge transfer) and challenges of nowadays such as natural hazard assessment, sustainable regional development, and so forth. Moreover, this would make a significant contribution to creation of a knowledge-based society that is seen as one of the key priorities of Europe and the civilization in general.

  19. ProvenCare: Geisinger's Model for Care Transformation through Innovative Clinical Initiatives and Value Creation.

    PubMed

    2009-04-01

    Geisinger's system of care can be seen as a microcosm of the national delivery of healthcare, with implications for decision makers in other health plans. In this interview, Dr Ronald A. Paulus focuses on Geisinger's unique approach to patient care. In its core, this approach represents a system of quality and value initiatives based on 3 major programs-Proven Health Navigation (medical home); the ProvenCare model; and transitions of care. The goal of such an approach is to optimize disease management by using a rational reimbursement paradigm for appropriate interventions, providing innovative incentives, and engaging patients in their own care as part of any intervention. Dr Paulus explains the reasons why, unlike Geisinger, other stakeholders, including payers, providers, patients, and employers, have no intrinsic reasons to be concerned with quality and value initiatives. In addition, he says, an electronic infrastructure that could be modified as management paradigms evolve is a necessary tool to ensure the healthcare delivery system's ability to adapt to new clinical realities quickly to ensure the continuation of delivering best value for all stakeholders.

  20. [Why controlled studies may lead to misleading and unconfirmed therapeutic concepts--a critical view of evidence-based medicine].

    PubMed

    Flachskampf, F A

    2002-03-01

    The concept of evidence-based medicine has gathered widespread support during recent years. While this concept has clear merits in compiling and qualifying up-to-date information for clinical decisions, it should be viewed with caution as the sole valid knowledge source for clinical decision-making. The limitations of such an approach are particularly striking when reviewing two key developments in modern cardiology, fibrinolysis and acute percutaneous intervention in acute myocardial infarction. In both cases, early studies and meta-analyses showed no benefit for these therapeutic interventions over earlier treatment. Only after further refinement (mainly in dosage, time window, concomitant heparin therapy for fibrinolysis, and the introduction of stents and IIb/IIIa inhibitors for acute intervention) did these therapies become universally acknowledged. It is therefore crucial to understand that especially for physicians actively participating in the development of a clinical field clinical decisions cannot be exclusively based on published evidence. Another important problem to consider is the time gap between the emergence of new therapies and the publication and reception by the medical audience, in particular in rapidly evolving fields as cardiology. While it is clear that clinical decision-making must be backed by solid knowledge of the published evidence, in particular the specialist involved in-depth in the field may use not yet proven therapeutic concepts and measures to the patient's advantage.

  1. Intrusion Detection Systems with Live Knowledge System

    DTIC Science & Technology

    2016-05-31

    Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR, which is a machine-learning based RDR...propose novel approach that uses Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR...detection model by applying Induct RDR approach. The proposed induct RDR ( Ripple Down Rules) approach allows to acquire the phishing detection

  2. RigFit: a new approach to superimposing ligand molecules.

    PubMed

    Lemmen, C; Hiller, C; Lengauer, T

    1998-09-01

    If structural knowledge of a receptor under consideration is lacking, drug design approaches focus on similarity or dissimilarity analysis of putative ligands. In this context the mutual ligand superposition is of utmost importance. Methods that are rapid enough to facilitate interactive usage, that allow to process sets of conformers and that enable database screening are of special interest here. The ability to superpose molecular fragments instead of entire molecules has proven to be helpful too. The RIGFIT approach meets these requirements and has several additional advantages. In three distinct test applications, we evaluated how closely we can approximate the observed relative orientation for a set of known crystal structures, we employed RIGFIT as a fragment placement procedure, and we performed a fragment-based database screening. The run time of RIGFIT can be traded off against its accuracy. To be competitive in accuracy with another state-of-the-art alignment tool, with which we compare our method explicitly, computing times of about 6 s per superposition on a common day workstation are required. If longer run times can be afforded the accuracy increases significantly. RIGFIT is part of the flexible superposition software FLEXS which can be accessed on the WWW [http:/(/)cartan.gmd.de/FlexS].

  3. TARGET: Rapid Capture of Process Knowledge

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.; Ly, H. V.; Saito, T.; Loftin, R. B.

    1993-01-01

    TARGET (Task Analysis/Rule Generation Tool) represents a new breed of tool that blends graphical process flow modeling capabilities with the function of a top-down reporting facility. Since NASA personnel frequently perform tasks that are primarily procedural in nature, TARGET models mission or task procedures and generates hierarchical reports as part of the process capture and analysis effort. Historically, capturing knowledge has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent the expert's knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some types of knowledge, procedural knowledge has received relatively little attention. In essence, TARGET is one of the first tools of its kind, commercial or institutional, that is designed to support this type of knowledge capture undertaking. This paper will describe the design and development of TARGET for the acquisition and representation of procedural knowledge. The strategies employed by TARGET to support use by knowledge engineers, subject matter experts, programmers and managers will be discussed. This discussion includes the method by which the tool employs its graphical user interface to generate a task hierarchy report. Next, the approach to generate production rules for incorporation in and development of a CLIPS based expert system will be elaborated. TARGET also permits experts to visually describe procedural tasks as a common medium for knowledge refinement by the expert community and knowledge engineer making knowledge consensus possible. The paper briefly touches on the verification and validation issues facing the CLIPS rule generation aspects of TARGET. A description of efforts to support TARGET's interoperability issues on PCs, Macintoshes and UNIX workstations concludes the paper.

  4. Security and Dependability Solutions for Web Services and Workflows

    NASA Astrophysics Data System (ADS)

    Kokolakis, Spyros; Rizomiliotis, Panagiotis; Benameur, Azzedine; Sinha, Smriti Kumar

    In this chapter we present an innovative approach towards the design and application of Security and Dependability (S&D) solutions for Web services and service-based workflows. Recently, several standards have been published that prescribe S&D solutions for Web services, e.g. OASIS WS-Security. However,the application of these solutions in specific contexts has been proven problematic. We propose a new framework for the application of such solutions based on the SERENITY S&D Pattern concept. An S&D Pattern comprises all the necessary information for the implementation, verification, deployment, and active monitoring of an S&D Solution. Thus, system developers may rely on proven solutions that are dynamically deployed and monitored by the Serenity Runtime Framework. Finally, we further extend this approach to cover the case of executable workflows which are realised through the orchestration of Web services.

  5. Distributed fault-tolerant time-varying formation control for high-order linear multi-agent systems with actuator failures.

    PubMed

    Hua, Yongzhao; Dong, Xiwang; Li, Qingdong; Ren, Zhang

    2017-11-01

    This paper investigates the fault-tolerant time-varying formation control problems for high-order linear multi-agent systems in the presence of actuator failures. Firstly, a fully distributed formation control protocol is presented to compensate for the influences of both bias fault and loss of effectiveness fault. Using the adaptive online updating strategies, no global knowledge about the communication topology is required and the bounds of actuator failures can be unknown. Then an algorithm is proposed to determine the control parameters of the fault-tolerant formation protocol, where the time-varying formation feasible conditions and an approach to expand the feasible formation set are given. Furthermore, the stability of the proposed algorithm is proven based on the Lyapunov-like theory. Finally, two simulation examples are given to demonstrate the effectiveness of the theoretical results. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Usage and Attitudes Towards Natural Remedies and Homeopathy in General Pediatrics

    PubMed Central

    Beer, André-Michael; Burlaka, Ievgeniia; Buskin, Stephen; Kamenov, Borislav; Pettenazzo, Andrea; Popova, Diana; Riveros Huckstadt, María Pilar; Sakalinskas, Virgilijus; Oberbaum, Menachem

    2016-01-01

    In order to better understand the global approach and country differences in physicians’ usage, knowledge, and attitudes towards natural remedies and homeopathy in pediatric practice, an online survey involving 582 general pediatricians and general practitioners treating pediatric diseases was conducted in 6 countries. Overall, 17% of the pediatric prescriptions refer to phytotherapy and 15% refer to homeopathic preparations. Natural remedies and homeopathic preparations are more frequently used in upper respiratory tract infections, infant colic, sleep disturbances, and recurrent infections. In the majority of cases, they are used together with chemical drugs. Both treatment options are typically used if parents are concerned about side effects of conventional drugs or prefer natural remedies for themselves. Physicians express high interest in natural remedies and homeopathy; however, their knowledge is variable. Lack of proven efficacy, knowledge on mechanism of action, and information on indications are main factors that limit their usage. PMID:27493983

  7. Artificial Intelligence (AI) Based Tactical Guidance for Fighter Aircraft

    NASA Technical Reports Server (NTRS)

    McManus, John W.; Goodrich, Kenneth H.

    1990-01-01

    A research program investigating the use of Artificial Intelligence (AI) techniques to aid in the development of a Tactical Decision Generator (TDG) for Within Visual Range (WVR) air combat engagements is discussed. The application of AI programming and problem solving methods in the development and implementation of the Computerized Logic For Air-to-Air Warfare Simulations (CLAWS), a second generation TDG, is presented. The Knowledge-Based Systems used by CLAWS to aid in the tactical decision-making process are outlined in detail, and the results of tests to evaluate the performance of CLAWS versus a baseline TDG developed in FORTRAN to run in real-time in the Langley Differential Maneuvering Simulator (DMS), are presented. To date, these test results have shown significant performance gains with respect to the TDG baseline in one-versus-one air combat engagements, and the AI-based TDG software has proven to be much easier to modify and maintain than the baseline FORTRAN TDG programs. Alternate computing environments and programming approaches, including the use of parallel algorithms and heterogeneous computer networks are discussed, and the design and performance of a prototype concurrent TDG system are presented.

  8. Rasmussen's model of human behavior in laparoscopy training.

    PubMed

    Wentink, M; Stassen, L P S; Alwayn, I; Hosman, R J A W; Stassen, H G

    2003-08-01

    Compared to aviation, where virtual reality (VR) training has been standardized and simulators have proven their benefits, the objectives, needs, and means of VR training in minimally invasive surgery (MIS) still have to be established. The aim of the study presented is to introduce Rasmussen's model of human behavior as a practical framework for the definition of the training objectives, needs, and means in MIS. Rasmussen distinguishes three levels of human behavior: skill-, rule-, and knowledge-based behaviour. The training needs of a laparoscopic novice can be determined by identifying the specific skill-, rule-, and knowledge-based behavior that is required for performing safe laparoscopy. Future objectives of VR laparoscopy trainers should address all three levels of behavior. Although most commercially available simulators for laparoscopy aim at training skill-based behavior, especially the training of knowledge-based behavior during complications in surgery will improve safety levels. However, the cost and complexity of a training means increases when the training objectives proceed from the training of skill-based behavior to the training of complex knowledge-based behavior. In aviation, human behavior models have been used successfully to integrate the training of skill-, rule-, and knowledge-based behavior in a full flight simulator. Understanding surgeon behavior is one of the first steps towards a future full-scale laparoscopy simulator.

  9. Knowledge-based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.

    1993-01-01

    Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scalable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded with this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on concept called the DataHub. With the DataHub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (address, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.

  10. Knowledge-based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.

    1992-01-01

    Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scaleable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on the concept called the Data Hub. With the Data Hub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (access, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.

  11. Shedding light on cell compartmentation in the candidate phylum Poribacteria by high resolution visualisation and transcriptional profiling

    NASA Astrophysics Data System (ADS)

    Jahn, Martin T.; Markert, Sebastian M.; Ryu, Taewoo; Ravasi, Timothy; Stigloher, Christian; Hentschel, Ute; Moitinho-Silva, Lucas

    2016-10-01

    Assigning functions to uncultivated environmental microorganisms continues to be a challenging endeavour. Here, we present a new microscopy protocol for fluorescence in situ hybridisation-correlative light and electron microscopy (FISH-CLEM) that enabled, to our knowledge for the first time, the identification of single cells within their complex microenvironment at electron microscopy resolution. Members of the candidate phylum Poribacteria, common and uncultivated symbionts of marine sponges, were used towards this goal. Cellular 3D reconstructions revealed bipolar, spherical granules of low electron density, which likely represent carbon reserves. Poribacterial activity profiles were retrieved from prokaryotic enriched sponge metatranscriptomes using simulation-based optimised mapping. We observed high transcriptional activity for proteins related to bacterial microcompartments (BMC) and we resolved their subcellular localisation by combining FISH-CLEM with immunohistochemistry (IHC) on ultra-thin sponge tissue sections. In terms of functional relevance, we propose that the BMC-A region may be involved in 1,2-propanediol degradation. The FISH-IHC-CLEM approach was proven an effective toolkit to combine -omics approaches with functional studies and it should be widely applicable in environmental microbiology.

  12. An Improved Approach for Analyzing the Oxygen Compatibility of Solvents and other Oxygen-Flammable Materials for Use in Oxygen Systems

    NASA Technical Reports Server (NTRS)

    Harper, Susan A.; Juarez, Alfredo; Peralta, Stephen F.; Stoltzfus, Joel; Arpin, Christina Pina; Beeson, Harold D.

    2016-01-01

    Solvents used to clean oxygen system components must be assessed for oxygen compatibility, as incompatible residue or fluid inadvertently left behind within an oxygen system can pose a flammability risk. The most recent approach focused on solvent ignition susceptibility to assess the flammability risk associated with these materials. Previous evaluations included Ambient Pressure Liquid Oxygen (LOX) Mechanical Impact Testing (ASTM G86) and Autogenous Ignition Temperature (AIT) Testing (ASTM G72). The goal in this approach was to identify a solvent material that was not flammable in oxygen. As environmental policies restrict the available options of acceptable solvents, it has proven difficult to identify one that is not flammable in oxygen. A more rigorous oxygen compatibility approach is needed in an effort to select a new solvent for NASA applications. NASA White Sands Test Facility proposed an approach that acknowledges oxygen flammability, yet selects solvent materials based on their relative oxygen compatibility ranking, similar to that described in ASTM G63-99. Solvents are selected based on their ranking with respect to minimal ignition susceptibility, damage and propagation potential, as well as their relative ranking when compared with other solvent materials that are successfully used in oxygen systems. Test methods used in this approach included ASTM G86 (Ambient Pressure LOX Mechanical Impact Testing and Pressurized Gaseous Oxygen (GOX) Mechanical Impact Testing), ASTM G72 (AIT Testing), and ASTM D240 (Heat of Combustion (HOC) Testing). Only four solvents were tested through the full battery of tests for evaluation of oxygen compatibility: AK-225G as a baseline comparison, Solstice PF, L-14780, and Vertrel MCA. Baseline solvent AK-225G exhibited the lowest HOC and highest AIT of solvents tested. Nonetheless, Solstice PF, L-14780, and Vertrel MCA HOCs all fell well within the range of properties that are associated with proven oxygen system materials. Tested AITs for these solvents fell only slightly lower than the AIT for the proven AK-225G solvent. Based on these comparisons in which solvents exhibited properties within those ranges seen with proven oxygen system materials, it is believed that Solstice PF, L-14780, and Vertrel MCA would perform well with respect to oxygen compatibility.

  13. ProvenCare-Psoriasis: A disease management model to optimize care.

    PubMed

    Gionfriddo, Michael R; Pulk, Rebecca A; Sahni, Dev R; Vijayanagar, Sonal G; Chronowski, Joseph J; Jones, Laney K; Evans, Michael A; Feldman, Steven R; Pride, Howard

    2018-03-15

    There are a variety of evidence-based treatments available for psoriasis. The transition of this evidence into practice is challenging. In this article, we describe the design of our disease management approach for Psoriasis (ProvenCare®) and present preliminary evidence of the effect of its implementation. In designing our approach, we identified three barriers to optimal care: 1) lack of a standardized and discrete disease activity measure within the electronic health record, 2) lack of a system-wide, standardized approach to care, and 3) non-uniform financial access to appropriate non-pharmacologic treatments. We implemented several solutions, which collectively form our approach. We standardized the documentation of clinical data such as body surface area (BSA), created a disease management algorithm for psoriasis, and aligned incentives to facilitate the implementation of the algorithm. This approach provides more coordinated, cost effective care for psoriasis, while being acceptable to key stakeholders. Future work will examine the effect of the implementation of our approach on important clinical and patient outcomes.

  14. On Improving Efficiency of Differential Evolution for Aerodynamic Shape Optimization Applications

    NASA Technical Reports Server (NTRS)

    Madavan, Nateri K.

    2004-01-01

    Differential Evolution (DE) is a simple and robust evolutionary strategy that has been provEn effective in determining the global optimum for several difficult optimization problems. Although DE offers several advantages over traditional optimization approaches, its use in applications such as aerodynamic shape optimization where the objective function evaluations are computationally expensive is limited by the large number of function evaluations often required. In this paper various approaches for improving the efficiency of DE are reviewed and discussed. Several approaches that have proven effective for other evolutionary algorithms are modified and implemented in a DE-based aerodynamic shape optimization method that uses a Navier-Stokes solver for the objective function evaluations. Parallelization techniques on distributed computers are used to reduce turnaround times. Results are presented for standard test optimization problems and for the inverse design of a turbine airfoil. The efficiency improvements achieved by the different approaches are evaluated and compared.

  15. A semantic-based method for extracting concept definitions from scientific publications: evaluation in the autism phenotype domain.

    PubMed

    Hassanpour, Saeed; O'Connor, Martin J; Das, Amar K

    2013-08-12

    A variety of informatics approaches have been developed that use information retrieval, NLP and text-mining techniques to identify biomedical concepts and relations within scientific publications or their sentences. These approaches have not typically addressed the challenge of extracting more complex knowledge such as biomedical definitions. In our efforts to facilitate knowledge acquisition of rule-based definitions of autism phenotypes, we have developed a novel semantic-based text-mining approach that can automatically identify such definitions within text. Using an existing knowledge base of 156 autism phenotype definitions and an annotated corpus of 26 source articles containing such definitions, we evaluated and compared the average rank of correctly identified rule definition or corresponding rule template using both our semantic-based approach and a standard term-based approach. We examined three separate scenarios: (1) the snippet of text contained a definition already in the knowledge base; (2) the snippet contained an alternative definition for a concept in the knowledge base; and (3) the snippet contained a definition not in the knowledge base. Our semantic-based approach had a higher average rank than the term-based approach for each of the three scenarios (scenario 1: 3.8 vs. 5.0; scenario 2: 2.8 vs. 4.9; and scenario 3: 4.5 vs. 6.2), with each comparison significant at the p-value of 0.05 using the Wilcoxon signed-rank test. Our work shows that leveraging existing domain knowledge in the information extraction of biomedical definitions significantly improves the correct identification of such knowledge within sentences. Our method can thus help researchers rapidly acquire knowledge about biomedical definitions that are specified and evolving within an ever-growing corpus of scientific publications.

  16. Corporate knowledge repository: Adopting academic LMS into corporate environment

    NASA Astrophysics Data System (ADS)

    Bakar, Muhamad Shahbani Abu; Jalil, Dzulkafli

    2017-10-01

    The growth of Knowledge Economy has transformed human capital to be the vital asset in business organization of the 21st century. Arguably, due to its white-collar nature, knowledge-based industry is more favorable than traditional manufacturing business. However, over dependency on human capital can also be a major challenge as any workers will inevitably leave the company or retire. This situation will possibly create knowledge gap that may impact business continuity of the enterprise. Knowledge retention in the corporate environment has been of many research interests. Learning Management System (LMS) refers to the system that provides the delivery, assessment and management tools for an organization to handle its knowledge repository. By using the aspirations of a proven LMS implemented in an academic environment, this paper proposes LMS model that can be used to enable peer-to-peer knowledge capture and sharing in the knowledge-based organization. Cloud Enterprise Resource Planning (ERP), referred to an ERP solution in the internet cloud environment was chosen as the domain knowledge. The complexity of the Cloud ERP business and its knowledge make it very vulnerable to the knowledge retention problem. This paper discusses how the company's essential knowledge can be retained using the LMS system derived from academic environment into the corporate model.

  17. The Natural Provenance: Ecoliteracy in Higher Education in Mississippi

    ERIC Educational Resources Information Center

    Hammond, Sarah Wheeless; Herron, Sherry S.

    2012-01-01

    Researchers suggest there is an increasing apathy in the study of natural history in academic settings and in the scientific community. However, most studies of environmental knowledge do not address knowledge of local flora and fauna; they are concerned with the knowledge of environmental issues or broad ecological knowledge. Ecoliteracy…

  18. Knowledge of and Attitudes Toward Evidence-Based Guidelines for and Against Clinical Preventive Services: Results from a National Survey.

    PubMed

    Lantz, Paula M; Evans, W Douglas; Mead, Holly; Alvarez, Carmen; Stewart, Lisa

    2016-03-01

    Both the underuse and overuse of clinical preventive services relative to evidence-based guidelines are a public health concern. Informed consumers are an important foundation of many components of the Affordable Care Act, including coverage mandates for proven clinical preventive services recommended by the US Preventive Services Task Force. Across sociodemographic groups, however, knowledge of and positive attitudes toward evidence-based guidelines for preventive care are extremely low. Given the demonstrated low levels of consumers' knowledge of and trust in guidelines, coupled with their strong preference for involvement in preventive care decisions, better education and decision-making support for evidence-based preventive services are greatly needed. Both the underuse and overuse of clinical preventive services are a serious public health problem. The goal of our study was to produce population-based national data that could assist in the design of communication strategies to increase knowledge of and positive attitudes toward evidence-based guidelines for clinical preventive services (including the US Preventive Services Task Force, USPSTF) and to reduce uncertainty among patients when guidelines change or are controversial. In late 2013 we implemented an Internet-based survey of a nationally representative sample of 2,529 adults via KnowledgePanel, a probability-based survey panel of approximately 60,000 adults, statistically representative of the US noninstitutionalized population. African Americans, Hispanics, and those with less than a high school education were oversampled. We then conducted descriptive statistics and multivariable logistic regression analysis to identify the prevalence of and sociodemographic characteristics associated with key knowledge and attitudinal variables. While 36.4% of adults reported knowing that the Affordable Care Act requires insurance companies to cover proven preventive services without cost sharing, only 7.7% had heard of the USPSTF. Approximately 1 in 3 (32.6%) reported trusting that a government task force would make fair guidelines for preventive services, and 38.2% believed that the government uses guidelines to ration health care. Most of the respondents endorsed the notion that research/scientific evidence and expert medical opinion are important for the creation of guidelines and that clinicians should follow guidelines based on evidence. But when presented with patient vignettes in which a physician made a guideline-based recommendation against a cancer-screening test, less than 10% believed that this recommendation alone, without further dialogue and/or the patient's own research, was sufficient to make such a decision. Given these demonstrated low levels of knowledge and mistrust regarding guidelines, coupled with a strong preference for shared decision making, better consumer education and decision supports for evidence-based guidelines for clinical preventive services are greatly needed. © 2016 Milbank Memorial Fund.

  19. Burn Resuscitation

    DTIC Science & Technology

    2009-01-01

    The modified Brooke formula may not be effective in preventing all complications of fluid loading in all patients. The important concept is that if...addition to complicated mathematical computations [39]. They used their knowledge of expected fluid loss rates to devise a formula, based on trials of...burn resuscitations in order to prevent such a complication . This has proven to be a difficult task. The use of colloid has been examined in

  20. Successfully Transitioning Science Research to Space Weather Applications

    NASA Technical Reports Server (NTRS)

    Spann, James

    2012-01-01

    The awareness of potentially significant impacts of space weather on spaceand ground ]based technological systems has generated a strong desire in many sectors of government and industry to effectively transform knowledge and understanding of the variable space environment into useful tools and applications for use by those entities responsible for systems that may be vulnerable to space weather impacts. Essentially, effectively transitioning science knowledge to useful applications relevant to space weather has become important. This talk will present proven methodologies that have been demonstrated to be effective, and how in the current environment those can be applied to space weather transition efforts.

  1. Discovering indigenous science: Implications for science education

    NASA Astrophysics Data System (ADS)

    Snively, Gloria; Corsiglia, John

    2001-01-01

    Indigenous science relates to both the science knowledge of long-resident, usually oral culture peoples, as well as the science knowledge of all peoples who as participants in culture are affected by the worldview and relativist interests of their home communities. This article explores aspects of multicultural science and pedagogy and describes a rich and well-documented branch of indigenous science known to biologists and ecologists as traditional ecological knowledge (TEK). Although TEK has been generally inaccessible, educators can now use a burgeoning science-based TEK literature that documents numerous examples of time-proven, ecologically relevant, and cost effective indigenous science. Disputes regarding the universality of the standard scientific account are of critical importance for science educators because the definition of science is a de facto gatekeeping device for determining what can be included in a school science curriculum and what cannot. When Western modern science (WMS) is defined as universal it does displace revelation-based knowledge (i.e., creation science); however, it also displaces pragmatic local indigenous knowledge that does not conform with formal aspects of the standard account. Thus, in most science classrooms around the globe, Western modern science has been taught at the expense of indigenous knowledge. However, because WMS has been implicated in many of the world's ecological disasters, and because the traditional wisdom component of TEK is particularly rich in time-tested approaches that foster sustainability and environmental integrity, it is possible that the universalist gatekeeper can be seen as increasingly problematic and even counter productive. This paper describes many examples from Canada and around the world of indigenous people's contributions to science, environmental understanding, and sustainability. The authors argue the view that Western or modern science is just one of many sciences that need to be addressed in the science classroom. We conclude by presenting instructional strategies that can help all science learners negotiate border crossings between Western modern science and indigenous science.

  2. Improving Medical Students' Application of Knowledge and Clinical Decision-Making Through a Porcine-Based Integrated Cardiac Basic Science Program.

    PubMed

    Stott, Martyn Charles; Gooseman, Michael Richard; Briffa, Norman Paul

    2016-01-01

    Despite the concerted effort of modern undergraduate curriculum designers, the ability to integrate basic sciences in clinical rotations is an ongoing problem in medical education. Students and newly qualified doctors themselves report worry about the effect this has on their clinical performance. There are examples in the literature to support development of attempts at integrating such aspects, but this "vertical integration" has proven to be difficult. We designed an expert-led integrated program using dissection of porcine hearts to improve the use of cardiac basic sciences in clinical medical students' decision-making processes. To our knowledge, this is the first time in the United Kingdom that an animal model has been used to teach undergraduate clinical anatomy to medical students to direct wider application of knowledge. Action research methodology was used to evaluate the local curriculum and assess learners needs, and the agreed teaching outcomes, methods, and delivery outline were established. A total of 18 students in the clinical years of their degree program attended, completing precourse and postcourse multichoice questions examinations and questionnaires to assess learners' development. Student's knowledge scores improved by 17.5% (p = 0.01; students t-test). Students also felt more confident at applying underlying knowledge to decision-making and diagnosis in clinical medicine. An expert teacher (consultant surgeon) was seen as beneficial to students' understanding and appreciation. This study outlines how the development of a teaching intervention using porcine-based methods successfully improved both student's knowledge and application of cardiac basic sciences. We recommend that clinicians fully engage with integrating previously learnt underlying sciences to aid students in developing decision-making and diagnostic skills as well as a deeper approach to learning. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  3. Quantitative Resistance to Plant Pathogens in Pyramiding Strategies for Durable Crop Protection.

    PubMed

    Pilet-Nayel, Marie-Laure; Moury, Benoît; Caffier, Valérie; Montarry, Josselin; Kerlan, Marie-Claire; Fournet, Sylvain; Durel, Charles-Eric; Delourme, Régine

    2017-01-01

    Quantitative resistance has gained interest in plant breeding for pathogen control in low-input cropping systems. Although quantitative resistance frequently has only a partial effect and is difficult to select, it is considered more durable than major resistance (R) genes. With the exponential development of molecular markers over the past 20 years, resistance QTL have been more accurately detected and better integrated into breeding strategies for resistant varieties with increased potential for durability. This review summarizes current knowledge on the genetic inheritance, molecular basis, and durability of quantitative resistance. Based on this knowledge, we discuss how strategies that combine major R genes and QTL in crops can maintain the effectiveness of plant resistance to pathogens. Combining resistance QTL with complementary modes of action appears to be an interesting strategy for breeding effective and potentially durable resistance. Combining quantitative resistance with major R genes has proven to be a valuable approach for extending the effectiveness of major genes. In the plant genomics era, improved tools and methods are becoming available to better integrate quantitative resistance into breeding strategies. Nevertheless, optimal combinations of resistance loci will still have to be identified to preserve resistance effectiveness over time for durable crop protection.

  4. Workflow Automation: A Collective Case Study

    ERIC Educational Resources Information Center

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  5. Validation of a Crowdsourcing Methodology for Developing a Knowledge Base of Related Problem-Medication Pairs.

    PubMed

    McCoy, A B; Wright, A; Krousel-Wood, M; Thomas, E J; McCoy, J A; Sittig, D F

    2015-01-01

    Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes.

  6. Validation of a Crowdsourcing Methodology for Developing a Knowledge Base of Related Problem-Medication Pairs

    PubMed Central

    Wright, A.; Krousel-Wood, M.; Thomas, E. J.; McCoy, J. A.; Sittig, D. F.

    2015-01-01

    Summary Background Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. Objective We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. Methods We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. Results The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. Conclusions We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes. PMID:26171079

  7. Multiobjective Optimization Using a Pareto Differential Evolution Approach

    NASA Technical Reports Server (NTRS)

    Madavan, Nateri K.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Differential Evolution is a simple, fast, and robust evolutionary algorithm that has proven effective in determining the global optimum for several difficult single-objective optimization problems. In this paper, the Differential Evolution algorithm is extended to multiobjective optimization problems by using a Pareto-based approach. The algorithm performs well when applied to several test optimization problems from the literature.

  8. Oligomer formation in the troposphere: from experimental knowledge to 3-D modeling

    NASA Astrophysics Data System (ADS)

    Lemaire, V.; Coll, I.; Couvidat, F.; Mouchel-Vallon, C.; Seigneur, C.; Siour, G.

    2015-10-01

    The organic fraction of atmospheric aerosols has proven to be a critical element of air quality and climate issues. However, its composition and the aging processes it undergoes remain insufficiently understood. This work builds on laboratory knowledge to simulate the formation of oligomers from biogenic secondary organic aerosol (BSOA) in the troposphere at the continental scale. We compare the results of two different modeling approaches, a 1st-order kinetic process and a pH-dependent parameterization, both implemented in the CHIMERE air quality model (AQM), to simulate the spatial and temporal distribution of oligomerized SOA over western Europe. Our results show that there is a strong dependence of the results on the selected modeling approach: while the irreversible kinetic process leads to the oligomerization of about 50 % of the total BSOA mass, the pH-dependent approach shows a broader range of impacts, with a strong dependency on environmental parameters (pH and nature of aerosol) and the possibility for the process to be reversible. In parallel, we investigated the sensitivity of each modeling approach to the representation of SOA precursor solubility (Henry's law constant values). Finally, the pros and cons of each approach for the representation of SOA aging are discussed and recommendations are provided to improve current representations of oligomer formation in AQMs.

  9. An adaptive sharing elitist evolution strategy for multiobjective optimization.

    PubMed

    Costa, Lino; Oliveira, Pedro

    2003-01-01

    Almost all approaches to multiobjective optimization are based on Genetic Algorithms (GAs), and implementations based on Evolution Strategies (ESs) are very rare. Thus, it is crucial to investigate how ESs can be extended to multiobjective optimization, since they have, in the past, proven to be powerful single objective optimizers. In this paper, we present a new approach to multiobjective optimization, based on ESs. We call this approach the Multiobjective Elitist Evolution Strategy (MEES) as it incorporates several mechanisms, like elitism, that improve its performance. When compared with other algorithms, MEES shows very promising results in terms of performance.

  10. Endoscopic saphenous vein and radial harvest: state-of-the-art.

    PubMed

    Bisleri, Gianluigi; Muneretto, Claudio

    2015-11-01

    Over the past decade, there has been an increased adoption of minimally invasive techniques for saphenous vein and radial artery procurement during coronary artery bypass surgery, albeit concerns have been raised about the potential detrimental effects of the endoscopic approach when compared with the conventional 'open' technique. The aim of the present review is to analyse the current available techniques and evidence about the impact of an endoscopic approach on conduit quality and clinical outcomes. At present, the available techniques for endoscopic vessel harvesting can be based on a sealed or non-sealed concept, for both saphenous vein and radial artery procurement. Despite the proven advantages of a minimally invasive approach in terms of reduced incidence of wound complications, pain reduction and improved cosmetic results, some studies questioned the impact of this technique in terms of potential graft damage, thus impairing the longevity of the graft itself. Endoscopic conduit harvesting can be performed safely and effectively with the currently available techniques, albeit a careful knowledge of the pitfalls of each technique is mandatory. Since there is ample evidence in literature that a minimally invasive approach for saphenous vein and radial artery procurement is not associated with an increased risk of graft damage and related failure in the mid-long term, the endoscopic technique should be adopted as the approach of choice for saphenous vein and radial artery harvesting in coronary artery bypass graft surgery.

  11. Developing instruments concerning scientific epistemic beliefs and goal orientations in learning science: a validation study

    NASA Astrophysics Data System (ADS)

    Lin, Tzung-Jin; Tsai, Chin-Chung

    2017-11-01

    The purpose of this study was to develop and validate two survey instruments to evaluate high school students' scientific epistemic beliefs and goal orientations in learning science. The initial relationships between the sampled students' scientific epistemic beliefs and goal orientations in learning science were also investigated. A final valid sample of 600 volunteer Taiwanese high school students participated in this survey by responding to the Scientific Epistemic Beliefs Instrument (SEBI) and the Goal Orientations in Learning Science Instrument (GOLSI). Through both exploratory and confirmatory factor analyses, the SEBI and GOLSI were proven to be valid and reliable for assessing the participants' scientific epistemic beliefs and goal orientations in learning science. The path analysis results indicated that, by and large, the students with more sophisticated epistemic beliefs in various dimensions such as Development of Knowledge, Justification for Knowing, and Purpose of Knowing tended to adopt both Mastery-approach and Mastery-avoidance goals. Some interesting results were also found. For example, the students tended to set a learning goal to outperform others or merely demonstrate competence (Performance-approach) if they had more informed epistemic beliefs in the dimensions of Multiplicity of Knowledge, Uncertainty of Knowledge, and Purpose of Knowing.

  12. On the Faceting and Linking of PROV for Earth Science Data Systems

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Wilson, B. D.; Tan, D.; Starch, M.

    2015-12-01

    Faceted search has yielded powerful capabilities for discovery of information by applying multiple filters to explore information. This is often more effective when the information is decomposed into faceted components that can be sliced and diced during faceted navigation. We apply this approach to the representation of PROV for Earth Science (PROV-ES) to facilitate more atomic units of provenance for discovery. Traditional bundles of PROV are then decomposed to enable finer-grain discovery of provenance. Linkages across provenance components can then be explored across seemingly disparate bundles. We will show how mappings into this provenance approach can be used to explore more data life-cycle relationships from observation to data to findings. We will also show examples of how this approach can be used to improve the discovery, access, and transparency of NASA datasets and the science data systems that were used to capture, manage, and produce the provenance information.

  13. Defining ischemic burden after traumatic brain injury using 15O PET imaging of cerebral physiology.

    PubMed

    Coles, Jonathan P; Fryer, Tim D; Smielewski, Peter; Rice, Kenneth; Clark, John C; Pickard, John D; Menon, David K

    2004-02-01

    Whereas postmortem ischemic damage is common in head injury, antemortem demonstration of ischemia has proven to be elusive. Although 15O positron emission tomography may be useful in this area, the technique has traditionally analyzed data within regions of interest (ROIs) to improve statistical accuracy. In head injury, such techniques are limited because of the lack of a priori knowledge regarding the location of ischemia, coexistence of hyperaemia, and difficulty in defining ischemic cerebral blood flow (CBF) and cerebral oxygen metabolism (CMRO2) levels. We report a novel method for defining disease pathophysiology following head injury. Voxel-based approaches are used to define the distribution of oxygen extraction fraction (OEF) across the entire brain; the standard deviation of this distribution provides a measure of the variability of OEF. These data are also used to integrate voxels above a threshold OEF value to produce an ROI based upon coherent physiology rather than spatial contiguity (the ischemic brain volume; IBV). However, such approaches may suffer from poor statistical accuracy, particularly in regions with low blood flow. The magnitude of these errors has been assessed in modeling experiments using the Hoffman brain phantom and modified control datasets. We conclude that this technique is a valid and useful tool for quantifying ischemic burden after traumatic brain injury.

  14. Noncontact Sleep Study by Multi-Modal Sensor Fusion.

    PubMed

    Chung, Ku-Young; Song, Kwangsub; Shin, Kangsoo; Sohn, Jinho; Cho, Seok Hyun; Chang, Joon-Hyuk

    2017-07-21

    Polysomnography (PSG) is considered as the gold standard for determining sleep stages, but due to the obtrusiveness of its sensor attachments, sleep stage classification algorithms using noninvasive sensors have been developed throughout the years. However, the previous studies have not yet been proven reliable. In addition, most of the products are designed for healthy customers rather than for patients with sleep disorder. We present a novel approach to classify sleep stages via low cost and noncontact multi-modal sensor fusion, which extracts sleep-related vital signals from radar signals and a sound-based context-awareness technique. This work is uniquely designed based on the PSG data of sleep disorder patients, which were received and certified by professionals at Hanyang University Hospital. The proposed algorithm further incorporates medical/statistical knowledge to determine personal-adjusted thresholds and devise post-processing. The efficiency of the proposed algorithm is highlighted by contrasting sleep stage classification performance between single sensor and sensor-fusion algorithms. To validate the possibility of commercializing this work, the classification results of this algorithm were compared with the commercialized sleep monitoring device, ResMed S+. The proposed algorithm was investigated with random patients following PSG examination, and results show a promising novel approach for determining sleep stages in a low cost and unobtrusive manner.

  15. Noncontact Sleep Study by Multi-Modal Sensor Fusion

    PubMed Central

    Chung, Ku-young; Song, Kwangsub; Shin, Kangsoo; Sohn, Jinho; Cho, Seok Hyun; Chang, Joon-Hyuk

    2017-01-01

    Polysomnography (PSG) is considered as the gold standard for determining sleep stages, but due to the obtrusiveness of its sensor attachments, sleep stage classification algorithms using noninvasive sensors have been developed throughout the years. However, the previous studies have not yet been proven reliable. In addition, most of the products are designed for healthy customers rather than for patients with sleep disorder. We present a novel approach to classify sleep stages via low cost and noncontact multi-modal sensor fusion, which extracts sleep-related vital signals from radar signals and a sound-based context-awareness technique. This work is uniquely designed based on the PSG data of sleep disorder patients, which were received and certified by professionals at Hanyang University Hospital. The proposed algorithm further incorporates medical/statistical knowledge to determine personal-adjusted thresholds and devise post-processing. The efficiency of the proposed algorithm is highlighted by contrasting sleep stage classification performance between single sensor and sensor-fusion algorithms. To validate the possibility of commercializing this work, the classification results of this algorithm were compared with the commercialized sleep monitoring device, ResMed S+. The proposed algorithm was investigated with random patients following PSG examination, and results show a promising novel approach for determining sleep stages in a low cost and unobtrusive manner. PMID:28753994

  16. A knowledge based approach to matching human neurodegenerative disease and animal models

    PubMed Central

    Maynard, Sarah M.; Mungall, Christopher J.; Lewis, Suzanna E.; Imam, Fahim T.; Martone, Maryann E.

    2013-01-01

    Neurodegenerative diseases present a wide and complex range of biological and clinical features. Animal models are key to translational research, yet typically only exhibit a subset of disease features rather than being precise replicas of the disease. Consequently, connecting animal to human conditions using direct data-mining strategies has proven challenging, particularly for diseases of the nervous system, with its complicated anatomy and physiology. To address this challenge we have explored the use of ontologies to create formal descriptions of structural phenotypes across scales that are machine processable and amenable to logical inference. As proof of concept, we built a Neurodegenerative Disease Phenotype Ontology (NDPO) and an associated Phenotype Knowledge Base (PKB) using an entity-quality model that incorporates descriptions for both human disease phenotypes and those of animal models. Entities are drawn from community ontologies made available through the Neuroscience Information Framework (NIF) and qualities are drawn from the Phenotype and Trait Ontology (PATO). We generated ~1200 structured phenotype statements describing structural alterations at the subcellular, cellular and gross anatomical levels observed in 11 human neurodegenerative conditions and associated animal models. PhenoSim, an open source tool for comparing phenotypes, was used to issue a series of competency questions to compare individual phenotypes among organisms and to determine which animal models recapitulate phenotypic aspects of the human disease in aggregate. Overall, the system was able to use relationships within the ontology to bridge phenotypes across scales, returning non-trivial matches based on common subsumers that were meaningful to a neuroscientist with an advanced knowledge of neuroanatomy. The system can be used both to compare individual phenotypes and also phenotypes in aggregate. This proof of concept suggests that expressing complex phenotypes using formal ontologies provides considerable benefit for comparing phenotypes across scales and species. PMID:23717278

  17. From Provenance Standards and Tools to Queries and Actionable Provenance

    NASA Astrophysics Data System (ADS)

    Ludaescher, B.

    2017-12-01

    The W3C PROV standard provides a minimal core for sharing retrospective provenance information for scientific workflows and scripts. PROV extensions such as DataONE's ProvONE model are necessary for linking runtime observables in retrospective provenance records with conceptual-level prospective provenance information, i.e., workflow (or dataflow) graphs. Runtime provenance recorders, such as DataONE's RunManager for R, or noWorkflow for Python capture retrospective provenance automatically. YesWorkflow (YW) is a toolkit that allows researchers to declare high-level prospective provenance models of scripts via simple inline comments (YW-annotations), revealing the computational modules and dataflow dependencies in the script. By combining and linking both forms of provenance, important queries and use cases can be supported that neither provenance model can afford on its own. We present existing and emerging provenance tools developed for the DataONE and SKOPE (Synthesizing Knowledge of Past Environments) projects. We show how the different tools can be used individually and in combination to model, capture, share, query, and visualize provenance information. We also present challenges and opportunities for making provenance information more immediately actionable for the researchers who create it in the first place. We argue that such a shift towards "provenance-for-self" is necessary to accelerate the creation, sharing, and use of provenance in support of transparent, reproducible computational and data science.

  18. Interoperable Data Sharing for Diverse Scientific Disciplines

    NASA Astrophysics Data System (ADS)

    Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean

    2016-04-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.

  19. A Working Framework for Enabling International Science Data System Interoperability

    NASA Astrophysics Data System (ADS)

    Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.

    2016-07-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.

  20. Coupling online effects-based monitoring with physicochemical, optical, and spectroscopy methods to assess quality at a surface water intake

    EPA Science Inventory

    Effects-based monitoring of water quality is a proven approach to monitoring the status of a water source. Only biological material can integrate factors which dictate toxicity. Online Toxicity Monitors (OTMs) provide a means to digitize sentinel organism responses to dynamic wa...

  1. A Blueprint for a Strengths-Based Level System in Schools

    ERIC Educational Resources Information Center

    Rubin, Ron

    2005-01-01

    In spite of the proven research studies that cite the beneficial effects of a positive, assets-based approach to child development and discipline (Scales, 2000; Jones & Jones, 1998; Benson, Galbraith, & Espeland, 1994), numerous school systems adhere to the articulation of tiered levels of misconduct, which identify minor to severe types of…

  2. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  3. Vibration-based angular speed estimation for multi-stage wind turbine gearboxes

    NASA Astrophysics Data System (ADS)

    Peeters, Cédric; Leclère, Quentin; Antoni, Jérôme; Guillaume, Patrick; Helsen, Jan

    2017-05-01

    Most processing tools based on frequency analysis of vibration signals are only applicable for stationary speed regimes. Speed variation causes the spectral content to smear, which encumbers most conventional fault detection techniques. To solve the problem of non-stationary speed conditions, the instantaneous angular speed (IAS) is estimated. Wind turbine gearboxes however are typically multi-stage gearboxes, consisting of multiple shafts, rotating at different speeds. Fitting a sensor (e.g. a tachometer) to every single stage is not always feasible. As such there is a need to estimate the IAS of every single shaft based on the vibration signals measured by the accelerometers. This paper investigates the performance of the multi-order probabilistic approach for IAS estimation on experimental case studies of wind turbines. This method takes into account the meshing orders of the gears present in the system and has the advantage that a priori it is not necessary to associate harmonics with a certain periodic mechanical event, which increases the robustness of the method. It is found that the MOPA has the potential to easily outperform standard band-pass filtering techniques for speed estimation. More knowledge of the gearbox kinematics is beneficial for the MOPA performance, but even with very little knowledge about the meshing orders, the MOPA still performs sufficiently well to compete with the standard speed estimation techniques. This observation is proven on two different data sets, both originating from vibration measurements on the gearbox housing of a wind turbine.

  4. An overview of CAM: components and clinical uses.

    PubMed

    Kiefer, David; Pitluk, Jessica; Klunk, Kathryn

    2009-01-01

    Complementary and alternative medicine (CAM), more recently known as integrative health or integrative medicine, is a diverse field comprising numerous treatments and practitioners of various levels of training. This review defines several of the main CAM modalities and reviews some of the research relevant to their clinical application. The goal is to provide healthcare providers with a basic understanding of CAM to start the incorporation of proven treatments into their clinical practice as well as guide them to working with CAM providers; ultimately, such knowledge is a fundamental part of a collaborative approach to optimal patient health and wellness.

  5. Ontology development for provenance tracing in National Climate Assessment of the US Global Change Research Program

    NASA Astrophysics Data System (ADS)

    Fu, Linyun; Ma, Xiaogang; Zheng, Jin; Goldstein, Justin; Duggan, Brian; West, Patrick; Aulenbach, Steve; Tilmes, Curt; Fox, Peter

    2014-05-01

    This poster will show how we used a case-driven iterative methodology to develop an ontology to represent the content structure and the associated provenance information in a National Climate Assessment (NCA) report of the US Global Change Research Program (USGCRP). We applied the W3C PROV-O ontology to implement a formal representation of provenance. We argue that the use case-driven, iterative development process and the application of a formal provenance ontology help efficiently incorporate domain knowledge from earth and environmental scientists in a well-structured model interoperable in the context of the Web of Data.

  6. Cystic fibrosis: myths. mistakes, and dogma.

    PubMed

    Rubin, Bruce K

    2014-03-01

    As a student I recall being told that half of what we would learn in medical school would be proven to be wrong. The challenges were to identify the incorrect half and, often more challenging, be willing to give up our entrenched ideas. Myths have been defined as traditional concepts or practice with no basis in fact. A misunderstanding is a mistaken approach or incomplete knowledge that can be resolved with better evidence, while firmly established misunderstandings can become dogma; a point of view put forth as authoritative without basis in fact. In this paper, I explore a number of myths, mistakes, and dogma related to cystic fibrosis disease and care. Many of these are myths that have long been vanquished and even forgotten, while others are controversial. In the future, many things taken as either fact or "clinical experience" today will be proven wrong. Let us examine these myths with an open mind and willingness to change our beliefs when justified. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Coherent exchange of healthcare knowledge in open systems.

    PubMed

    Buchan, I; Hanka, R

    1997-01-01

    This paper outlines design philosophies and methods for healthcare knowledge systems. Clinical priorities for knowledge are discussed in terms of temporal and individual needs. Book centred organisation of healthcare knowledge, which has proven effective in clinical practice, is proposed as the basis of virtual libraries available at the point of care for target groups of healthcare professionals.

  8. An Investigation of the Effectiveness of Family-Centred Positive Behaviour Support of Young Children with Disabilities

    ERIC Educational Resources Information Center

    Chu, Szu-Yin

    2015-01-01

    Positive Behaviour Intervention and Support (PBIS) is an evidence-based approach that has been proven to be effective in remediating problem behaviours in children. The purpose of this study was to evaluate the effectiveness of the family-centred PBIS approach when involving Taiwanese families in the treatment of off-task and non-compliant…

  9. The Evolution of the DARWIN System

    NASA Technical Reports Server (NTRS)

    Walton, Joan D.; Filman, Robert E.; Korsmeyer, David J.; Norvig, Peter (Technical Monitor)

    1999-01-01

    DARWIN is a web-based system for presenting the results of wind-tunnel testing and computational model analyses to aerospace designers. DARWIN captures the data, maintains the information, and manages derived knowledge (e.g. visualizations, etc.) of large quantities of aerospace data. In addition, it provides tools and an environment for distributed collaborative engineering. We are currently constructing the third version of the DARWIN software system. DARWN's development history has, in some sense, tracked the development of web applications. The 1995 DARWIN reflected the latest web technologies--CGI scripts, Java applets and a three-layer architecture--available at that time. The 1997 version of DARWIN expanded on this base, making extensive use of a plethora of web technologies, including Java/JavaScript and Dynamic HTML. While more powerful, this multiplicity has proven to be a maintenance and development headache. The year 2000 version of DARWIN will provide a more stable and uniform foundation environment, composed primarily of Java mechanisms. In this paper, we discuss this evolution, comparing the strengths and weaknesses of the various architectural approaches and describing the lessons learned about building complex web applications.

  10. The Effect on Pupils' Science Performance and Problem-Solving Ability through Lego: An Engineering Design-Based Modeling Approach

    ERIC Educational Resources Information Center

    Li, Yanyan; Huang, Zhinan; Jiang, Menglu; Chang, Ting-Wen

    2016-01-01

    Incorporating scientific fundamentals via engineering through a design-based methodology has proven to be highly effective for STEM education. Engineering design can be instantiated for learning as they involve mental and physical stimulation and develop practical skills especially in solving problems. Lego bricks, as a set of toys based on design…

  11. Citation and Recognition of contributions using Semantic Provenance Knowledge Captured in the OPeNDAP Software Framework

    NASA Astrophysics Data System (ADS)

    West, P.; Michaelis, J.; Lebot, T.; McGuinness, D. L.; Fox, P. A.

    2014-12-01

    Providing proper citation and attribution for published data, derived data products, and the software tools used to generate them, has always been an important aspect of scientific research. However, It is often the case that this type of detailed citation and attribution is lacking. This is in part because it often requires manual markup since dynamic generation of this type of provenance information is not typically done by the tools used to access, manipulate, transform, and visualize data. In addition, the tools themselves lack the information needed to be properly cited themselves. The OPeNDAP Hyrax Software Framework is a tool that provides access to and the ability to constrain, manipulate, and transform, different types of data from different data formats, into a common format, the DAP (Data Access Protocol), in order to derive new data products. A user, or another software client, specifies an HTTP URL in order to access a particular piece of data, and appropriately transform it to suit a specific purpose of use. The resulting data products, however, do not contain any information about what data was used to create it, or the software process used to generate it, let alone information that would allow the proper citing and attribution to down stream researchers and tool developers. We will present our approach to provenance capture in Hyrax including a mechanism that can be used to report back to the hosting site any derived products, such as publications and reports, using the W3C PROV recommendation pingback service. We will demonstrate our utilization of Semantic Web and Web standards, the development of an information model that extends the PROV model for provenance capture, and the development of the pingback service. We will present our findings, as well as our practices for providing provenance information, visualization of the provenance information, and the development of pingback services, to better enable scientists and tool developers to be recognized and properly cited for their contributions.

  12. Resilience Simulation for Water, Power & Road Networks

    NASA Astrophysics Data System (ADS)

    Clark, S. S.; Seager, T. P.; Chester, M.; Eisenberg, D. A.; Sweet, D.; Linkov, I.

    2014-12-01

    The increasing frequency, scale, and damages associated with recent catastrophic events has called for a shift in focus from evading losses through risk analysis to improving threat preparation, planning, absorption, recovery, and adaptation through resilience. However, neither underlying theory nor analytic tools have kept pace with resilience rhetoric. As a consequence, current approaches to engineering resilience analysis often conflate resilience and robustness or collapse into a deeper commitment to the risk analytic paradigm proven problematic in the first place. This research seeks a generalizable understanding of resilience that is applicable in multiple disciplinary contexts. We adopt a unique investigative perspective by coupling social and technical analysis with human subjects research to discover the adaptive actions, ideas and decisions that contribute to resilience in three socio-technical infrastructure systems: electric power, water, and roadways. Our research integrates physical models representing network objects with examination of the knowledge systems and social interactions revealed by human subjects making decisions in a simulated crisis environment. To ensure a diversity of contexts, we model electric power, water, roadway and knowledge networks for Phoenix AZ and Indianapolis IN. We synthesize this in a new computer-based Resilient Infrastructure Simulation Environment (RISE) to allow individuals, groups (including students) and experts to test different network design configurations and crisis response approaches. By observing simulated failures and best performances, we expect a generalizable understanding of resilience may emerge that yields a measureable understanding of the sensing, anticipating, adapting, and learning processes that are essential to resilient organizations.

  13. Problem-Oriented Corporate Knowledge Base Models on the Case-Based Reasoning Approach Basis

    NASA Astrophysics Data System (ADS)

    Gluhih, I. N.; Akhmadulin, R. K.

    2017-07-01

    One of the urgent directions of efficiency enhancement of production processes and enterprises activities management is creation and use of corporate knowledge bases. The article suggests a concept of problem-oriented corporate knowledge bases (PO CKB), in which knowledge is arranged around possible problem situations and represents a tool for making and implementing decisions in such situations. For knowledge representation in PO CKB a case-based reasoning approach is encouraged to use. Under this approach, the content of a case as a knowledge base component has been defined; based on the situation tree a PO CKB knowledge model has been developed, in which the knowledge about typical situations as well as specific examples of situations and solutions have been represented. A generalized problem-oriented corporate knowledge base structural chart and possible modes of its operation have been suggested. The obtained models allow creating and using corporate knowledge bases for support of decision making and implementing, training, staff skill upgrading and analysis of the decisions taken. The universal interpretation of terms “situation” and “solution” adopted in the work allows using the suggested models to develop problem-oriented corporate knowledge bases in different subject domains. It has been suggested to use the developed models for making corporate knowledge bases of the enterprises that operate engineer systems and networks at large production facilities.

  14. A convergent diffusion and social marketing approach for disseminating proven approaches to physical activity promotion.

    PubMed

    Dearing, James W; Maibach, Edward W; Buller, David B

    2006-10-01

    Approaches from diffusion of innovations and social marketing are used here to propose efficient means to promote and enhance the dissemination of evidence-based physical activity programs. While both approaches have traditionally been conceptualized as top-down, center-to-periphery, centralized efforts at social change, their operational methods have usually differed. The operational methods of diffusion theory have a strong relational emphasis, while the operational methods of social marketing have a strong transactional emphasis. Here, we argue for a convergence of diffusion of innovation and social marketing principles to stimulate the efficient dissemination of proven-effective programs. In general terms, we are encouraging a focus on societal sectors as a logical and efficient means for enhancing the impact of dissemination efforts. This requires an understanding of complex organizations and the functional roles played by different individuals in such organizations. In specific terms, ten principles are provided for working effectively within societal sectors and enhancing user involvement in the processes of adoption and implementation.

  15. Space Weather Monitoring for ISS Space Environments Engineering and Crew Auroral Observations

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Pettit, Donald R.; Hartman, William A.

    2012-01-01

    The awareness of potentially significant impacts of space weather on spaceand ground ]based technological systems has generated a strong desire in many sectors of government and industry to effectively transform knowledge and understanding of the variable space environment into useful tools and applications for use by those entities responsible for systems that may be vulnerable to space weather impacts. Essentially, effectively transitioning science knowledge to useful applications relevant to space weather has become important. This talk will present proven methodologies that have been demonstrated to be effective, and how in the current environment those can be applied to space weather transition efforts.

  16. SEE: structured representation of scientific evidence in the biomedical domain using Semantic Web techniques

    PubMed Central

    2014-01-01

    Background Accounts of evidence are vital to evaluate and reproduce scientific findings and integrate data on an informed basis. Currently, such accounts are often inadequate, unstandardized and inaccessible for computational knowledge engineering even though computational technologies, among them those of the semantic web, are ever more employed to represent, disseminate and integrate biomedical data and knowledge. Results We present SEE (Semantic EvidencE), an RDF/OWL based approach for detailed representation of evidence in terms of the argumentative structure of the supporting background for claims even in complex settings. We derive design principles and identify minimal components for the representation of evidence. We specify the Reasoning and Discourse Ontology (RDO), an OWL representation of the model of scientific claims, their subjects, their provenance and their argumentative relations underlying the SEE approach. We demonstrate the application of SEE and illustrate its design patterns in a case study by providing an expressive account of the evidence for certain claims regarding the isolation of the enzyme glutamine synthetase. Conclusions SEE is suited to provide coherent and computationally accessible representations of evidence-related information such as the materials, methods, assumptions, reasoning and information sources used to establish a scientific finding by adopting a consistently claim-based perspective on scientific results and their evidence. SEE allows for extensible evidence representations, in which the level of detail can be adjusted and which can be extended as needed. It supports representation of arbitrary many consecutive layers of interpretation and attribution and different evaluations of the same data. SEE and its underlying model could be a valuable component in a variety of use cases that require careful representation or examination of evidence for data presented on the semantic web or in other formats. PMID:25093070

  17. SEE: structured representation of scientific evidence in the biomedical domain using Semantic Web techniques.

    PubMed

    Bölling, Christian; Weidlich, Michael; Holzhütter, Hermann-Georg

    2014-01-01

    Accounts of evidence are vital to evaluate and reproduce scientific findings and integrate data on an informed basis. Currently, such accounts are often inadequate, unstandardized and inaccessible for computational knowledge engineering even though computational technologies, among them those of the semantic web, are ever more employed to represent, disseminate and integrate biomedical data and knowledge. We present SEE (Semantic EvidencE), an RDF/OWL based approach for detailed representation of evidence in terms of the argumentative structure of the supporting background for claims even in complex settings. We derive design principles and identify minimal components for the representation of evidence. We specify the Reasoning and Discourse Ontology (RDO), an OWL representation of the model of scientific claims, their subjects, their provenance and their argumentative relations underlying the SEE approach. We demonstrate the application of SEE and illustrate its design patterns in a case study by providing an expressive account of the evidence for certain claims regarding the isolation of the enzyme glutamine synthetase. SEE is suited to provide coherent and computationally accessible representations of evidence-related information such as the materials, methods, assumptions, reasoning and information sources used to establish a scientific finding by adopting a consistently claim-based perspective on scientific results and their evidence. SEE allows for extensible evidence representations, in which the level of detail can be adjusted and which can be extended as needed. It supports representation of arbitrary many consecutive layers of interpretation and attribution and different evaluations of the same data. SEE and its underlying model could be a valuable component in a variety of use cases that require careful representation or examination of evidence for data presented on the semantic web or in other formats.

  18. Quantum rewinding via phase estimation

    NASA Astrophysics Data System (ADS)

    Tabia, Gelo Noel

    2015-03-01

    In cryptography, the notion of a zero-knowledge proof was introduced by Goldwasser, Micali, and Rackoff. An interactive proof system is said to be zero-knowledge if any verifier interacting with an honest prover learns nothing beyond the validity of the statement being proven. With recent advances in quantum information technologies, it has become interesting to ask if classical zero-knowledge proof systems remain secure against adversaries with quantum computers. The standard approach to show the zero-knowledge property involves constructing a simulator for a malicious verifier that can be rewinded to a previous step when the simulation fails. In the quantum setting, the simulator can be described by a quantum circuit that takes an arbitrary quantum state as auxiliary input but rewinding becomes a nontrivial issue. Watrous proposed a quantum rewinding technique in the case where the simulation's success probability is independent of the auxiliary input. Here I present a more general quantum rewinding scheme that employs the quantum phase estimation algorithm. This work was funded by institutional research grant IUT2-1 from the Estonian Research Council and by the European Union through the European Regional Development Fund.

  19. Development of an Inquiry-Based Learning Support System Based on an Intelligent Knowledge Exploration Approach

    ERIC Educational Resources Information Center

    Wu, Ji-Wei; Tseng, Judy C. R.; Hwang, Gwo-Jen

    2015-01-01

    Inquiry-Based Learning (IBL) is an effective approach for promoting active learning. When inquiry-based learning is incorporated into instruction, teachers provide guiding questions for students to actively explore the required knowledge in order to solve the problems. Although the World Wide Web (WWW) is a rich knowledge resource for students to…

  20. Speechlinks: Robust Cross-Lingual Tactical Communication Aids

    DTIC Science & Technology

    2008-06-01

    domain, the ontology based translation has proven to be challenging to build in this domain, however recent developments show promising results...assignments, and the effect of domain knowledge on those requirements. • Improving the front end of the speech recognizer remains one of the most challenging ...users by being very selective. 4.2.3.2 Analysis of the Normal user type inference result Figure 4.11 shows one of the most challenging users to

  1. What to Teach and How to Teach It: Elementary Teachers' Views on Teaching Inquiry-Based, Interdisciplinary Science and Social Studies in Urban Settings

    ERIC Educational Resources Information Center

    Santau, Alexandra O.; Ritter, Jason K.

    2013-01-01

    Inquiry-based and interdisciplinary teaching practices exemplify constructivist approaches to education capable of facilitating authentic student learning; however, their implementation has proven particularly challenging within certain contexts in the United States. This qualitative study considers one such context via an investigation of…

  2. Modular Cognitive-Behavioral Therapy for Childhood Anxiety Disorders. Guides to Individualized Evidence-Based Treatment Series

    ERIC Educational Resources Information Center

    Chorpita, Bruce F.

    2006-01-01

    This clinically wise and pragmatic book presents a systematic approach for treating any form of childhood anxiety using proven exposure-based techniques. What makes this rigorously tested modular treatment unique is that it is explicitly designed with flexibility and individualization in mind. Developed in a real-world, highly diverse community…

  3. Placing "Knowledge" in Teacher Education in the English Further Education Sector: An Alternative Approach Based on Collaboration and Evidence-Based Research

    ERIC Educational Resources Information Center

    Loo, Sai Y.

    2014-01-01

    This paper focuses on teacher education in the English further education sector, where the teaching of disciplinary and pedagogic knowledge is an issue. Using research findings, the paper advocates an approach based on collaboration and informed research to emphasize and integrate knowledge(s) in situated teaching contexts despite working in a…

  4. Next Generation Cloud-based Science Data Systems and Their Implications on Data and Software Stewardship, Preservation, and Provenance

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Starch, M.

    2017-12-01

    NASA's upcoming missions are expected to be generating data volumes at least an order of magnitude larger than current missions. A significant increase in data processing, data rates, data volumes, and long-term data archive capabilities are needed. Consequently, new challenges are emerging that impact traditional data and software management approaches. At large-scales, next generation science data systems are exploring the move onto cloud computing paradigms to support these increased needs. New implications such as costs, data movement, collocation of data systems & archives, and moving processing closer to the data, may result in changes to the stewardship, preservation, and provenance of science data and software. With more science data systems being on-boarding onto cloud computing facilities, we can expect more Earth science data records to be both generated and kept in the cloud. But at large scales, the cost of processing and storing global data may impact architectural and system designs. Data systems will trade the cost of keeping data in the cloud with the data life-cycle approaches of moving "colder" data back to traditional on-premise facilities. How will this impact data citation and processing software stewardship? What are the impacts of cloud-based on-demand processing and its affect on reproducibility and provenance. Similarly, with more science processing software being moved onto cloud, virtual machines, and container based approaches, more opportunities arise for improved stewardship and preservation. But will the science community trust data reprocessed years or decades later? We will also explore emerging questions of the stewardship of the science data system software that is generating the science data records both during and after the life of mission.

  5. Pilot Field Test: Results of Tandem Walk Performance Following Long-Duration Spaceflight

    NASA Technical Reports Server (NTRS)

    Cerisano, J. M.; Reschke, M. F.; Kofman, I. S.; Fisher, E. A.; Gadd, N. E.; Phillips, T. R.; Lee, S. M. C.; Laurie, S. S.; Stenger, M. B.; Bloomberg, J. J.; hide

    2016-01-01

    Coordinated locomotion has proven to be challenging for many astronauts following long duration spaceflight. As NASA's vision for spaceflight points toward interplanetary travel and missions to distant objects, astronauts will not have assistance once they land. Thus, it is vital to develop a knowledge base from which operational guidelines can be written that define when astronauts can be expected to safely perform certain tasks. Data obtained during the Field Test experiment will add important insight to this knowledge base. Specifically, we aim to develop a recovery timeline of functional sensorimotor performance during the first 24 hours and several days after landing. A forerunner of the full Field Test study, the Pilot Field Test (PFT) comprised a subset of the tasks and measurements to be included in the ultimate set.

  6. Identification of provenance rocks based on EPMA analyses of heavy minerals

    NASA Astrophysics Data System (ADS)

    Shimizu, M.; Sano, N.; Ueki, T.; Yonaga, Y.; Yasue, K. I.; Masakazu, N.

    2017-12-01

    Information on mountain building is significant in the field of geological disposal of high-level radioactive waste, because this affects long-term stability in groundwater flow system. Provenance analysis is one of effective approaches for understanding building process of mountains. Chemical compositions of heavy minerals, as well as their chronological data, can be an index for identification of provenance rocks. The accurate identification requires the measurement of as many grains as possible. In order to achieve an efficient provenance analysis, we developed a method for quick identification of heavy minerals using an Electron Probe Micro Analyzer (EPMA). In this method, heavy mineral grains extracted from a sample were aligned on a glass slide and mounted in a resin. Concentration of 28 elements was measured for 300-500 grains per sample using EPMA. To measure as many grains as possible, we prioritized swiftness of measurement over precision, configuring measurement time of about 3.5 minutes for each grain. Identification of heavy minerals was based on their chemical composition. We developed a Microsoft® Excel® spread sheet input criteria of mineral identification using a typical range of chemical compositions for each mineral. The grains of <80 wt.% or >110 wt.% total were rejected. The criteria of mineral identification were revised through the comparison between mineral identification by optical microscopy and chemical compositions of grains classified as "unknown minerals". Provenance rocks can be identified based on abundance ratio of identified minerals. If no significant difference of the abundance ratio was found among source rocks, chemical composition of specific minerals was used as another index. This method was applied to the sediments of some regions in Japan where provenance rocks had lithological variations but similar formation ages. Consequently, the provenance rocks were identified based on chemical compositions of heavy minerals resistant to weathering, such as zircon and ilmenite.This study was carried out under a contract with Ministry of Economy, Trade and Industry of Japan as part of its R&D supporting program for developing geological disposal technology.

  7. Participatory approach to the development of a knowledge base for problem-solving in diabetes self-management.

    PubMed

    Cole-Lewis, Heather J; Smaldone, Arlene M; Davidson, Patricia R; Kukafka, Rita; Tobin, Jonathan N; Cassells, Andrea; Mynatt, Elizabeth D; Hripcsak, George; Mamykina, Lena

    2016-01-01

    To develop an expandable knowledge base of reusable knowledge related to self-management of diabetes that can be used as a foundation for patient-centric decision support tools. The structure and components of the knowledge base were created in participatory design with academic diabetes educators using knowledge acquisition methods. The knowledge base was validated using scenario-based approach with practicing diabetes educators and individuals with diabetes recruited from Community Health Centers (CHCs) serving economically disadvantaged communities and ethnic minorities in New York. The knowledge base includes eight glycemic control problems, over 150 behaviors known to contribute to these problems coupled with contextual explanations, and over 200 specific action-oriented self-management goals for correcting problematic behaviors, with corresponding motivational messages. The validation of the knowledge base suggested high level of completeness and accuracy, and identified improvements in cultural appropriateness. These were addressed in new iterations of the knowledge base. The resulting knowledge base is theoretically grounded, incorporates practical and evidence-based knowledge used by diabetes educators in practice settings, and allows for personally meaningful choices by individuals with diabetes. Participatory design approach helped researchers to capture implicit knowledge of practicing diabetes educators and make it explicit and reusable. The knowledge base proposed here is an important step towards development of new generation patient-centric decision support tools for facilitating chronic disease self-management. While this knowledge base specifically targets diabetes, its overall structure and composition can be generalized to other chronic conditions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Participatory approach to the development of a knowledge base for problem-solving in diabetes self-management

    PubMed Central

    Cole-Lewis, Heather J.; Smaldone, Arlene M.; Davidson, Patricia R.; Kukafka, Rita; Tobin, Jonathan N.; Cassells, Andrea; Mynatt, Elizabeth D.; Hripcsak, George; Mamykina, Lena

    2015-01-01

    Objective To develop an expandable knowledge base of reusable knowledge related to self-management of diabetes that can be used as a foundation for patient-centric decision support tools. Materials and methods The structure and components of the knowledge base were created in participatory design with academic diabetes educators using knowledge acquisition methods. The knowledge base was validated using scenario-based approach with practicing diabetes educators and individuals with diabetes recruited from Community Health Centers (CHCs) serving economically disadvantaged communities and ethnic minorities in New York. Results The knowledge base includes eight glycemic control problems, over 150 behaviors known to contribute to these problems coupled with contextual explanations, and over 200 specific action-oriented self-management goals for correcting problematic behaviors, with corresponding motivational messages. The validation of the knowledge base suggested high level of completeness and accuracy, and identified improvements in cultural appropriateness. These were addressed in new iterations of the knowledge base. Discussion The resulting knowledge base is theoretically grounded, incorporates practical and evidence-based knowledge used by diabetes educators in practice settings, and allows for personally meaningful choices by individuals with diabetes. Participatory design approach helped researchers to capture implicit knowledge of practicing diabetes educators and make it explicit and reusable. Conclusion The knowledge base proposed here is an important step towards development of new generation patient-centric decision support tools for facilitating chronic disease self-management. While this knowledge base specifically targets diabetes, its overall structure and composition can be generalized to other chronic conditions. PMID:26547253

  9. Probabilistic Sizing and Verification of Space Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  10. Integrating artificial and human intelligence into tablet production process.

    PubMed

    Gams, Matjaž; Horvat, Matej; Ožek, Matej; Luštrek, Mitja; Gradišek, Anton

    2014-12-01

    We developed a new machine learning-based method in order to facilitate the manufacturing processes of pharmaceutical products, such as tablets, in accordance with the Process Analytical Technology (PAT) and Quality by Design (QbD) initiatives. Our approach combines the data, available from prior production runs, with machine learning algorithms that are assisted by a human operator with expert knowledge of the production process. The process parameters encompass those that relate to the attributes of the precursor raw materials and those that relate to the manufacturing process itself. During manufacturing, our method allows production operator to inspect the impacts of various settings of process parameters within their proven acceptable range with the purpose of choosing the most promising values in advance of the actual batch manufacture. The interaction between the human operator and the artificial intelligence system provides improved performance and quality. We successfully implemented the method on data provided by a pharmaceutical company for a particular product, a tablet, under development. We tested the accuracy of the method in comparison with some other machine learning approaches. The method is especially suitable for analyzing manufacturing processes characterized by a limited amount of data.

  11. Integrating Genetic Studies of Nicotine Addiction into Public Health Practice: Stakeholder Views on Challenges, Barriers and Opportunities

    PubMed Central

    Dingel, M.J.; Hicks, A.D.; Robinson, M.E.; Koenig, B.A.

    2011-01-01

    Objective: Will emerging genetic research strengthen tobacco control programs? In this empirical study, we interview stakeholders in tobacco control to illuminate debates about the role of genomics in public health. Methods: The authors performed open-ended interviews with 86 stakeholders from 5 areas of tobacco control: basic scientists, clinicians, tobacco prevention specialists, health payers, and pharmaceutical industry employees. Interviews were qualitatively analyzed using standard techniques. Results: The central tension is between the hope that an expanding genomic knowledge base will improve prevention and smoking cessation therapies and the fear that genetic research might siphon resources away from traditional and proven public health programs. While showing strong support for traditional public health approaches to tobacco control, stakeholders recognize weaknesses, specifically the difficulty of countering the powerful voice of the tobacco industry when mounting public campaigns and the problem of individuals who are resistant to treatment and continue smoking. Conclusions: In order for genetic research to be effectively translated into efforts to minimize the harm of smoking-related disease, the views of key stakeholders must be voiced and disagreements reconciled. Effective translation requires honest evaluation of both the strengths and limitations of genetic approaches. PMID:21757875

  12. A Discourse Based Approach to the Language Documentation of Local Ecological Knowledge

    ERIC Educational Resources Information Center

    Odango, Emerson Lopez

    2016-01-01

    This paper proposes a discourse-based approach to the language documentation of local ecological knowledge (LEK). The knowledge, skills, beliefs, cultural worldviews, and ideologies that shape the way a community interacts with its environment can be examined through the discourse in which LEK emerges. 'Discourse-based' refers to two components:…

  13. In-context query reformulation for failing SPARQL queries

    NASA Astrophysics Data System (ADS)

    Viswanathan, Amar; Michaelis, James R.; Cassidy, Taylor; de Mel, Geeth; Hendler, James

    2017-05-01

    Knowledge bases for decision support systems are growing increasingly complex, through continued advances in data ingest and management approaches. However, humans do not possess the cognitive capabilities to retain a bird's-eyeview of such knowledge bases, and may end up issuing unsatisfiable queries to such systems. This work focuses on the implementation of a query reformulation approach for graph-based knowledge bases, specifically designed to support the Resource Description Framework (RDF). The reformulation approach presented is instance-and schema-aware. Thus, in contrast to relaxation techniques found in the state-of-the-art, the presented approach produces in-context query reformulation.

  14. Microglia-Neuron Communication in Epilepsy.

    PubMed

    Eyo, Ukpong B; Murugan, Madhuvika; Wu, Long-Jun

    2017-01-01

    Epilepsy has remained a significant social concern and financial burden globally. Current therapeutic strategies are based primarily on neurocentric mechanisms that have not proven successful in at least a third of patients, raising the need for novel alternative and complementary approaches. Recent evidence implicates glial cells and neuroinflammation in the pathogenesis of epilepsy with the promise of targeting these cells to complement existing strategies. Specifically, microglial involvement, as a major inflammatory cell in the epileptic brain, has been poorly studied. In this review, we highlight microglial reaction to experimental seizures, discuss microglial control of neuronal activities, and propose the functions of microglia during acute epileptic phenotypes, delayed neurodegeneration, and aberrant neurogenesis. Future research that would help fill in the current gaps in our knowledge includes epilepsy-induced alterations in basic microglial functions, neuro-microglial interactions during chronic epilepsy, and microglial contribution to developmental seizures. Studying the role of microglia in epilepsy could inform therapies to better alleviate the disease. GLIA 2016;65:5-18. © 2016 Wiley Periodicals, Inc.

  15. Participatory/problem-based methods and techniques for training in health and safety.

    PubMed

    Rosskam, E

    2001-01-01

    More knowledgeable and trained people are needed in the area of occupational health, safety, and environment (OSHE) if work-related fatalities, accidents, and diseases are to be reduced. Established systems have been largely ineffective, with few employers taking voluntary measures to protect workers and the environment and too few labor inspectors available. Training techniques using participatory methods and a worker empowerment philosophy have proven value. There is demonstrated need for the use of education for action, promoting the involvement of workers in all levels of decision-making and problem-solving in the workplace. OSH risks particular to women s jobs are virtually unstudied and not addressed at policy levels in most countries. Trade unions and health and safety professionals need to demystify technical areas, empower workers, and encourage unions to dedicate special activities around women s jobs. Trained women are excellent motivators and transmitters of safety culture. Particular emphasis is given to train-the-trainer approaches.

  16. Identity-Based Verifiably Encrypted Signatures without Random Oracles

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Wu, Qianhong; Qin, Bo

    Fair exchange protocol plays an important role in electronic commerce in the case of exchanging digital contracts. Verifiably encrypted signatures provide an optimistic solution to these scenarios with an off-line trusted third party. In this paper, we propose an identity-based verifiably encrypted signature scheme. The scheme is non-interactive to generate verifiably encrypted signatures and the resulting encrypted signature consists of only four group elements. Based on the computational Diffie-Hellman assumption, our scheme is proven secure without using random oracles. To the best of our knowledge, this is the first identity-based verifiably encrypted signature scheme provably secure in the standard model.

  17. Dynamic Parameter Identification of Subject-Specific Body Segment Parameters Using Robotics Formalism: Case Study Head Complex.

    PubMed

    Díaz-Rodríguez, Miguel; Valera, Angel; Page, Alvaro; Besa, Antonio; Mata, Vicente

    2016-05-01

    Accurate knowledge of body segment inertia parameters (BSIP) improves the assessment of dynamic analysis based on biomechanical models, which is of paramount importance in fields such as sport activities or impact crash test. Early approaches for BSIP identification rely on the experiments conducted on cadavers or through imaging techniques conducted on living subjects. Recent approaches for BSIP identification rely on inverse dynamic modeling. However, most of the approaches are focused on the entire body, and verification of BSIP for dynamic analysis for distal segment or chain of segments, which has proven to be of significant importance in impact test studies, is rarely established. Previous studies have suggested that BSIP should be obtained by using subject-specific identification techniques. To this end, our paper develops a novel approach for estimating subject-specific BSIP based on static and dynamics identification models (SIM, DIM). We test the validity of SIM and DIM by comparing the results using parameters obtained from a regression model proposed by De Leva (1996, "Adjustments to Zatsiorsky-Seluyanov's Segment Inertia Parameters," J. Biomech., 29(9), pp. 1223-1230). Both SIM and DIM are developed considering robotics formalism. First, the static model allows the mass and center of gravity (COG) to be estimated. Second, the results from the static model are included in the dynamics equation allowing us to estimate the moment of inertia (MOI). As a case study, we applied the approach to evaluate the dynamics modeling of the head complex. Findings provide some insight into the validity not only of the proposed method but also of the application proposed by De Leva (1996, "Adjustments to Zatsiorsky-Seluyanov's Segment Inertia Parameters," J. Biomech., 29(9), pp. 1223-1230) for dynamic modeling of body segments.

  18. About the Complexities of Video-Based Assessments: Theoretical and Methodological Approaches to Overcoming Shortcomings of Research on Teachers' Competence

    ERIC Educational Resources Information Center

    Kaiser, Gabriele; Busse, Andreas; Hoth, Jessica; König, Johannes; Blömeke, Sigrid

    2015-01-01

    Research on the evaluation of the professional knowledge of mathematics teachers (comprising for example mathematical content knowledge, mathematics pedagogical content knowledge and general pedagogical knowledge) has become prominent in the last decade; however, the development of video-based assessment approaches is a more recent topic. This…

  19. A Knowledge-Based Approach to Information Fusion for the Support of Military Intelligence

    DTIC Science & Technology

    2004-03-01

    and most reliable an appropriate picture of the battlespace. The presented approach of knowledge based information fusion is focussing on the...incomplete and imperfect information of military reports and background knowledge can be supported substantially in an automated system. Keywords

  20. An advanced artificial intelligence tool for menu design.

    PubMed

    Khan, Abdus Salam; Hoffmann, Achim

    2003-01-01

    The computer-assisted menu design still remains a difficult task. Usually knowledge that aids in menu design by a computer is hard-coded and because of that a computerised menu planner cannot handle the menu design problem for an unanticipated client. To address this problem we developed a menu design tool, MIKAS (menu construction using incremental knowledge acquisition system), an artificial intelligence system that allows the incremental development of a knowledge-base for menu design. We allow an incremental knowledge acquisition process in which the expert is only required to provide hints to the system in the context of actual problem instances during menu design using menus stored in a so-called Case Base. Our system incorporates Case-Based Reasoning (CBR), an Artificial Intelligence (AI) technique developed to mimic human problem solving behaviour. Ripple Down Rules (RDR) are a proven technique for the acquisition of classification knowledge from expert directly while they are using the system, which complement CBR in a very fruitful way. This combination allows the incremental improvement of the menu design system while it is already in routine use. We believe MIKAS allows better dietary practice by leveraging a dietitian's skills and expertise. As such MIKAS has the potential to be helpful for any institution where dietary advice is practised.

  1. Cognitive task analysis for instruction in single-injection ultrasound guided-regional anesthesia

    NASA Astrophysics Data System (ADS)

    Gucev, Gligor V.

    Cognitive task analysis (CTA) is methodology for eliciting knowledge from subject matter experts. CTA has been used to capture the cognitive processes, decision-making, and judgments that underlie expert behaviors. A review of the literature revealed that CTA has not yet been used to capture the knowledge required to perform ultrasound guided regional anesthesia (UGRA). The purpose of this study was to utilize CTA to extract knowledge from UGRA experts and to determine whether instruction based on CTA of UGRA will produce results superior to the results of traditional training. This study adds to the knowledge base of CTA in being the first one to effectively capture the expert knowledge of UGRA. The derived protocol was used in a randomized, double blinded experiment involving UGRA instruction to 39 novice learners. The results of this study strongly support the hypothesis that CTA-based instruction in UGRA is more effective than conventional clinical instruction, as measured by conceptual pre- and post-tests, performance of a simulated UGRA procedure, and time necessary for the task performance. This study adds to the number of studies that have proven the superiority of CTA-informed instruction. Finally, it produced several validated instruments that can be used in instructing and evaluating UGRA.

  2. The implementation of a community-based aerobic walking program for mild to moderate knee osteoarthritis: A knowledge translation randomized controlled trial: Part II: Clinical outcomes

    PubMed Central

    2012-01-01

    Background Osteoarthritis (OA) is the most common joint disorder in the world, as it is appears to be prevalent among 80% of individuals over the age of 75. Although physical activities such as walking have been scientifically proven to improve physical function and arthritic symptoms, individuals with OA tend to adopt a sedentary lifestyle. There is therefore a need to improve knowledge translation in order to influence individuals to adopt effective self-management interventions, such as an adapted walking program. Methods A single-blind, randomized control trial was conducted. Subjects (n = 222) were randomized to one of three knowledge translation groups: 1) Walking and Behavioural intervention (WB) (18 males, 57 females) which included the supervised community-based aerobic walking program combined with a behavioural intervention and an educational pamphlet on the benefits of walking; 2) Walking intervention (W) (24 males, 57 females) wherein participants only received the supervised community-based aerobic walking program intervention and the educational pamphlet; 3) Self-directed control (C) (32 males, 52 females) wherein participants only received the educational pamphlet. One-way analyses of variance were used to test for differences in quality of life, adherence, confidence, and clinical outcomes among the study groups at each 3 month assessment during the 12-month intervention period and 6-month follow-up period. Results The clinical and quality of life outcomes improved among participants in each of the three comparative groups. However, there were few statistically significant differences observed for quality of life and clinical outcomes at long-term measurements at 12-months end of intervention and at 6- months post intervention (18-month follow-up). Outcome results varied among the three groups. Conclusion The three groups were equivalent when determining the effectiveness of knowledge uptake and improvements in quality of life and other clinical outcomes. OA can be managed through the implementation of a proven effective walking program in existing community-based walking clubs. Trial registration Current Controlled Trials IRSCTNO9193542 PMID:23234575

  3. Towards improving the NASA standard soil moisture retrieval algorithm and product

    NASA Astrophysics Data System (ADS)

    Mladenova, I. E.; Jackson, T. J.; Njoku, E. G.; Bindlish, R.; Cosh, M. H.; Chan, S.

    2013-12-01

    Soil moisture mapping using passive-based microwave remote sensing techniques has proven to be one of the most effective ways of acquiring reliable global soil moisture information on a routine basis. An important step in this direction was made by the launch of the Advanced Microwave Scanning Radiometer on the NASA's Earth Observing System Aqua satellite (AMSR-E). Along with the standard NASA algorithm and operational AMSR-E product, the easy access and availability of the AMSR-E data promoted the development and distribution of alternative retrieval algorithms and products. Several evaluation studies have demonstrated issues with the standard NASA AMSR-E product such as dampened temporal response and limited range of the final retrievals and noted that the available global passive-based algorithms, even though based on the same electromagnetic principles, produce different results in terms of accuracy and temporal dynamics. Our goal is to identify the theoretical causes that determine the reduced sensitivity of the NASA AMSR-E product and outline ways to improve the operational NASA algorithm, if possible. Properly identifying the underlying reasons that cause the above mentioned features of the NASA AMSR-E product and differences between the alternative algorithms requires a careful examination of the theoretical basis of each approach. Specifically, the simplifying assumptions and parametrization approaches adopted by each algorithm to reduce the dimensionality of unknowns and characterize the observing system. Statistically-based error analyses, which are useful and necessary, provide information on the relative accuracy of each product but give very little information on the theoretical causes, knowledge that is essential for algorithm improvement. Thus, we are currently examining the possibility of improving the standard NASA AMSR-E global soil moisture product by conducting a thorough theoretically-based review of and inter-comparisons between several well established global retrieval techniques. A detailed discussion focused on the theoretical basis of each approach and algorithms sensitivity to assumptions and parametrization approaches will be presented. USDA is an equal opportunity provider and employer.

  4. Symmetry breaking in occupation number based slave-particle methods

    NASA Astrophysics Data System (ADS)

    Georgescu, Alexandru B.; Ismail-Beigi, Sohrab

    2017-10-01

    We describe a theoretical approach to finding spontaneously symmetry-broken electronic phases due to strong electronic interactions when using recently developed slave-particle (slave-boson) approaches based on occupation numbers. We describe why, to date, spontaneous symmetry breaking has proven difficult to achieve in such approaches. We then provide a total energy based approach for introducing auxiliary symmetry-breaking fields into the solution of the slave-particle problem that leads to lowered total energies for symmetry-broken phases. We point out that not all slave-particle approaches yield energy lowering: the slave-particle model being used must explicitly describe the degrees of freedom that break symmetry. Finally, our total energy approach permits us to greatly simplify the formalism used to achieve a self-consistent solution between spinon and slave modes while increasing the numerical stability and greatly speeding up the calculations.

  5. Executive Functions as Moderators of the Worked Example Effect: When Shifting Is More Important than Working Memory Capacity

    ERIC Educational Resources Information Center

    Schwaighofer, Matthias; Bühner, Markus; Fischer, Frank

    2016-01-01

    Worked examples have proven to be effective for knowledge acquisition compared with problem solving, particularly when prior knowledge is low (e.g., Kalyuga, 2007). However, in addition to prior knowledge, executive functions and fluid intelligence might be potential moderators of the effectiveness of worked examples. The present study examines…

  6. S-ProvFlow: provenance model and tools for scalable and adaptive analysis pipelines in geoscience.

    NASA Astrophysics Data System (ADS)

    Spinuso, A.; Mihajlovski, A.; Atkinson, M.; Filgueira, R.; Klampanos, I.; Sanchez, S.

    2017-12-01

    The reproducibility of scientific findings is essential to improve the quality and application of modern data-driven research. Delivering such reproducibility is challenging in the context of systems handling large data-streams with sophisticated computational methods. Similarly, the SKA (Square Kilometer Array) will collect an unprecedented volume of radio-wave signals that will have to be reduced and transformed into derived products, with impact on space-weather research. This highlights the importance of having cross-disciplines mechanisms at the producer's side that rely on usable lineage data to support validation and traceability of the new artifacts. To be informative, provenance has to describe each methods' abstractions and their implementation as mappings onto distributed platforms and their concurrent execution, capturing relevant internal dependencies at runtime. Producers and intelligent toolsets should be able to exploit the produced provenance, steering real-time monitoring activities and inferring adaptations of methods at runtime.We present a model of provenance (S-PROV) that extends W3C PROV and ProvONE, broadening coverage of provenance to aspects related to distribution, scale-up and steering of stateful streaming operators in analytic pipelines. This is supported by a technical framework for tuneable and actionable lineage, ensuring its relevance to the users' interests, fostering its rapid exploitation to facilitate research practices. By applying concepts such as provenance typing and profiling, users define rules to capture common provenance patterns and activate selective controls based on domain-metadata. The traces are recorded in a document-store with index optimisation and a web API serves advanced interactive tools (S-ProvFlow, https://github.com/KNMI/s-provenance). These allow different classes of consumers to rapidly explore the provenance data. The system, which contributes to the SKA-Link initiative, within technology and knowledge transfer events, will be discussed in the context of an existing data-intensive service for seismology (VERCE), and the newly funded project DARE - Delivering Agile Research Excellence. A generic solution for extreme data and methods in geosciences that domain experts can understand, change and use effectively.

  7. Thermo-rheological behaviour of polymer melts in microinjection moulding

    NASA Astrophysics Data System (ADS)

    Vasco, J. C.; Maia, J. M.; Pouzada, A. S.

    2009-10-01

    Microinjection has proven to be one of the most efficient replication methods for microcomponents and microsystems in various domains of microengineering. The use of available commercial microinjection equipment to evaluate the polymeric flow in microchannels would surely contribute to enhancing knowledge on polymeric flow at the microscale under industrial conditions. This approach is appropriate since rheological phenomena such as wall slip, surface tension, melt pressure drop and polymer flow length can be studied. These aspects are not fully dealt with in current commercial simulation software packages. In this study a micromould was designed to assess and characterize the flow in microchannels under realistic industrial conditions.

  8. Holonomy, quantum mechanics and the signal-tuned Gabor approach to the striate cortex

    NASA Astrophysics Data System (ADS)

    Torreão, José R. A.

    2016-02-01

    It has been suggested that an appeal to holographic and quantum properties will be ultimately required for the understanding of higher brain functions. On the other hand, successful quantum-like approaches to cognitive and behavioral processes bear witness to the usefulness of quantum prescriptions as applied to the analysis of complex non-quantum systems. Here, we show that the signal-tuned Gabor approach for modeling cortical neurons, although not based on quantum assumptions, also admits a quantum-like interpretation. Recently, the equation of motion for the signal-tuned complex cell response has been derived and proven equivalent to the Schrödinger equation for a dissipative quantum system whose solutions come under two guises: as plane-wave and Airy-packet responses. By interpreting the squared magnitude of the plane-wave solution as a probability density, in accordance with the quantum mechanics prescription, we arrive at a Poisson spiking probability — a common model of neuronal response — while spike propagation can be described by the Airy-packet solution. The signal-tuned approach is also proven consistent with holonomic brain theories, as it is based on Gabor functions which provide a holographic representation of the cell’s input, in the sense that any restricted subset of these functions still allows stimulus reconstruction.

  9. Development of a general-purpose, integrated knowledge capture and delivery system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, A.G.; Freer, E.B.

    1991-01-01

    KATIE (Knowledge-Based Assistant for Troubleshooting Industrial Equipment) was first conceived as a solution for maintenance problems. In the area of process control, maintenance technicians have become responsible for increasingly complicated equipment and an overwhelming amount of associated information. The sophisticated distributed control systems have proven to be such a drastic change for technicians that they are forced to rely on the engineer for troubleshooting guidance. Because it is difficult for a knowledgeable engineer to be readily available for troubleshooting,maintenance personnel wish to capture the information provided by the engineer. The solution provided has two stages. First, a specific complicated systemmore » was chosen as a test case. An effort was made to gather all available system information in some form. Second, a method of capturing and delivering this collection of information was developed. Several features were desired for this knowledge capture/delivery system (KATIE). Creation of the knowledge base needed to be independent of the delivery system. The delivery path need to be as simple as possible for the technician, and the capture, or authoring, system could provide very sophisticated features. It was decided that KATIE should be as general as possible, not internalizing specifics about the first implementation. The knowledge bases created needed to be completely separate from KATIE needed to have a modular structure so that each type of information (rules, procedures, manuals, symptoms) could be encapsulated individually.« less

  10. Comparing fusion techniques for the ImageCLEF 2013 medical case retrieval task.

    PubMed

    G Seco de Herrera, Alba; Schaer, Roger; Markonis, Dimitrios; Müller, Henning

    2015-01-01

    Retrieval systems can supply similar cases with a proven diagnosis to a new example case under observation to help clinicians during their work. The ImageCLEFmed evaluation campaign proposes a framework where research groups can compare case-based retrieval approaches. This paper focuses on the case-based task and adds results of the compound figure separation and modality classification tasks. Several fusion approaches are compared to identify the approaches best adapted to the heterogeneous data of the task. Fusion of visual and textual features is analyzed, demonstrating that the selection of the fusion strategy can improve the best performance on the case-based retrieval task. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Drug-disease modeling in the pharmaceutical industry - where mechanistic systems pharmacology and statistical pharmacometrics meet.

    PubMed

    Helmlinger, Gabriel; Al-Huniti, Nidal; Aksenov, Sergey; Peskov, Kirill; Hallow, Karen M; Chu, Lulu; Boulton, David; Eriksson, Ulf; Hamrén, Bengt; Lambert, Craig; Masson, Eric; Tomkinson, Helen; Stanski, Donald

    2017-11-15

    Modeling & simulation (M&S) methodologies are established quantitative tools, which have proven to be useful in supporting the research, development (R&D), regulatory approval, and marketing of novel therapeutics. Applications of M&S help design efficient studies and interpret their results in context of all available data and knowledge to enable effective decision-making during the R&D process. In this mini-review, we focus on two sets of modeling approaches: population-based models, which are well-established within the pharmaceutical industry today, and fall under the discipline of clinical pharmacometrics (PMX); and systems dynamics models, which encompass a range of models of (patho-)physiology amenable to pharmacological intervention, of signaling pathways in biology, and of substance distribution in the body (today known as physiologically-based pharmacokinetic models) - which today may be collectively referred to as quantitative systems pharmacology models (QSP). We next describe the convergence - or rather selected integration - of PMX and QSP approaches into 'middle-out' drug-disease models, which retain selected mechanistic aspects, while remaining parsimonious, fit-for-purpose, and able to address variability and the testing of covariates. We further propose development opportunities for drug-disease systems models, to increase their utility and applicability throughout the preclinical and clinical spectrum of pharmaceutical R&D. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. On the acquisition and representation of procedural knowledge

    NASA Technical Reports Server (NTRS)

    Saito, T.; Ortiz, C.; Loftin, R. B.

    1992-01-01

    Historically knowledge acquisition has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some of some types of knowledge, little attention has been devoted to procedural knowledge. NASA personnel frequently perform tasks that are primarily procedural in nature. Previous work is reviewed in the field of knowledge acquisition and then focus on knowledge acquisition for procedural tasks with special attention devoted to the Navy's VISTA tool. The design and development is described of a system for the acquisition and representation of procedural knowledge-TARGET (Task Analysis and Rule Generation Tool). TARGET is intended as a tool that permits experts to visually describe procedural tasks and as a common medium for knowledge refinement by the expert and knowledge engineer. The system is designed to represent the acquired knowledge in the form of production rules. Systems such as TARGET have the potential to profoundly reduce the time, difficulties, and costs of developing knowledge-based systems for the performance of procedural tasks.

  13. MMKG: An approach to generate metallic materials knowledge graph based on DBpedia and Wikipedia

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoming; Liu, Xin; Li, Xin; Pan, Dongyu

    2017-02-01

    The research and development of metallic materials are playing an important role in today's society, and in the meanwhile lots of metallic materials knowledge is generated and available on the Web (e.g., Wikipedia) for materials experts. However, due to the diversity and complexity of metallic materials knowledge, the knowledge utilization may encounter much inconvenience. The idea of knowledge graph (e.g., DBpedia) provides a good way to organize the knowledge into a comprehensive entity network. Therefore, the motivation of our work is to generate a metallic materials knowledge graph (MMKG) using available knowledge on the Web. In this paper, an approach is proposed to build MMKG based on DBpedia and Wikipedia. First, we use an algorithm based on directly linked sub-graph semantic distance (DLSSD) to preliminarily extract metallic materials entities from DBpedia according to some predefined seed entities; then based on the results of the preliminary extraction, we use an algorithm, which considers both semantic distance and string similarity (SDSS), to achieve the further extraction. Second, due to the absence of materials properties in DBpedia, we use an ontology-based method to extract properties knowledge from the HTML tables of corresponding Wikipedia Web pages for enriching MMKG. Materials ontology is used to locate materials properties tables as well as to identify the structure of the tables. The proposed approach is evaluated by precision, recall, F1 and time performance, and meanwhile the appropriate thresholds for the algorithms in our approach are determined through experiments. The experimental results show that our approach returns expected performance. A tool prototype is also designed to facilitate the process of building the MMKG as well as to demonstrate the effectiveness of our approach.

  14. Web 2.0 collaboration tool to support student research in hydrology - an opinion

    NASA Astrophysics Data System (ADS)

    Pathirana, A.; Gersonius, B.; Radhakrishnan, M.

    2012-08-01

    A growing body of evidence suggests that it is unwise to make the a-priori assumption that university students are ready and eager to embrace modern online technologies employed to enhance the educational experience. We present our opinion on employing Wiki, a popular Web 2.0 technology, in small student groups, based on a case-study of using it customized to work as a personal learning environment (PLE1) (Fiedler and Väljataga, 2011) for supporting thesis research in hydrology. Since inception in 2006, the system presented has proven to facilitate knowledge construction and peer-communication within and across groups of students of different academic years and to stimulate learning. Being an open ended and egalitarian system, it was a minimal burden to maintain, as all students became content authors and shared responsibility. A number of unintended uses of the system were also observed, like using it as a backup medium and mobile storage. We attribute the success and sustainability of the proposed Web 2.0-based approach to the fact that the efforts were not limited to the application of the technology, but comprised the creation of a supporting environment with educational activities organized around it. We propose that Wiki-based PLEs are much more suitable than traditional learning management systems for supporting non-classroom education activities like thesis research in hydrology. 1Here we use the term PLE to refer to the conceptual framework to make the process of knowledge construction a personalized experience - rather than to refer to the technology (in this case Wiki) used to attempt implementing such a system.

  15. The elements of design knowledge capture

    NASA Technical Reports Server (NTRS)

    Freeman, Michael S.

    1988-01-01

    This paper will present the basic constituents of a design knowledge capture effort. This will include a discussion of the types of knowledge to be captured in such an effort and the difference between design knowledge capture and more traditional knowledge base construction. These differences include both knowledge base structure and knowledge acquisition approach. The motivation for establishing a design knowledge capture effort as an integral part of major NASA programs will be outlined, along with the current NASA position on that subject. Finally the approach taken in design knowledge capture for Space Station will be contrasted with that used in the HSTDEK project.

  16. Promoting Early Reading: Research, Resources, and Best Practices

    ERIC Educational Resources Information Center

    McKenna, Michael C., Ed.; Walpole, Sharon, Ed.; Conradi, Kristin, Ed.

    2010-01-01

    Bringing together leading scholars, this book describes proven ways to enhance early literacy skills in 3- and 4-year-olds, especially those from low-income families. Presented are scientifically based methods and approaches that are being applied in Early Reading First programs around the country. Important topics include promoting oral language…

  17. Designing a Pedagogical Model for Web Engineering Education: An Evolutionary Perspective

    ERIC Educational Resources Information Center

    Hadjerrouit, Said

    2005-01-01

    In contrast to software engineering, which relies on relatively well established development approaches, there is a lack of a proven methodology that guides Web engineers in building reliable and effective Web-based systems. Currently, Web engineering lacks process models, architectures, suitable techniques and methods, quality assurance, and a…

  18. Developing a European Practitioner Qualification: The TRAVORS2 Project

    ERIC Educational Resources Information Center

    Lester, Stan

    2013-01-01

    The TRAVORS projects, supported by the European Union's Lifelong Learning Programme, ran between 2008 and 2012. Their object was to develop training programmes for disability employment practitioners across nine countries based on proven approaches both to vocational rehabilitation and to skills training. The second of the two projects aimed to…

  19. A Thematic Instruction Approach to Teaching Technology and Engineering

    ERIC Educational Resources Information Center

    Moyer, Courtney D.

    2016-01-01

    Thematic instruction offers flexible opportunities to engage students with real-world experiences in the technology and engineering community. Whether used in a broad unifying theme or specific project-based theme, research has proven that thematic instruction has the capacity to link cross-curricular subjects, facilitate active learning, and…

  20. Federated provenance of oceanographic research cruises: from metadata to data

    NASA Astrophysics Data System (ADS)

    Thomas, Rob; Leadbetter, Adam; Shepherd, Adam

    2016-04-01

    The World Wide Web Consortium's Provenance Data Model and associated Semantic Web ontology (PROV-O) have created much interest in the Earth and Space Science Informatics community (Ma et al., 2014). Indeed, PROV-O has recently been posited as an upper ontology for the alignment of various data models (Cox, 2015). Similarly, PROV-O has been used as the building blocks of a data release lifecycle ontology (Leadbetter & Buck, 2015). In this presentation we show that the alignment between different local data descriptions of an oceanographic research cruise can be achieved through alignment with PROV-O and that descriptions of the funding bodies, organisations and researchers involved in a cruise and its associated data release lifecycle can be modelled within a PROV-O based environment. We show that, at a first-order, this approach is scalable by presenting results from three endpoints (the Biological and Chemical Oceanography Data Management Office at Woods Hole Oceanographic Institution, USA; the British Oceanographic Data Centre at the National Oceanography Centre, UK; and the Marine Institute, Ireland). Current advances in ontology engineering, provide pathways to resolving reasoning issues from varying perspectives on implementing PROV-O. This includes the use of the Information Object design pattern where such edge cases as research cruise scheduling efforts are considered. PROV-O describes only things which have happened, but the Information Object design pattern allows for the description of planned research cruises through its statement that the local data description is not the the entity itself (in this case the planned research cruise) and therefore the local data description itself can be described using the PROV-O model. In particular, we present the use of the data lifecycle ontology to show the connection between research cruise activities and their associated datasets, and the publication of those data sets online with Digital Object Identifiers and more formally in data journals. Use of the SPARQL 1.1 standard allows queries to be federated across these endpoints to create a distributed network of provenance documents. Future research directions will add further nodes to the federated network of oceanographic research cruise provenance to determine the true scalability of this approach, and will involve analysis of and possible evolution of the data release lifecycle ontology. References Nitin Arora et al., 2006. Information object design pattern for modeling domain specific knowledge. 1st ECOOP Workshop on Domain-Specific Program Development. Simon Cox, 2015. Pitfalls in alignment of observation models resolved using PROV as an upper ontology. Abstract IN33F-07 presented at the American Geophysical Union Fall Meeting, 14-18 December, San Francisco. Adam Leadbetter & Justin Buck, 2015. Where did my data layer come from?" The semantics of data release. Geophysical Research Abstracts 17, EGU2015-3746-1. Xiaogang Ma et al., 2014. Ontology engineering in provenance enablement for the National Climate Assessment. Environmental Modelling & Software 61, 191-205. http://dx.doi.org/10.1016/j.envsoft.2014.08.002

  1. Effects of a Peer Assessment System Based on a Grid-Based Knowledge Classification Approach on Computer Skills Training

    ERIC Educational Resources Information Center

    Hsu, Ting-Chia

    2016-01-01

    In this study, a peer assessment system using the grid-based knowledge classification approach was developed to improve students' performance during computer skills training. To evaluate the effectiveness of the proposed approach, an experiment was conducted in a computer skills certification course. The participants were divided into three…

  2. A viewpoint-based case-based reasoning approach utilising an enterprise architecture ontology for experience management

    NASA Astrophysics Data System (ADS)

    Martin, Andreas; Emmenegger, Sandro; Hinkelmann, Knut; Thönssen, Barbara

    2017-04-01

    The accessibility of project knowledge obtained from experiences is an important and crucial issue in enterprises. This information need about project knowledge can be different from one person to another depending on the different roles he or she has. Therefore, a new ontology-based case-based reasoning (OBCBR) approach that utilises an enterprise ontology is introduced in this article to improve the accessibility of this project knowledge. Utilising an enterprise ontology improves the case-based reasoning (CBR) system through the systematic inclusion of enterprise-specific knowledge. This enterprise-specific knowledge is captured using the overall structure given by the enterprise ontology named ArchiMEO, which is a partial ontological realisation of the enterprise architecture framework (EAF) ArchiMate. This ontological representation, containing historical cases and specific enterprise domain knowledge, is applied in a new OBCBR approach. To support the different information needs of different stakeholders, this OBCBR approach has been built in such a way that different views, viewpoints, concerns and stakeholders can be considered. This is realised using a case viewpoint model derived from the ISO/IEC/IEEE 42010 standard. The introduced approach was implemented as a demonstrator and evaluated using an application case that has been elicited from a business partner in the Swiss research project.

  3. Optimal filter design with progressive genetic algorithm for local damage detection in rolling bearings

    NASA Astrophysics Data System (ADS)

    Wodecki, Jacek; Michalak, Anna; Zimroz, Radoslaw

    2018-03-01

    Harsh industrial conditions present in underground mining cause a lot of difficulties for local damage detection in heavy-duty machinery. For vibration signals one of the most intuitive approaches of obtaining signal with expected properties, such as clearly visible informative features, is prefiltration with appropriately prepared filter. Design of such filter is very broad field of research on its own. In this paper authors propose a novel approach to dedicated optimal filter design using progressive genetic algorithm. Presented method is fully data-driven and requires no prior knowledge of the signal. It has been tested against a set of real and simulated data. Effectiveness of operation has been proven for both healthy and damaged case. Termination criterion for evolution process was developed, and diagnostic decision making feature has been proposed for final result determinance.

  4. How to Measure the Intervention Process? An Assessment of Qualitative and Quantitative Approaches to Data Collection in the Process Evaluation of Organizational Interventions

    PubMed Central

    Abildgaard, Johan S.; Saksvik, Per Ø.; Nielsen, Karina

    2016-01-01

    Organizational interventions aiming at improving employee health and wellbeing have proven to be challenging to evaluate. To analyze intervention processes two methodological approaches have widely been used: quantitative (often questionnaire data), or qualitative (often interviews). Both methods are established tools, but their distinct epistemological properties enable them to illuminate different aspects of organizational interventions. In this paper, we use the quantitative and qualitative process data from an organizational intervention conducted in a national postal service, where the Intervention Process Measure questionnaire (N = 285) as well as an extensive interview study (N = 50) were used. We analyze what type of knowledge about intervention processes these two methodologies provide and discuss strengths and weaknesses as well as potentials for mixed methods evaluation methodologies. PMID:27713707

  5. How to Measure the Intervention Process? An Assessment of Qualitative and Quantitative Approaches to Data Collection in the Process Evaluation of Organizational Interventions.

    PubMed

    Abildgaard, Johan S; Saksvik, Per Ø; Nielsen, Karina

    2016-01-01

    Organizational interventions aiming at improving employee health and wellbeing have proven to be challenging to evaluate. To analyze intervention processes two methodological approaches have widely been used: quantitative (often questionnaire data), or qualitative (often interviews). Both methods are established tools, but their distinct epistemological properties enable them to illuminate different aspects of organizational interventions. In this paper, we use the quantitative and qualitative process data from an organizational intervention conducted in a national postal service, where the Intervention Process Measure questionnaire ( N = 285) as well as an extensive interview study ( N = 50) were used. We analyze what type of knowledge about intervention processes these two methodologies provide and discuss strengths and weaknesses as well as potentials for mixed methods evaluation methodologies.

  6. An Adaptive Approach to Managing Knowledge Development in a Project-Based Learning Environment

    ERIC Educational Resources Information Center

    Tilchin, Oleg; Kittany, Mohamed

    2016-01-01

    In this paper we propose an adaptive approach to managing the development of students' knowledge in the comprehensive project-based learning (PBL) environment. Subject study is realized by two-stage PBL. It shapes adaptive knowledge management (KM) process and promotes the correct balance between personalized and collaborative learning. The…

  7. Comparative economic performance and carbon footprint of two farming models for producing atlantic salmon (salmo salar): Land-based closed containment system in freshwater and open pen in seawater

    USDA-ARS?s Scientific Manuscript database

    Ocean net pen production of Atlantic salmon is approaching 2 million metric tons (MT) annually and has proven to be cost- and energy- efficient. Recently, with technology improvements, freshwater aquaculture of Atlantic salmon from eggs to harvestable size of 4 -5 kg in land-based closed containmen...

  8. Performance of the definitions of the systemic inflammatory response syndrome and sepsis in neonates.

    PubMed

    Hofer, Nora; Zacharias, Eva; Müller, Wilhelm; Resch, Bernhard

    2012-09-01

    The aim of this study was to examine the applicability of the definitions of the systemic inflammatory response syndrome (SIRS) and sepsis to neonates during the first 3 days of life. This is a retrospective study of all term neonates hospitalized within the first 24 h of life from 2004 to 2010 at our neonatal intensive care unit. Of 476 neonates, 30 (6 %) had a diagnosis of culture-proven early-onset sepsis (EOS) and 81 (17 %) had culture-negative clinical EOS or suspected EOS. SIRS and sepsis criteria were applied to 116 (24 %) and 61 (13 %) neonates, respectively. Of 30 neonates with culture proven, EOS 14 (53 %) fulfilled SIRS and sepsis criteria. The single diagnostic criterion of SIRS applied to 20 % (hypothermia or fever), 43 % (white blood cell count/immature-to-total neutrophil ratio), 87 % (respiratory symptoms), and 33 % (cardiocirculatory symptoms) of all neonates with culture-proven EOS. The definitions of SIRS and sepsis did not apply to about half of all cases of culture-proven EOS. An evidence-based approach to find the appropriate criteria for defining EOS in the neonate is needed.

  9. The Knowledge-Based Software Assistant: Beyond CASE

    NASA Technical Reports Server (NTRS)

    Carozzoni, Joseph A.

    1993-01-01

    This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.

  10. Learning Machine, Vietnamese Based Human-Computer Interface.

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    The sixth session of IT@EDU98 consisted of seven papers on the topic of the learning machine--Vietnamese based human-computer interface, and was chaired by Phan Viet Hoang (Informatics College, Singapore). "Knowledge Based Approach for English Vietnamese Machine Translation" (Hoang Kiem, Dinh Dien) presents the knowledge base approach,…

  11. Nonvolatile RRAM cells from polymeric composites embedding recycled SiC powders.

    PubMed

    De Girolamo Del Mauro, Anna; Nenna, Giuseppe; Miscioscia, Riccardo; Freda, Cesare; Portofino, Sabrina; Galvagno, Sergio; Minarini, Carla

    2014-10-21

    Silicon carbide powders have been synthesized from tires utilizing a patented recycling process. Dynamic light scattering, Raman spectroscopy, SEM microscopy, and X-ray diffraction have been carried out to gather knowledge about powders and the final composite structure. The obtained powder has been proven to induce resistive switching in a PMMA polymer-based composite device. Memory effect has been detected in two-terminal devices having coplanar contacts and quantified by read-write-erase measurements in terms of level separation and persistence.

  12. Computational neuroanatomy: ontology-based representation of neural components and connectivity.

    PubMed

    Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron

    2009-02-05

    A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future.

  13. Indigenous knowledges driving technological innovation

    Treesearch

    Lilian Alessa; Carlos Andrade; Phil Cash Cash; Christian P. Giardina; Matt Hamabata; Craig Hammer; Kai Henifin; Lee Joachim; Jay T. Johnson; Kekuhi Kealiikanakaoleohaililani; Deanna Kingston; Andrew Kliskey; Renee Pualani Louis; Amanda Lynch; Daryn McKenny; Chels Marshall; Mere Roberts; Taupouri Tangaro; Jyl Wheaton-Abraham; Everett Wingert

    2011-01-01

    This policy brief explores the use and expands the conversation on the ability of geospatial technologies to represent Indigenous cultural knowledge. Indigenous peoples' use of geospatial technologies has already proven to be a critical step for protecting tribal self-determination. However, the ontological frameworks and techniques of Western geospatial...

  14. Benefiting from Customer and Competitor Knowledge: A Market-Based Approach to Organizational Learning

    ERIC Educational Resources Information Center

    Hoe, Siu Loon

    2008-01-01

    Purpose: The purpose of this paper is to review the organizational learning, market orientation and learning orientation concepts, highlight the importance of market knowledge to organizational learning and recommend ways in adopting a market-based approach to organizational learning. Design/methodology/approach: The extant organizational learning…

  15. Preliminary Structural Design Using Topology Optimization with a Comparison of Results from Gradient and Genetic Algorithm Methods

    NASA Technical Reports Server (NTRS)

    Burt, Adam O.; Tinker, Michael L.

    2014-01-01

    In this paper, genetic algorithm based and gradient-based topology optimization is presented in application to a real hardware design problem. Preliminary design of a planetary lander mockup structure is accomplished using these methods that prove to provide major weight savings by addressing the structural efficiency during the design cycle. This paper presents two alternative formulations of the topology optimization problem. The first is the widely-used gradient-based implementation using commercially available algorithms. The second is formulated using genetic algorithms and internally developed capabilities. These two approaches are applied to a practical design problem for hardware that has been built, tested and proven to be functional. Both formulations converged on similar solutions and therefore were proven to be equally valid implementations of the process. This paper discusses both of these formulations at a high level.

  16. Knowledge of cervical cancer, attitude and husband’s support of Pap smear among multiparous women which have Pap’s smear examination in Aviati clinic Padang Bulan Medan

    NASA Astrophysics Data System (ADS)

    Feriyawati, L.; Anggraini, D. R.; Fitrie, A. A.; Anggreini, R. N.

    2018-03-01

    Cervical cancer is a serious health problem and stated as the second cause of death of woman worldwide. Several studies have noted a higher incidence of cervical cancer with increasing parity. Early detection with Pap smear is proven to reduce mortality of patients. Knowledge, attitude and husband’s support contributed to theled womanto follow Pap smear examination. This study explores the knowledge of cervical cancer, attitude and husband’ s support of Pap smearin multiparous women that have Pap smear examination. This research is a quantitative study with cross sectional approach recruited 50 respondents as multiparous women that have Pap smear examination inAviati Clinic, Padang Bulan, Medan. The data were collected by self-reports using structured objectives by questionnaires. The result of this study showed that 66% respondents have high knowledge of cervical cancer and 76% respondents have ahigh attitude of Pap smear, but they almost have low husband’s support of Pap smear including information support (62%), emotional support (46%) and real support (50%). This study has revealed that multiparous women that had Pap smear examination generally had high knowledge about cervical cancer and positive attitude about Pap smear, even most of them had low husband’s support.

  17. Approaching Knowledge Management through the Lens of the Knowledge Life Cycle: A Case Study Investigation

    ERIC Educational Resources Information Center

    Fowlin, Julaine M.; Cennamo, Katherine S.

    2017-01-01

    More organizational leaders are recognizing that their greatest competitive advantage is the knowledge base of their employees and for organizations to thrive knowledge management (KM) systems need to be in place that encourage the natural interplay and flow of tacit and explicit knowledge. Approaching KM through the lens of the knowledge life…

  18. Combating desertification: building on traditional knowledge systems of the Thar Desert communities.

    PubMed

    Gaur, Mahesh K; Gaur, Hemlata

    2004-12-01

    The Thar Desert of western India is known for its rich and ancient culture system and traditions. The communities have long been part of the Thar Desert ecosystem and have evolved specific strategies to live in harmony with its hostile environment. This culture has provided several miracle plants of immense food and medicinal value to modern civilisation. The ancient rural livelihood knowledge system reflects time-tested techno-scientific knowledge with a proven track record of sustainability, especially during natural hazards like drought and famines. In addition, several of the traditional skills of local communities in arts and crafts, music and instruments have made modern man aware of the art and techniques of sustainably utilising local biological resources and preserving their biodiversity along with using waste products of the forests, without harming the desert ecosystem. Traditional cultural and socio-religious values are fast dwindling under the impact of materialistic approach, industrialisation and development. This paper endeavours to illustrate the need to assist and propagate indigenous rural livelihood systems rather than mindlessly replace or abandon them as a result of state bureaucracies.

  19. Adaptive Knowledge Management of Project-Based Learning

    ERIC Educational Resources Information Center

    Tilchin, Oleg; Kittany, Mohamed

    2016-01-01

    The goal of an approach to Adaptive Knowledge Management (AKM) of project-based learning (PBL) is to intensify subject study through guiding, inducing, and facilitating development knowledge, accountability skills, and collaborative skills of students. Knowledge development is attained by knowledge acquisition, knowledge sharing, and knowledge…

  20. Science Illiteracy: Breaking the Cycle

    NASA Astrophysics Data System (ADS)

    Lebofsky, L. A.; Lebofsky, N. R.

    2003-12-01

    At the University of Arizona, as at many state universities and colleges, the introductory science classes for non-science majors may be the only science classes that future K--8 teachers will take. The design of the UA's General Education program requires all future non-science certified teachers to take the General Education science classes. These classes are therefore an ideal venue for the training of the state's future teachers. Many students, often including future teachers, are ill-prepared for college, i.e., they lack basic science content knowledge, basic mathematics skills, and reading and writing skills. They also lack basic critical thinking skills and study skills. It is within this context that our future teachers are trained. How do we break the cycle of science illiteracy? There is no simple solution, and certainly not a one-size-fits-all panacea that complements every professor's style of instruction. However, there are several programs at the University of Arizona, and also principles that I apply in my own classes, that may be adaptable in other classrooms. Assessment of K--12 students' learning supports the use of inquiry-based science instruction. This approach can be incorporated in college classes. Modeling proven and productive teaching methods for the future teachers provides far more than ``just the facts,'' and all students gain from the inquiry approach. Providing authentic research opportunities employs an inquiry-based approach. Reading (outside the textbook) and writing provide feedback to students with poor writing and critical thinking skills. Using peer tutors and an instant messaging hot line gives experience to the tutors and offers "comfortable" assistance to students.

  1. Field-based education and indigenous knowledge: Essential components of geoscience education for native American communities

    NASA Astrophysics Data System (ADS)

    Riggs, Eric M.

    2005-03-01

    The purpose of this study is to propose a framework drawing on theoretical and empirical science education research that explains the common prominent field-based components of the handful of persistent and successful Earth science education programs designed for indigenous communities in North America. These programs are primarily designed for adult learners, either in a postsecondary or in a technical education setting and all include active collaboration between local indigenous communities and geoscientists from nearby universities. Successful Earth science curricula for indigenous learners share in common an explicit emphasis on outdoor education, a place and problem-based structure, and the explicit inclusion of traditional indigenous knowledge in the instruction. Programs sharing this basic design have proven successful and popular for a wide range of indigenous cultures across North America. We present an analysis of common field-based elements to yield insight into indigenous Earth science education. We provide an explanation for the success of this design based in research on field-based learning, Native American learning styles research, and theoretical and empirical research into the nature and structure of indigenous knowledge. We also provide future research directions that can test and further refine our understanding of best practices in indigenous Earth science education.

  2. Knowledge Management as an Approach to Learning and Instructing Sector University Students in Post-Soviet Professional Education

    ERIC Educational Resources Information Center

    Volegzhanina, Irina S.; Chusovlyanova, Svetlana V.; Adolf, Vladimir A.; Bykadorova, Ekaterina S.; Belova, Elena N.

    2017-01-01

    The relevance of the study depends on addressing to the issue of knowledge management in learning and instructing students of post-Soviet sector universities. In this regard, the article is intended to reveal the nature of knowledge management approach compared to the knowledge-based one predominated in Soviet education. The flagship approach of…

  3. Charting Collective Knowledge: Supporting Self-Regulated Learning in the Workplace

    ERIC Educational Resources Information Center

    Littlejohn, Allison; Milligan, Colin; Margaryan, Anoush

    2012-01-01

    Purpose: This study aims to outline an approach to improving the effectiveness of work-based learning through knowledge creation and enhancing self-regulated learning. The paper presents a case example of a novel approach to learning through knowledge creation in the workplace. This case example is based on empirical data collected through a study…

  4. Course Ontology-Based User's Knowledge Requirement Acquisition from Behaviors within E-Learning Systems

    ERIC Educational Resources Information Center

    Zeng, Qingtian; Zhao, Zhongying; Liang, Yongquan

    2009-01-01

    User's knowledge requirement acquisition and analysis are very important for a personalized or user-adaptive learning system. Two approaches to capture user's knowledge requirement about course content within an e-learning system are proposed and implemented in this paper. The first approach is based on the historical data accumulated by an…

  5. Using the Knowledge Transfer Partnership Approach in Undergraduate Education and Practice-Based Training to Encourage Employer Engagement

    ERIC Educational Resources Information Center

    Harris, Margaret; Chisholm, Colin; Burns, George

    2013-01-01

    Purpose: The purpose of this paper is to provide a conceptual viewpoint which proposes the use of the post graduate Knowledge Transfer Partnership (KTP) approach to learning in undergraduate education and practice-based training. Design/methodology/approach: This is an examination of the KTP approach and how this could be used effectively in…

  6. A knowledge-based approach to improving and homogenizing intensity modulated radiation therapy planning quality among treatment centers: an example application to prostate cancer planning.

    PubMed

    Good, David; Lo, Joseph; Lee, W Robert; Wu, Q Jackie; Yin, Fang-Fang; Das, Shiva K

    2013-09-01

    Intensity modulated radiation therapy (IMRT) treatment planning can have wide variation among different treatment centers. We propose a system to leverage the IMRT planning experience of larger institutions to automatically create high-quality plans for outside clinics. We explore feasibility by generating plans for patient datasets from an outside institution by adapting plans from our institution. A knowledge database was created from 132 IMRT treatment plans for prostate cancer at our institution. The outside institution, a community hospital, provided the datasets for 55 prostate cancer cases, including their original treatment plans. For each "query" case from the outside institution, a similar "match" case was identified in the knowledge database, and the match case's plan parameters were then adapted and optimized to the query case by use of a semiautomated approach that required no expert planning knowledge. The plans generated with this knowledge-based approach were compared with the original treatment plans at several dose cutpoints. Compared with the original plan, the knowledge-based plan had a significantly more homogeneous dose to the planning target volume and a significantly lower maximum dose. The volumes of the rectum, bladder, and femoral heads above all cutpoints were nominally lower for the knowledge-based plan; the reductions were significantly lower for the rectum. In 40% of cases, the knowledge-based plan had overall superior (lower) dose-volume histograms for rectum and bladder; in 54% of cases, the comparison was equivocal; in 6% of cases, the knowledge-based plan was inferior for both bladder and rectum. Knowledge-based planning was superior or equivalent to the original plan in 95% of cases. The knowledge-based approach shows promise for homogenizing plan quality by transferring planning expertise from more experienced to less experienced institutions. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. The Silverton Field Experience: A Model Geography Course for Achieving High-Impact Educational Practices (HEPs)

    ERIC Educational Resources Information Center

    Vogt, Brandon J.; Skop, Emily

    2017-01-01

    High-Impact Educational Practices (HEPs) are a set of specific teaching and learning approaches proven effective in university education. This paper focuses on the benefits derived from utilizing three particular HEPs (inquiry-based collaborative activities, undergraduate research, and experiential learning) while teaching a snow and ice field…

  8. Teaching Music to Students with Special Needs: A Label-Free Approach

    ERIC Educational Resources Information Center

    Hammel, Alice; Hourigan, Ryan

    2011-01-01

    A practical guide & reference manual, "Teaching Music to Students with Special Needs" addresses special needs in the broadest possible sense to equip teachers with proven, research-based curricular strategies that are grounded in both best practice and current special education law. Chapters address the full range of topics and issues music…

  9. The All-Day Kindergarten and Pre-K Curriculum: A Dynamic-Themes Approach

    ERIC Educational Resources Information Center

    Fromberg, Doris Pronin

    2011-01-01

    Grounded in theory and research, "The All-Day Kindergarten and Pre-K Curriculum" provides an activity-based and classroom-proven curriculum for educators to consider as they plan and interact with pre-k and kindergarten children. Allowing young children the opportunities to become independent, caring, critical thinkers who feel comfortable asking…

  10. EnvironMentors: Mentoring At-Risk High School Students through University Partnerships

    ERIC Educational Resources Information Center

    Monk, Melissa H.; Baustian, Melissa M.; Saari, Courtney R.; Welsh, Susan; D'Elia, Christopher F.; Powers, Joseph E.; Gaston, Suzan; Francis, Pamela

    2014-01-01

    Informal place-based environmental education is a proven approach for increasing environmental awareness for students in urban cities. This article describes and qualitatively evaluates the first two academic years of the EnvironMentors program at Louisiana State University (LSU-EM), which is part of a national network of EnvironMentors programs.…

  11. Provenance-Based Approaches to Semantic Web Service Discovery and Usage

    ERIC Educational Resources Information Center

    Narock, Thomas William

    2012-01-01

    The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…

  12. An Evidence-Based Approach to an Adolescent with Emotional and Behavioral Dysregulation

    ERIC Educational Resources Information Center

    McClellan, Jon M.; Hamilton, John D.

    2006-01-01

    Children and adolescents in community mental health settings often present with multiple overlapping syndromes, as well as confounding issues such as family turmoil, abuse and neglect, and involvement with social welfare and juvenile justice systems. Most interventions proven to have efficacy in randomized, controlled trials have nevertheless not…

  13. Provenance-Powered Automatic Workflow Generation and Composition

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  14. Rational design of gene-based vaccines.

    PubMed

    Barouch, Dan H

    2006-01-01

    Vaccine development has traditionally been an empirical discipline. Classical vaccine strategies include the development of attenuated organisms, whole killed organisms, and protein subunits, followed by empirical optimization and iterative improvements. While these strategies have been remarkably successful for a wide variety of viruses and bacteria, these approaches have proven more limited for pathogens that require cellular immune responses for their control. In this review, current strategies to develop and optimize gene-based vaccines are described, with an emphasis on novel approaches to improve plasmid DNA vaccines and recombinant adenovirus vector-based vaccines. Copyright 2006 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd.

  15. Applying organizational science to health care: a framework for collaborative practice.

    PubMed

    Dow, Alan W; DiazGranados, Deborah; Mazmanian, Paul E; Retchin, Sheldon M

    2013-07-01

    Developing interprofessional education (IPE) curricula that improve collaborative practice across professions has proven challenging. A theoretical basis for understanding collaborative practice in health care settings is needed to guide the education and evaluation of health professions trainees and practitioners and support the team-based delivery of care. IPE should incorporate theory-driven, evidence-based methods and build competency toward effective collaboration.In this article, the authors review several concepts from the organizational science literature and propose using these as a framework for understanding how health care teams function. Specifically, they outline the team process model of action and planning phases in collaborative work; discuss leadership and followership, including how locus (a leader's integration into a team's usual work) and formality (a leader's responsibility conferred by the traditional hierarchy) affect team functions; and describe dynamic delegation, an approach to conceptualizing escalation and delegation within health care teams. For each concept, they identify competencies for knowledge, attitudes, and behaviors to aid in the development of innovative curricula to improve collaborative practice. They suggest that gaining an understanding of these principles will prepare health care trainees, whether team leaders or members, to analyze team performance, adapt behaviors that improve collaboration, and create team-based health care delivery processes that lead to improved clinical outcomes.

  16. Applying Organizational Science to Health Care: A Framework for Collaborative Practice

    PubMed Central

    Dow, Alan W.; DiazGranados, Deborah; Mazmanian, Paul E.; Retchin, Sheldon M.

    2013-01-01

    Developing interprofessional education (IPE) curricula that improve collaborative practice across professions has proven challenging. A theoretical basis for understanding collaborative practice in health care settings is needed to guide the education and evaluation of health professions trainees and practitioners and support the team-based delivery of care. IPE should incorporate theory-driven, evidence-based methods and build competency toward effective collaboration. In this article, the authors review several concepts from the organizational science literature and propose using these as a framework for understanding how health care teams function. Specifically, they outline the team process model of action and planning phases in collaborative work; discuss leadership and followership, including how locus (a leader’s integration into a team’s usual work) and formality (a leader’s responsibility conferred by the traditional hierarchy) affect team functions; and describe dynamic delegation, an approach to conceptualizing escalation and delegation within health care teams. For each concept, they identify competencies for knowledge, attitudes, and behaviors to aid in the development of innovative curricula to improve collaborative practice. They suggest that gaining an understanding of these principles will prepare health care trainees, whether team leaders or members, to analyze team performance, adapt behaviors that improve collaboration, and create team-based health care delivery processes that lead to improved clinical outcomes. PMID:23702530

  17. Web 2.0 collaboration tools to support student research in hydrology - an opinion

    NASA Astrophysics Data System (ADS)

    Pathirana, A.; Gersonius, B.; Radhakrishnan, M.

    2012-02-01

    A growing body of evidence suggests that it is unwise to make the a-priori assumption that university students are ready and eager to embrace modern online technologies employed to enhance the educational experience. We present an opinion on employing Wiki, a popular Web 2.0 technology, in small student groups, based on a case-study of using it customized as a personal learning environment (PLE) for supporting thesis research in hydrology. Since inception in 2006 the system presented has proven to facilitate knowledge construction and peer-communication within and across groups of students of different academic years and to stimulate learning. Being an open ended and egalitarian system, it was a minimal burden to maintain, as all students became content authors and shared responsibility. A number of unintended uses of the system were also observed, like using it as a backup medium and mobile storage. We attribute the success and sustainability of the proposed web 2.0-based approach to the fact that the efforts were not limited to the application of the technology, but comprised the creation of a supporting environment with educational activities organized around it. We propose that Wiki-based PLEs are much more suitable than traditional learning management systems for supporting non-classroom education activities like thesis research in hydrology.

  18. Three Metacognitive Approaches to Training Pre-Service Teachers in Different Learning Phases of Technological Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Kramarski, Bracha; Michalsky, Tova

    2009-01-01

    Our study investigated 3 metacognitive approaches provided during different phases of learning technological pedagogical content knowledge (TPCK) in a Web-based learning environment. These metacognitive approaches were based on self-question prompts (Kramarski & Mevarech, 2003) which appeared in pop-up screens and fostered the Self-Regulated…

  19. Conceptual information processing: A robust approach to KBS-DBMS integration

    NASA Technical Reports Server (NTRS)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  20. PSYCHE: An Object-Oriented Approach to Simulating Medical Education

    PubMed Central

    Mullen, Jamie A.

    1990-01-01

    Traditional approaches to computer-assisted instruction (CAI) do not provide realistic simulations of medical education, in part because they do not utilize heterogeneous knowledge bases for their source of domain knowledge. PSYCHE, a CAI program designed to teach hypothetico-deductive psychiatric decision-making to medical students, uses an object-oriented implementation of an intelligent tutoring system (ITS) to model the student, domain expert, and tutor. It models the transactions between the participants in complex transaction chains, and uses heterogeneous knowledge bases to represent both domain and procedural knowledge in clinical medicine. This object-oriented approach is a flexible and dynamic approach to modeling, and represents a potentially valuable tool for the investigation of medical education and decision-making.

  1. Advancing nursing practice: redefining the theoretical and practical integration of knowledge.

    PubMed

    Christensen, Martin

    2011-03-01

    The aim of this paper is to offer an alternative knowing-how knowing-that framework of nursing knowledge, which in the past has been accepted as the provenance of advanced practice. The concept of advancing practice is central to the development of nursing practice and has been seen to take on many different forms depending on its use in context. To many it has become synonymous with the work of the advanced or expert practitioner; others have viewed it as a process of continuing professional development and skills acquisition. Moreover, it is becoming closely linked with practice development. However, there is much discussion as to what constitutes the knowledge necessary for advancing and advanced practice, and it has been suggested that theoretical and practical knowledge form the cornerstone of advanced knowledge. The design of this article takes a discursive approach as to the meaning and integration of knowledge within the context of advancing nursing practice. A thematic analysis of the current discourse relating to knowledge integration models in an advancing and advanced practice arena was used to identify concurrent themes relating to the knowing-how knowing-that framework which commonly used to classify the knowledge necessary for advanced nursing practice. There is a dichotomy as to what constitutes knowledge for advanced and advancing practice. Several authors have offered a variety of differing models, yet it is the application and integration of theoretical and practical knowledge that defines and develops the advancement of nursing practice. An alternative framework offered here may allow differences in the way that nursing knowledge important for advancing practice is perceived, developed and coordinated. What has inevitably been neglected is that there are various other variables which when transposed into the existing knowing-how knowing-that framework allows for advanced knowledge to be better defined. One of the more notable variables is pattern recognition, which became the focus of Benner's work on expert practice. Therefore, if this is included into the knowing-how knowing-that framework, the knowing-how becomes the knowledge that contributes to advancing and advanced practice and the knowing-that becomes the governing action based on a deeper understanding of the problem or issue. © 2011 Blackwell Publishing Ltd.

  2. Anti-inflammatory drugs and prediction of new structures by comparative analysis.

    PubMed

    Bartzatt, Ronald

    2012-01-01

    Nonsteroidal anti-inflammatory drugs (NSAIDs) are a group of agents important for their analgesic, anti-inflammatory, and antipyretic properties. This study presents several approaches to predict and elucidate new molecular structures of NSAIDs based on 36 known and proven anti-inflammatory compounds. Based on 36 known NSAIDs the mean value of Log P is found to be 3.338 (standard deviation= 1.237), mean value of polar surface area is 63.176 Angstroms2 (standard deviation = 20.951 A2), and the mean value of molecular weight is 292.665 (standard deviation = 55.627). Nine molecular properties are determined for these 36 NSAID agents, including Log P, number of -OH and -NHn, violations of Rule of 5, number of rotatable bonds, and number of oxygens and nitrogens. Statistical analysis of these nine molecular properties provides numerical parameters to conform to in the design of novel NSAID drug candidates. Multiple regression analysis is accomplished using these properties of 36 agents followed with examples of predicted molecular weight based on minimum and maximum property values. Hierarchical cluster analysis indicated that licofelone, tolfenamic acid, meclofenamic acid, droxicam, and aspirin are substantially distinct from all remaining NSAIDs. Analysis of similarity (ANOSIM) produced R = 0.4947, which indicates low to moderate level of dissimilarity between these 36 NSAIDs. Non-hierarchical K-means cluster analysis separated the 36 NSAIDs into four groups having members of greatest similarity. Likewise, discriminant analysis divided the 36 agents into two groups indicating the greatest level of distinction (discrimination) based on nine properties. These two multivariate methods together provide investigators a means to compare and elucidate novel drug designs to 36 proven compounds and ascertain to which of those are most analogous in pharmacodynamics. In addition, artificial neural network modeling is demonstrated as an approach to predict numerous molecular properties of new drug designs that is based on neural training from 36 proven NSAIDs. Comprehensive and effective approaches are presented in this study for the design of new NSAID type agents which are so very important for inhibition of COX-2 and COX-1 isoenzymes.

  3. Applying the Kanban Method in Problem-Based Project Work: A Case Study in A Manufacturing Engineering Bachelor's Programme at Aalborg University Copenhagen

    ERIC Educational Resources Information Center

    Balve, Patrick; Krüger, Volker; Tolstrup Sørensen, Lene

    2017-01-01

    Problem-based learning (PBL) has proven to be highly effective for educating students in an active and self-motivated manner in various disciplines. Student projects carried out following PBL principles are very dynamic and carry a high level of uncertainty, both conditions under which agile project management approaches are assumed to be highly…

  4. Premenstrual syndrome: current knowledge and management.

    PubMed Central

    Robinson, G E

    1989-01-01

    Premenstrual syndrome (PMS) has become a popular self-diagnosis. Faulty research has led to confusion about the diagnosis, epidemiologic features, causes and treatment of this disorder. There is no proof that the premenstrual period is a time of increased violence. An association between menstrually related mood disorders and other psychiatric illness is also unproven. Despite many theories no definitive cause of PMS has been established, and controlled studies of various treatments have failed to find a universally effective approach. Conservative measures involving support, diet and exercise seem to help in most cases. The use of alprazolam and mefenamic acid may help some women. Rectal or vaginal progesterone therapy has been proven ineffective and should not be used. PMID:2645986

  5. Online Educational Tool to Promote Bone Health in Cancer Survivors.

    PubMed

    des Bordes, Jude K A; Suarez-Almazor, Maria E; Volk, Robert J; Lu, Huifang; Edwards, Beatrice; Lopez-Olivo, Maria A

    2017-10-01

    Osteoporosis burden is significant in cancer survivors. Websites providing health information abound, but their development, quality, and source of information remain unclear. Our aim was to use a systematic and transparent approach to create an educational website on bone health, and to evaluate its potential to improve knowledge, self-management, and awareness in prostate cancer (PCa) and breast cancer (BCa) survivors. Guided by the Health Belief Model, we created a website using international standards and evaluated it in 10 PCa and 10 BCa survivors with self-administered questionnaire before, after, and 1 month after navigating the website. The mean scores on the knowledge questionnaire at baseline, postintervention and 1 month were, respectively, 5.1 (±2.0), 6.9 (±2.5), and 6.7 (±2.4), p < .008, in PCa and 3.4 (±2.7), 7.6 (±3.0), and 6.5 (±3.8), p  = .016, in BCa survivors. Acceptability ratings ranged from 60% to 100%. Participants found the website useful, helpful, and able to raise bone health awareness. Our website improved bone health knowledge in both PCa and BCa survivors. A systematic and transparent approach to the development of online educational websites could result in a tool capable of meeting the educational needs of targeted consumers. Cancer survivors could benefit from proven online educational tools.

  6. [Summary of the existing knowledge about electronic cigarettes].

    PubMed

    Cselkó, Zsuzsa; Pénzes, Melinda

    2016-06-19

    The decreasing proportion of smokers due to smoking restrictions have led producers to invent and disseminate electronic cigarettes (e-cigarettes) worldwide as a new form of nicotine enjoyment. This review summarizes the existing knowledge about e-cigarettes based on publications of PubMed, and on reviews and research data published by national and international scientific institutions. Present knowledge about the composition of e-cigarettes confirms that they are harmful products since their vapor is equally detrimental to the health of users and bystanders. Their benefits in smoking cessation still have not been justified by adequate scientific evidence, however, it has been proven that e-cigarettes uphold nicotine addiction and may increase the risk of starting conventional cigarette use by youth. In order to ensure the results of tobacco control policy and to assist smoking cessation, the same regulations are to be applied to e-cigarettes as to conventional tobacco products.

  7. Combining Open-domain and Biomedical Knowledge for Topic Recognition in Consumer Health Questions.

    PubMed

    Mrabet, Yassine; Kilicoglu, Halil; Roberts, Kirk; Demner-Fushman, Dina

    2016-01-01

    Determining the main topics in consumer health questions is a crucial step in their processing as it allows narrowing the search space to a specific semantic context. In this paper we propose a topic recognition approach based on biomedical and open-domain knowledge bases. In the first step of our method, we recognize named entities in consumer health questions using an unsupervised method that relies on a biomedical knowledge base, UMLS, and an open-domain knowledge base, DBpedia. In the next step, we cast topic recognition as a binary classification problem of deciding whether a named entity is the question topic or not. We evaluated our approach on a dataset from the National Library of Medicine (NLM), introduced in this paper, and another from the Genetic and Rare Disease Information Center (GARD). The combination of knowledge bases outperformed the results obtained by individual knowledge bases by up to 16.5% F1 and achieved state-of-the-art performance. Our results demonstrate that combining open-domain knowledge bases with biomedical knowledge bases can lead to a substantial improvement in understanding user-generated health content.

  8. Automated knowledge base development from CAD/CAE databases

    NASA Technical Reports Server (NTRS)

    Wright, R. Glenn; Blanchard, Mary

    1988-01-01

    Knowledge base development requires a substantial investment in time, money, and resources in order to capture the knowledge and information necessary for anything other than trivial applications. This paper addresses a means to integrate the design and knowledge base development process through automated knowledge base development from CAD/CAE databases and files. Benefits of this approach include the development of a more efficient means of knowledge engineering, resulting in the timely creation of large knowledge based systems that are inherently free of error.

  9. Preventing childhood obesity in Asia: an overview of intervention programmes.

    PubMed

    Uijtdewilligen, L; Waters, C N; Müller-Riemenschneider, F; Lim, Y W

    2016-11-01

    The rapid economic growth in Asia in the past few decades has contributed to the global increase in childhood obesity prevalence. Yet, little is known about obesity prevention efforts in this region. This systematic review provides an overview of child obesity prevention programmes in Asia. Searches were performed in six electronic databases. Out of 4,234 studies, 17 were included, among them 11 controlled trials (of which five were randomized). Only one study was published before 2007. Identified studies were predominantly conducted in China and Thailand and targeted primary school children in a school setting. Most studies implemented different programmes, frequently targeting behavioural modification through nutrition/health education lectures and/or physical activity sessions. Programme effects related to obesity outcome measures were mixed. Most substantial effects were found for outcomes such as improved health knowledge and/or favourable lifestyle practices. The relatively small number of relevant publications in Asia highlights the need for scientific evaluations of existing and future programmes. This will help ensure the implementation and dissemination of evidence-based approaches that have been proven to be effective in the Asian context. Targeting preschool settings and applying a comprehensive multisectoral approach may increase the effectiveness and sustainability of childhood obesity prevention programmes. © 2016 World Obesity.

  10. A brief review of clinical trials involving manipulation of invariant NKT cells as a promising approach in future cancer therapies

    PubMed Central

    Bojarska-Junak, Agnieszka; Roliński, Jacek

    2017-01-01

    In the recent years researchers have put a lot of emphasis on the possible immunotherapeutic strategies able to target tumors. Many studies have proven that the key role in recognition and eradication of cancer cells, both for mice and humans, is being conducted by the invariant natural killer T-cells (NKT). This small subpopulation of lymphocytes can kill other cells, either directly or indirectly, through the natural killer cells’ (NK) activation. They can also swiftly release cytokines, causing the involvement of elements of the innate and acquired immune system. With the discovery of α-galactosylceramide (α-GalCer) – the first known agonist for iNKT cells – and its later subsequent analogs, it became possible to effectively stimulate iNKT cells, hence to keep control over the tumor progression. This article refers to the current knowledge concerning iNKT cells and the most important aspects of their antitumor activity. It also highlights the clinical trials that aim at increasing the amount of iNKT cells in general and in the microenvironment of the tumor. For sure, the iNKT-based immunotherapeutic approach holds a great potential and is highly probable to become a part of the cancer immunotherapy in the future. PMID:28860937

  11. Clay pigment structure characterisation as a guide for provenance determination--a comparison between laboratory powder micro-XRD and synchrotron radiation XRD.

    PubMed

    Švarcová, Silvie; Bezdička, Petr; Hradil, David; Hradilová, Janka; Žižak, Ivo

    2011-01-01

    Application of X-ray diffraction (XRD)-based techniques in the analysis of painted artworks is not only beneficial for indisputable identification of crystal constituents in colour layers, but it can also bring insight in material crystal structure, which can be affected by their geological formation, manufacturing procedure or secondary changes. This knowledge might be helpful for art historic evaluation of an artwork as well as for its conservation. By way of example of kaolinite, we show that classification of its crystal structure order based on XRD data is useful for estimation of its provenance. We found kaolinite in the preparation layer of a Gothic wall painting in a Czech church situated near Karlovy Vary, where there are important kaolin deposits. Comparing reference kaolin materials from eight various Czech deposits, we found that these can be differentiated just according to the kaolinite crystallinity. Within this study, we compared laboratory powder X-ray micro-diffraction (micro-XRD) with synchrotron radiation X-ray diffraction analysing the same real sample. We found that both techniques led to the same results.

  12. Using ePortfolio-Based Learning Approach to Facilitate Knowledge Sharing and Creation among College Students

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Chou, Pao-Nan; Liang, Chaoyan

    2018-01-01

    The purpose of the present study was to examine the effects of the ePortfolio-based learning approach (ePBLA) on knowledge sharing and creation with 92 college students majoring in electrical engineering as the participants. Multivariate analysis of covariance (MANCOVA) with a covariance of pretest on knowledge sharing and creation was conducted…

  13. A multidisciplinary approach to trace Asian dust storms from source to sink

    NASA Astrophysics Data System (ADS)

    Yan, Yan; Sun, Youbin; Ma, Long; Long, Xin

    2015-03-01

    Tracing the source of dust storm (DS) in mega-cities of northern China currently suffers ambiguities from different approaches including source-sink proxy comparison, air mass back trajectory modeling, and satellite image monitoring. By integrating advantages of all three methods, we present a multidisciplinary approach to trace the provenance of dust fall in Xi'an during the spring season (March to May) of 2012. We collected daily dust fall to calculate dust flux variation, and detected eight DS events with remarkable high flux values based on meteorological comparison and extreme detection algorithm. By combining MODIS images and accompanying real-time air mass back trajectories, we attribute four of them as natural DS events and the other four as anthropogenic DS events, suggesting the importance of natural and anthropogenic processes in supplying long-range transported dust. The primary sources of these DS events were constrained to three possible areas, including the northern Chinese deserts, Taklimakan desert, and Gurbantunggut desert. Proxy comparisons based upon the quartz crystallinity index and oxygen isotope further confirmed the source-to-sink linkage between the natural DS events in Xi'an and the dust emissions from the northern Chinese deserts. The integration of geochemical and meteorological tracing approaches favors the dominant contribution of short-distance transportation of modern dust fall on the Chinese Loess Plateau. Our study shows that the multidisciplinary approach could permit a better source identification of modern dust and should be applied properly for tracing the provenance fluctuations of geological dust deposits.

  14. Knowledge-Based Runway Assignment for Arrival Aircraft in the Terminal Area

    DOT National Transportation Integrated Search

    1997-01-01

    A knowledge-based system for scheduling arrival traffic in the terminal area, : referred to as the Final Approach Spacing Tool (FAST), has been implemented and : operationally tested at the Dallas/Fort Worth Terminal Radar Approach Control : (TRACON)...

  15. Computational neuroanatomy: ontology-based representation of neural components and connectivity

    PubMed Central

    Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron

    2009-01-01

    Background A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. Results We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Conclusion Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future. PMID:19208191

  16. Reproducibility and Knowledge Capture Architecture for the NASA Earth Exchange (NEX)

    NASA Astrophysics Data System (ADS)

    Votava, P.; Michaelis, A.; Nemani, R. R.

    2015-12-01

    NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a platform for analysis, experiments and production of data on the order of petabytes in volume, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process or chain of processes, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. For example, the NEX Landsat pipeline is deployed to process hundreds of thousands of Landsat scenes in a non-linear production workflow with many-to-many mappings of files between 40 separate processing stages where over 100 million processes get executed. At this scale it is almost impossible to easily examine the entire provenance of each file, let alone easily reproduce it. We have developed an initial solution for the NEX system - a transparent knowledge capture and reproducibility architecture that does not require any special code instrumentation and other actions on user's part. Users can automatically capture their work through a transparent provenance tracking system and the information can subsequently be queried and/or converted into workflows. The provenance information is streamed to a MongoDB document store and a subset is converted to an RDF format and inserted into our triple-store. The triple-store already contains semantic information about other aspects of the NEX system and adding provenance enhances the ability to relate workflows and data to users, locations, projects and other NEX concepts that can be queried in a standard way. The provenance system has the ability to track data throughout NEX and across number of executions and can recreate and re-execute the entire history and reproduce the results. The information can also be used to automatically create individual workflow components and full workflows that can be visually examined, modified, executed and extended by researchers. This provides a key component for accelerating research through knowledge capture and scientific reproducibility on NEX.

  17. Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model

    NASA Astrophysics Data System (ADS)

    Pathak, Jaideep; Wikner, Alexander; Fussell, Rebeckah; Chandra, Sarthak; Hunt, Brian R.; Girvan, Michelle; Ott, Edward

    2018-04-01

    A model-based approach to forecasting chaotic dynamical systems utilizes knowledge of the mechanistic processes governing the dynamics to build an approximate mathematical model of the system. In contrast, machine learning techniques have demonstrated promising results for forecasting chaotic systems purely from past time series measurements of system state variables (training data), without prior knowledge of the system dynamics. The motivation for this paper is the potential of machine learning for filling in the gaps in our underlying mechanistic knowledge that cause widely-used knowledge-based models to be inaccurate. Thus, we here propose a general method that leverages the advantages of these two approaches by combining a knowledge-based model and a machine learning technique to build a hybrid forecasting scheme. Potential applications for such an approach are numerous (e.g., improving weather forecasting). We demonstrate and test the utility of this approach using a particular illustrative version of a machine learning known as reservoir computing, and we apply the resulting hybrid forecaster to a low-dimensional chaotic system, as well as to a high-dimensional spatiotemporal chaotic system. These tests yield extremely promising results in that our hybrid technique is able to accurately predict for a much longer period of time than either its machine-learning component or its model-based component alone.

  18. Object-based landslide mapping on satellite images from different sensors

    NASA Astrophysics Data System (ADS)

    Hölbling, Daniel; Friedl, Barbara; Eisank, Clemens; Blaschke, Thomas

    2015-04-01

    Several studies have proven that object-based image analysis (OBIA) is a suitable approach for landslide mapping using remote sensing data. Mostly, optical satellite images are utilized in combination with digital elevation models (DEMs) for semi-automated mapping. The ability of considering spectral, spatial, morphometric and contextual features in OBIA constitutes a significant advantage over pixel-based methods, especially when analysing non-uniform natural phenomena such as landslides. However, many of the existing knowledge-based OBIA approaches for landslide mapping are rather complex and are tailored to specific data sets. These restraints lead to a lack of transferability of OBIA mapping routines. The objective of this study is to develop an object-based approach for landslide mapping that is robust against changing input data with different resolutions, i.e. optical satellite imagery from various sensors. Two study sites in Taiwan were selected for developing and testing the landslide mapping approach. One site is located around the Baolai village in the Huaguoshan catchment in the southern-central part of the island, the other one is a sub-area of the Taimali watershed in Taitung County near the south-eastern Pacific coast. Both areas are regularly affected by severe landslides and debris flows. A range of very high resolution (VHR) optical satellite images was used for the object-based mapping of landslides and for testing the transferability across different sensors and resolutions: (I) SPOT-5, (II) Formosat-2, (III) QuickBird, and (IV) WorldView-2. Additionally, a digital elevation model (DEM) with 5 m spatial resolution and its derived products (e.g. slope, plan curvature) were used for supporting the semi-automated mapping, particularly for differentiating source areas and accumulation areas according to their morphometric characteristics. A focus was put on the identification of comparatively stable parameters (e.g. relative indices), which could be transferred to the different satellite images. The presence of bare ground was assumed to be an evidence for the occurrence of landslides. For separating vegetated from non-vegetated areas the Normalized Difference Vegetation Index (NDVI) was primarily used. Each image was divided into two respective parts based on an automatically calculated NDVI threshold value in eCognition (Trimble) software by combining the homogeneity criterion of multiresolution segmentation and histogram-based methods, so that heterogeneity is increased to a maximum. Expert knowledge models, which depict features and thresholds that are usually used by experts for digital landslide mapping, were considered for refining the classification. The results were compared to the respective results from visual image interpretation (i.e. manually digitized reference polygons for each image), which were produced by an independent local expert. By that, the spatial overlaps as well as under- and over-estimated areas were identified and the performance of the approach in relation to each sensor was evaluated. The presented method can complement traditional manual mapping efforts. Moreover, it contributes to current developments for increasing the transferability of semi-automated OBIA approaches and for improving the efficiency of change detection approaches across multi-sensor imagery.

  19. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  20. Niger Delta play types, Nigeria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akinpelu, A.O.

    Exploration databases can be more valuable when sorted by play type. Play specific databases provide a system to organize E & P data used in evaluating the range of values of parameters for reserve estimation and risk assessment. It is important both in focusing the knowledge base and in orienting research effort. A play in this context is any unique combination of trap, reservoir and source properties with the right dynamics of migration and preservation that results in hydrocarbon accumulation. This definitions helps us to discriminate the subtle differences found with these accumulation settings. About 20 play types were identifiedmore » around the Niger Delta oil province in Nigeria. These are grouped into three parts: (1) The proven plays-constituting the bulk of exploration prospects in Nigeria today. (2) The unproven or semi-proven plays usually with some successes recorded in a few tries but where knowledge is still inadequate. (3) The unproven or analogous play concept. These are untested but geologically sound ideas which may or may not have been tried elsewhere. With classification and sub grouping of these play types into specific databases, intrinsic attributes and uniqueness of each of them with respect to the four major risk elements and the eight parameters for reserve estimation can be better understood.« less

  1. Development of a Premium Quality Plasma-derived IVIg (IQYMUNE®) Utilizing the Principles of Quality by Design-A Worked-through Case Study.

    PubMed

    Paolantonacci, Philippe; Appourchaux, Philippe; Claudel, Béatrice; Ollivier, Monique; Dennett, Richard; Siret, Laurent

    2018-01-01

    Polyvalent human normal immunoglobulins for intravenous use (IVIg), indicated for rare and often severe diseases, are complex plasma-derived protein preparations. A quality by design approach has been used to develop the Laboratoire Français du Fractionnement et des Biotechnologies new-generation IVIg, targeting a high level of purity to generate an enhanced safety profile while maintaining a high level of efficacy. A modular approach of quality by design was implemented consisting of five consecutive steps to cover all the stages from the product design to the final product control strategy.A well-defined target product profile was translated into 27 product quality attributes that formed the basis of the process design. In parallel, a product risk analysis was conducted and identified 19 critical quality attributes among the product quality attributes. Process risk analysis was carried out to establish the links between process parameters and critical quality attributes. Twelve critical steps were identified, and for each of these steps a risk mitigation plan was established.Among the different process risk mitigation exercises, five process robustness studies were conducted at qualified small scale with a design of experiment approach. For each process step, critical process parameters were identified and, for each critical process parameter, proven acceptable ranges were established. The quality risk management and risk mitigation outputs, including verification of proven acceptable ranges, were used to design the process verification exercise at industrial scale.Finally, the control strategy was established using a mix, or hybrid, of the traditional approach plus elements of the quality by design enhanced approach, as illustrated, to more robustly assign material and process controls and in order to securely meet product specifications.The advantages of this quality by design approach were improved process knowledge for industrial design and process validation and a clear justification of the process and product specifications as a basis for control strategy and future comparability exercises. © PDA, Inc. 2018.

  2. Nurses' perceptions of evidence-based practice: a quantitative study at a teaching hospital in Iran.

    PubMed

    Shafiei, Ebrahim; Baratimarnani, Ahmad; Goharinezhad, Salime; Kalhor, Rohollah; Azmal, Mohammad

    2014-01-01

    Evidence-based practice (EBP) provides nurses a method to use critically appraised and scientifically proven evidence for delivering quality health care and the best decision that leads to quality outcomes. The purpose of this study was to measure the practice, attitude and knowledge/skill of evidence-based practice of nurses in a teaching hospital in Iran. This cross-sectional study was conducted in 2011.The study sample was composed of 195 nurses who were working at the Fatemeh Zahra Hospital affiliated to Bushehr University of Medical Sciences (BPUMS). The survey instrument was a questionnaire based on Upton and Upton study. This tool measures Nurses' perceptions in the three sub-scales of practice, attitude and knowledge/skill of evidence-based practice. Descriptive statistical analysis was used to analyze the data. Pearson correlation coefficients were used to examine the relationship between subscales. The overall mean score of the evidence-based practice in this study was 4.48±1.26 from 7, and the three subscales of practice, attitude and knowledge/skill in evidence-based practice were, 4.58±1.24, 4.57±1.35 and 4.39±1.20, respectively. There was a strong relationship between knowledge and performance subscale (r=0.73,p<0.01). Findings of the study indicate that more training and education are required for evidence-based nursing. Successful implementation of evidence-based nursing depends on organizational plans and empowerment programs in hospitals. Hence, hospital managers should formulate a comprehensive strategy for improving EBP.

  3. Biorefineries of carbon dioxide: From carbon capture and storage (CCS) to bioenergies production.

    PubMed

    Cheah, Wai Yan; Ling, Tau Chuan; Juan, Joon Ching; Lee, Duu-Jong; Chang, Jo-Shu; Show, Pau Loke

    2016-09-01

    Greenhouse gas emissions have several adverse environmental effects, like pollution and climate change. Currently applied carbon capture and storage (CCS) methods are not cost effective and have not been proven safe for long term sequestration. Another attractive approach is CO2 valorization, whereby CO2 can be captured in the form of biomass via photosynthesis and is subsequently converted into various form of bioenergy. This article summarizes the current carbon sequestration and utilization technologies, while emphasizing the value of bioconversion of CO2. In particular, CO2 sequestration by terrestrial plants, microalgae and other microorganisms are discussed. Prospects and challenges for CO2 conversion are addressed. The aim of this review is to provide comprehensive knowledge and updated information on the current advances in biological CO2 sequestration and valorization, which are essential if this approach is to achieve environmental sustainability and economic feasibility. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. A chemometric approach to the characterisation of historical mortars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rampazzi, L.; Pozzi, A.; Sansonetti, A.

    2006-06-15

    The compositional knowledge of historical mortars is of great concern in case of provenance and dating investigations and of conservation works since the nature of the raw materials suggests the most compatible conservation products. The classic characterisation usually goes through various analytical determinations, while conservation laboratories call for simple and quick analyses able to enlighten the nature of mortars, usually in terms of the binder fraction. A chemometric approach to the matter is here undertaken. Specimens of mortars were prepared with calcitic and dolomitic binders and analysed by Atomic Spectroscopy. Principal Components Analysis (PCA) was used to investigate the featuresmore » of specimens and samples. A Partial Least Square (PLS1) regression was done in order to predict the binder/aggregate ratio. The model was applied to historical mortars from the churches of St. Lorenzo (Milan) and St. Abbondio (Como). The accordance between the predictive model and the real samples is discussed.« less

  5. Promising molecular targets and biomarkers for male BPH and LUTS.

    PubMed

    Gharaee-Kermani, Mehrnaz; Macoska, Jill A

    2013-12-01

    Benign prostatic hyperplasia (BPH) is a major health concern for aging men. BPH is associated with urinary voiding dysfunction and lower urinary tract symptoms (LUTS), which negatively affects quality of life. Surgical resection and medical approaches have proven effective for improving urinary flow and relieving LUTS but are not effective for all men and can produce adverse effects that require termination of the therapeutic regimen. Thus, there is a need to explore other therapeutic targets to treat BPH/LUTS. Complicating the treatment of BPH/LUTS is the lack of biomarkers to effectively identify pathobiologies contributing to BPH/LUTS or to gauge successful response to therapy. This review will briefly discuss current knowledge and will highlight new studies that illuminate the pathobiologies contributing to BPH/LUTS, potential new therapeutic strategies for successfully treating BPH/LUTS, and new approaches for better defining these pathobiologies and response to therapeutics through the development of biomarkers and phenotyping strategies.

  6. Development of a culturally appropriate computer-delivered tailored Internet-based health literacy intervention for Spanish-dominant Hispanics living with HIV.

    PubMed

    Jacobs, Robin J; Caballero, Joshua; Ownby, Raymond L; Kane, Michael N

    2014-11-30

    Low health literacy is associated with poor medication adherence in persons with human immunodeficiency virus (HIV), which can lead to poor health outcomes. As linguistic minorities, Spanish-dominant Hispanics (SDH) face challenges such as difficulties in obtaining and understanding accurate information about HIV and its treatment. Traditional health educational methods (e.g., pamphlets, talking) may not be as effective as delivering through alternate venues. Technology-based health information interventions have the potential for being readily available on desktop computers or over the Internet. The purpose of this research was to adapt a theoretically-based computer application (initially developed for English-speaking HIV-positive persons) that will provide linguistically and culturally appropriate tailored health education to Spanish-dominant Hispanics with HIV (HIV + SDH). A mixed methods approach using quantitative and qualitative interviews with 25 HIV + SDH and 5 key informants guided by the Information-Motivation-Behavioral (IMB) Skills model was used to investigate cultural factors influencing medication adherence in HIV + SDH. We used a triangulation approach to identify major themes within cultural contexts relevant to understanding factors related to motivation to adhere to treatment. From this data we adapted an automated computer-based health literacy intervention to be delivered in Spanish. Culture-specific motivational factors for treatment adherence in HIV + SDH persons that emerged from the data were stigma, familismo (family), mood, and social support. Using this data, we developed a culturally and linguistically adapted a tailored intervention that provides information about HIV infection, treatment, and medication related problem solving skills (proven effective in English-speaking populations) that can be delivered using touch-screen computers, tablets, and smartphones to be tested in a future study. Using a theoretically-grounded Internet-based eHealth education intervention that builds on knowledge and also targets core cultural determinants of adherence may prove a highly effective approach to improve health literacy and medication decision-making in this group.

  7. Role of E-Learning in Capacity Building: An Alumni View

    ERIC Educational Resources Information Center

    Zaheer, Muhammad; Jabeen, Sadia; Qadri, Mubasher Majeed

    2015-01-01

    The concept of knowledge sharing has now expanded because of sophisticated communication tools. A common consensus has been generated for spreading knowledge beyond boundaries and making collective efforts for the development of individuals as well as nations. E-learning has proven its authenticity in this regard. In developing countries, access…

  8. Using Funds of Knowledge to Build Trust between a Teacher and Parents of Language-Delayed Preschoolers

    ERIC Educational Resources Information Center

    Gonzalez, Alissa Zoraida

    2014-01-01

    Preschool children with language delays often struggle to learn new concepts. Proven strategies such as modeling, prompting, reinforcing responses, direct teaching, and hands-on experience matter to young children with language delays. Also important are social interactions and shared experiences with more knowledgeable persons. Within a cultural…

  9. Beneficial and adverse effects of testosterone on the cardiovascular system in men.

    PubMed

    Ruige, Johannes B; Ouwens, D Margriet; Kaufman, Jean-Marc

    2013-11-01

    The widespread use of T therapy, particularly in aging males, necessitates knowledge of the relationship between T and the cardiovascular system. The review is based on a 1970 to 2013 PubMed search with terms related to androgens in combination with cardiovascular disease, including T, dihydrotestosterone, trial, mortality, cardiovascular disease, myocardial infarction, blood pressure, endothelial function, dyslipidemia, thrombosis, ventricular function, and arrhythmia. Original articles, systematic reviews and meta-analyses, and relevant citations were screened. Low T has been linked to increased blood pressure, dyslipidemia, atherosclerosis, arrhythmia, thrombosis, endothelial dysfunction, as well as to impaired left ventricular function. On the one hand, a modest association is suggested between low endogenous T and incident cardiovascular disease or cardiovascular mortality, implying unrecognized beneficial T effects, residual confounding, or a relationship with health status. On the other hand, treatments with T to restore "normal concentrations" have so far not been proven to be beneficial with respect to cardiovascular disease; neither have they definitely shown specific adverse cardiovascular effects. The cardiovascular risk-benefit profile of T therapy remains largely evasive in view of a lack of well-designed and adequately powered randomized clinical trials. The important knowledge gap as to the exact relationship between T and cardiovascular disease would support a cautious, restrained approach to T therapy in aging men, pending clarification of benefits and risks by adequately powered clinical trials of sufficient duration.

  10. Trust from the past: Bayesian Personalized Ranking based Link Prediction in Knowledge Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Baichuan; Choudhury, Sutanay; Al-Hasan, Mohammad

    2016-02-01

    Estimating the confidence for a link is a critical task for Knowledge Graph construction. Link prediction, or predicting the likelihood of a link in a knowledge graph based on prior state is a key research direction within this area. We propose a Latent Feature Embedding based link recommendation model for prediction task and utilize Bayesian Personalized Ranking based optimization technique for learning models for each predicate. Experimental results on large-scale knowledge bases such as YAGO2 show that our approach achieves substantially higher performance than several state-of-art approaches. Furthermore, we also study the performance of the link prediction algorithm in termsmore » of topological properties of the Knowledge Graph and present a linear regression model to reason about its expected level of accuracy.« less

  11. Visual analysis of online social media to open up the investigation of stance phenomena

    PubMed Central

    Kucher, Kostiantyn; Schamp-Bjerede, Teri; Kerren, Andreas; Paradis, Carita; Sahlgren, Magnus

    2015-01-01

    Online social media are a perfect text source for stance analysis. Stance in human communication is concerned with speaker attitudes, beliefs, feelings and opinions. Expressions of stance are associated with the speakers' view of what they are talking about and what is up for discussion and negotiation in the intersubjective exchange. Taking stance is thus crucial for the social construction of meaning. Increased knowledge of stance can be useful for many application fields such as business intelligence, security analytics, or social media monitoring. In order to process large amounts of text data for stance analyses, linguists need interactive tools to explore the textual sources as well as the processed data based on computational linguistics techniques. Both original texts and derived data are important for refining the analyses iteratively. In this work, we present a visual analytics tool for online social media text data that can be used to open up the investigation of stance phenomena. Our approach complements traditional linguistic analysis techniques and is based on the analysis of utterances associated with two stance categories: sentiment and certainty. Our contributions include (1) the description of a novel web-based solution for analyzing the use and patterns of stance meanings and expressions in human communication over time; and (2) specialized techniques used for visualizing analysis provenance and corpus overview/navigation. We demonstrate our approach by means of text media on a highly controversial scandal with regard to expressions of anger and provide an expert review from linguists who have been using our tool. PMID:29249903

  12. Visual analysis of online social media to open up the investigation of stance phenomena.

    PubMed

    Kucher, Kostiantyn; Schamp-Bjerede, Teri; Kerren, Andreas; Paradis, Carita; Sahlgren, Magnus

    2016-04-01

    Online social media are a perfect text source for stance analysis. Stance in human communication is concerned with speaker attitudes, beliefs, feelings and opinions. Expressions of stance are associated with the speakers' view of what they are talking about and what is up for discussion and negotiation in the intersubjective exchange. Taking stance is thus crucial for the social construction of meaning. Increased knowledge of stance can be useful for many application fields such as business intelligence, security analytics, or social media monitoring. In order to process large amounts of text data for stance analyses, linguists need interactive tools to explore the textual sources as well as the processed data based on computational linguistics techniques. Both original texts and derived data are important for refining the analyses iteratively. In this work, we present a visual analytics tool for online social media text data that can be used to open up the investigation of stance phenomena. Our approach complements traditional linguistic analysis techniques and is based on the analysis of utterances associated with two stance categories: sentiment and certainty. Our contributions include (1) the description of a novel web-based solution for analyzing the use and patterns of stance meanings and expressions in human communication over time; and (2) specialized techniques used for visualizing analysis provenance and corpus overview/navigation. We demonstrate our approach by means of text media on a highly controversial scandal with regard to expressions of anger and provide an expert review from linguists who have been using our tool.

  13. Relating GTE and Knowledge-Based Courseware Engineering: Some Epistemological Issues.

    ERIC Educational Resources Information Center

    De Diana, Italo P. F.; Ladhani, Al-Noor

    1998-01-01

    Discusses GTE (Generic Tutoring Environment) and knowledge-based courseware engineering from an epistemological point of view and suggests some combination of the two approaches. Topics include intelligent tutoring; courseware authoring; application versus acquisition of knowledge; and domain knowledge. (LRW)

  14. Direct model reference adaptive control of a flexible robotic manipulator

    NASA Technical Reports Server (NTRS)

    Meldrum, D. R.

    1985-01-01

    Quick, precise control of a flexible manipulator in a space environment is essential for future Space Station repair and satellite servicing. Numerous control algorithms have proven successful in controlling rigid manipulators wih colocated sensors and actuators; however, few have been tested on a flexible manipulator with noncolocated sensors and actuators. In this thesis, a model reference adaptive control (MRAC) scheme based on command generator tracker theory is designed for a flexible manipulator. Quicker, more precise tracking results are expected over nonadaptive control laws for this MRAC approach. Equations of motion in modal coordinates are derived for a single-link, flexible manipulator with an actuator at the pinned-end and a sensor at the free end. An MRAC is designed with the objective of controlling the torquing actuator so that the tip position follows a trajectory that is prescribed by the reference model. An appealing feature of this direct MRAC law is that it allows the reference model to have fewer states than the plant itself. Direct adaptive control also adjusts the controller parameters directly with knowledge of only the plant output and input signals.

  15. Sophisticated Cloning, Fermentation, and Purification Technologies for an Enhanced Therapeutic Protein Production: A Review

    PubMed Central

    Gupta, Sanjeev K.; Shukla, Pratyoosh

    2017-01-01

    The protein productions strategies are crucial towards the development of application based research and elucidating the novel purification strategies for industrial production. Currently, there are few innovative avenues are studies for cloning, upstream, and purification through efficient bioprocess development. Such strategies are beneficial for industries as well as proven to be vital for effectual therapeutic protein development. Though, these techniques are well documented, but, there is scope of addition to current knowledge with novel and new approaches and it will pave new avenues in production of recombinant microbial and non-microbial proteins including secondary metabolites. In this review, we have focussed on the recent development in clone selection, various modern fermentation and purification technologies and future directions in these emerging areas. Moreover, we have also highlighted notable perspectives and challenges involved in the bioengineering of such proteins, including quality by design, gene editing and pioneering ideas. The biopharmaceutical industries continue to shift towards more flexible, automated platforms and economical product development, which in turn can help in developing the cost effective processes and affordable drug development for a large community. PMID:28725194

  16. Web-Based Personalised System of Instruction: An Effective Approach for Diverse Cohorts with Virtual Learning Environments?

    ERIC Educational Resources Information Center

    Rae, Andrew; Samuels, Peter

    2011-01-01

    The Personalised System of Instruction is a form of mastery learning which, though it has been proven to be educationally effective, has never seriously challenged the dominant lecture-tutorial teaching method in higher education and has largely fallen into disuse. An information and communications technology assisted version of the Personalised…

  17. TEACH: An Ethogram-Based Method to Observe and Record Teaching Behavior

    ERIC Educational Resources Information Center

    Kline, Michelle Ann

    2017-01-01

    Teaching has attracted growing research attention in studies of human and animal behavior as a crucial behavior that coevolved with human cultural capacities. However, the synthesis of data on teaching across species and across human populations has proven elusive because researchers use a variety of definitions and methods to approach the topic.…

  18. Multi-criteria comparative evaluation of spallation reaction models

    NASA Astrophysics Data System (ADS)

    Andrianov, Andrey; Andrianova, Olga; Konobeev, Alexandr; Korovin, Yury; Kuptsov, Ilya

    2017-09-01

    This paper presents an approach to a comparative evaluation of the predictive ability of spallation reaction models based on widely used, well-proven multiple-criteria decision analysis methods (MAVT/MAUT, AHP, TOPSIS, PROMETHEE) and the results of such a comparison for 17 spallation reaction models in the presence of the interaction of high-energy protons with natPb.

  19. Proof of Economic Viability of Blended Learning Business Models

    ERIC Educational Resources Information Center

    Druhmann, Carsten; Hohenberg, Gregor

    2014-01-01

    The discussion on economically sustainable business models with respect to information technology is lacking in many aspects of proven approaches. In the following contribution the economic viability is valued based on a procedural model for design and evaluation of e-learning business models in the form of a case study. As a case study object a…

  20. Treating ALK-positive non-small cell lung cancer

    PubMed Central

    Tsiara, Anna; Tsironis, Georgios; Lykka, Maria; Liontos, Michalis; Bamias, Aristotelis; Dimopoulos, Meletios-Athanasios

    2018-01-01

    Targeting genomic alterations, such as epidermal growth factor receptor (EGFR) mutations and anaplastic lymphoma kinase (ALK) gene rearrangements, have radically changed the treatment of patients with non-small cell lung cancer (NSCLC). In the case of ALK-rearranged gene, subsequent rapid development of effective genotype-directed therapies with ALK tyrosine kinase inhibitors (TKIs) triggered major advances in the personalized molecularly based approach of NSCLC. Crizotinib was the first-in-class ALK TKI with proven superiority over standard platinum-based chemotherapy for the 1st-line therapy of ALK-rearranged NSCLC patients. However, the acquired resistance to crizotinib and its diminished efficacy to the central nervous system (CNS) relapse led to the development of several novel ALK inhibitors, more potent and with different selectivity compared to crizotinib. To date, four ALK TKIs, crizotinib, ceritinib, alectinib and brigatinib have received approval from the Food and Drug Administration (FDA) and/or the European Medicines Agency (EMA) and even more agents are currently under investigation for the treatment of ALK-rearranged NSCLC. However, the optimal frontline approach and the exact sequence of ALK inhibitors are still under consideration. Recently announced results of phase III trials recognized higher efficacy of alectinib compared to crizotinib in first-line setting, even in patients with CNS involvement. In this review, we will discuss the current knowledge regarding the biology of the ALK-positive NSCLC, the available therapeutic inhibitors and we will focus on the raised issues from their use in clinical practise. PMID:29862230

  1. Network-based study of Lagrangian transport and mixing

    NASA Astrophysics Data System (ADS)

    Padberg-Gehle, Kathrin; Schneide, Christiane

    2017-10-01

    Transport and mixing processes in fluid flows are crucially influenced by coherent structures and the characterization of these Lagrangian objects is a topic of intense current research. While established mathematical approaches such as variational methods or transfer-operator-based schemes require full knowledge of the flow field or at least high-resolution trajectory data, this information may not be available in applications. Recently, different computational methods have been proposed to identify coherent behavior in flows directly from Lagrangian trajectory data, that is, numerical or measured time series of particle positions in a fluid flow. In this context, spatio-temporal clustering algorithms have been proven to be very effective for the extraction of coherent sets from sparse and possibly incomplete trajectory data. Inspired by these recent approaches, we consider an unweighted, undirected network, where Lagrangian particle trajectories serve as network nodes. A link is established between two nodes if the respective trajectories come close to each other at least once in the course of time. Classical graph concepts are then employed to analyze the resulting network. In particular, local network measures such as the node degree, the average degree of neighboring nodes, and the clustering coefficient serve as indicators of highly mixing regions, whereas spectral graph partitioning schemes allow us to extract coherent sets. The proposed methodology is very fast to run and we demonstrate its applicability in two geophysical flows - the Bickley jet as well as the Antarctic stratospheric polar vortex.

  2. Control of Origin of Sesame Oil from Various Countries by Stable Isotope Analysis and DNA Based Markers—A Pilot Study

    PubMed Central

    Horacek, Micha; Hansel-Hohl, Karin; Burg, Kornel; Soja, Gerhard; Okello-Anyanga, Walter; Fluch, Silvia

    2015-01-01

    The indication of origin of sesame seeds and sesame oil is one of the important factors influencing its price, as it is produced in many regions worldwide and certain provenances are especially sought after. We joined stable carbon and hydrogen isotope analysis with DNA based molecular marker analysis to study their combined potential for the discrimination of different origins of sesame seeds. For the stable carbon and hydrogen isotope data a positive correlation between both isotope parameters was observed, indicating a dominant combined influence of climate and water availability. This enabled discrimination between sesame samples from tropical and subtropical/moderate climatic provenances. Carbon isotope values also showed differences between oil from black and white sesame seeds from identical locations, indicating higher water use efficiency of plants producing black seeds. DNA based markers gave independent evidence for geographic variation as well as provided information on the genetic relatedness of the investigated samples. Depending on the differences in ambient environmental conditions and in the genotypic fingerprint, a combination of both analytical methods is a very powerful tool to assess the declared geographic origin. To our knowledge this is the first paper on food authenticity combining the stable isotope analysis of bio-elements with DNA based markers and their combined statistical analysis. PMID:25831054

  3. Control of origin of sesame oil from various countries by stable isotope analysis and DNA based markers--a pilot study.

    PubMed

    Horacek, Micha; Hansel-Hohl, Karin; Burg, Kornel; Soja, Gerhard; Okello-Anyanga, Walter; Fluch, Silvia

    2015-01-01

    The indication of origin of sesame seeds and sesame oil is one of the important factors influencing its price, as it is produced in many regions worldwide and certain provenances are especially sought after. We joined stable carbon and hydrogen isotope analysis with DNA based molecular marker analysis to study their combined potential for the discrimination of different origins of sesame seeds. For the stable carbon and hydrogen isotope data a positive correlation between both isotope parameters was observed, indicating a dominant combined influence of climate and water availability. This enabled discrimination between sesame samples from tropical and subtropical/moderate climatic provenances. Carbon isotope values also showed differences between oil from black and white sesame seeds from identical locations, indicating higher water use efficiency of plants producing black seeds. DNA based markers gave independent evidence for geographic variation as well as provided information on the genetic relatedness of the investigated samples. Depending on the differences in ambient environmental conditions and in the genotypic fingerprint, a combination of both analytical methods is a very powerful tool to assess the declared geographic origin. To our knowledge this is the first paper on food authenticity combining the stable isotope analysis of bio-elements with DNA based markers and their combined statistical analysis.

  4. Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)

    NASA Astrophysics Data System (ADS)

    Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.

    2017-12-01

    We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis, CMDA improves the research productivity and collaboration level of its user.

  5. How the provenance of electronic health record data matters for research: a case example using system mapping.

    PubMed

    Johnson, Karin E; Kamineni, Aruna; Fuller, Sharon; Olmstead, Danielle; Wernli, Karen J

    2014-01-01

    The use of electronic health records (EHRs) for research is proceeding rapidly, driven by computational power, analytical techniques, and policy. However, EHR-based research is limited by the complexity of EHR data and a lack of understanding about data provenance, meaning the context under which the data were collected. This paper presents system flow mapping as a method to help researchers more fully understand the provenance of their EHR data as it relates to local workflow. We provide two specific examples of how this method can improve data identification, documentation, and processing. EHRs store clinical and administrative data, often in unstructured fields. Each clinical system has a unique and dynamic workflow, as well as an EHR customized for local use. The EHR customization may be influenced by a broader context such as documentation required for billing. We present a case study with two examples of using system flow mapping to characterize EHR data for a local colorectal cancer screening process. System flow mapping demonstrated that information entered into the EHR during clinical practice required interpretation and transformation before it could be accurately applied to research. We illustrate how system flow mapping shaped our knowledge of the quality and completeness of data in two examples: (1) determining colonoscopy indication as recorded in the EHR, and (2) discovering a specific EHR form that captured family history. Researchers who do not consider data provenance risk compiling data that are systematically incomplete or incorrect. For example, researchers who are not familiar with the clinical workflow under which data were entered might miss or misunderstand patient information or procedure and diagnostic codes. Data provenance is a fundamental characteristic of research data from EHRs. Given the diversity of EHR platforms and system workflows, researchers need tools for evaluating and reporting data availability, quality, and transformations. Our case study illustrates how system mapping can inform researchers about the provenance of their data as it pertains to local workflows.

  6. Integration of evidence-based knowledge management in microsystems: a tele-ICU experience.

    PubMed

    Rincon, Teresa A

    2012-01-01

    The Institute of Medicine's proposed 6 aims to improve health care are timely, safe, effective, efficient, equitable, and patient-centered care. Unfortunately, it also asserts that improvements in these 6 dimensions cannot be achieved within the existing framework of care systems. These systems are based on unrealistic expectations on human cognition and vigilance, and demonstrate a lack of dependence on computerized systems to support care processes and put information at the point of use. Knowledge-based care and evidence-based clinical decision-making need to replace the unscientific care that is being delivered in health care. Building care practices on evidence within an information technology platform is needed to support sound clinical decision-making and to influence organizational adoption of evidence-based practice in health care. Despite medical advances and evidence-based recommendations for treatment of severe sepsis, it remains a significant cause of mortality and morbidity in the world. It is a complex disease state that has proven difficult to define, diagnose, and treat. Supporting bedside teams with real-time knowledge and expertise to target early identification of severe sepsis and compliance to Surviving Sepsis Campaign, evidence-based practice bundles are important to improving outcomes. Using a centralized, remote team of expert nurses and an open-source software application to advance clinical decision-making and execution of the severe sepsis bundle will be examined.

  7. Chances of short-term cooling trends over Canada for the next decades

    NASA Astrophysics Data System (ADS)

    Grenier, Patrick; de Elia, Ramon; Chaumont, Diane

    2014-05-01

    As climate services continue to develop in Quebec, Canada, an increasing number of requests are made for providing information relevant for the near term. As a response, one approach has been to consider short-term cooling trends as a basis for climate products. This project comprises different aspects: technical steps, knowledge transfer, and societal use. Each step does represent a different challenge. The technical part, i.e. producing probabilistic distributions of short-term temperature trends, involves relatively complex scenario construction methods including bias-related post-processing, and access to wide simulation and observation databases. Calculations are performed on 60 CMIP5-based scenarios on a grid covering Canada during the period 2006-2035, and for 5, 10, 15, 20 and 25-year trend durations. Knowledge transfer implies overcoming misinterpretation, given that probabilistic projections based on simulation ensembles are not perfectly related to real-Earth possible outcomes. Finally, societal use of this information remains the biggest challenge. On the one hand, users clearly state their interest in near-term relevant information, and intuitively it seems clear that short-term cooling trends embedded within the long-term warming path should be considered in adaptation plans, for avoiding over-adaptation. On the other hand, the exact way of incorporating such information within a decision-making process has proven not to be obvious. Irrespective of that, the study and communication of short-term cooling chances is necessary for preventing decision-makers to infer from the eventual occurrence of such a trend that global warming isn't happening. The presentation will discuss the three aspects aforementioned.

  8. Sustaining Knowledge Building as a Principle-Based Innovation at an Elementary School

    ERIC Educational Resources Information Center

    Zhang, Jianwei; Hong, Huang-Yao; Scardamalia, Marlene; Teo, Chew Lee; Morley, Elizabeth A.

    2011-01-01

    This study explores Knowledge Building as a principle-based innovation at an elementary school and makes a case for a principle- versus procedure-based approach to educational innovation, supported by new knowledge media. Thirty-nine Knowledge Building initiatives, each focused on a curriculum theme and facilitated by nine teachers over eight…

  9. Knowledge Translation: The Bridging Function of Cochrane Rehabilitation.

    PubMed

    Negrini, Stefano; Gimigliano, Francesca; Arienti, Chiara; Kiekens, Carlotte

    2018-06-01

    Cochrane Rehabilitation is aimed to ensure that all rehabilitation professionals can apply Evidence Based Clinical Practice and take decisions according to the best and most appropriate evidence in this specific field, combining the best available evidence as gathered by high-quality Cochrane systematic reviews, with their own clinical expertise and the values of patients. This mission can be pursued through knowledge translation. The aim of this article is to shortly present what knowledge translation is, how and why Cochrane (previously known as Cochrane Collaboration) is trying to reorganize itself in light of knowledge translation, and the relevance that this process has for Cochrane Rehabilitation and in the end for the whole world of rehabilitation. It is well known how it is difficult to effectively apply in everyday life what we would like to do and to apply the scientific knowledge in the clinical field: this is called the know-do gap. In the field of evidence-based medicine, where Cochrane belongs, it has been proven that high-quality evidence is not consistently applied in practice. A solution to these problems is the so-called knowledge translation. In this context, Cochrane Rehabilitation is organized to provide the best possible knowledge translation in both directions (bridging function), obviously toward the world of rehabilitation (spreading reviews), but also to the Cochrane community (production of reviews significant for rehabilitation). Cochrane is now strongly pushing to improve its knowledge translation activities, and this creates a strong base for Cochrane Rehabilitation work, focused not only on spreading the evidence but also on improving its production to make it more meaningful for the world of rehabilitation. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  10. A knowledge-based potential with an accurate description of local interactions improves discrimination between native and near-native protein conformations.

    PubMed

    Ferrada, Evandro; Vergara, Ismael A; Melo, Francisco

    2007-01-01

    The correct discrimination between native and near-native protein conformations is essential for achieving accurate computer-based protein structure prediction. However, this has proven to be a difficult task, since currently available physical energy functions, empirical potentials and statistical scoring functions are still limited in achieving this goal consistently. In this work, we assess and compare the ability of different full atom knowledge-based potentials to discriminate between native protein structures and near-native protein conformations generated by comparative modeling. Using a benchmark of 152 near-native protein models and their corresponding native structures that encompass several different folds, we demonstrate that the incorporation of close non-bonded pairwise atom terms improves the discriminating power of the empirical potentials. Since the direct and unbiased derivation of close non-bonded terms from current experimental data is not possible, we obtained and used those terms from the corresponding pseudo-energy functions of a non-local knowledge-based potential. It is shown that this methodology significantly improves the discrimination between native and near-native protein conformations, suggesting that a proper description of close non-bonded terms is important to achieve a more complete and accurate description of native protein conformations. Some external knowledge-based energy functions that are widely used in model assessment performed poorly, indicating that the benchmark of models and the specific discrimination task tested in this work constitutes a difficult challenge.

  11. Competency-based teacher training: A systematic revision of a proven programme in medical didactics.

    PubMed

    Griewatz, Jan; Simon, Melanie; Lammerding-Koeppel, Maria

    2017-01-01

    Objectives: Competency-based medical education (CBME) requires factual knowledge to be practically applied together with skills and attitudes. With the National Competence-Based Learning Objectives for Undergraduate Medical Education (NKLM) representing a strong official demand for competence-orientation, it is generally important to explicitly outline its characteristics and review its realisation in teacher trainings. Further requirements are given by the core competencies for medical teachers (KLM). As an example the MQ programme ("Medizindidaktische Qualifikation") in Baden-Wuerttemberg, a long established and well-accepted training, has been critically revised on this basis, concerning its suitability for the demands of CBME, its needs for adjustment and the efforts to be undertaken for its implementation. Methods: In a systematic quality management process the MQ curriculum and its organisational framing were analysed and further developed in a step-wise comprehensive approach, using the six-step cycle by Kern. The procedures included a thorough needs assessment (e.g. literature research, programme mapping), strategic decisions on structure and content, piloting and evaluation. During the process essential elements of project and change management were considered. Results: The experiences of the MQ example revealed helpful information for key factors to be considered in the pending change process any training provider will be confronted with. Guiding questions were developed related to the process phases. Our analyses showed persistent key points of proven value as stable foundation for change, as well as components needing special consideration to foster competence-oriented aims and transfer into practice: reflection, feedback, application-oriented methods and transparent competence development. These aspects have to be consciously perceived and experienced by participants. Taking this into account, we re-designed the course evidence-based. Besides visualising competencies and their progress, the occasions for reflection and feedback as well as the number of typical, practice-oriented tasks were extended to facilitate self-directed learning, critical self-reflection and individualised solutions. It is shown at what point, in what form and with which purpose these aspects were integrated in the MQ programme. Piloting showed good acceptance by participants, trainers. Preliminary assessment of the outcome is promising. Conclusion: Respecting the high workload, most likely medical teachers will not put CBME concepts into practice without impulses and support. Therefore, in didactical trainings, medical teachers should practice in a competency-based teaching setting and reflect themselves in different professional roles to be able to transfer the experiences to their own educational approach. Trainers and training can serve as models for CBME realisation.

  12. Competency-based teacher training: A systematic revision of a proven programme in medical didactics

    PubMed Central

    Griewatz, Jan; Simon, Melanie; Lammerding-Koeppel, Maria

    2017-01-01

    Objectives: Competency-based medical education (CBME) requires factual knowledge to be practically applied together with skills and attitudes. With the National Competence-Based Learning Objectives for Undergraduate Medical Education (NKLM) representing a strong official demand for competence-orientation, it is generally important to explicitly outline its characteristics and review its realisation in teacher trainings. Further requirements are given by the core competencies for medical teachers (KLM). As an example the MQ programme (“Medizindidaktische Qualifikation”) in Baden-Wuerttemberg, a long established and well-accepted training, has been critically revised on this basis, concerning its suitability for the demands of CBME, its needs for adjustment and the efforts to be undertaken for its implementation. Methods: In a systematic quality management process the MQ curriculum and its organisational framing were analysed and further developed in a step-wise comprehensive approach, using the six-step cycle by Kern. The procedures included a thorough needs assessment (e.g. literature research, programme mapping), strategic decisions on structure and content, piloting and evaluation. During the process essential elements of project and change management were considered. Results: The experiences of the MQ example revealed helpful information for key factors to be considered in the pending change process any training provider will be confronted with. Guiding questions were developed related to the process phases. Our analyses showed persistent key points of proven value as stable foundation for change, as well as components needing special consideration to foster competence-oriented aims and transfer into practice: reflection, feedback, application-oriented methods and transparent competence development. These aspects have to be consciously perceived and experienced by participants. Taking this into account, we re-designed the course evidence-based. Besides visualising competencies and their progress, the occasions for reflection and feedback as well as the number of typical, practice-oriented tasks were extended to facilitate self-directed learning, critical self-reflection and individualised solutions. It is shown at what point, in what form and with which purpose these aspects were integrated in the MQ programme. Piloting showed good acceptance by participants, trainers. Preliminary assessment of the outcome is promising. Conclusion: Respecting the high workload, most likely medical teachers will not put CBME concepts into practice without impulses and support. Therefore, in didactical trainings, medical teachers should practice in a competency-based teaching setting and reflect themselves in different professional roles to be able to transfer the experiences to their own educational approach. Trainers and training can serve as models for CBME realisation. PMID:29085888

  13. Using fuzzy rule-based knowledge model for optimum plating conditions search

    NASA Astrophysics Data System (ADS)

    Solovjev, D. S.; Solovjeva, I. A.; Litovka, Yu V.; Arzamastsev, A. A.; Glazkov, V. P.; L’vov, A. A.

    2018-03-01

    The paper discusses existing approaches to plating process modeling in order to decrease the distribution thickness of plating surface cover. However, these approaches do not take into account the experience, knowledge, and intuition of the decision-makers when searching the optimal conditions of electroplating technological process. The original approach to optimal conditions search for applying the electroplating coatings, which uses the rule-based model of knowledge and allows one to reduce the uneven product thickness distribution, is proposed. The block diagrams of a conventional control system of a galvanic process as well as the system based on the production model of knowledge are considered. It is shown that the fuzzy production model of knowledge in the control system makes it possible to obtain galvanic coatings of a given thickness unevenness with a high degree of adequacy to the experimental data. The described experimental results confirm the theoretical conclusions.

  14. A knowledge engineering framework towards clinical support for adverse drug event prevention: the PSIP approach.

    PubMed

    Koutkias, Vassilis; Stalidis, George; Chouvarda, Ioanna; Lazou, Katerina; Kilintzis, Vassilis; Maglaveras, Nicos

    2009-01-01

    Adverse Drug Events (ADEs) are currently considered as a major public health issue, endangering patients' safety and causing significant healthcare costs. Several research efforts are currently concentrating on the reduction of preventable ADEs by employing Information Technology (IT) solutions, which aim to provide healthcare professionals and patients with relevant knowledge and decision support tools. In this context, we present a knowledge engineering approach towards the construction of a Knowledge-based System (KBS) regarded as the core part of a CDSS (Clinical Decision Support System) for ADE prevention, all developed in the context of the EU-funded research project PSIP (Patient Safety through Intelligent Procedures in Medication). In the current paper, we present the knowledge sources considered in PSIP and the implications they pose to knowledge engineering, the methodological approach followed, as well as the components defining the knowledge engineering framework based on relevant state-of-the-art technologies and representation formalisms.

  15. Scientific Reproducibility in Biomedical Research: Provenance Metadata Ontology for Semantic Annotation of Study Description.

    PubMed

    Sahoo, Satya S; Valdez, Joshua; Rueschman, Michael

    2016-01-01

    Scientific reproducibility is key to scientific progress as it allows the research community to build on validated results, protect patients from potentially harmful trial drugs derived from incorrect results, and reduce wastage of valuable resources. The National Institutes of Health (NIH) recently published a systematic guideline titled "Rigor and Reproducibility " for supporting reproducible research studies, which has also been accepted by several scientific journals. These journals will require published articles to conform to these new guidelines. Provenance metadata describes the history or origin of data and it has been long used in computer science to capture metadata information for ensuring data quality and supporting scientific reproducibility. In this paper, we describe the development of Provenance for Clinical and healthcare Research (ProvCaRe) framework together with a provenance ontology to support scientific reproducibility by formally modeling a core set of data elements representing details of research study. We extend the PROV Ontology (PROV-O), which has been recommended as the provenance representation model by World Wide Web Consortium (W3C), to represent both: (a) data provenance, and (b) process provenance. We use 124 study variables from 6 clinical research studies from the National Sleep Research Resource (NSRR) to evaluate the coverage of the provenance ontology. NSRR is the largest repository of NIH-funded sleep datasets with 50,000 studies from 36,000 participants. The provenance ontology reuses ontology concepts from existing biomedical ontologies, for example the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), to model the provenance information of research studies. The ProvCaRe framework is being developed as part of the Big Data to Knowledge (BD2K) data provenance project.

  16. Scientific Reproducibility in Biomedical Research: Provenance Metadata Ontology for Semantic Annotation of Study Description

    PubMed Central

    Sahoo, Satya S.; Valdez, Joshua; Rueschman, Michael

    2016-01-01

    Scientific reproducibility is key to scientific progress as it allows the research community to build on validated results, protect patients from potentially harmful trial drugs derived from incorrect results, and reduce wastage of valuable resources. The National Institutes of Health (NIH) recently published a systematic guideline titled “Rigor and Reproducibility “ for supporting reproducible research studies, which has also been accepted by several scientific journals. These journals will require published articles to conform to these new guidelines. Provenance metadata describes the history or origin of data and it has been long used in computer science to capture metadata information for ensuring data quality and supporting scientific reproducibility. In this paper, we describe the development of Provenance for Clinical and healthcare Research (ProvCaRe) framework together with a provenance ontology to support scientific reproducibility by formally modeling a core set of data elements representing details of research study. We extend the PROV Ontology (PROV-O), which has been recommended as the provenance representation model by World Wide Web Consortium (W3C), to represent both: (a) data provenance, and (b) process provenance. We use 124 study variables from 6 clinical research studies from the National Sleep Research Resource (NSRR) to evaluate the coverage of the provenance ontology. NSRR is the largest repository of NIH-funded sleep datasets with 50,000 studies from 36,000 participants. The provenance ontology reuses ontology concepts from existing biomedical ontologies, for example the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), to model the provenance information of research studies. The ProvCaRe framework is being developed as part of the Big Data to Knowledge (BD2K) data provenance project. PMID:28269904

  17. Enhancing Learning Outcomes with an Interactive Knowledge-Based Learning Environment Providing Narrative Feedback

    ERIC Educational Resources Information Center

    Stranieri, Andrew; Yearwood, John

    2008-01-01

    This paper describes a narrative-based interactive learning environment which aims to elucidate reasoning using interactive scenarios that may be used in training novices in decision-making. Its design is based on an approach to generating narrative from knowledge that has been modelled in specific decision/reasoning domains. The approach uses a…

  18. Is Student Knowledge of Anatomy Affected by a Problem-Based Learning Approach? A Review

    ERIC Educational Resources Information Center

    Williams, Jonathan M.

    2014-01-01

    A fundamental understanding of anatomy is critical for students on many health science courses. It has been suggested that a problem-based approach to learning anatomy may result in deficits in foundation knowledge. The aim of this review is to compare traditional didactic methods with problem-based learning methods for obtaining anatomy…

  19. A Knowledge Based Approach to VLSI CAD

    DTIC Science & Technology

    1983-09-01

    Avail-and/or Dist ISpecial L| OI. SEICURITY CLASIIrCATION OP THIS IPA.lErllm S Daene." A KNOwLEDE BASED APPROACH TO VLSI CAD’ Louis L Steinberg and...major issues lies in building up and managing the knowledge base of oesign expertise. We expect that, as with many recent expert systems, in order to

  20. A Computer-Based Approach for Deriving and Measuring Individual and Team Knowledge Structure from Essay Questions

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Wallace, Patricia

    2007-01-01

    This proof-of-concept investigation describes a computer-based approach for deriving the knowledge structure of individuals and of groups from their written essays, and considers the convergent criterion-related validity of the computer-based scores relative to human rater essay scores and multiple-choice test scores. After completing a…

  1. Some requirements and suggestions for a methodology to develop knowledge based systems.

    PubMed

    Green, D W; Colbert, M; Long, J

    1989-11-01

    This paper describes an approach to the creation of a methodology for the development of knowledge based systems. It specifies some requirements and suggests how these requirements might be met. General requirements can be satisfied using a systems approach. More specific ones can be met by viewing an organization as a network of consultations for coordinating expertise. The nature of consultations is described and the form of a possible cognitive model using a blackboard architecture is outlined. The value of the approach is illustrated in terms of certain knowledge elicitation methods.

  2. Miners, Silica and Disability: The Bi-National Interplay Between South Africa and the United Kingdom, c1900–1930s

    PubMed Central

    McIvor, Arthur

    2016-01-01

    This paper investigates silicosis as a disabling disease in underground mining in the United Kingdom (UK) before Second World War, exploring the important connections between South Africa and the UK and examining some of the issues raised at the 1930 International Labour Office Conference on silicosis in Johannesburg in a British context. The evidence suggests there were significant paradoxes and much contestation in medical knowledge creation, advocacy, and policy-making relating to this occupational disease. It is argued here that whilst there was an international exchange of scientific knowledge on silicosis in the early decades of the twentieth century, it was insufficient to challenge the traditional defense adopted by the British government of proven beyond all scientific doubt before effective intervention in coal mining. This circumspect approach reflected dominant business interests and despite relatively robust trade union campaigning and eventual reform, the outcome was an accumulative legacy of respiratory disease and disability that blighted coalfield communities. PMID:26509751

  3. Distributed, cooperating knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt

    1991-01-01

    Some current research in the development and application of distributed, cooperating knowledge-based systems technology is addressed. The focus of the current research is the spacecraft ground operations environment. The underlying hypothesis is that, because of the increasing size, complexity, and cost of planned systems, conventional procedural approaches to the architecture of automated systems will give way to a more comprehensive knowledge-based approach. A hallmark of these future systems will be the integration of multiple knowledge-based agents which understand the operational goals of the system and cooperate with each other and the humans in the loop to attain the goals. The current work includes the development of a reference model for knowledge-base management, the development of a formal model of cooperating knowledge-based agents, the use of testbed for prototyping and evaluating various knowledge-based concepts, and beginning work on the establishment of an object-oriented model of an intelligent end-to-end (spacecraft to user) system. An introductory discussion of these activities is presented, the major concepts and principles being investigated are highlighted, and their potential use in other application domains is indicated.

  4. A framework for teaching medical students and residents about practice-based learning and improvement, synthesized from a literature review.

    PubMed

    Ogrinc, Greg; Headrick, Linda A; Mutha, Sunita; Coleman, Mary T; O'Donnell, Joseph; Miles, Paul V

    2003-07-01

    To create a framework for teaching the knowledge and skills of practice-based learning and improvement to medical students and residents based on proven, effective strategies. The authors conducted a Medline search of English-language articles published between 1996 and May 2001, using the term "quality improvement" (QI), and cross-matched it with "medical education" and "health professions education." A thematic-synthesis method of review was used to compile the information from the articles. Based on the literature review, an expert panel recommended educational objectives for practice-based learning and improvement. Twenty-seven articles met the inclusion criteria. The majority of studies were conducted in academic medical centers and medical schools and 40% addressed experiential learning of QI. More than 75% were qualitative case reports capturing educational outcomes, and 7% included an experimental study design. The expert panel integrated data from the literature review with the Dreyfus model of professional skill acquisition, the Institute for Healthcare Improvement's (IHI) knowledge domains for improving health care, and the ACGME competencies and generated a framework of core educational objectives about teaching practice-based learning and improvement to medical students and residents. Teaching the knowledge and skills of practice-based learning and improvement to medical students and residents is a necessary and important foundation for improving patient care. The authors present a framework of learning objectives-informed by the literature and synthesized by the expert panel-to assist educational leaders when integrating these objectives into a curriculum. This framework serves as a blueprint to bridge the gap between current knowledge and future practice needs.

  5. Knowledge Resources - A Knowledge Management Approach for Digital Ecosystems

    NASA Astrophysics Data System (ADS)

    Kurz, Thomas; Eder, Raimund; Heistracher, Thomas

    The paper at hand presents an innovative approach for the conception and implementation of knowledge management in Digital Ecosystems. Based on a reflection of Digital Ecosystem research of the past years, an architecture is outlined which utilizes Knowledge Resources as the central and simplest entities of knowledge transfer. After the discussion of the related conception, the result of a first prototypical implementation is described that helps the transformation of implicit knowledge to explicit knowledge for wide use.

  6. Particle-based vaccines for HIV-1 infection.

    PubMed

    Young, Kelly R; Ross, Ted M

    2003-06-01

    The use of live-attenuated viruses as vaccines has been successful for the control of viral infections. However, the development of an effective vaccine against the human immunodeficiency virus (HIV) has proven to be a challenge. HIV infects cells of the immune system and results in a severe immunodeficiency. In addition, the ability of the virus to adapt to immune pressure and the ability to reside in an integrated form in host cells present hurdles for vaccinologists to overcome. A particle-based vaccine strategy has promise for eliciting high titer, long-lived, immune responses to a diverse number of viral epitopes from different HIV antigens. Live-attenuated viruses are effective at generating both cellular and humoral immunity, however, a live-attenuated vaccine for HIV is problematic. The possibility of a live-attenuated vaccine to revert to a pathogenic form or recombine with a wild-type or defective virus in an infected individual is a drawback to this approach. Therefore, these vaccines are currently only being tested in non-human primate models. Live-attenuated vaccines are effective in stimulating immunity, however challenged animals rarely clear viral infection and the degree of attenuation directly correlates with the protection of animals from disease. Another particle-based vaccine approach for HIV involves the use of virus-like particles (VLPs). VLPs mimic the viral particle without causing an immunodeficiency disease. HIV-like particles (HIV-LP) are defined as self-assembling, non-replicating, nonpathogenic, genomeless particles that are similar in size and conformation to intact virions. A variety of VLPs for both HIV and SIV are currently in pre-clinical and clinical trials. This review focuses on the current knowledge regarding the immunogenicity and safety of particle-based vaccine strategies for HIV-1.

  7. 76 FR 4452 - Privacy Act of 1974; Report of Modified or Altered System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... Disease Control and Prevention (CDC) for more complete knowledge of the disease/condition in the following... the light of future discoveries and proven associations so that relevant data collected at the time of... professional staff at the Centers for Disease Control and Prevention (CDC) for more complete knowledge of the...

  8. Inductive knowledge acquisition experience with commercial tools for space shuttle main engine testing

    NASA Technical Reports Server (NTRS)

    Modesitt, Kenneth L.

    1990-01-01

    Since 1984, an effort has been underway at Rocketdyne, manufacturer of the Space Shuttle Main Engine (SSME), to automate much of the analysis procedure conducted after engine test firings. Previously published articles at national and international conferences have contained the context of and justification for this effort. Here, progress is reported in building the full system, including the extensions of integrating large databases with the system, known as Scotty. Inductive knowledge acquisition has proven itself to be a key factor in the success of Scotty. The combination of a powerful inductive expert system building tool (ExTran), a relational data base management system (Reliance), and software engineering principles and Computer-Assisted Software Engineering (CASE) tools makes for a practical, useful and state-of-the-art application of an expert system.

  9. Engineering Knowledge for Assistive Living

    NASA Astrophysics Data System (ADS)

    Chen, Liming; Nugent, Chris

    This paper introduces a knowledge based approach to assistive living in smart homes. It proposes a system architecture that makes use of knowledge in the lifecycle of assistive living. The paper describes ontology based knowledge engineering practices and discusses mechanisms for exploiting knowledge for activity recognition and assistance. It presents system implementation and experiments, and discusses initial results.

  10. Oligomer formation in the troposphere: from experimental knowledge to 3-D modeling

    NASA Astrophysics Data System (ADS)

    Lemaire, Vincent; Coll, Isabelle; Couvidat, Florian; Mouchel-Vallon, Camille; Seigneur, Christian; Siour, Guillaume

    2016-04-01

    The organic fraction of atmospheric aerosols has proven to be a critical element of air quality and climate issues. However, its composition and the aging processes it undergoes remain insufficiently understood. This work builds on laboratory knowledge to simulate the formation of oligomers from biogenic secondary organic aerosol (BSOA) in the troposphere at the continental scale. We compare the results of two different modeling approaches, a first-order kinetic process and a pH-dependent parameterization, both implemented in the CHIMERE air quality model (AQM) (www.lmd.polytechnique.fr/chimere), to simulate the spatial and temporal distribution of oligomerized secondary organic aerosol (SOA) over western Europe. We also included a comparison of organic carbon (OC) concentrations at two EMEP (European Monitoring and Evaluation Programme) stations. Our results show that there is a strong dependence of the results on the selected modeling approach: while the irreversible kinetic process leads to the oligomerization of about 50 % of the total BSOA mass, the pH-dependent approach shows a broader range of impacts, with a strong dependency on environmental parameters (pH and nature of aerosol) and the possibility for the process to be reversible. In parallel, we investigated the sensitivity of each modeling approach to the representation of SOA precursor solubility (Henry's law constant values). Finally, the pros and cons of each approach for the representation of SOA aging are discussed and recommendations are provided to improve current representations of oligomer formation in AQMs.

  11. Comparison of clinical knowledge bases for summarization of electronic health records.

    PubMed

    McCoy, Allison B; Sittig, Dean F; Wright, Adam

    2013-01-01

    Automated summarization tools that create condition-specific displays may improve clinician efficiency. These tools require new kinds of knowledge that is difficult to obtain. We compared five problem-medication pair knowledge bases generated using four previously described knowledge base development approaches. The number of pairs in the resulting mapped knowledge bases varied widely due to differing mapping techniques from the source terminologies, ranging from 2,873 to 63,977,738 pairs. The number of overlapping pairs across knowledge bases was low, with one knowledge base having half of the pairs overlapping with another knowledge base, and most having less than a third overlapping. Further research is necessary to better evaluate the knowledge bases independently in additional settings, and to identify methods to integrate the knowledge bases.

  12. Knowledge-intensive software design systems: Can too much knowledge be a burden?

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1992-01-01

    While acknowledging the considerable benefits of domain-specific, knowledge-intensive approaches to automated software engineering, it is prudent to carefully examine the costs of such approaches, as well. In adding domain knowledge to a system, a developer makes a commitment to understanding, representing, maintaining, and communicating that knowledge. This substantial overhead is not generally associated with domain-independent approaches. In this paper, I examine the downside of incorporating additional knowledge, and illustrate with examples based on our experience in building the SIGMA system. I also offer some guidelines for developers building domain-specific systems.

  13. Knowledge-intensive software design systems: Can too much knowledge be a burden?

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1992-01-01

    While acknowledging the considerable benefits of domain-specific, knowledge-intensive approaches to automated software engineering, it is prudent to carefully examine the costs of such approaches, as well. In adding domain knowledge to a system, a developer makes a commitment to understanding, representing, maintaining, and communicating that knowledge. This substantial overhead is not generally associated with domain-independent approaches. In this paper, I examine the downside of incorporating additional knowledge, and illustrate with examples based on our experiences building the SIGMA system. I also offer some guidelines for developers building domain-specific systems.

  14. An approach to combining heuristic and qualitative reasoning in an expert system

    NASA Technical Reports Server (NTRS)

    Jiang, Wei-Si; Han, Chia Yung; Tsai, Lian Cheng; Wee, William G.

    1988-01-01

    An approach to combining the heuristic reasoning from shallow knowledge and the qualitative reasoning from deep knowledge is described. The shallow knowledge is represented in production rules and under the direct control of the inference engine. The deep knowledge is represented in frames, which may be put in a relational DataBase Management System. This approach takes advantage of both reasoning schemes and results in improved efficiency as well as expanded problem solving ability.

  15. The Use of GIS for the Application of the Phenomenological Approach to the Seismic Risk Analysis: the Case of the Italian Fortified Architecture

    NASA Astrophysics Data System (ADS)

    Lenticchia, E.; Coïsson, E.

    2017-05-01

    The present paper proposes the use of GIS for the application of the so-called phenomenological approach to the analysis of the seismic behaviour of historical buildings. This approach is based on the awareness that the different masonry building typologies are characterized by different, recurring vulnerabilities. Thus, the observation and classification of the real damage is seen as the first step for recognizing and classifying these vulnerabilities, in order to plan focused preventive interventions. For these purposes, the GIS has proven to be a powerful instrument to collect and manage this type of information on a large number of cases. This paper specifically focuses on the application of the phenomenological approach to the analysis of the seismic behaviour of fortified buildings, including castles, fortresses, citadels, and all the typical historical constructions characterized by the presence of massive towers and defensive walls. The main earthquakes which struck Italy in the last 40 years (up to the recent Central Italy seismic swarm) were taken into consideration and described by means of shake maps. A previously published work has been continued with the addition of new data and some improvements, including a specific symbology for the description of building typologies and conservation status on the maps, the indications of damage levels and the comparison between shake maps in terms of pga and in terms of pseudo-acceleration. The increase in knowledge obtained and the broader frame given by the analysis of the data are here directed to the primary aim of cultural heritage preservation.

  16. Genetic therapy for vein bypass graft disease: current perspectives.

    PubMed

    Simosa, Hector F; Conte, Michael S

    2004-01-01

    Although continued progress in endovascular technology holds promise for less invasive approaches to arterial diseases, surgical bypass grafting remains the mainstay of therapy for patients with advanced coronary and peripheral ischemia. In the United States, nearly 400,000 coronary and 100,000 lower extremity bypass procedures are performed annually. The autogenous vein, particularly the greater saphenous vein, has proven to be a durable and versatile arterial substitute, with secondary patency rates at 5 years of 70 to 80% in the extremity. However, vein graft failure is a common occurrence that incurs significant morbidity and mortality, and, to date, pharmacologic approaches to prolong vein graft patency have produced limited results. Dramatic advances in genetics, coupled with a rapidly expanding knowledge of the molecular basis of vascular diseases, have set the stage for genetic interventions. The attraction of a genetic approach to vein graft failure is based on the notion that the tissue at risk is readily accessible to the clinician prior to the onset of the pathologic process and the premise that genetic reprogramming of cells in the wall of the vein can lead to an improved healing response. Although the pathophysiology of vein graft failure is incompletely understood, numerous relevant molecular targets have been elucidated. Interventions designed to influence cell proliferation, thrombosis, inflammation, and matrix remodeling at the genetic level have been described, and many have been tested in animal models. Both gene delivery and gene blockade strategies have been investigated, with the latter reaching the stage of advanced clinical trials.

  17. Building Better Decision-Support by Using Knowledge Discovery.

    ERIC Educational Resources Information Center

    Jurisica, Igor

    2000-01-01

    Discusses knowledge-based decision-support systems that use artificial intelligence approaches. Addresses the issue of how to create an effective case-based reasoning system for complex and evolving domains, focusing on automated methods for system optimization and domain knowledge evolution that can supplement knowledge acquired from domain…

  18. COM3/369: Knowledge-based Information Systems: A new approach for the representation and retrieval of medical information

    PubMed Central

    Mann, G; Birkmann, C; Schmidt, T; Schaeffler, V

    1999-01-01

    Introduction Present solutions for the representation and retrieval of medical information from online sources are not very satisfying. Either the retrieval process lacks of precision and completeness the representation does not support the update and maintenance of the represented information. Most efforts are currently put into improving the combination of search engines and HTML based documents. However, due to the current shortcomings of methods for natural language understanding there are clear limitations to this approach. Furthermore, this approach does not solve the maintenance problem. At least medical information exceeding a certain complexity seems to afford approaches that rely on structured knowledge representation and corresponding retrieval mechanisms. Methods Knowledge-based information systems are based on the following fundamental ideas. The representation of information is based on ontologies that define the structure of the domain's concepts and their relations. Views on domain models are defined and represented as retrieval schemata. Retrieval schemata can be interpreted as canonical query types focussing on specific aspects of the provided information (e.g. diagnosis or therapy centred views). Based on these retrieval schemata it can be decided which parts of the information in the domain model must be represented explicitly and formalised to support the retrieval process. As representation language propositional logic is used. All other information can be represented in a structured but informal way using text, images etc. Layout schemata are used to assign layout information to retrieved domain concepts. Depending on the target environment HTML or XML can be used. Results Based on this approach two knowledge-based information systems have been developed. The 'Ophthalmologic Knowledge-based Information System for Diabetic Retinopathy' (OKIS-DR) provides information on diagnoses, findings, examinations, guidelines, and reference images related to diabetic retinopathy. OKIS-DR uses combinations of findings to specify the information that must be retrieved. The second system focuses on nutrition related allergies and intolerances. Information on allergies and intolerances of a patient are used to retrieve general information on the specified combination of allergies and intolerances. As a special feature the system generates tables showing food types and products that are tolerated or not tolerated by patients. Evaluation by external experts and user groups showed that the described approach of knowledge-based information systems increases the precision and completeness of knowledge retrieval. Due to the structured and non-redundant representation of information the maintenance and update of the information can be simplified. Both systems are available as WWW based online knowledge bases and CD-ROMs (cf. http://mta.gsf.de topic: products).

  19. A time-responsive tool for informing policy making: rapid realist review.

    PubMed

    Saul, Jessie E; Willis, Cameron D; Bitz, Jennifer; Best, Allan

    2013-09-05

    A realist synthesis attempts to provide policy makers with a transferable theory that suggests a certain program is more or less likely to work in certain respects, for particular subjects, in specific kinds of situations. Yet realist reviews can require considerable and sustained investment over time, which does not always suit the time-sensitive demands of many policy decisions. 'Rapid Realist Review' methodology (RRR) has been developed as a tool for applying a realist approach to a knowledge synthesis process in order to produce a product that is useful to policy makers in responding to time-sensitive and/or emerging issues, while preserving the core elements of realist methodology. Using examples from completed RRRs, we describe key features of the RRR methodology, the resources required, and the strengths and limitations of the process. All aspects of an RRR are guided by both a local reference group, and a group of content experts. Involvement of knowledge users and external experts ensures both the usability of the review products, as well as their links to current practice. RRRs have proven useful in providing evidence for and making explicit what is known on a given topic, as well as articulating where knowledge gaps may exist. From the RRRs completed to date, findings broadly adhere to four (often overlapping) classifications: guiding rules for policy-making; knowledge quantification (i.e., the amount of literature available that identifies context, mechanisms, and outcomes for a given topic); understanding tensions/paradoxes in the evidence base; and, reinforcing or refuting beliefs and decisions taken. 'Traditional' realist reviews and RRRs have some key differences, which allow policy makers to apply each type of methodology strategically to maximize its utility within a particular local constellation of history, goals, resources, politics and environment. In particular, the RRR methodology is explicitly designed to engage knowledge users and review stakeholders to define the research questions, and to streamline the review process. In addition, results are presented with a focus on context-specific explanations for what works within a particular set of parameters rather than producing explanations that are potentially transferrable across contexts and populations. For policy makers faced with making difficult decisions in short time frames for which there is sufficient (if limited) published/research and practice-based evidence available, RRR provides a practical, outcomes-focused knowledge synthesis method.

  20. A relational data-knowledge base system and its potential in developing a distributed data-knowledge system

    NASA Technical Reports Server (NTRS)

    Rahimian, Eric N.; Graves, Sara J.

    1988-01-01

    A new approach used in constructing a rational data knowledge base system is described. The relational database is well suited for distribution due to its property of allowing data fragmentation and fragmentation transparency. An example is formulated of a simple relational data knowledge base which may be generalized for use in developing a relational distributed data knowledge base system. The efficiency and ease of application of such a data knowledge base management system is briefly discussed. Also discussed are the potentials of the developed model for sharing the data knowledge base as well as the possible areas of difficulty in implementing the relational data knowledge base management system.

  1. Self-organizing maps in geothermal exploration-A new approach for understanding geochemical processes and fluid evolution

    NASA Astrophysics Data System (ADS)

    Brehme, Maren; Bauer, Klaus; Nukman, Mochamad; Regenspurg, Simona

    2017-04-01

    Understanding geochemical processes is an important part of geothermal exploration to get information about the source and evolution of geothermal fluids. However, in most cases knowledge of fluid properties is based on few parameters determined in samples from the shallow subsurface. This study presents a new approach that allows to conclude from the combination of a variety of these data on processes occurring at depth in a geothermal reservoir. The neural network clustering technique called "self-organizing maps" (SOMs) successfully distinguished two different geothermal settings based on a hydrochemical database and disclosed the source, evolution and flow pathways of geothermal fluids. Scatter plots, as shown in this study, are appropriate presentations of element concentrations and the chemical interaction of water and rock at depth. One geological setting presented here is marked by fault dominated fluid pathways and minor influence of volcanic affected fluids with high concentrations of HCO3, Ca and Sr. The second is a magmatically dominated setting showing strong alteration features in volcanic rocks and accommodates acidic fluids with high SO4 and Si concentrations. Former studies, i.e., Giggenbach (1988), suggested Cl, HCO3 and SO4 to be generally the most important elements for understanding hydrochemical processes in geothermal reservoirs. Their relation has been widely used to classify different water types in geothermal fields. However, this study showed that non-standard elements are at least of same importance to reveal different fluid types in geothermal systems. Therefore, this study is an extended water classification approach using SOM for element correlations. SOM have been proven to be a successful method for analyzing even relatively small hydrochemical datasets in geothermal applications.

  2. "I Have Just Understood It from the Story …": Using Vignettes in Educational Research to Investigate Cultural Tolerance

    ERIC Educational Resources Information Center

    Al Sadi, Fatma H.; Basit, Tehmina N.

    2017-01-01

    The vignettes approach has emerged as a popular tool in quantitative and qualitative research. It has proven to be particularly effective in measuring sensitive topics. This paper focuses on the construction and validation process of questionnaire-based vignettes, which were used as an instrument to examine Omani secondary school girls' cultural…

  3. Development of a component centered fault monitoring and diagnosis knowledge based system for space power system

    NASA Technical Reports Server (NTRS)

    Lee, S. C.; Lollar, Louis F.

    1988-01-01

    The overall approach currently being taken in the development of AMPERES (Autonomously Managed Power System Extendable Real-time Expert System), a knowledge-based expert system for fault monitoring and diagnosis of space power systems, is discussed. The system architecture, knowledge representation, and fault monitoring and diagnosis strategy are examined. A 'component-centered' approach developed in this project is described. Critical issues requiring further study are identified.

  4. Microbial composition analyses by 16S rRNA sequencing: A proof of concept approach to provenance determination of archaeological ochre.

    PubMed

    Lenehan, Claire E; Tobe, Shanan S; Smith, Renee J; Popelka-Filcoff, Rachel S

    2017-01-01

    Many archaeological science studies use the concept of "provenance", where the origins of cultural material can be determined through physical or chemical properties that relate back to the origins of the material. Recent studies using DNA profiling of bacteria have been used for the forensic determination of soils, towards determination of geographic origin. This manuscript presents a novel approach to the provenance of archaeological minerals and related materials through the use of 16S rRNA sequencing analysis of microbial DNA. Through the microbial DNA characterization from ochre and multivariate statistics, we have demonstrated the clear discrimination between four distinct Australian cultural ochre sites.

  5. Automated structure solution, density modification and model building.

    PubMed

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  6. Moiré deflectometry-based position detection for optical tweezers.

    PubMed

    Khorshad, Ali Akbar; Reihani, S Nader S; Tavassoly, Mohammad Taghi

    2017-09-01

    Optical tweezers have proven to be indispensable tools for pico-Newton range force spectroscopy. A quadrant photodiode (QPD) positioned at the back focal plane of an optical tweezers' condenser is commonly used for locating the trapped object. In this Letter, for the first time, to the best of our knowledge, we introduce a moiré pattern-based detection method for optical tweezers. We show, both theoretically and experimentally, that this detection method could provide considerably better position sensitivity compared to the commonly used detection systems. For instance, position sensitivity for a trapped 2.17 μm polystyrene bead is shown to be 71% better than the commonly used QPD-based detection method. Our theoretical and experimental results are in good agreement.

  7. Multi-viewpoint clustering analysis

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala; Wild, Chris

    1993-01-01

    In this paper, we address the feasibility of partitioning rule-based systems into a number of meaningful units to enhance the comprehensibility, maintainability and reliability of expert systems software. Preliminary results have shown that no single structuring principle or abstraction hierarchy is sufficient to understand complex knowledge bases. We therefore propose the Multi View Point - Clustering Analysis (MVP-CA) methodology to provide multiple views of the same expert system. We present the results of using this approach to partition a deployed knowledge-based system that navigates the Space Shuttle's entry. We also discuss the impact of this approach on verification and validation of knowledge-based systems.

  8. A Comparison of Books and Hypermedia for Knowledge-based Sports Coaching.

    ERIC Educational Resources Information Center

    Vickers, Joan N.; Gaines, Brian R.

    1988-01-01

    Summarizes and illustrates the knowledge-based approach to instructional material design. A series of sports coaching handbooks and hypermedia presentations of the same material are described and the different instantiations of the knowledge and training structures are compared. Figures show knowledge structures for badminton and the architecture…

  9. Optimizing the Use of Aripiprazole Augmentation in the Treatment of Major Depressive Disorder: From Clinical Trials to Clinical Practice

    PubMed Central

    Han, Changsu; Wang, Sheng-Min; Lee, Soo-Jung; Jun, Tae-Youn

    2015-01-01

    Major depressive disorder (MDD) is a recurrent, chronic, and devastating disorder leading to serious impairment in functional capacity as well as increasing public health care costs. In the previous decade, switching therapy and dose adjustment of ongoing antidepressants was the most frequently chosen subsequent treatment option for MDD. However, such recommendations were not based on firmly proven efficacy data from well-designed, placebo-controlled, randomized clinical trials (RCTs) but on practical grounds and clinical reasoning. Aripiprazole augmentation has been dramatically increasing in clinical practice owing to its unique action mechanisms as well as proven efficacy and safety from adequately powered and well-controlled RCTs. Despite the increased use of aripiprazole in depression, limited clinical information and knowledge interfere with proper and efficient use of aripiprazole augmentation for MDD. The objective of the present review was to enhance clinicians' current understanding of aripiprazole augmentation and how to optimize the use of this therapy in the treatment of MDD. PMID:26306301

  10. A knowledge engineering approach to recognizing and extracting sequences of nucleic acids from scientific literature.

    PubMed

    García-Remesal, Miguel; Maojo, Victor; Crespo, José

    2010-01-01

    In this paper we present a knowledge engineering approach to automatically recognize and extract genetic sequences from scientific articles. To carry out this task, we use a preliminary recognizer based on a finite state machine to extract all candidate DNA/RNA sequences. The latter are then fed into a knowledge-based system that automatically discards false positives and refines noisy and incorrectly merged sequences. We created the knowledge base by manually analyzing different manuscripts containing genetic sequences. Our approach was evaluated using a test set of 211 full-text articles in PDF format containing 3134 genetic sequences. For such set, we achieved 87.76% precision and 97.70% recall respectively. This method can facilitate different research tasks. These include text mining, information extraction, and information retrieval research dealing with large collections of documents containing genetic sequences.

  11. Report: Unsupervised identification of malaria parasites using computer vision.

    PubMed

    Khan, Najeed Ahmed; Pervaz, Hassan; Latif, Arsalan; Musharaff, Ayesha

    2017-01-01

    Malaria in human is a serious and fatal tropical disease. This disease results from Anopheles mosquitoes that are infected by Plasmodium species. The clinical diagnosis of malaria based on the history, symptoms and clinical findings must always be confirmed by laboratory diagnosis. Laboratory diagnosis of malaria involves identification of malaria parasite or its antigen / products in the blood of the patient. Manual diagnosis of malaria parasite by the pathologists has proven to become cumbersome. Therefore, there is a need of automatic, efficient and accurate identification of malaria parasite. In this paper, we proposed a computer vision based approach to identify the malaria parasite from light microscopy images. This research deals with the challenges involved in the automatic detection of malaria parasite tissues. Our proposed method is based on the pixel-based approach. We used K-means clustering (unsupervised approach) for the segmentation to identify malaria parasite tissues.

  12. A formal approach to validation and verification for knowledge-based control systems

    NASA Technical Reports Server (NTRS)

    Castore, Glen

    1987-01-01

    As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.

  13. Towards integration of clinical decision support in commercial hospital information systems using distributed, reusable software and knowledge components.

    PubMed

    Müller, M L; Ganslandt, T; Eich, H P; Lang, K; Ohmann, C; Prokosch, H U

    2001-12-01

    Clinicians' acceptance of clinical decision support depends on its workflow-oriented, context-sensitive accessibility and availability at the point of care, integrated into the Electronic Patient Record (EPR). Commercially available Hospital Information Systems (HIS) often focus on administrative tasks and mostly do not provide additional knowledge based functionality. Their traditionally monolithic and closed software architecture encumbers integration of and interaction with external software modules. Our aim was to develop methods and interfaces to integrate knowledge sources into two different commercial hospital information systems to provide the best decision support possible within the context of available patient data. An existing, proven standalone scoring system for acute abdominal pain was supplemented by a communication interface. In both HIS we defined data entry forms and developed individual and reusable mechanisms for data exchange with external software modules. We designed an additional knowledge support frontend which controls data exchange between HIS and the knowledge modules. Finally, we added guidelines and algorithms to the knowledge library. Despite some major drawbacks which resulted mainly from the HIS' closed software architectures we showed exemplary, how external knowledge support can be integrated almost seamlessly into different commercial HIS. This paper describes the prototypical design and current implementation and discusses our experiences.

  14. Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training

    NASA Astrophysics Data System (ADS)

    Macris, A.; Malamateniou, F.; Vassilacopoulos, G.

    Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.

  15. Knowledge acquisition and representation using fuzzy evidential reasoning and dynamic adaptive fuzzy Petri nets.

    PubMed

    Liu, Hu-Chen; Liu, Long; Lin, Qing-Lian; Liu, Nan

    2013-06-01

    The two most important issues of expert systems are the acquisition of domain experts' professional knowledge and the representation and reasoning of the knowledge rules that have been identified. First, during expert knowledge acquisition processes, the domain expert panel often demonstrates different experience and knowledge from one another and produces different types of knowledge information such as complete and incomplete, precise and imprecise, and known and unknown because of its cross-functional and multidisciplinary nature. Second, as a promising tool for knowledge representation and reasoning, fuzzy Petri nets (FPNs) still suffer a couple of deficiencies. The parameters in current FPN models could not accurately represent the increasingly complex knowledge-based systems, and the rules in most existing knowledge inference frameworks could not be dynamically adjustable according to propositions' variation as human cognition and thinking. In this paper, we present a knowledge acquisition and representation approach using the fuzzy evidential reasoning approach and dynamic adaptive FPNs to solve the problems mentioned above. As is illustrated by the numerical example, the proposed approach can well capture experts' diversity experience, enhance the knowledge representation power, and reason the rule-based knowledge more intelligently.

  16. Dissemination of sustainable irrigation strategies for almond and olive orchards via a participatory approach. Project LIFE+IRRIMAN

    NASA Astrophysics Data System (ADS)

    Garcia-Vila, Margarita; Gamero-Ojeda, Pablo; Ascension Carmona, Maria; Berlanga, Jose; Fereres, Elias

    2017-04-01

    Dissemination of sustainable irrigation strategies for almond and olive orchards via a participatory approach. Project LIFE+IRRIMAN Spain is the world's first and third largest producer of olive oil and almond, respectively. Despite huge efforts in the last years by the production sector towards intensification, cultural issues relative to the traditional rain-fed crop management know how, prevent farmers from adoption of sustainable irrigation management practices. Consequently, even though there has been progress in irrigation management research for these two crops, adoption of modern irrigation techniques by farmers has been slow. Sustainable irrigation strategies for olive and almond orchards are being designed, implemented, validated and disseminated under the framework of the LIFE+ IRRIMAN project, through a participatory approach. The implementation of the LIFE+ IRRIMAN innovative and demonstrative actions has been carried out in an irrigation district of Southern Spain (Genil-Cabra Irrigation Scheme, Andalusia). The approach designed has four phases: i) design and implementation of sustainable irrigation strategies in demonstration farms; ii) dissemination of best irrigation practices which were tested in the initial year throughout the irrigation scheme by the irrigation advisory service; iii) assessment of degree of adoption and re-design of the dissemination strategies; and, iv) based on the results obtained, elaboration of sustainable irrigation guidelines for knowledge transfer in the district at regional and national levels to promote changes in irrigation practices. Participatory approaches have proven to be effective tools for successful irrigation strategies design and diffusion, especially in traditional rain fed crops such as olive and almond trees in the Mediterranean countries. Acknowledgements This work has been funded by the European Union LIFE+ project IRRIMAN (LIFE13 ENV/ES/000539).

  17. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support.

    PubMed

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning , which generalizes the commonalities among the data to induce new rules, and analogical reasoning , which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15%, and 20% of missing values. This expansion in the KB coverage allowed solving complex disease diagnostic queries that were previously unresolvable, without losing the correctness of the answers. However, compared to deductive reasoning, data-intensive plausible reasoning mechanisms yield a significant performance overhead. We observed that plausible reasoning approaches, by generating tentative inferences and leveraging domain knowledge of experts, allow us to extend the coverage of medical knowledge bases, resulting in improved clinical decision support. Second, by leveraging OWL ontological knowledge, we are able to increase the expressivity and accuracy of plausible reasoning methods. Third, our approach is applicable to clinical decision support systems for a range of chronic diseases.

  18. PLAN-IT: Knowledge-Based Mission Sequencing

    NASA Astrophysics Data System (ADS)

    Biefeld, Eric W.

    1987-02-01

    Mission sequencing consumes a large amount of time and manpower during a space exploration effort. Plan-It is a knowledge-based approach to assist in mission sequencing. Plan-It uses a combined frame and blackboard architecture. This paper reports on the approach implemented by Plan-It and the current applications of Plan-It for sequencing at NASA.

  19. Strategies and Resources for Contextualising the Curriculum Based on the Funds of Knowledge Approach: A Literature Review

    ERIC Educational Resources Information Center

    Llopart, Mariona; Esteban-Guitart, Moisès

    2017-01-01

    This article aims to describe and illustrate how the curriculum can be contextualised through different educational experiences based on the funds of knowledge approach. Educational contextualisation is understood to be the linking of curricular content (literacy, science, mathematics, social sciences) with students' lives, including prior…

  20. Guided Work-Based Learning: Sharing Practical Teaching Knowledge with Student Teachers

    ERIC Educational Resources Information Center

    van Velzen, Corinne; Volman, Monique; Brekelmans, Mieke; White, Simone

    2012-01-01

    Building quality work-based learning opportunities for student teachers is a challenge for schools in school-university partnerships. This study focused on the guidance of student teachers by means of a mentoring approach aimed at sharing practical knowledge, with student teachers' learning needs as an emphasis. The approach was built on…

  1. STEM-based workbook: Enhancing students' STEM competencies on lever system

    NASA Astrophysics Data System (ADS)

    Sejati, Binar Kasih; Firman, Harry; Kaniawati, Ida

    2017-05-01

    Twenty-first century is a century of technology, a rapid development of scientific studies and technology make them relied heavily on each other. This research investigated about the effect of STEM-based workbook in enhancing students' STEM competencies in terms of knowledge understanding, problem solving skill, innovative abilities, and responsibility. The workbook was tried on 24 students that applied engineering design processes together with mathematics and science knowledge to design and create an egg cracker. The result showed that the implementation of STEM-based workbook on lever system in human body is effective to improve students' STEM competencies, it can be proven by students' result on their knowledge understanding improvement which can be seen from normalized gain () score is 0.41 and categorized as medium improvement, students' problem solving skill is also improving where it obtained a medium improvement with normalized gain as much as 0.45. Innovative abilities also encountered an the improvement, the workbook analysis obtained a higher score which means students can be more innovative after finishing their workbook. Last, students' responsibility is keep improving day by day, students' effort gain the highest score it means that the students become more responsible after implementation of STEM-based workbook. All of the results are supported with the response of students towards STEM-based workbook implementation which showed positive response in all indicators.

  2. Theory in Practice: Why "Good Medicine" and "Scientific Medicine" Are Not Necessarily the Same Thing

    ERIC Educational Resources Information Center

    De Camargo, Kenneth, Jr.; Coeli, Claudia Medina

    2006-01-01

    The term "scientific medicine", ubiquitous in medical literature although poorly defined, can be traced to a number of assumptions, three of which are examined in this paper: that medicine is a form of knowledge-driven practice, where the established body of proven medical knowledge determines what doctors do; if what doctors do is either…

  3. The Avian Knowledge Network : A partnership to organize, analyze, and visualize bird observation data for education, conservation, research, and land management

    Treesearch

    Marshall Iliff; Leo Salas; Ernesto Ruelas Inzunza; Grant Ballard; Denis Lepage; Steve Kelling

    2009-01-01

    The Avian Knowledge Network (AKN) is an international collaboration of academic, nongovernment, and government institutions with the goal of organizing observations of birds into an interoperable format to enhance access, data visualization and exploration, and scientifi c analyses. The AKN uses proven cyberinfrastructure and informatics techniques as the foundation of...

  4. SSME fault monitoring and diagnosis expert system

    NASA Technical Reports Server (NTRS)

    Ali, Moonis; Norman, Arnold M.; Gupta, U. K.

    1989-01-01

    An expert system, called LEADER, has been designed and implemented for automatic learning, detection, identification, verification, and correction of anomalous propulsion system operations in real time. LEADER employs a set of sensors to monitor engine component performance and to detect, identify, and validate abnormalities with respect to varying engine dynamics and behavior. Two diagnostic approaches are adopted in the architecture of LEADER. In the first approach fault diagnosis is performed through learning and identifying engine behavior patterns. LEADER, utilizing this approach, generates few hypotheses about the possible abnormalities. These hypotheses are then validated based on the SSME design and functional knowledge. The second approach directs the processing of engine sensory data and performs reasoning based on the SSME design, functional knowledge, and the deep-level knowledge, i.e., the first principles (physics and mechanics) of SSME subsystems and components. This paper describes LEADER's architecture which integrates a design based reasoning approach with neural network-based fault pattern matching techniques. The fault diagnosis results obtained through the analyses of SSME ground test data are presented and discussed.

  5. Discovery radiomics via evolutionary deep radiomic sequencer discovery for pathologically proven lung cancer detection.

    PubMed

    Shafiee, Mohammad Javad; Chung, Audrey G; Khalvati, Farzad; Haider, Masoom A; Wong, Alexander

    2017-10-01

    While lung cancer is the second most diagnosed form of cancer in men and women, a sufficiently early diagnosis can be pivotal in patient survival rates. Imaging-based, or radiomics-driven, detection methods have been developed to aid diagnosticians, but largely rely on hand-crafted features that may not fully encapsulate the differences between cancerous and healthy tissue. Recently, the concept of discovery radiomics was introduced, where custom abstract features are discovered from readily available imaging data. We propose an evolutionary deep radiomic sequencer discovery approach based on evolutionary deep intelligence. Motivated by patient privacy concerns and the idea of operational artificial intelligence, the evolutionary deep radiomic sequencer discovery approach organically evolves increasingly more efficient deep radiomic sequencers that produce significantly more compact yet similarly descriptive radiomic sequences over multiple generations. As a result, this framework improves operational efficiency and enables diagnosis to be run locally at the radiologist's computer while maintaining detection accuracy. We evaluated the evolved deep radiomic sequencer (EDRS) discovered via the proposed evolutionary deep radiomic sequencer discovery framework against state-of-the-art radiomics-driven and discovery radiomics methods using clinical lung CT data with pathologically proven diagnostic data from the LIDC-IDRI dataset. The EDRS shows improved sensitivity (93.42%), specificity (82.39%), and diagnostic accuracy (88.78%) relative to previous radiomics approaches.

  6. A knowledge-driven approach to cluster validity assessment.

    PubMed

    Bolshakova, Nadia; Azuaje, Francisco; Cunningham, Pádraig

    2005-05-15

    This paper presents an approach to assessing cluster validity based on similarity knowledge extracted from the Gene Ontology. The program is freely available for non-profit use on request from the authors.

  7. Knowledge-based diagnosis for aerospace systems

    NASA Technical Reports Server (NTRS)

    Atkinson, David J.

    1988-01-01

    The need for automated diagnosis in aerospace systems and the approach of using knowledge-based systems are examined. Research issues in knowledge-based diagnosis which are important for aerospace applications are treated along with a review of recent relevant research developments in Artificial Intelligence. The design and operation of some existing knowledge-based diagnosis systems are described. The systems described and compared include the LES expert system for liquid oxygen loading at NASA Kennedy Space Center, the FAITH diagnosis system developed at the Jet Propulsion Laboratory, the PES procedural expert system developed at SRI International, the CSRL approach developed at Ohio State University, the StarPlan system developed by Ford Aerospace, the IDM integrated diagnostic model, and the DRAPhys diagnostic system developed at NASA Langley Research Center.

  8. A knowledge-based machine vision system for space station automation

    NASA Technical Reports Server (NTRS)

    Chipman, Laure J.; Ranganath, H. S.

    1989-01-01

    A simple knowledge-based approach to the recognition of objects in man-made scenes is being developed. Specifically, the system under development is a proposed enhancement to a robot arm for use in the space station laboratory module. The system will take a request from a user to find a specific object, and locate that object by using its camera input and information from a knowledge base describing the scene layout and attributes of the object types included in the scene. In order to use realistic test images in developing the system, researchers are using photographs of actual NASA simulator panels, which provide similar types of scenes to those expected in the space station environment. Figure 1 shows one of these photographs. In traditional approaches to image analysis, the image is transformed step by step into a symbolic representation of the scene. Often the first steps of the transformation are done without any reference to knowledge of the scene or objects. Segmentation of an image into regions generally produces a counterintuitive result in which regions do not correspond to objects in the image. After segmentation, a merging procedure attempts to group regions into meaningful units that will more nearly correspond to objects. Here, researchers avoid segmenting the image as a whole, and instead use a knowledge-directed approach to locate objects in the scene. The knowledge-based approach to scene analysis is described and the categories of knowledge used in the system are discussed.

  9. Suggestions for the New Social Entrepreneurship Initiative: Focus on Building a Body of Research-Proven Programs, Shown to Produce Major Gains in Education, Poverty Reduction, Crime Prevention, and Other Areas

    ERIC Educational Resources Information Center

    Coalition for Evidence-Based Policy, 2009

    2009-01-01

    This paper outlines a possible approach to implementing the Social Entrepreneurship initiative, focused on building a body of research-proven program models/strategies, and scaling them up, so as to produce major progress in education, poverty reduction, crime prevention, and other areas. The paper summarizes the rationale for this approach, then…

  10. Knowledge-based prediction of protein backbone conformation using a structural alphabet.

    PubMed

    Vetrivel, Iyanar; Mahajan, Swapnil; Tyagi, Manoj; Hoffmann, Lionel; Sanejouand, Yves-Henri; Srinivasan, Narayanaswamy; de Brevern, Alexandre G; Cadet, Frédéric; Offmann, Bernard

    2017-01-01

    Libraries of structural prototypes that abstract protein local structures are known as structural alphabets and have proven to be very useful in various aspects of protein structure analyses and predictions. One such library, Protein Blocks, is composed of 16 standard 5-residues long structural prototypes. This form of analyzing proteins involves drafting its structure as a string of Protein Blocks. Predicting the local structure of a protein in terms of protein blocks is the general objective of this work. A new approach, PB-kPRED is proposed towards this aim. It involves (i) organizing the structural knowledge in the form of a database of pentapeptide fragments extracted from all protein structures in the PDB and (ii) applying a knowledge-based algorithm that does not rely on any secondary structure predictions and/or sequence alignment profiles, to scan this database and predict most probable backbone conformations for the protein local structures. Though PB-kPRED uses the structural information from homologues in preference, if available. The predictions were evaluated rigorously on 15,544 query proteins representing a non-redundant subset of the PDB filtered at 30% sequence identity cut-off. We have shown that the kPRED method was able to achieve mean accuracies ranging from 40.8% to 66.3% depending on the availability of homologues. The impact of the different strategies for scanning the database on the prediction was evaluated and is discussed. Our results highlight the usefulness of the method in the context of proteins without any known structural homologues. A scoring function that gives a good estimate of the accuracy of prediction was further developed. This score estimates very well the accuracy of the algorithm (R2 of 0.82). An online version of the tool is provided freely for non-commercial usage at http://www.bo-protscience.fr/kpred/.

  11. Does an outcome-based approach to continuing medical education improve physicians' competences in rational prescribing?

    PubMed

    Esmaily, Hamideh M; Savage, Carl; Vahidi, Rezagoli; Amini, Abolghasem; Dastgiri, Saeed; Hult, Hakan; Dahlgren, Lars Owe; Wahlstrom, Rolf

    2009-11-01

    Continuing medical education (CME) is compulsory in Iran, and traditionally it is lecture-based, which is mostly not successful. Outcome-based education has been proposed for CME programs. To evaluate the effectiveness of an outcome-based educational intervention with a new approach based on outcomes and aligned teaching methods, on knowledge and skills of general physicians (GPs) working in primary care compared with a concurrent CME program in the field of "Rational prescribing". The method used was cluster randomized controlled design. All GPs working in six cities in one province in Iran were invited to participate. The cities were matched and randomly divided into an intervention arm for education on rational prescribing with an outcome-based approach, and a control arm for a traditional program on the same topic. Knowledge and skills were assessed using a pre- and post-test, including case scenarios. In total, 112 GPs participated. There were significant improvements in knowledge and prescribing skills after the training in the intervention arm as well as in comparison with the changes in the control arm. The overall intervention effect was 26 percentage units. The introduction of an outcome-based approach in CME appears to be effective when creating programs to improve GPs' knowledge and skills.

  12. Using knowledge brokering to promote evidence-based policy-making: The need for support structures.

    PubMed Central

    van Kammen, Jessika; de Savigny, Don; Sewankambo, Nelson

    2006-01-01

    Knowledge brokering is a promising strategy to close the "know-do gap" and foster greater use of research findings and evidence in policy-making. It focuses on organizing the interactive process between the producers and users of knowledge so that they can co-produce feasible and research-informed policy options. We describe a recent successful experience with this novel approach in the Netherlands and discuss the requirements for effective institutionalization of knowledge brokering. We also discuss the potential of this approach to assist health policy development in low-income countries based on the experience of developing the Regional East-African Health (REACH)-Policy Initiative. We believe that intermediary organizations, such as regional networks, dedicated institutional mechanisms and funding agencies, can play key roles in supporting knowledge brokering. We recommend the need to support and learn from the brokerage approach to strengthen the relationship between the research and policy communities and hence move towards a stronger culture of evidence-based policy and policy-relevant research. PMID:16917647

  13. How Structure Shapes Dynamics: Knowledge Development in Wikipedia - A Network Multilevel Modeling Approach

    PubMed Central

    Halatchliyski, Iassen; Cress, Ulrike

    2014-01-01

    Using a longitudinal network analysis approach, we investigate the structural development of the knowledge base of Wikipedia in order to explain the appearance of new knowledge. The data consists of the articles in two adjacent knowledge domains: psychology and education. We analyze the development of networks of knowledge consisting of interlinked articles at seven snapshots from 2006 to 2012 with an interval of one year between them. Longitudinal data on the topological position of each article in the networks is used to model the appearance of new knowledge over time. Thus, the structural dimension of knowledge is related to its dynamics. Using multilevel modeling as well as eigenvector and betweenness measures, we explain the significance of pivotal articles that are either central within one of the knowledge domains or boundary-crossing between the two domains at a given point in time for the future development of new knowledge in the knowledge base. PMID:25365319

  14. Taxonomy-Based Approaches to Quality Assurance of Ontologies

    PubMed Central

    Perl, Yehoshua; Ochs, Christopher

    2017-01-01

    Ontologies are important components of health information management systems. As such, the quality of their content is of paramount importance. It has been proven to be practical to develop quality assurance (QA) methodologies based on automated identification of sets of concepts expected to have higher likelihood of errors. Four kinds of such sets (called QA-sets) organized around the themes of complex and uncommonly modeled concepts are introduced. A survey of different methodologies based on these QA-sets and the results of applying them to various ontologies are presented. Overall, following these approaches leads to higher QA yields and better utilization of QA personnel. The formulation of additional QA-set methodologies will further enhance the suite of available ontology QA tools. PMID:29158885

  15. Knowledge-based nonuniform sampling in multidimensional NMR.

    PubMed

    Schuyler, Adam D; Maciejewski, Mark W; Arthanari, Haribabu; Hoch, Jeffrey C

    2011-07-01

    The full resolution afforded by high-field magnets is rarely realized in the indirect dimensions of multidimensional NMR experiments because of the time cost of uniformly sampling to long evolution times. Emerging methods utilizing nonuniform sampling (NUS) enable high resolution along indirect dimensions by sampling long evolution times without sampling at every multiple of the Nyquist sampling interval. While the earliest NUS approaches matched the decay of sampling density to the decay of the signal envelope, recent approaches based on coupled evolution times attempt to optimize sampling by choosing projection angles that increase the likelihood of resolving closely-spaced resonances. These approaches employ knowledge about chemical shifts to predict optimal projection angles, whereas prior applications of tailored sampling employed only knowledge of the decay rate. In this work we adapt the matched filter approach as a general strategy for knowledge-based nonuniform sampling that can exploit prior knowledge about chemical shifts and is not restricted to sampling projections. Based on several measures of performance, we find that exponentially weighted random sampling (envelope matched sampling) performs better than shift-based sampling (beat matched sampling). While shift-based sampling can yield small advantages in sensitivity, the gains are generally outweighed by diminished robustness. Our observation that more robust sampling schemes are only slightly less sensitive than schemes highly optimized using prior knowledge about chemical shifts has broad implications for any multidimensional NMR study employing NUS. The results derived from simulated data are demonstrated with a sample application to PfPMT, the phosphoethanolamine methyltransferase of the human malaria parasite Plasmodium falciparum.

  16. Knowledge Sharing in an American Multinational Company Based in Malaysia

    ERIC Educational Resources Information Center

    Ling, Chen Wai; Sandhu, Manjit S.; Jain, Kamal Kishore

    2009-01-01

    Purpose: This paper seeks to examine the views of executives working in an American based multinational company (MNC) about knowledge sharing, barriers to knowledge sharing, and strategies to promote knowledge sharing. Design/methodology/approach: This study was carried out in phases. In the first phase, a topology of organizational mechanisms for…

  17. A Vision for Spaceflight Reliability: NASA's Objectives Based Strategy

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Evans, John; Hall, Tony

    2015-01-01

    In defining the direction for a new Reliability and Maintainability standard, OSMA has extracted the essential objectives that our programs need, to undertake a reliable mission. These objectives have been structured to lead mission planning through construction of an objective hierarchy, which defines the critical approaches for achieving high reliability and maintainability (R M). Creating a hierarchy, as a basis for assurance implementation, is a proven approach; yet, it holds the opportunity to enable new directions, as NASA moves forward in tackling the challenges of space exploration.

  18. Restoring Consistency In Subjective Information For Groundwater Driven Health Risk Assessment

    NASA Astrophysics Data System (ADS)

    Ozbek, M. M.; Pinder, G. F.

    2004-12-01

    In an earlier work (Ozbek and Pinder, 2003), we constructed a fuzzy rule-based knowledge base that uses subjective expert opinion to calculate risk-based design constraints (i.e., dose and pattern of exposure) to sustain the groundwater-driven individual health risk at a desired level. Ideally, our system must be capable to produce for any individual a meaningful risk result or for any given risk a meaningful design constraint, in the sense that the result is neither the empty set nor the whole domain of the variable of interest. Otherwise we consider our system as inconsistent. We present a method based on fuzzy similarity relations to restore consistency in our implicative fuzzy rule based system used for the risk-based groundwater remediation design problem. Both a global and a local approach are considered. Even though straightforward and computationally less demanding, the global approach can affect pieces of knowledge negatively by inducing unwarranted imprecision into the knowledge base. On the other hand, the local approach, given a family of parameterized similarity relations, determines a parameter for each inference such that consistent results are computed which may not be feasible in real time applications of our knowledge base. Several scenarios are considered for comparing the two approaches that suggest that for specific applications one or several approaches ranging from a completely global to a completely local one will be more suitable than others while calculating the design constraints.

  19. Towards Semantic e-Science for Traditional Chinese Medicine

    PubMed Central

    Chen, Huajun; Mao, Yuxin; Zheng, Xiaoqing; Cui, Meng; Feng, Yi; Deng, Shuiguang; Yin, Aining; Zhou, Chunying; Tang, Jinming; Jiang, Xiaohong; Wu, Zhaohui

    2007-01-01

    Background Recent advances in Web and information technologies with the increasing decentralization of organizational structures have resulted in massive amounts of information resources and domain-specific services in Traditional Chinese Medicine. The massive volume and diversity of information and services available have made it difficult to achieve seamless and interoperable e-Science for knowledge-intensive disciplines like TCM. Therefore, information integration and service coordination are two major challenges in e-Science for TCM. We still lack sophisticated approaches to integrate scientific data and services for TCM e-Science. Results We present a comprehensive approach to build dynamic and extendable e-Science applications for knowledge-intensive disciplines like TCM based on semantic and knowledge-based techniques. The semantic e-Science infrastructure for TCM supports large-scale database integration and service coordination in a virtual organization. We use domain ontologies to integrate TCM database resources and services in a semantic cyberspace and deliver a semantically superior experience including browsing, searching, querying and knowledge discovering to users. We have developed a collection of semantic-based toolkits to facilitate TCM scientists and researchers in information sharing and collaborative research. Conclusion Semantic and knowledge-based techniques are suitable to knowledge-intensive disciplines like TCM. It's possible to build on-demand e-Science system for TCM based on existing semantic and knowledge-based techniques. The presented approach in the paper integrates heterogeneous distributed TCM databases and services, and provides scientists with semantically superior experience to support collaborative research in TCM discipline. PMID:17493289

  20. Characterization of GM events by insert knowledge adapted re-sequencing approaches

    PubMed Central

    Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing

    2013-01-01

    Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events. PMID:24088728

  1. Characterization of GM events by insert knowledge adapted re-sequencing approaches.

    PubMed

    Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing

    2013-10-03

    Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events.

  2. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.

  3. Anatomy education for the YouTube generation.

    PubMed

    Barry, Denis S; Marzouk, Fadi; Chulak-Oglu, Kyrylo; Bennett, Deirdre; Tierney, Paul; O'Keeffe, Gerard W

    2016-01-01

    Anatomy remains a cornerstone of medical education despite challenges that have seen a significant reduction in contact hours over recent decades; however, the rise of the "YouTube Generation" or "Generation Connected" (Gen C), offers new possibilities for anatomy education. Gen C, which consists of 80% Millennials, actively interact with social media and integrate it into their education experience. Most are willing to merge their online presence with their degree programs by engaging with course materials and sharing their knowledge freely using these platforms. This integration of social media into undergraduate learning, and the attitudes and mindset of Gen C, who routinely creates and publishes blogs, podcasts, and videos online, has changed traditional learning approaches and the student/teacher relationship. To gauge this, second year undergraduate medical and radiation therapy students (n = 73) were surveyed regarding their use of online social media in relation to anatomy learning. The vast majority of students had employed web-based platforms to source information with 78% using YouTube as their primary source of anatomy-related video clips. These findings suggest that the academic anatomy community may find value in the integration of social media into blended learning approaches in anatomy programs. This will ensure continued connection with the YouTube generation of students while also allowing for academic and ethical oversight regarding the use of online video clips whose provenance may not otherwise be known. © 2015 American Association of Anatomists.

  4. Antiquity versus modern times in hydraulics - a case study

    NASA Astrophysics Data System (ADS)

    Stroia, L.; Georgescu, S. C.; Georgescu, A. M.

    2010-08-01

    Water supply and water management in Antiquity represent more than Modern World can imagine about how people in that period used to think about, and exploit the resources they had, aiming at developing and improving their society and own lives. This paper points out examples of how they handled different situations, and how they managed to cope with the growing number of population in the urban areas, by adapting or by improving their water supply systems. The paper tries to emphasize the engineering contribution of Rome and the Roman Empire, mainly in the capital but also in the provinces, as for instance the today territory of France, by analysing some aqueducts from the point of view of modern Hydraulic Engineering. A third order polynomial regression is proposed to compute the water flow rate, based on the flow cross-sectional area measured in quinaria. This paper also emphasizes on contradictory things between what we thought we knew about Ancient Roman civilization, and what could really be proven, either by a modern engineering approach, a documentary approach, or by commonsense, where none of the above could be used. It is certain that the world we live in is the heritage of the Greco-Roman culture and therefore, we are due to acknowledge their contribution, especially taking into account the lack of knowledge of that time, and the poor resources they had.

  5. From Cultural Knowledge to Intercultural Communicative Competence: Changing Perspectives on the Role of Culture in Foreign Language Teaching

    ERIC Educational Resources Information Center

    Piatkowska, Katarzyna

    2015-01-01

    Approaches to the concept of culture and teaching cultural competence in a foreign language classroom have been changing over the last decades. The paper summarises, compares, contrasts and evaluates four major approaches to teaching cultural competence in foreign language teaching, that is, knowledge-based approach, contrastive approach,…

  6. Using CLIPS in the domain of knowledge-based massively parallel programming

    NASA Technical Reports Server (NTRS)

    Dvorak, Jiri J.

    1994-01-01

    The Program Development Environment (PDE) is a tool for massively parallel programming of distributed-memory architectures. Adopting a knowledge-based approach, the PDE eliminates the complexity introduced by parallel hardware with distributed memory and offers complete transparency in respect of parallelism exploitation. The knowledge-based part of the PDE is realized in CLIPS. Its principal task is to find an efficient parallel realization of the application specified by the user in a comfortable, abstract, domain-oriented formalism. A large collection of fine-grain parallel algorithmic skeletons, represented as COOL objects in a tree hierarchy, contains the algorithmic knowledge. A hybrid knowledge base with rule modules and procedural parts, encoding expertise about application domain, parallel programming, software engineering, and parallel hardware, enables a high degree of automation in the software development process. In this paper, important aspects of the implementation of the PDE using CLIPS and COOL are shown, including the embedding of CLIPS with C++-based parts of the PDE. The appropriateness of the chosen approach and of the CLIPS language for knowledge-based software engineering are discussed.

  7. Developmental Therapy- Developmental Teaching: An Outreach Project for Young Children with Social-Emotional-Behavioral Disabilities (October 1, 1997-September 30, 2000). Final Performance Report.

    ERIC Educational Resources Information Center

    Georgia Univ., Athens. Coll. of Family and Consumer Sciences.

    This outreach project is based on the validated Developmental Therapy-Developmental Teaching model originally designed for young children with severe emotional/behavioral problems and their families. It is an approach that emphasizes the teaching skills that foster a child's social-emotional-behavioral competence. The model has proven effective in…

  8. Popular Explanations of Physical Phenomena: Broken Ruler, Oxygen in the Air and Water Attracted by Electric Charges

    ERIC Educational Resources Information Center

    Riveros, Héctor G.

    2012-01-01

    The inquiry-based approach to learning has proven to be quite effective, since Socrates, but it is difficult to found good questions to induce reasoning. Many sources explain wrongly some experimental results, which can be used as discrepant events. Some use the breaking of a ruler with a newspaper to "show" that the atmospheric pressure…

  9. Mathematical Knowledge for Teaching, Standards-Based Mathematics Teaching Practices, and Student Achievement in the Context of the "Responsive Classroom Approach"

    ERIC Educational Resources Information Center

    Ottmar, Erin R.; Rimm-Kaufman, Sara E.; Larsen, Ross A.; Berry, Robert Q.

    2015-01-01

    This study investigates the effectiveness of the Responsive Classroom (RC) approach, a social and emotional learning intervention, on changing the relations between mathematics teacher and classroom inputs (mathematical knowledge for teaching [MKT] and standards-based mathematics teaching practices) and student mathematics achievement. Work was…

  10. Exploring the structure and function of temporal networks with dynamic graphlets

    PubMed Central

    Hulovatyy, Y.; Chen, H.; Milenković, T.

    2015-01-01

    Motivation: With increasing availability of temporal real-world networks, how to efficiently study these data? One can model a temporal network as a single aggregate static network, or as a series of time-specific snapshots, each being an aggregate static network over the corresponding time window. Then, one can use established methods for static analysis on the resulting aggregate network(s), but losing in the process valuable temporal information either completely, or at the interface between different snapshots, respectively. Here, we develop a novel approach for studying a temporal network more explicitly, by capturing inter-snapshot relationships. Results: We base our methodology on well-established graphlets (subgraphs), which have been proven in numerous contexts in static network research. We develop new theory to allow for graphlet-based analyses of temporal networks. Our new notion of dynamic graphlets is different from existing dynamic network approaches that are based on temporal motifs (statistically significant subgraphs). The latter have limitations: their results depend on the choice of a null network model that is required to evaluate the significance of a subgraph, and choosing a good null model is non-trivial. Our dynamic graphlets overcome the limitations of the temporal motifs. Also, when we aim to characterize the structure and function of an entire temporal network or of individual nodes, our dynamic graphlets outperform the static graphlets. Clearly, accounting for temporal information helps. We apply dynamic graphlets to temporal age-specific molecular network data to deepen our limited knowledge about human aging. Availability and implementation: http://www.nd.edu/∼cone/DG. Contact: tmilenko@nd.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26072480

  11. Object-oriented productivity metrics

    NASA Technical Reports Server (NTRS)

    Connell, John L.; Eller, Nancy

    1992-01-01

    Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.

  12. Major accident prevention through applying safety knowledge management approach.

    PubMed

    Kalatpour, Omid

    2016-01-01

    Many scattered resources of knowledge are available to use for chemical accident prevention purposes. The common approach to management process safety, including using databases and referring to the available knowledge has some drawbacks. The main goal of this article was to devise a new emerged knowledge base (KB) for the chemical accident prevention domain. The scattered sources of safety knowledge were identified and scanned. Then, the collected knowledge was formalized through a computerized program. The Protégé software was used to formalize and represent the stored safety knowledge. The domain knowledge retrieved as well as data and information. This optimized approach improved safety and health knowledge management (KM) process and resolved some typical problems in the KM process. Upgrading the traditional resources of safety databases into the KBs can improve the interaction between the users and knowledge repository.

  13. Assessing the role of detrital zircon sorting on provenance interpretations in an ancient fluvial system using paleohydraulics - Permian Cutler Group, Paradox Basin, Utah and Colorado

    NASA Astrophysics Data System (ADS)

    Findlay, C. P., III; Ewing, R. C.; Perez, N. D.

    2017-12-01

    Detrital zircon age signatures used in provenance studies are assumed to be representative of entire catchments from which the sediment was derived, but the extent to which hydraulic sorting can bias provenance interpretations is poorly constrained. Sediment and mineral sorting occurs with changes in hydraulic conditions driven by both allogenic and autogenic processes. Zircon is sorted from less dense minerals due to the difference in density, and any age dependence on zircon size could potentially bias provenance interpretations. In this study, a coupled paleohydraulic and geochemical provenance approach is used to identify changes in paleohydraulic conditions and relate them to spatial variations in provenance signatures from samples collected along an approximately time-correlative source-to-sink pathway in the Permian Cutler Group of the Paradox Basin. Samples proximal to the uplift have a paleoflow direction to the southwest. In the medial basin, paleocurrent direction indicates salt movement caused fluvial pathways divert to the north and northwest on the flanks of anticlines. Channel depth, flow velocity, and discharge calculations were derived from field measurements of grain size and dune and bar cross-stratification indicate that competency of the fluvial system decreased from proximal to the medial basin by up to a factor of 12. Based upon the paleohydraulic calculations, zircon size fractionation would occur along the transect such that the larger zircons are removed from the system prior to reaching the medial basin. Analysis of the size and age distribution of zircons from the proximal and distal fluvial system of the Cutler Group tests if this hydraulic sorting affects the expected Uncompahgre Uplift age distribution.

  14. Bridging the gap between formal and experience-based knowledge for context-aware laparoscopy.

    PubMed

    Katić, Darko; Schuck, Jürgen; Wekerle, Anna-Laura; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2016-06-01

    Computer assistance is increasingly common in surgery. However, the amount of information is bound to overload processing abilities of surgeons. We propose methods to recognize the current phase of a surgery for context-aware information filtering. The purpose is to select the most suitable subset of information for surgical situations which require special assistance. We combine formal knowledge, represented by an ontology, and experience-based knowledge, represented by training samples, to recognize phases. For this purpose, we have developed two different methods. Firstly, we use formal knowledge about possible phase transitions to create a composition of random forests. Secondly, we propose a method based on cultural optimization to infer formal rules from experience to recognize phases. The proposed methods are compared with a purely formal knowledge-based approach using rules and a purely experience-based one using regular random forests. The comparative evaluation on laparoscopic pancreas resections and adrenalectomies employs a consistent set of quality criteria on clean and noisy input. The rule-based approaches proved best with noisefree data. The random forest-based ones were more robust in the presence of noise. Formal and experience-based knowledge can be successfully combined for robust phase recognition.

  15. Visualisation methods for large provenance collections in data-intensive collaborative platforms

    NASA Astrophysics Data System (ADS)

    Spinuso, Alessandro; Fligueira, Rosa; Atkinson, Malcolm; Gemuend, Andre

    2016-04-01

    This work investigates improving the methods of visually representing provenance information in the context of modern data-driven scientific research. It explores scenarios where data-intensive workflows systems are serving communities of researchers within collaborative environments, supporting the sharing of data and methods, and offering a variety of computation facilities, including HPC, HTC and Cloud. It focuses on the exploration of big-data visualization techniques aiming at producing comprehensive and interactive views on top of large and heterogeneous provenance data. The same approach is applicable to control-flow and data-flow workflows or to combinations of the two. This flexibility is achieved using the W3C-PROV recommendation as a reference model, especially its workflow oriented profiles such as D-PROV (Messier et al. 2013). Our implementation is based on the provenance records produced by the dispel4py data-intensive processing library (Filgueira et al. 2015). dispel4py is an open-source Python framework for describing abstract stream-based workflows for distributed data-intensive applications, developed during the VERCE project. dispel4py enables scientists to develop their scientific methods and applications on their laptop and then run them at scale on a wide range of e-Infrastructures (Cloud, Cluster, etc.) without making changes. Users can therefore focus on designing their workflows at an abstract level, describing actions, input and output streams, and how they are connected. The dispel4py system then maps these descriptions to the enactment platforms, such as MPI, Storm, multiprocessing. It provides a mechanism which allows users to determine the provenance information to be collected and to analyze it at runtime. For this work we consider alternative visualisation methods for provenance data, from infinite lists and localised interactive graphs, to radial-views. The latter technique has been positively explored in many fields, from text data visualisation to genomics and social networking analysis. Its adoption for provenance has been presented in literature (Borkin et al. 2013) in the context of parent-child relationships across processes, constructed from control-flow information. Computer graphics research has focused on the advantage of this radial distribution of interlinked information and on ways to improve the visual efficiency and tunability of such representations, like the Hierarchical Edge Bundles visualisation method, (Holten et al. 2006), which aims at reducing visual clutter of highly connected structures via the generation of bundles. Our approach explores the potential of the combination of these methods. It serves environments where the size of the provenance collection, coupled with the diversity of the infrastructures and the domain metadata, make the extrapolation of usage trends extremely challenging. Applications of such visualisation systems can engage groups of scientists, data providers and computational engineers, by serving visual snapshots that highlight relationships between an item and its connected processes. We will present examples of comprehensive views on the distribution of processing and data transfers during a workflow's execution in HPC, as well as cross workflows interactions and internal dynamics. The latter in the context of faceted searches on domain metadata values-range. These are obtained from the analysis of real provenance data generated by the processing of seismic traces performed through the VERCE platform.

  16. An analysis of the influence of deep neural network (DNN) topology in bottleneck feature based language recognition.

    PubMed

    Lozano-Diez, Alicia; Zazo, Ruben; Toledano, Doroteo T; Gonzalez-Rodriguez, Joaquin

    2017-01-01

    Language recognition systems based on bottleneck features have recently become the state-of-the-art in this research field, showing its success in the last Language Recognition Evaluation (LRE 2015) organized by NIST (U.S. National Institute of Standards and Technology). This type of system is based on a deep neural network (DNN) trained to discriminate between phonetic units, i.e. trained for the task of automatic speech recognition (ASR). This DNN aims to compress information in one of its layers, known as bottleneck (BN) layer, which is used to obtain a new frame representation of the audio signal. This representation has been proven to be useful for the task of language identification (LID). Thus, bottleneck features are used as input to the language recognition system, instead of a classical parameterization of the signal based on cepstral feature vectors such as MFCCs (Mel Frequency Cepstral Coefficients). Despite the success of this approach in language recognition, there is a lack of studies analyzing in a systematic way how the topology of the DNN influences the performance of bottleneck feature-based language recognition systems. In this work, we try to fill-in this gap, analyzing language recognition results with different topologies for the DNN used to extract the bottleneck features, comparing them and against a reference system based on a more classical cepstral representation of the input signal with a total variability model. This way, we obtain useful knowledge about how the DNN configuration influences bottleneck feature-based language recognition systems performance.

  17. Improving the learning of clinical reasoning through computer-based cognitive representation.

    PubMed

    Wu, Bian; Wang, Minhong; Johnson, Janice M; Grotzer, Tina A

    2014-01-01

    Objective Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Methods Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. Results A significant improvement was found in students' learning products from the beginning to the end of the study, consistent with students' report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. Conclusions The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction.

  18. Improving the learning of clinical reasoning through computer-based cognitive representation

    PubMed Central

    Wu, Bian; Wang, Minhong; Johnson, Janice M.; Grotzer, Tina A.

    2014-01-01

    Objective Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Methods Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. Results A significant improvement was found in students’ learning products from the beginning to the end of the study, consistent with students’ report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. Conclusions The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction. PMID:25518871

  19. Improving the learning of clinical reasoning through computer-based cognitive representation.

    PubMed

    Wu, Bian; Wang, Minhong; Johnson, Janice M; Grotzer, Tina A

    2014-01-01

    Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. A significant improvement was found in students' learning products from the beginning to the end of the study, consistent with students' report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction.

  20. Analysis of a Knowledge-Management-Based Process of Transferring Project Management Skills

    ERIC Educational Resources Information Center

    Ioi, Toshihiro; Ono, Masakazu; Ishii, Kota; Kato, Kazuhiko

    2012-01-01

    Purpose: The purpose of this paper is to propose a method for the transfer of knowledge and skills in project management (PM) based on techniques in knowledge management (KM). Design/methodology/approach: The literature contains studies on methods to extract experiential knowledge in PM, but few studies exist that focus on methods to convert…

  1. A hybrid multimodal non-rigid registration of MR images based on diffeomorphic demons.

    PubMed

    Lu, Huanxiang; Cattin, Philippe C; Reyes, Mauricio

    2010-01-01

    In this paper we present a novel hybrid approach for multimodal medical image registration based on diffeomorphic demons. Diffeomorphic demons have proven to be a robust and efficient way for intensity-based image registration. A very recent extension even allows to use mutual information (MI) as a similarity measure to registration multimodal images. However, due to the intensity correspondence uncertainty existing in some anatomical parts, it is difficult for a purely intensity-based algorithm to solve the registration problem. Therefore, we propose to combine the resulting transformations from both intensity-based and landmark-based methods for multimodal non-rigid registration based on diffeomorphic demons. Several experiments on different types of MR images were conducted, for which we show that a better anatomical correspondence between the images can be obtained using the hybrid approach than using either intensity information or landmarks alone.

  2. Critical Analysis of Textbooks: Knowledge-Generating Logics and the Emerging Image of "Global Economic Contexts"

    ERIC Educational Resources Information Center

    Thoma, Michael

    2017-01-01

    This paper presents an approach to the critical analysis of textbook knowledge, which, working from a discourse theory perspective (based on the work of Foucault), refers to the performative nature of language. The critical potential of the approach derives from an analysis of knowledge-generating logics, which produce particular images of reality…

  3. Potential and Impact Factors of the Knowledge and Information Awareness Approach for Fostering Net-Based Collaborative Problem-Solving: An Overview

    ERIC Educational Resources Information Center

    Engelmann, Tanja

    2014-01-01

    For effective communication and collaboration in learning situations, it is important to know what the collaboration partners know. However, the acquisition of this knowledge is difficult, especially in collaborating groups with spatially distributed members. One solution is the "Knowledge and Information Awareness" approach developed by…

  4. Partnering with Youth to Map Their Neighborhood Environments: A Multi-Layered GIS Approach

    PubMed Central

    Topmiller, Michael; Jacquez, Farrah; Vissman, Aaron T.; Raleigh, Kevin; Miller-Francis, Jenni

    2014-01-01

    Mapping approaches offer great potential for community-based participatory researchers interested in displaying youth perceptions and advocating for change. We describe a multi-layered approach for gaining local knowledge of neighborhood environments that engages youth as co-researchers and active knowledge producers. By integrating geographic information systems (GIS) with environmental audits, an interactive focus group, and sketch mapping, the approach provides a place-based understanding of physical activity resources from the situated experience of youth. Youth report safety and a lack of recreational resources as inhibiting physical activity. Maps reflecting youth perceptions aid policy-makers in making place-based improvements for youth neighborhood environments. PMID:25423245

  5. Tools for Implementing an Evidence-Based Approach in Public Health Practice

    PubMed Central

    Jacobs, Julie A.; Jones, Ellen; Gabella, Barbara A.; Spring, Bonnie

    2012-01-01

    Increasing disease rates, limited funding, and the ever-growing scientific basis for intervention demand the use of proven strategies to improve population health. Public health practitioners must be ready to implement an evidence-based approach in their work to meet health goals and sustain necessary resources. We researched easily accessible and time-efficient tools for implementing an evidence-based public health (EBPH) approach to improve population health. Several tools have been developed to meet EBPH needs, including free online resources in the following topic areas: training and planning tools, US health surveillance, policy tracking and surveillance, systematic reviews and evidence-based guidelines, economic evaluation, and gray literature. Key elements of EBPH are engaging the community in assessment and decision making; using data and information systems systematically; making decisions on the basis of the best available peer-reviewed evidence (both quantitative and qualitative); applying program-planning frameworks (often based in health-behavior theory); conducting sound evaluation; and disseminating what is learned. PMID:22721501

  6. Consistent visualizations of changing knowledge

    PubMed Central

    Tipney, Hannah J.; Schuyler, Ronald P.; Hunter, Lawrence

    2009-01-01

    Networks are increasingly used in biology to represent complex data in uncomplicated symbolic form. However, as biological knowledge is continually evolving, so must those networks representing this knowledge. Capturing and presenting this type of knowledge change over time is particularly challenging due to the intimate manner in which researchers customize those networks they come into contact with. The effective visualization of this knowledge is important as it creates insight into complex systems and stimulates hypothesis generation and biological discovery. Here we highlight how the retention of user customizations, and the collection and visualization of knowledge associated provenance supports effective and productive network exploration. We also present an extension of the Hanalyzer system, ReOrient, which supports network exploration and analysis in the presence of knowledge change. PMID:21347184

  7. What Do We Know and How Well Do We Know It? Identifying Practice-Based Insights in Education

    ERIC Educational Resources Information Center

    Miller, Barbara; Pasley, Joan

    2012-01-01

    Knowledge derived from practice forms a significant portion of the knowledge base in the education field, yet is not accessible using existing empirical research methods. This paper describes a systematic, rigorous, grounded approach to collecting and analysing practice-based knowledge using the authors' research in teacher leadership as an…

  8. Mental health interventions in schools in low-income and middle-income countries.

    PubMed

    Fazel, Mina; Patel, Vikram; Thomas, Saji; Tol, Wietse

    2014-10-01

    Increasing enrolment rates could place schools in a crucial position to support mental health in low-income and middle-income countries. In this Review, we provide evidence for mental health interventions in schools in accordance with a public mental health approach spanning promotion, prevention, and treatment. We identified a systematic review for mental health promotion, and identified further prevention and treatment studies. Present evidence supports schools as places for promotion of positive aspects of mental health using a whole-school approach. Knowledge of effectiveness of prevention and treatment interventions is more widely available for conflict-affected children and adolescents. More evidence is needed to identify the many elements likely to be associated with effective prevention and treatment for children exposed to a range of adversity and types of mental disorders. Dissemination and implementation science is crucial to establish how proven effective interventions could be scaled up and implemented in schools. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. A Holistic School-Based Nutrition Program Fails to Improve Teachers' Nutrition-Related Knowledge, Attitudes and Behaviour in Rural China

    ERIC Educational Resources Information Center

    Wang, Dongxu; Stewart, Donald; Chang, Chun

    2016-01-01

    Purpose: The purpose of this paper is to examine the effectiveness of a holistic school-based nutrition programme using the health-promoting school (HPS) approach, on teachers' knowledge, attitudes and behaviour in relation to nutrition in rural China. Design/methodology/approach: A cluster-randomised intervention trial design was employed. Two…

  10. Enhancing the Teaching-Learning Process: A Knowledge Management Approach

    ERIC Educational Resources Information Center

    Bhusry, Mamta; Ranjan, Jayanthi

    2012-01-01

    Purpose: The purpose of this paper is to emphasize the need for knowledge management (KM) in the teaching-learning process in technical educational institutions (TEIs) in India, and to assert the impact of information technology (IT) based KM intervention in the teaching-learning process. Design/methodology/approach: The approach of the paper is…

  11. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing.

    PubMed

    Li, Chunhua; Zhao, Pengpeng; Sheng, Victor S; Xian, Xuefeng; Wu, Jian; Cui, Zhiming

    2017-01-01

    Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  12. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2001-01-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  13. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2000-12-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  14. Zero to Three: National Center for Infants, Toddlers and Families

    MedlinePlus

    ... Sign In Become a Member Advancing the proven power of early connections Early Development & Well-Being How ... closes Monday! Become a ZERO TO THREE Member! Knowledge and Know-How: Now Available for Individual Purchase ...

  15. Visualizing complex processes using a cognitive-mapping tool to support the learning of clinical reasoning.

    PubMed

    Wu, Bian; Wang, Minhong; Grotzer, Tina A; Liu, Jun; Johnson, Janice M

    2016-08-22

    Practical experience with clinical cases has played an important role in supporting the learning of clinical reasoning. However, learning through practical experience involves complex processes difficult to be captured by students. This study aimed to examine the effects of a computer-based cognitive-mapping approach that helps students to externalize the reasoning process and the knowledge underlying the reasoning process when they work with clinical cases. A comparison between the cognitive-mapping approach and the verbal-text approach was made by analyzing their effects on learning outcomes. Fifty-two third-year or higher students from two medical schools participated in the study. Students in the experimental group used the computer-base cognitive-mapping approach, while the control group used the verbal-text approach, to make sense of their thinking and actions when they worked with four simulated cases over 4 weeks. For each case, students in both groups reported their reasoning process (involving data capture, hypotheses formulation, and reasoning with justifications) and the underlying knowledge (involving identified concepts and the relationships between the concepts) using the given approach. The learning products (cognitive maps or verbal text) revealed that students in the cognitive-mapping group outperformed those in the verbal-text group in the reasoning process, but not in making sense of the knowledge underlying the reasoning process. No significant differences were found in a knowledge posttest between the two groups. The computer-based cognitive-mapping approach has shown a promising advantage over the verbal-text approach in improving students' reasoning performance. Further studies are needed to examine the effects of the cognitive-mapping approach in improving the construction of subject-matter knowledge on the basis of practical experience.

  16. A collaborative filtering-based approach to biomedical knowledge discovery.

    PubMed

    Lever, Jake; Gakkhar, Sitanshu; Gottlieb, Michael; Rashnavadi, Tahereh; Lin, Santina; Siu, Celia; Smith, Maia; Jones, Martin R; Krzywinski, Martin; Jones, Steven J M; Wren, Jonathan

    2018-02-15

    The increase in publication rates makes it challenging for an individual researcher to stay abreast of all relevant research in order to find novel research hypotheses. Literature-based discovery methods make use of knowledge graphs built using text mining and can infer future associations between biomedical concepts that will likely occur in new publications. These predictions are a valuable resource for researchers to explore a research topic. Current methods for prediction are based on the local structure of the knowledge graph. A method that uses global knowledge from across the knowledge graph needs to be developed in order to make knowledge discovery a frequently used tool by researchers. We propose an approach based on the singular value decomposition (SVD) that is able to combine data from across the knowledge graph through a reduced representation. Using cooccurrence data extracted from published literature, we show that SVD performs better than the leading methods for scoring discoveries. We also show the diminishing predictive power of knowledge discovery as we compare our predictions with real associations that appear further into the future. Finally, we examine the strengths and weaknesses of the SVD approach against another well-performing system using several predicted associations. All code and results files for this analysis can be accessed at https://github.com/jakelever/knowledgediscovery. sjones@bcgsc.ca. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  17. The Semantic eScience Framework

    NASA Astrophysics Data System (ADS)

    McGuinness, Deborah; Fox, Peter; Hendler, James

    2010-05-01

    The goal of this effort is to design and implement a configurable and extensible semantic eScience framework (SESF). Configuration requires research into accommodating different levels of semantic expressivity and user requirements from use cases. Extensibility is being achieved in a modular approach to the semantic encodings (i.e. ontologies) performed in community settings, i.e. an ontology framework into which specific applications all the way up to communities can extend the semantics for their needs.We report on how we are accommodating the rapid advances in semantic technologies and tools and the sustainable software path for the future (certain) technical advances. In addition to a generalization of the current data science interface, we will present plans for an upper-level interface suitable for use by clearinghouses, and/or educational portals, digital libraries, and other disciplines.SESF builds upon previous work in the Virtual Solar-Terrestrial Observatory. The VSTO utilizes leading edge knowledge representation, query and reasoning techniques to support knowledge-enhanced search, data access, integration, and manipulation. It encodes term meanings and their inter-relationships in ontologies anduses these ontologies and associated inference engines to semantically enable the data services. The Semantically-Enabled Science Data Integration (SESDI) project implemented data integration capabilities among three sub-disciplines; solar radiation, volcanic outgassing and atmospheric structure using extensions to existingmodular ontolgies and used the VSTO data framework, while adding smart faceted search and semantic data registrationtools. The Semantic Provenance Capture in Data Ingest Systems (SPCDIS) has added explanation provenance capabilities to an observational data ingest pipeline for images of the Sun providing a set of tools to answer diverseend user questions such as ``Why does this image look bad?. http://tw.rpi.edu/portal/SESF

  18. The Semantic eScience Framework

    NASA Astrophysics Data System (ADS)

    Fox, P. A.; McGuinness, D. L.

    2009-12-01

    The goal of this effort is to design and implement a configurable and extensible semantic eScience framework (SESF). Configuration requires research into accommodating different levels of semantic expressivity and user requirements from use cases. Extensibility is being achieved in a modular approach to the semantic encodings (i.e. ontologies) performed in community settings, i.e. an ontology framework into which specific applications all the way up to communities can extend the semantics for their needs.We report on how we are accommodating the rapid advances in semantic technologies and tools and the sustainable software path for the future (certain) technical advances. In addition to a generalization of the current data science interface, we will present plans for an upper-level interface suitable for use by clearinghouses, and/or educational portals, digital libraries, and other disciplines.SESF builds upon previous work in the Virtual Solar-Terrestrial Observatory. The VSTO utilizes leading edge knowledge representation, query and reasoning techniques to support knowledge-enhanced search, data access, integration, and manipulation. It encodes term meanings and their inter-relationships in ontologies anduses these ontologies and associated inference engines to semantically enable the data services. The Semantically-Enabled Science Data Integration (SESDI) project implemented data integration capabilities among three sub-disciplines; solar radiation, volcanic outgassing and atmospheric structure using extensions to existingmodular ontolgies and used the VSTO data framework, while adding smart faceted search and semantic data registrationtools. The Semantic Provenance Capture in Data Ingest Systems (SPCDIS) has added explanation provenance capabilities to an observational data ingest pipeline for images of the Sun providing a set of tools to answer diverseend user questions such as ``Why does this image look bad?.

  19. Knowledge-rich temporal relation identification and classification in clinical notes

    PubMed Central

    D’Souza, Jennifer; Ng, Vincent

    2014-01-01

    Motivation: We examine the task of temporal relation classification for the clinical domain. Our approach to this task departs from existing ones in that it is (i) ‘knowledge-rich’, employing sophisticated knowledge derived from discourse relations as well as both domain-independent and domain-dependent semantic relations, and (ii) ‘hybrid’, combining the strengths of rule-based and learning-based approaches. Evaluation results on the i2b2 Clinical Temporal Relations Challenge corpus show that our approach yields a 17–24% and 8–14% relative reduction in error over a state-of-the-art learning-based baseline system when gold-standard and automatically identified temporal relations are used, respectively. Database URL: http://www.hlt.utdallas.edu/~jld082000/temporal-relations/ PMID:25414383

  20. Physician leadership: influence on practice-based learning and improvement.

    PubMed

    Prather, Stephen E; Jones, David N

    2003-01-01

    In response to the technology and information explosion, practice-based learning and improvement is emerging within the medical field to deliver systematic practice-linked improvements. However, its emergence has been inhibited by the slow acceptance of evidence-based medicine among physicians, who are reluctant to embrace proven high-performance leadership principles long established in other high-risk fields. This reluctance may be attributable to traditional medical training, which encourages controlling leadership styles that magnify the resistance common to all change efforts. To overcome this resistance, physicians must develop the same leadership skills that have proven to be critical to success in other service and high-performance industries. Skills such as self-awareness, shared authority, conflict resolution, and nonpunitive critique can emerge in practice only if they are taught. A dramatic shift away from control and blame has become a requirement for achieving success in other industries based on complex group process. This approach is so mainstream that the burden of proof that cooperative leadership is not a requirement for medical improvement falls to those institutions perpetuating the outmoded paradigm of the past. Cooperative leadership skills that have proven central to implementing change in the information era are suggested as a core cultural support for practice-based learning and improvement. Complex adaptive systems theory, long used as a way to understand evolutionary biology, and more recently computer science and economics, predicts that behavior emerging among some groups of providers will be selected for duplication by the competitive environment. A curriculum framework needed to teach leadership skills to expand the influence of practice-based learning and improvement is offered as a guide to accelerate change.

  1. NOAA Ecosystem Data Assembly Center for the Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Parsons, A. R.; Beard, R. H.; Arnone, R. A.; Cross, S. L.; Comar, P. G.; May, N.; Strange, T. P.

    2006-12-01

    Through research programs at the NOAA Northern Gulf of Mexico Cooperative Institute (CI), NOAA is establishing an Ecosystem Data Assembly Center (EDAC) for the Gulf of Mexico. The EDAC demonstrates the utility of integrating many heterogeneous data types and streams used to characterized and identify ecosystems for the purpose of determining the health of ecosystems and identifying applications of the data within coastal resource management activities. Data streams include meteorological, physical oceanographic, ocean color, benthic, biogeochemical surveys, fishery, as well as fresh water fluxes (rainfall and river flow). Additionally the EDAC will provide an interface to the ecosystem data through an ontology based on the Coastal/Marine Ecological Classification System (CMECS). Applications of the ontological approach within the EDAC will be applied to increase public knowledge on habitat and ecosystem awareness. The EDAC plans to leverage companion socioeconomic studies to identify the essential data needed for continued EDAC operations. All data-management architectures and practices within the EDAC ensure interoperability with the Integrated Ocean Observing System (IOOS) national backbone by incorporating the IOOS Data Management and Communications Plan. Proven data protocols, standards, formats, applications, practices and architectures developed by the EDAC will be transitioned to the NOAA National Data Centers.

  2. Electrospraying of polymer solutions: Study of formulation and process parameters.

    PubMed

    Smeets, Annelies; Clasen, Christian; Van den Mooter, Guy

    2017-10-01

    Over the past decade, electrospraying has proven to be a promising method for the preparation of amorphous solid dispersions, an established formulation strategy to improve the oral bioavailability of poorly soluble drug compounds. Due to the lack of fundamental knowledge concerning adequate single nozzle electrospraying conditions, a trial-and-error approach is currently the only option. The objective of this paper is to study/investigate the influence of the different formulation and process parameters, as well as their interplay, on the formation of a stable cone-jet mode as a prerequisite for a reproducible production of monodisperse micro- and nanoparticles. To this purpose, different polymers commonly used in the formulation of solid dispersions were electrosprayed to map out the workable parameter ranges of the process. The experiments evaluate the importance of the experimental parameters as flow rate, electric potential difference and the distance between the tip of the nozzle and collector. Based on this, the type of solvent and the concentration of the polymer solutions, along with their viscosity and conductivity, were identified as determinative formulation parameters. This information is of utmost importance to rationally design further electrospraying methods for the preparation of amorphous solid dispersions. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Smoking cessation and the cardiovascular patient.

    PubMed

    Prochaska, Judith J; Benowitz, Neal L

    2015-09-01

    Smoking remains the leading cause of preventable morbidity and mortality. Our review highlights research from 2013 to 2015 on the treatment of cigarette smoking, with a focus on heart patients and cardiovascular outcomes. Seeking to maximize the reach and effectiveness of existing cessation medications, current tobacco control research has demonstrated the safety and efficacy of combination treatment, extended use, reduce-to-quit strategies, and personalized approaches to treatment matching. Further, cytisine has gained interest as a lower-cost strategy for addressing the global tobacco epidemic. On the harm reduction front, snus and electronic nicotine delivery systems are being widely distributed and promoted with major gaps in knowledge of the safety of long-term and dual use. Quitlines, comparable in outcome to in-person treatment, make cessation counseling available on a national scale, though use rates remain relatively low. Employee reward programs are gaining attention given the high costs of tobacco use to employers; sustaining quit rates postpayment, however, has proven challenging. Evidence-based cessation treatments exist. Broader dissemination, adoption, and implementation are key to addressing the tobacco epidemic. The cardiology team has a professional obligation to advance tobacco control efforts and can play an important role in achieving a smoke-free future.

  4. Investigational drug therapies for coeliac disease - where to from here?

    PubMed

    Haridy, James; Lewis, Diana; Newnham, Evan D

    2018-03-01

    Despite decades of research and a detailed knowledge of the immunopathological basis of coeliac disease (CD), adherence to a lifelong gluten-free diet (GFD) remains the single proven and available treatment. The increasing prevalence of CD combined with variable adherence to the GFD in a significant proportion of patients demands new therapeutic strategies. Areas covered: Trial registries, clinicaltrials.gov, pharmaceutical company website searches as well as published data from PubMed and conference proceedings were used to extract the most recent outcomes for CD therapeutics. This article aims to review the available therapies from a pathophysiological approach, and propose future directions in this interesting yet largely unfulfilled area of research. Expert opinion: Increasingly, the GFD is being challenged by its availability, palatability, practicality and now even efficacy in some populations. Whilst the causative antigens have been well described, it is clear that treatment based on the removal of these immunostimulatory peptides from the diet is far more complex than early experience in CD treatment implied. Despite burgeoning interest and research in experimental therapies for CD over the past twenty years, the only therapy showing promise as a true alternative to a GFD is that of the induction of tolerance via a vaccine.

  5. Towards a knowledge-based system to assist the Brazilian data-collecting system operation

    NASA Technical Reports Server (NTRS)

    Rodrigues, Valter; Simoni, P. O.; Oliveira, P. P. B.; Oliveira, C. A.; Nogueira, C. A. M.

    1988-01-01

    A study is reported which was carried out to show how a knowledge-based approach would lead to a flexible tool to assist the operation task in a satellite-based environmental data collection system. Some characteristics of a hypothesized system comprised of a satellite and a network of Interrogable Data Collecting Platforms (IDCPs) are pointed out. The Knowledge-Based Planning Assistant System (KBPAS) and some aspects about how knowledge is organized in the IDCP's domain are briefly described.

  6. An Ontology-Based Conceptual Model For Accumulating And Reusing Knowledge In A DMAIC Process

    NASA Astrophysics Data System (ADS)

    Nguyen, ThanhDat; Kifor, Claudiu Vasile

    2015-09-01

    DMAIC (Define, Measure, Analyze, Improve, and Control) is an important process used to enhance quality of processes basing on knowledge. However, it is difficult to access DMAIC knowledge. Conventional approaches meet a problem arising from structuring and reusing DMAIC knowledge. The main reason is that DMAIC knowledge is not represented and organized systematically. In this article, we overcome the problem basing on a conceptual model that is a combination of DMAIC process, knowledge management, and Ontology engineering. The main idea of our model is to utilizing Ontologies to represent knowledge generated by each of DMAIC phases. We build five different knowledge bases for storing all knowledge of DMAIC phases with the support of necessary tools and appropriate techniques in Information Technology area. Consequently, these knowledge bases provide knowledge available to experts, managers, and web users during or after DMAIC execution in order to share and reuse existing knowledge.

  7. Teen pregnancy and the achievement gap among urban minority youth.

    PubMed

    Basch, Charles E

    2011-10-01

    To outline the prevalence and disparities of teen pregnancy among school-aged urban minority youth, causal pathways through which nonmarital teen births adversely affects academic achievement, and proven or promising approaches for schools to address this problem. Literature review. In 2006, the birth rate among 15- to 17-year-old non-Hispanic Blacks (36.1 per 1000) was more than three times as high, and the birth rate among Hispanics (47.9 per 1000) was more than four times as high as the birth rate among non-Hispanic Whites (11.8 per 1000). Compared with women who delay childbearing until age 30, teen mothers' education is estimated to be approximately 2 years shorter. Teen mothers are 10-12% less likely to complete high school and have 14-29% lower odds of attending college. School-based programs have the potential to help teens acquire the knowledge and skills needed to postpone sex, practice safer sex, avoid unintended pregnancy, and if pregnant, to complete high school and pursue postsecondary education. Most students in US middle and high schools receive some kind of sex education. Federal policies and legislation have increased use of the abstinence-only-until-marriage approach, which is disappointing considering the lack of evidence that this approach is effective. Nonmarital teen births are highly and disproportionately prevalent among school-aged urban minority youth, have a negative impact on educational attainment, and effective practices are available for schools to address this problem. Teen pregnancy exerts an important influence on educational attainment among urban minority youth. Decisions about what will be taught should be informed by empirical data documenting the effectiveness of alternative approaches. © 2011, American School Health Association.

  8. A semantic proteomics dashboard (SemPoD) for data management in translational research.

    PubMed

    Jayapandian, Catherine P; Zhao, Meng; Ewing, Rob M; Zhang, Guo-Qiang; Sahoo, Satya S

    2012-01-01

    One of the primary challenges in translational research data management is breaking down the barriers between the multiple data silos and the integration of 'omics data with clinical information to complete the cycle from the bench to the bedside. The role of contextual metadata, also called provenance information, is a key factor ineffective data integration, reproducibility of results, correct attribution of original source, and answering research queries involving "What", "Where", "When", "Which", "Who", "How", and "Why" (also known as the W7 model). But, at present there is limited or no effective approach to managing and leveraging provenance information for integrating data across studies or projects. Hence, there is an urgent need for a paradigm shift in creating a "provenance-aware" informatics platform to address this challenge. We introduce an ontology-driven, intuitive Semantic Proteomics Dashboard (SemPoD) that uses provenance together with domain information (semantic provenance) to enable researchers to query, compare, and correlate different types of data across multiple projects, and allow integration with legacy data to support their ongoing research. The SemPoD platform, currently in use at the Case Center for Proteomics and Bioinformatics (CPB), consists of three components: (a) Ontology-driven Visual Query Composer, (b) Result Explorer, and (c) Query Manager. Currently, SemPoD allows provenance-aware querying of 1153 mass-spectrometry experiments from 20 different projects. SemPod uses the systems molecular biology provenance ontology (SysPro) to support a dynamic query composition interface, which automatically updates the components of the query interface based on previous user selections and efficiently prunes the result set usinga "smart filtering" approach. The SysPro ontology re-uses terms from the PROV-ontology (PROV-O) being developed by the World Wide Web Consortium (W3C) provenance working group, the minimum information required for reporting a molecular interaction experiment (MIMIx), and the minimum information about a proteomics experiment (MIAPE) guidelines. The SemPoD was evaluated both in terms of user feedback and as scalability of the system. SemPoD is an intuitive and powerful provenance ontology-driven data access and query platform that uses the MIAPE and MIMIx metadata guideline to create an integrated view over large-scale systems molecular biology datasets. SemPoD leverages the SysPro ontology to create an intuitive dashboard for biologists to compose queries, explore the results, and use a query manager for storing queries for later use. SemPoD can be deployed over many existing database applications storing 'omics data, including, as illustrated here, the LabKey data-management system. The initial user feedback evaluating the usability and functionality of SemPoD has been very positive and it is being considered for wider deployment beyond the proteomics domain, and in other 'omics' centers.

  9. A Logical Framework for Service Migration Based Survivability

    DTIC Science & Technology

    2016-06-24

    platforms; Service Migration Strategy Fuzzy Inference System Knowledge Base Fuzzy rules representing domain expert knowledge about implications of...service migration strategy. Our approach uses expert knowledge as linguistic reasoning rules and takes service programs damage assessment, service...programs complexity, and available network capability as input. The fuzzy inference system includes four components as shown in Figure 5: (1) a knowledge

  10. Risk Management of New Microelectronics for NASA: Radiation Knowledge-base

    NASA Technical Reports Server (NTRS)

    LaBel, Kenneth A.

    2004-01-01

    Contents include the following: NASA Missions - implications to reliability and radiation constraints. Approach to Insertion of New Technologies Technology Knowledge-base development. Technology model/tool development and validation. Summary comments.

  11. Password-only authenticated three-party key exchange proven secure against insider dictionary attacks.

    PubMed

    Nam, Junghyun; Choo, Kim-Kwang Raymond; Paik, Juryon; Won, Dongho

    2014-01-01

    While a number of protocols for password-only authenticated key exchange (PAKE) in the 3-party setting have been proposed, it still remains a challenging task to prove the security of a 3-party PAKE protocol against insider dictionary attacks. To the best of our knowledge, there is no 3-party PAKE protocol that carries a formal proof, or even definition, of security against insider dictionary attacks. In this paper, we present the first 3-party PAKE protocol proven secure against both online and offline dictionary attacks as well as insider and outsider dictionary attacks. Our construct can be viewed as a protocol compiler that transforms any 2-party PAKE protocol into a 3-party PAKE protocol with 2 additional rounds of communication. We also present a simple and intuitive approach of formally modelling dictionary attacks in the password-only 3-party setting, which significantly reduces the complexity of proving the security of 3-party PAKE protocols against dictionary attacks. In addition, we investigate the security of the well-known 3-party PAKE protocol, called GPAKE, due to Abdalla et al. (2005, 2006), and demonstrate that the security of GPAKE against online dictionary attacks depends heavily on the composition of its two building blocks, namely a 2-party PAKE protocol and a 3-party key distribution protocol.

  12. Validation and detection of vessel landmarks by using anatomical knowledge

    NASA Astrophysics Data System (ADS)

    Beck, Thomas; Bernhardt, Dominik; Biermann, Christina; Dillmann, Rüdiger

    2010-03-01

    The detection of anatomical landmarks is an important prerequisite to analyze medical images fully automatically. Several machine learning approaches have been proposed to parse 3D CT datasets and to determine the location of landmarks with associated uncertainty. However, it is a challenging task to incorporate high-level anatomical knowledge to improve these classification results. We propose a new approach to validate candidates for vessel bifurcation landmarks which is also applied to systematically search missed and to validate ambiguous landmarks. A knowledge base is trained providing human-readable geometric information of the vascular system, mainly vessel lengths, radii and curvature information, for validation of landmarks and to guide the search process. To analyze the bifurcation area surrounding a vessel landmark of interest, a new approach is proposed which is based on Fast Marching and incorporates anatomical information from the knowledge base. Using the proposed algorithms, an anatomical knowledge base has been generated based on 90 manually annotated CT images containing different parts of the body. To evaluate the landmark validation a set of 50 carotid datasets has been tested in combination with a state of the art landmark detector with excellent results. Beside the carotid bifurcation the algorithm is designed to handle a wide range of vascular landmarks, e.g. celiac, superior mesenteric, renal, aortic, iliac and femoral bifurcation.

  13. Facilitating Emergent Literacy Skills: A Literature-Based, Multiple Intelligence Approach

    ERIC Educational Resources Information Center

    Brand, Susan Trostle

    2006-01-01

    Educators have continually sought to achieve a balance between a phonics-based, code-emphasis program and a more holistic, meaning-based approach to emergent literacy instruction. This article describes an integrated phonics and literature-based approach to developing children's emergent literacy skills. These skills included alphabet knowledge,…

  14. ProphTools: general prioritization tools for heterogeneous biological networks.

    PubMed

    Navarro, Carmen; Martínez, Victor; Blanco, Armando; Cano, Carlos

    2017-12-01

    Networks have been proven effective representations for the analysis of biological data. As such, there exist multiple methods to extract knowledge from biological networks. However, these approaches usually limit their scope to a single biological entity type of interest or they lack the flexibility to analyze user-defined data. We developed ProphTools, a flexible open-source command-line tool that performs prioritization on a heterogeneous network. ProphTools prioritization combines a Flow Propagation algorithm similar to a Random Walk with Restarts and a weighted propagation method. A flexible model for the representation of a heterogeneous network allows the user to define a prioritization problem involving an arbitrary number of entity types and their interconnections. Furthermore, ProphTools provides functionality to perform cross-validation tests, allowing users to select the best network configuration for a given problem. ProphTools core prioritization methodology has already been proven effective in gene-disease prioritization and drug repositioning. Here we make ProphTools available to the scientific community as flexible, open-source software and perform a new proof-of-concept case study on long noncoding RNAs (lncRNAs) to disease prioritization. ProphTools is robust prioritization software that provides the flexibility not present in other state-of-the-art network analysis approaches, enabling researchers to perform prioritization tasks on any user-defined heterogeneous network. Furthermore, the application to lncRNA-disease prioritization shows that ProphTools can reach the performance levels of ad hoc prioritization tools without losing its generality. © The Authors 2017. Published by Oxford University Press.

  15. Challenging the Non-Science Majors with Inquiry-based Laboratory Environmental Geoscience Courses

    NASA Astrophysics Data System (ADS)

    Humphreys, R. R.; Hall, C.; Colgan, M. W.

    2009-12-01

    Although there is proven rationale for teaching inquiry-based/problem-based lessons in the undergraduate classroom, very few non-major geoscience course implement these instructional strategies in their laboratory sections. The College of Charleston Department of Geology and Environmental Geosciences has developed an introductory Environmental Geology Laboratory course for undergraduate non-majors, which corrects this traditional methodology. The Environmental Geology lab activities employ an inquiry-based approach, in which the students take control of their own learning; a cooperative learning approach, in which each member of a team is responsible not only for learning what is taught but also for helping their peers learn; and a problem/case study-based learning approach, in which activities are abstracted from a real-life scenario. In these lab sessions, students actively engage in mastering course content and develop essential skills while exploring real-world scenarios through case studies. For example, during the two-week section on Earthquakes, teams of students study the effects of seismic motion on various types of sediments found underlying the Charleston, South Carolina region. Students discover areas where the greatest damage occurred during the 1886 7.4 MM earthquake through a walking tour of downtown Charleston. Extracting information from historical and topographic maps, as well as aerial and satellite imagery provides students with the necessary information to produce an earthquake hazard-zone map of the Charleston Peninsula. These types of exercises and laboratory activities allow the students to utilize scientific reasoning and application of scientific concepts to develop solutions to environmental scenarios, such as volcanic eruptions, coastal, flooding, or landslide hazards, and groundwater contamination. The newly implemented labs began in Fall of 2008 and have been undergoing adaptations throughout the Spring and Fall of 2009. Qualitative data will be gathered and analyzed to show the effectiveness of moving beyond traditional laboratory teaching methods to methods that require and promote deeper learning and retaining of content. Qualitative data will be based upon the engagement of the students, the deeper level of questioning, the engagement of the faculty, among others. The data will be acquired through the use of personal responses and end of course surveys. For the Spring 2009 semester, the department will develop a more quantitative means of assessment by integrating a pre- and post-survey for this course as well as the traditionally-taught introductory course. Acquisition of knowledge and depth of knowledge by the students from both types of courses will be obtained and compared for assessing effectiveness of this teaching strategy in a laboratory setting. This data will encourage the faculty teaching Environmental Geology Labs as well as the standard introductory labs to redesign the remaining lab courses. In addition, the method used here may serve as a model for laboratory courses in other disciplines.

  16. Designing a theory-informed, contextually appropriate intervention strategy to improve delivery of paediatric services in Kenyan hospitals.

    PubMed

    English, Mike

    2013-03-28

    District hospital services in Kenya and many low-income countries should deliver proven, effective interventions that could substantially reduce child and newborn mortality. However such services are often of poor quality. Researchers have therefore been challenged to identify intervention strategies that go beyond addressing knowledge, skill, or resource inadequacies to support health systems to deliver better services at scale. An effort to develop a system-oriented intervention tailored to local needs and context and drawing on theory is described. An intervention was designed to improve district hospital services for children based on four main strategies: a reflective process to distill root causes for the observed problems with service delivery; developing a set of possible intervention approaches to address these problems; a search of literature for theory that provided the most appropriate basis for intervention design; and repeatedly moving backwards and forwards between identified causes, proposed interventions, identified theory, and knowledge of the existing context to develop an overarching intervention that seemed feasible and likely to be acceptable and potentially sustainable. In addition to human and resource constraints key problems included failures of relevant professionals to take responsibility for or ownership of the challenge of pediatric service delivery; inadequately prepared, poorly supported leaders of service units (mid-level managers) who are often professionally and geographically isolated and an almost complete lack of useful information for routinely monitoring or understanding service delivery practice or outcomes. A system-oriented intervention recognizing the pivotal role of leaders of service units but addressing the outer and inner setting of hospitals was designed to help shape and support an appropriate role for these professionals. It aims to foster a sense of ownership while providing the necessary understanding, knowledge, and skills for mid-level managers to work effectively with senior managers and frontline staff to improve services. The intervention will include development of an information system, feedback mechanisms, and discussion fora that promote positive change. The vehicle for such an intervention is a collaborative network partnering government and national professional associations. This case is presented to promote discussion on approaches to developing context appropriate interventions particularly in international health.

  17. From Visual Exploration to Storytelling and Back Again.

    PubMed

    Gratzl, S; Lex, A; Gehlenborg, N; Cosgrove, N; Streit, M

    2016-06-01

    The primary goal of visual data exploration tools is to enable the discovery of new insights. To justify and reproduce insights, the discovery process needs to be documented and communicated. A common approach to documenting and presenting findings is to capture visualizations as images or videos. Images, however, are insufficient for telling the story of a visual discovery, as they lack full provenance information and context. Videos are difficult to produce and edit, particularly due to the non-linear nature of the exploratory process. Most importantly, however, neither approach provides the opportunity to return to any point in the exploration in order to review the state of the visualization in detail or to conduct additional analyses. In this paper we present CLUE (Capture, Label, Understand, Explain), a model that tightly integrates data exploration and presentation of discoveries. Based on provenance data captured during the exploration process, users can extract key steps, add annotations, and author "Vistories", visual stories based on the history of the exploration. These Vistories can be shared for others to view, but also to retrace and extend the original analysis. We discuss how the CLUE approach can be integrated into visualization tools and provide a prototype implementation. Finally, we demonstrate the general applicability of the model in two usage scenarios: a Gapminder-inspired visualization to explore public health data and an example from molecular biology that illustrates how Vistories could be used in scientific journals. (see Figure 1 for visual abstract).

  18. From Visual Exploration to Storytelling and Back Again

    PubMed Central

    Gratzl, S.; Lex, A.; Gehlenborg, N.; Cosgrove, N.; Streit, M.

    2016-01-01

    The primary goal of visual data exploration tools is to enable the discovery of new insights. To justify and reproduce insights, the discovery process needs to be documented and communicated. A common approach to documenting and presenting findings is to capture visualizations as images or videos. Images, however, are insufficient for telling the story of a visual discovery, as they lack full provenance information and context. Videos are difficult to produce and edit, particularly due to the non-linear nature of the exploratory process. Most importantly, however, neither approach provides the opportunity to return to any point in the exploration in order to review the state of the visualization in detail or to conduct additional analyses. In this paper we present CLUE (Capture, Label, Understand, Explain), a model that tightly integrates data exploration and presentation of discoveries. Based on provenance data captured during the exploration process, users can extract key steps, add annotations, and author “Vistories”, visual stories based on the history of the exploration. These Vistories can be shared for others to view, but also to retrace and extend the original analysis. We discuss how the CLUE approach can be integrated into visualization tools and provide a prototype implementation. Finally, we demonstrate the general applicability of the model in two usage scenarios: a Gapminder-inspired visualization to explore public health data and an example from molecular biology that illustrates how Vistories could be used in scientific journals. (see Figure 1 for visual abstract) PMID:27942091

  19. Key Provenance of Earth Science Observational Data Products

    NASA Astrophysics Data System (ADS)

    Conover, H.; Plale, B.; Aktas, M.; Ramachandran, R.; Purohit, P.; Jensen, S.; Graves, S. J.

    2011-12-01

    As the sheer volume of data increases, particularly evidenced in the earth and environmental sciences, local arrangements for sharing data need to be replaced with reliable records about the what, who, how, and where of a data set or collection. This is frequently called the provenance of a data set. While observational data processing systems in the earth sciences have a long history of capturing metadata about the processing pipeline, current processes are limited in both what is captured and how it is disseminated to the science community. Provenance capture plays a role in scientific data preservation and stewardship precisely because it can automatically capture and represent a coherent picture of the what, how and who of a particular scientific collection. It reflects the transformations that a data collection underwent prior to its current form and the sequence of tasks that were executed and data products applied to generate a new product. In the NASA-funded Instant Karma project, we examine provenance capture in earth science applications, specifically the Advanced Microwave Scanning Radiometer - Earth Observing System (AMSR-E) Science Investigator-led Processing system (SIPS). The project is integrating the Karma provenance collection and representation tool into the AMSR-E SIPS production environment, with an initial focus on Sea Ice. This presentation will describe capture and representation of provenance that is guided by the Open Provenance Model (OPM). Several things have become clear during the course of the project to date. One is that core OPM entities and relationships are not adequate for expressing the kinds of provenance that is of interest in the science domain. OPM supports name-value pair annotations that can be used to augment what is known about the provenance entities and relationships, but in Karma, annotations cannot be added during capture, but only after the fact. This limits the capture system's ability to record something it learned about an entity after the event of its creation in the provenance record. We will discuss extensions to the Open Provenance Model (OPM) and modifications to the Karma tool suite to address this issue, more efficient representations of earth science kinds of provenance, and definition of metadata structures for capturing related knowledge about the data products and science algorithms used to generate them. Use scenarios for provenance information is an active topic of investigation. It has additionally become clear through the project that not all provenance is created equal. In processing pipelines, some provenance is repetitive and uninteresting. Because of the volume of provenance, this obscures what are the interesting pieces of provenance. Methodologies to reveal science-relevant provenance will be presented, along with a preview of the AMSR-E Provenance Browser.

  20. Linking science and decision making to promote an ecology for the city: practices and opportunities

    Treesearch

    Morgan Grove; Daniel L. Childers; Michael Galvin; Sarah J. Hines; Tischa Munoz-Erickson; Erika S. Svendsen

    2016-01-01

    To promote urban sustainability and resilience, there is an increasing demand for actionable science that links science and decision making based on social–ecological knowledge. Approaches, frameworks, and practices for such actionable science are needed and have only begun to emerge. We propose that approaches based on the co- design and co- production of knowledge...

  1. School-Based Intervention for Nutrition Promotion in Mi Yun County, Beijing, China: Does a Health-Promoting School Approach Improve Parents' Knowledge, Attitudes and Behaviour?

    ERIC Educational Resources Information Center

    Wang, Dongxu; Stewart, Donald; Chang, Chun

    2016-01-01

    Purpose: The purpose of this paper is to assess whether the school-based nutrition programme using the health-promoting school (HPS) framework was effective to improve parents' knowledge, attitudes and behaviour (KAB) in relation to nutrition in rural Mi Yun County, Beijing. Design/methodology/approach: A cluster-randomised intervention trial…

  2. Work-Centered Approach to Insurgency Campaign Analysis

    DTIC Science & Technology

    2007-06-01

    a constructivist or sensemaking philosophy by defining data, information , situation awareness , and situation understanding in the following manner...present paper explores a new approach to understanding transnational insurgency movements –an approach based on a fundamental analysis of the knowledge ...country or region. By focusing at the fundamental level of knowledge creation, the resulting framework allows an understanding of insurgency

  3. A Scientific Data Provenance Harvester for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Raju, Bibi; Elsethagen, Todd O.

    Data provenance provides a way for scientists to observe how experimental data originates, conveys process history, and explains influential factors such as experimental rationale and associated environmental factors from system metrics measured at runtime. The US Department of Energy Office of Science Integrated end-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project has developed a provenance harvester that is capable of collecting observations from file based evidence typically produced by distributed applications. To achieve this, file based evidence is extracted and transformed into an intermediate data format inspired in part by W3C CSV on the Web recommendations, calledmore » the Harvester Provenance Application Interface (HAPI) syntax. This syntax provides a general means to pre-stage provenance into messages that are both human readable and capable of being written to a provenance store, Provenance Environment (ProvEn). HAPI is being applied to harvest provenance from climate ensemble runs for Accelerated Climate Modeling for Energy (ACME) project funded under the U.S. Department of Energy’s Office of Biological and Environmental Research (BER) Earth System Modeling (ESM) program. ACME informally provides provenance in a native form through configuration files, directory structures, and log files that contain success/failure indicators, code traces, and performance measurements. Because of its generic format, HAPI is also being applied to harvest tabular job management provenance from Belle II DIRAC scheduler relational database tables as well as other scientific applications that log provenance related information.« less

  4. Knowledge-driven genomic interactions: an application in ovarian cancer.

    PubMed

    Kim, Dokyoon; Li, Ruowang; Dudek, Scott M; Frase, Alex T; Pendergrass, Sarah A; Ritchie, Marylyn D

    2014-01-01

    Effective cancer clinical outcome prediction for understanding of the mechanism of various types of cancer has been pursued using molecular-based data such as gene expression profiles, an approach that has promise for providing better diagnostics and supporting further therapies. However, clinical outcome prediction based on gene expression profiles varies between independent data sets. Further, single-gene expression outcome prediction is limited for cancer evaluation since genes do not act in isolation, but rather interact with other genes in complex signaling or regulatory networks. In addition, since pathways are more likely to co-operate together, it would be desirable to incorporate expert knowledge to combine pathways in a useful and informative manner. Thus, we propose a novel approach for identifying knowledge-driven genomic interactions and applying it to discover models associated with cancer clinical phenotypes using grammatical evolution neural networks (GENN). In order to demonstrate the utility of the proposed approach, an ovarian cancer data from the Cancer Genome Atlas (TCGA) was used for predicting clinical stage as a pilot project. We identified knowledge-driven genomic interactions associated with cancer stage from single knowledge bases such as sources of pathway-pathway interaction, but also knowledge-driven genomic interactions across different sets of knowledge bases such as pathway-protein family interactions by integrating different types of information. Notably, an integration model from different sources of biological knowledge achieved 78.82% balanced accuracy and outperformed the top models with gene expression or single knowledge-based data types alone. Furthermore, the results from the models are more interpretable because they are framed in the context of specific biological pathways or other expert knowledge. The success of the pilot study we have presented herein will allow us to pursue further identification of models predictive of clinical cancer survival and recurrence. Understanding the underlying tumorigenesis and progression in ovarian cancer through the global view of interactions within/between different biological knowledge sources has the potential for providing more effective screening strategies and therapeutic targets for many types of cancer.

  5. Evidence-based management.

    PubMed

    Pfeffer, Jeffrey; Sutton, Robert I

    2006-01-01

    For the most part, managers looking to cure their organizational ills rely on obsolete knowledge they picked up in school, long-standing but never proven traditions, patterns gleaned from experience, methods they happen to be skilled in applying, and information from vendors. They could learn a thing or two from practitioners of evidence-based medicine, a movement that has taken the medical establishment by storm over the past decade. A growing number of physicians are eschewing the usual, flawed resources and are instead identifying, disseminating, and applying research that is soundly conducted and clinically relevant. It's time for managers to do the same. The challenge is, quite simply, to ground decisions in the latest and best knowledge of what actually works. In some ways, that's more difficult to do in business than in medicine. The evidence is weaker in business; almost anyone can (and many people do) claim to be a management expert; and a motley crew of sources--Shakespeare, Billy Graham,Jack Welch, Attila the Hunare used to generate management advice. Still, it makes sense that when managers act on better logic and strong evidence, their companies will beat the competition. Like medicine, management is learned through practice and experience. Yet managers (like doctors) can practice their craft more effectively if they relentlessly seek new knowledge and insight, from both inside and outside their companies, so they can keep updating their assumptions, skills, and knowledge.

  6. Smith predictor based-sliding mode controller for integrating processes with elevated deadtime.

    PubMed

    Camacho, Oscar; De la Cruz, Francisco

    2004-04-01

    An approach to control integrating processes with elevated deadtime using a Smith predictor sliding mode controller is presented. A PID sliding surface and an integrating first-order plus deadtime model have been used to synthesize the controller. Since the performance of existing controllers with a Smith predictor decrease in the presence of modeling errors, this paper presents a simple approach to combining the Smith predictor with the sliding mode concept, which is a proven, simple, and robust procedure. The proposed scheme has a set of tuning equations as a function of the characteristic parameters of the model. For implementation of our proposed approach, computer based industrial controllers that execute PID algorithms can be used. The performance and robustness of the proposed controller are compared with the Matausek-Micić scheme for linear systems using simulations.

  7. Knowledge about smoking, reasons for smoking, and reasons for wishing to quit in inner-city African Americans.

    PubMed

    Ahluwalia, J S; Resnicow, K; Clark, W S

    1998-01-01

    To determine knowledge about smoking, reasons for smoking, and reasons for wishing to quit and the association of these variables with abstinence at ten weeks and six months. Descriptive study and longitudinal intervention. Inner-city public hospital clinics. 410 African-American cigarette smokers interested in quitting were surveyed at baseline and subsequently enrolled into a double-blind, placebo-controlled, randomized trial of the transdermal nicotine patch. Descriptive information about smoking knowledge, reasons for smoking, and reasons for wishing to quit, and association of these variables with abstinence at 10 weeks and 6 months. Among the 410 patients randomized, mean age was 48 years, 61% were female, 41% had less than a high school education, 51% had an annual household income less than $8,000, and the average number of cigarettes smoked a day was twenty. The average number of questions answered correctly was nine out of eleven (84%). The most cited reason for smoking was relaxation/tension reduction and the least cited were stimulation and handling of the cigarette. Ninety-nine percent of patients stated they wished to quit for health reasons. Knowledge, reasons for smoking, and reasons for wishing to quit were not significantly associated with 10-week or 6-month abstinence. In this group of inner-city African-American smokers, knowledge about cigarette smoking was high. Reasons for smoking were related to relaxation, craving, and pleasure, and reasons for wishing to quit were largely health-related. Knowledge, reasons for smoking, and reasons for wishing to quit were not associated with 10 week or 6 month abstinence. Since knowledge about smoking is already high, future efforts should be directed at promoting cessation through proven behavioral and pharmacological approaches, rather than didactic patient education.

  8. Effectiveness of e-learning in continuing medical education for occupational physicians.

    PubMed

    Hugenholtz, Nathalie I R; de Croon, Einar M; Smits, Paul B; van Dijk, Frank J H; Nieuwenhuijsen, Karen

    2008-08-01

    Within a clinical context e-learning is comparable to traditional approaches of continuing medical education (CME). However, the occupational health context differs and until now the effect of postgraduate e-learning among occupational physicians (OPs) has not been evaluated. To evaluate the effect of e-learning on knowledge on mental health issues as compared to lecture-based learning in a CME programme for OPs. Within the context of a postgraduate meeting for 74 OPs, a randomized controlled trial was conducted. Test assessments of knowledge were made before and immediately after an educational session with either e-learning or lecture-based learning. In both groups, a significant gain in knowledge on mental health care was found (P < 0.05). However, there was no significant difference between the two educational approaches. The effect of e-learning on OPs' mental health care knowledge is comparable to a lecture-based approach. Therefore, e-learning can be beneficial for the CME of OPs.

  9. Model compilation: An approach to automated model derivation

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Baudin, Catherine; Iwasaki, Yumi; Nayak, Pandurang; Tanaka, Kazuo

    1990-01-01

    An approach is introduced to automated model derivation for knowledge based systems. The approach, model compilation, involves procedurally generating the set of domain models used by a knowledge based system. With an implemented example, how this approach can be used to derive models of different precision and abstraction is illustrated, and models are tailored to different tasks, from a given set of base domain models. In particular, two implemented model compilers are described, each of which takes as input a base model that describes the structure and behavior of a simple electromechanical device, the Reaction Wheel Assembly of NASA's Hubble Space Telescope. The compilers transform this relatively general base model into simple task specific models for troubleshooting and redesign, respectively, by applying a sequence of model transformations. Each transformation in this sequence produces an increasingly more specialized model. The compilation approach lessens the burden of updating and maintaining consistency among models by enabling their automatic regeneration.

  10. Evolutionary computing for the design search and optimization of space vehicle power subsystems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Klimeck, Gerhard; Hanks, David; Hua, Hook

    2004-01-01

    Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment. Out preliminary results demonstrate that this approach has the potential to improve the space system trade study process by allowing engineers to statistically weight subsystem goals of mass, cost and performance then automatically size power elements based on anticipated performance of the subsystem rather than on worst-case estimates.

  11. Closed-Tube Barcoding.

    PubMed

    Sirianni, Nicky M; Yuan, Huijun; Rice, John E; Kaufman, Ronit S; Deng, John; Fulton, Chandler; Wangh, Lawrence J

    2016-11-01

    Here, we present a new approach for increasing the rate and lowering the cost of identifying, cataloging, and monitoring global biodiversity. These advances, which we call Closed-Tube Barcoding, are one application of a suite of proven PCR-based technologies invented in our laboratory. Closed-Tube Barcoding builds on and aims to enhance the profoundly important efforts of the International Barcode of Life initiative. Closed-Tube Barcoding promises to be particularly useful when large numbers of small or rare specimens need to be screened and characterized at an affordable price. This approach is also well suited for automation and for use in portable devices.

  12. Tracking Provenance of Earth Science Data

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt; Yesha, Yelena; Halem, Milton

    2010-01-01

    Tremendous volumes of data have been captured, archived and analyzed. Sensors, algorithms and processing systems for transforming and analyzing the data are evolving over time. Web Portals and Services can create transient data sets on-demand. Data are transferred from organization to organization with additional transformations at every stage. Provenance in this context refers to the source of data and a record of the process that led to its current state. It encompasses the documentation of a variety of artifacts related to particular data. Provenance is important for understanding and using scientific datasets, and critical for independent confirmation of scientific results. Managing provenance throughout scientific data processing has gained interest lately and there are a variety of approaches. Large scale scientific datasets consisting of thousands to millions of individual data files and processes offer particular challenges. This paper uses the analogy of art history provenance to explore some of the concerns of applying provenance tracking to earth science data. It also illustrates some of the provenance issues with examples drawn from the Ozone Monitoring Instrument (OMI) Data Processing System (OMIDAPS) run at NASA's Goddard Space Flight Center by the first author.

  13. Predicting Mycobacterium tuberculosis Complex Clades Using Knowledge-Based Bayesian Networks

    PubMed Central

    Bennett, Kristin P.

    2014-01-01

    We develop a novel approach for incorporating expert rules into Bayesian networks for classification of Mycobacterium tuberculosis complex (MTBC) clades. The proposed knowledge-based Bayesian network (KBBN) treats sets of expert rules as prior distributions on the classes. Unlike prior knowledge-based support vector machine approaches which require rules expressed as polyhedral sets, KBBN directly incorporates the rules without any modification. KBBN uses data to refine rule-based classifiers when the rule set is incomplete or ambiguous. We develop a predictive KBBN model for 69 MTBC clades found in the SITVIT international collection. We validate the approach using two testbeds that model knowledge of the MTBC obtained from two different experts and large DNA fingerprint databases to predict MTBC genetic clades and sublineages. These models represent strains of MTBC using high-throughput biomarkers called spacer oligonucleotide types (spoligotypes), since these are routinely gathered from MTBC isolates of tuberculosis (TB) patients. Results show that incorporating rules into problems can drastically increase classification accuracy if data alone are insufficient. The SITVIT KBBN is publicly available for use on the World Wide Web. PMID:24864238

  14. Measuring Knowledge Elaboration Based on a Computer-Assisted Knowledge Map Analytical Approach to Collaborative Learning

    ERIC Educational Resources Information Center

    Zheng, Lanqin; Huang, Ronghuai; Hwang, Gwo-Jen; Yang, Kaicheng

    2015-01-01

    The purpose of this study is to quantitatively measure the level of knowledge elaboration and explore the relationships between prior knowledge of a group, group performance, and knowledge elaboration in collaborative learning. Two experiments were conducted to investigate the level of knowledge elaboration. The collaborative learning objective in…

  15. The motivational and informational basis of attitudes toward foods with health claims.

    PubMed

    Žeželj, Iris; Milošević, Jasna; Stojanović, Žaklina; Ognjanov, Galjina

    2012-12-01

    This research explored the effects of food choice motives, nutritional knowledge, and the use of food labels, on attitude toward food with health claims. Food with health claims was chosen as a relatively novel category of products designed to be beneficial for health. We identified eight motives served by food in general, and tested if they serve as motivations to positively evaluate functional food. Questionnaire was administered on nationally representative samples of 3085 respondents from six Western Balkan countries. We proposed two structural models relating an extensive list of eight and, alternatively, restricted list of three food-choice motives (health, mood and sensory appeal) to attitude toward functional food. We also expected the indirect association between the health motive and attitude, through nutritional knowledge and use of food labels. The results revealed highly positive, although undifferentiated attitude toward functional food, with no significant differences between the countries. The restricted model provided a better fit then the exhaustive model; the health motive was proven to have indirect influence on attitude through knowledge and label use. The implications of these findings for functional approach to attitudes, understanding the demand for functional food and overcoming barriers to dietary change are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. An application of object-oriented knowledge representation to engineering expert systems

    NASA Technical Reports Server (NTRS)

    Logie, D. S.; Kamil, H.; Umaretiya, J. R.

    1990-01-01

    The paper describes an object-oriented knowledge representation and its application to engineering expert systems. The object-oriented approach promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects and organized by defining relationships between the objects. An Object Representation Language (ORL) was implemented as a tool for building and manipulating the object base. Rule-based knowledge representation is then used to simulate engineering design reasoning. Using a common object base, very large expert systems can be developed, comprised of small, individually processed, rule sets. The integration of these two schemes makes it easier to develop practical engineering expert systems. The general approach to applying this technology to the domain of the finite element analysis, design, and optimization of aerospace structures is discussed.

  17. Delivering spacecraft control centers with embedded knowledge-based systems: The methodology issue

    NASA Technical Reports Server (NTRS)

    Ayache, S.; Haziza, M.; Cayrac, D.

    1994-01-01

    Matra Marconi Space (MMS) occupies a leading place in Europe in the domain of satellite and space data processing systems. The maturity of the knowledge-based systems (KBS) technology, the theoretical and practical experience acquired in the development of prototype, pre-operational and operational applications, make it possible today to consider the wide operational deployment of KBS's in space applications. In this perspective, MMS has to prepare the introduction of the new methods and support tools that will form the basis of the development of such systems. This paper introduces elements of the MMS methodology initiatives in the domain and the main rationale that motivated the approach. These initiatives develop along two main axes: knowledge engineering methods and tools, and a hybrid method approach for coexisting knowledge-based and conventional developments.

  18. HPPD: ligand- and target-based virtual screening on a herbicide target.

    PubMed

    López-Ramos, Miriam; Perruccio, Francesca

    2010-05-24

    Hydroxyphenylpyruvate dioxygenase (HPPD) has proven to be a very successful target for the development of herbicides with bleaching properties, and today HPPD inhibitors are well established in the agrochemical market. Syngenta has a long history of HPPD-inhibitor research, and HPPD was chosen as a case study for the validation of diverse ligand- and target-based virtual screening approaches to identify compounds with inhibitory properties. Two-dimensional extended connectivity fingerprints, three-dimensional shape-based tools (ROCS, EON, and Phase-shape) and a pharmacophore approach (Phase) were used as ligand-based methods; Glide and Gold were used as target-based. Both the virtual screening utility and the scaffold-hopping ability of the screening tools were assessed. Particular emphasis was put on the specific pitfalls to take into account for the design of a virtual screening campaign in an agrochemical context, as compared to a pharmaceutical environment.

  19. An interaural-correlation-based approach that accounts for a wide variety of binaural detection data.

    PubMed

    Bernstein, Leslie R; Trahiotis, Constantine

    2017-02-01

    Interaural cross-correlation-based models of binaural processing have accounted successfully for a wide variety of binaural phenomena, including binaural detection, binaural discrimination, and measures of extents of laterality based on interaural temporal disparities, interaural intensitive disparities, and their combination. This report focuses on quantitative accounts of data obtained from binaural detection experiments published over five decades. Particular emphasis is placed on stimulus contexts for which commonly used correlation-based approaches fail to provide adequate explanations of the data. One such context concerns binaural detection of signals masked by certain noises that are narrow-band and/or interaurally partially correlated. It is shown that a cross-correlation-based model that includes stages of peripheral auditory processing can, when coupled with an appropriate decision variable, account well for a wide variety of classic and recently published binaural detection data including those that have, heretofore, proven to be problematic.

  20. An architecture for rule based system explanation

    NASA Technical Reports Server (NTRS)

    Fennel, T. R.; Johannes, James D.

    1990-01-01

    A system architecture is presented which incorporate both graphics and text into explanations provided by rule based expert systems. This architecture facilitates explanation of the knowledge base content, the control strategies employed by the system, and the conclusions made by the system. The suggested approach combines hypermedia and inference engine capabilities. Advantages include: closer integration of user interface, explanation system, and knowledge base; the ability to embed links to deeper knowledge underlying the compiled knowledge used in the knowledge base; and allowing for more direct control of explanation depth and duration by the user. User models are suggested to control the type, amount, and order of information presented.

  1. Knowledge Cultures and the Shaping of Work-Based Learning: The Case of Computer Engineering

    ERIC Educational Resources Information Center

    Nerland, Monika

    2008-01-01

    This paper examines how the knowledge culture of computer engineering--that is, the ways in which knowledge is produced, distributed, accumulated and collectively approached within this profession--serve to construct work-based learning in specific ways. Typically, the epistemic infrastructures take the form of information structures with a global…

  2. A quasi-likelihood approach to non-negative matrix factorization

    PubMed Central

    Devarajan, Karthik; Cheung, Vincent C.K.

    2017-01-01

    A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511

  3. Quantitative and Qualitative Analysis of Flavonoids and Phenolic Acids in Snow Chrysanthemum (Coreopsis tinctoria Nutt.) by HPLC-DAD and UPLC-ESI-QTOF-MS.

    PubMed

    Yang, Yinjun; Sun, Xinguang; Liu, Jinjun; Kang, Liping; Chen, Sibao; Ma, Baiping; Guo, Baolin

    2016-09-30

    A simple, accurate and reliable high performance liquid chromatography coupled with photodiode array detection (HPLC-DAD) method was developed and then successfully applied for simultaneous quantitative analysis of eight compounds, including chlorogenic acid ( 1 ), ( R / S )-flavanomarein ( 2 ), butin-7- O -β-d-glucopyranoside ( 3 ), isookanin ( 4 ), taxifolin ( 5 ), 5,7,3',5'-tetrahydroxyflavanone-7- O -β-d-glucopyranoside ( 6 ), marein ( 7 ) and okanin ( 8 ), in 23 batches of snow chrysanthemum of different seed provenance and from various habitats. The results showed total contents of the eight compounds in the samples with seed provenance from Keliyang (Xinjiang, China), are higher than in samples from the other five provenances by 52.47%, 15.53%, 19.78%, 21.17% and 5.06%, respectively, which demonstrated that provenance has a great influence on the constituents in snow chrysanthemum. Meanwhile, an ultra performance liquid chromatography coupled with electrospray ionization and quadrupole time-of-flight-mass spectrometry (UPLC-ESI-QTOF-MS) was also employed to rapidly separate and identify flavonoids and phenolic acids in snow chrysanthemum from Keliyang. As a result, a total of 30 constituents, including 26 flavonoids and four phenolic acids, were identified or tentatively identified based on the exact mass information, the fragmentation characteristics, and retention times of eight reference standards. This work may provide an efficient approach to comprehensively evaluate the quality of snow chrysanthemum.

  4. New markers to identify the provenance of lapis lazuli: trace elements in pyrite by means of micro-PIXE

    NASA Astrophysics Data System (ADS)

    Re, A.; Angelici, D.; Lo Giudice, A.; Maupas, E.; Giuntini, L.; Calusi, S.; Gelli, N.; Massi, M.; Borghi, A.; Gallo, L. M.; Pratesi, G.; Mandò, P. A.

    2013-04-01

    Lapis lazuli has been used for glyptics and carving since the fifth millennium BC to produce jewels, amulets, seals, inlays, etc; the identification of the origin of the stone used for carving artworks may be valuable for reconstructing old trade routes. Since ancient lapis lazuli art objects are precious, only non-destructive techniques can be used to identify their provenance, and ion beam analysis (IBA) techniques allow us to characterise this stone in a fully non-invasive way. In addition, by using an ion microprobe, we have been able to focus the analysis on single crystals, as their typical dimensions may range from a few microns to hundreds of microns. Provenance markers, identified in previous IBA studies and already presented elsewhere, were based on the presence/absence of mineral phases, on the presence/quantity of trace elements inside a phase and on characteristic features of the luminescence spectra. In this work, a systematic study on pyrite crystals, a common accessory mineral in lapis lazuli, was carried out, following a multi-technique approach: optical microscopy and SEM-EDX to select crystals for successive trace element micro-PIXE measurements at two Italian facilities, the INFN Laboratori Nazionali di Legnaro and the INFN LABEC laboratory in Firenze. The results of this work allowed us to obtain new markers for lapis lazuli provenance identification.

  5. Measuring and Visualizing Group Knowledge Elaboration in Online Collaborative Discussions

    ERIC Educational Resources Information Center

    Zheng, Yafeng; Xu, Chang; Li, Yanyan; Su, You

    2018-01-01

    Knowledge elaboration plays a critical role in promoting knowledge acquisition and facilitating the retention of target knowledge in online collaborative discussions. Adopting a key-term-based automated analysis approach, we proposed an indicator framework to measure the level of knowledge elaboration in terms of coverage, activation, and…

  6. A deep convolutional neural network using directional wavelets for low-dose X-ray CT reconstruction.

    PubMed

    Kang, Eunhee; Min, Junhong; Ye, Jong Chul

    2017-10-01

    Due to the potential risk of inducing cancer, radiation exposure by X-ray CT devices should be reduced for routine patient scanning. However, in low-dose X-ray CT, severe artifacts typically occur due to photon starvation, beam hardening, and other causes, all of which decrease the reliability of the diagnosis. Thus, a high-quality reconstruction method from low-dose X-ray CT data has become a major research topic in the CT community. Conventional model-based de-noising approaches are, however, computationally very expensive, and image-domain de-noising approaches cannot readily remove CT-specific noise patterns. To tackle these problems, we want to develop a new low-dose X-ray CT algorithm based on a deep-learning approach. We propose an algorithm which uses a deep convolutional neural network (CNN) which is applied to the wavelet transform coefficients of low-dose CT images. More specifically, using a directional wavelet transform to extract the directional component of artifacts and exploit the intra- and inter- band correlations, our deep network can effectively suppress CT-specific noise. In addition, our CNN is designed with a residual learning architecture for faster network training and better performance. Experimental results confirm that the proposed algorithm effectively removes complex noise patterns from CT images derived from a reduced X-ray dose. In addition, we show that the wavelet-domain CNN is efficient when used to remove noise from low-dose CT compared to existing approaches. Our results were rigorously evaluated by several radiologists at the Mayo Clinic and won second place at the 2016 "Low-Dose CT Grand Challenge." To the best of our knowledge, this work is the first deep-learning architecture for low-dose CT reconstruction which has been rigorously evaluated and proven to be effective. In addition, the proposed algorithm, in contrast to existing model-based iterative reconstruction (MBIR) methods, has considerable potential to benefit from large data sets. Therefore, we believe that the proposed algorithm opens a new direction in the area of low-dose CT research. © 2017 American Association of Physicists in Medicine.

  7. Leading Change Step-by-Step: Tactics, Tools, and Tales

    ERIC Educational Resources Information Center

    Spiro, Jody

    2010-01-01

    "Leading Change Step-by-Step" offers a comprehensive and tactical guide for change leaders. Spiro's approach has been field-tested for more than a decade and proven effective in a wide variety of public sector organizations including K-12 schools, universities, international agencies and non-profits. The book is filled with proven tactics for…

  8. Developing Evidence for Public Health Policy and Practice: The Implementation of a Knowledge Translation Approach in a Staged, Multi-Methods Study in England, 2007-09

    ERIC Educational Resources Information Center

    South, Jane; Cattan, Mima

    2014-01-01

    Effective knowledge translation processes are critical for the development of evidence-based public health policy and practice. This paper reports on the design and implementation of an innovative approach to knowledge translation within a mixed methods study on lay involvement in public health programme delivery. The study design drew on…

  9. Knowledge-Based Object Detection in Laser Scanning Point Clouds

    NASA Astrophysics Data System (ADS)

    Boochs, F.; Karmacharya, A.; Marbs, A.

    2012-07-01

    Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This "understanding" enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL), used for formulating the knowledge base and the Semantic Web Rule Language (SWRL) with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists' knowledge of the scene and algorithmic processing.

  10. Genomic-based multiple-trait evaluation in Eucalyptus grandis using dominant DArT markers.

    PubMed

    Cappa, Eduardo P; El-Kassaby, Yousry A; Muñoz, Facundo; Garcia, Martín N; Villalba, Pamela V; Klápště, Jaroslav; Marcucci Poltri, Susana N

    2018-06-01

    We investigated the impact of combining the pedigree- and genomic-based relationship matrices in a multiple-trait individual-tree mixed model (a.k.a., multiple-trait combined approach) on the estimates of heritability and on the genomic correlations between growth and stem straightness in an open-pollinated Eucalyptus grandis population. Additionally, the added advantage of incorporating genomic information on the theoretical accuracies of parents and offspring breeding values was evaluated. Our results suggested that the use of the combined approach for estimating heritabilities and additive genetic correlations in multiple-trait evaluations is advantageous and including genomic information increases the expected accuracy of breeding values. Furthermore, the multiple-trait combined approach was proven to be superior to the single-trait combined approach in predicting breeding values, in particular for low-heritability traits. Finally, our results advocate the use of the combined approach in forest tree progeny testing trials, specifically when a multiple-trait individual-tree mixed model is considered. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Tracing catchment fine sediment sources using the new SIFT (SedIment Fingerprinting Tool) open source software.

    PubMed

    Pulley, S; Collins, A L

    2018-09-01

    The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in tracer data processing, guides users through key steps. Critically, by applying multiple model configurations and uncertainty assessment, it delivers more robust solutions for informing catchment management of the sediment problem than many previously used approaches. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Integrating HL7 RIM and ontology for unified knowledge and data representation in clinical decision support systems.

    PubMed

    Zhang, Yi-Fan; Tian, Yu; Zhou, Tian-Shu; Araki, Kenji; Li, Jing-Song

    2016-01-01

    The broad adoption of clinical decision support systems within clinical practice has been hampered mainly by the difficulty in expressing domain knowledge and patient data in a unified formalism. This paper presents a semantic-based approach to the unified representation of healthcare domain knowledge and patient data for practical clinical decision making applications. A four-phase knowledge engineering cycle is implemented to develop a semantic healthcare knowledge base based on an HL7 reference information model, including an ontology to model domain knowledge and patient data and an expression repository to encode clinical decision making rules and queries. A semantic clinical decision support system is designed to provide patient-specific healthcare recommendations based on the knowledge base and patient data. The proposed solution is evaluated in the case study of type 2 diabetes mellitus inpatient management. The knowledge base is successfully instantiated with relevant domain knowledge and testing patient data. Ontology-level evaluation confirms model validity. Application-level evaluation of diagnostic accuracy reaches a sensitivity of 97.5%, a specificity of 100%, and a precision of 98%; an acceptance rate of 97.3% is given by domain experts for the recommended care plan orders. The proposed solution has been successfully validated in the case study as providing clinical decision support at a high accuracy and acceptance rate. The evaluation results demonstrate the technical feasibility and application prospect of our approach. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. First principles prediction of amorphous phases using evolutionary algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nahas, Suhas, E-mail: shsnhs@iitk.ac.in; Gaur, Anshu, E-mail: agaur@iitk.ac.in; Bhowmick, Somnath, E-mail: bsomnath@iitk.ac.in

    2016-07-07

    We discuss the efficacy of evolutionary method for the purpose of structural analysis of amorphous solids. At present, ab initio molecular dynamics (MD) based melt-quench technique is used and this deterministic approach has proven to be successful to study amorphous materials. We show that a stochastic approach motivated by Darwinian evolution can also be used to simulate amorphous structures. Applying this method, in conjunction with density functional theory based electronic, ionic and cell relaxation, we re-investigate two well known amorphous semiconductors, namely silicon and indium gallium zinc oxide. We find that characteristic structural parameters like average bond length and bondmore » angle are within ∼2% of those reported by ab initio MD calculations and experimental studies.« less

  14. Exploring arts-based knowledge translation: sharing research findings through performing the patterns, rehearsing the results, staging the synthesis.

    PubMed

    Rieger, Kendra; Schultz, Annette S H

    2014-04-01

    Cultivation of knowledge translation (KT) strategies that actively engage health professionals in critical reflection of their practice and research-based evidence are imperative to address the research-practice gap. While research-based evidence is exponentially growing, our ability to facilitate uptake by nurses and other health professionals has not kept pace. Innovative approaches that extend epistemological bias beyond a singular standpoint of postpositivism, such as the utilization of arts-based methods, expand the possibility to address the complexities of context, engage audience members, promote dissemination within communities of practice, and foster new audiences interested in research findings. In this paper, we address the importance of adopting a social constructivist epistemological stance to facilitate knowledge translation to diverse audiences, explore various arts-based knowledge translation (ABKT) strategies, and open a dialogue concerning evaluative tenets of ABKT. ABKT utilizes various art forms to disseminate research knowledge to diverse audiences and promote evidence-informed practice. ABKT initiatives translate knowledge not based upon a linear model, which views knowledge as an objective entity, but rather operate from the premise that knowledge is socially situated, which demands acknowledging and engaging the learner within their context. Theatre, dance, photography, and poetry are art forms that are commonly used to communicate research findings to diverse audiences. Given the emerging interest and importance of utilizing this KT strategy situated within a social constructivist epistemology, potential challenges and plausible evaluative criteria specific to ABKT are presented. ABKT is an emerging KT strategy that is grounded in social constructivist epistemological tenets, and holds potential for meaningfully sharing new research knowledge with diverse audiences. ABKT is an innovative and synergistic approach to traditional dissemination strategies. This creative KT approach is emerging as potent transformational learning tools that are congruent with the relational nature of nursing practice. ABKT facilitates learning about new research findings in an engaging and critical reflective manner that promotes learning within communities of practice. © 2014 Sigma Theta Tau International.

  15. Knowledge transfer and exchange frameworks in health and their applicability to palliative care: scoping review protocol.

    PubMed

    Prihodova, Lucia; Guerin, Suzanne; Kernohan, W George

    2015-07-01

    To review knowledge transfer and exchange frameworks used in health, to analyse the core concepts of these frameworks and appraise their potential applicability to palliative care. Although there are over 60 different models of knowledge transfer and exchange designed for various areas of the fields of health care, many remain largely unrefined and untested. There is a lack of studies that create guidelines for scaling-up successful implementation of research findings and of proven models ensuring that patients have access to optimal health care, guided by current research. The protocol for this scoping review was devised according to the guidelines proposed by Arksey and O'Malley (2005) and Levac et al. (2010). The protocol includes decisions about the review objectives, inclusion criteria, search strategy, study selection, data extraction, quality assessment, data synthesis and plans for dissemination. The review will allow us to identify the currently used models of knowledge transfer and exchange in healthcare setting and analyse their applicability to the complex demands of palliative care. Results from this review will identify effective way of translating different types of knowledge to different PC providers and could be used in hospital, community and home based PC and future research. © 2015 John Wiley & Sons Ltd.

  16. A knowledge-based approach to automated flow-field zoning for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Vogel, Alison Andrews

    1989-01-01

    An automated three-dimensional zonal grid generation capability for computational fluid dynamics is shown through the development of a demonstration computer program capable of automatically zoning the flow field of representative two-dimensional (2-D) aerodynamic configurations. The applicability of a knowledge-based programming approach to the domain of flow-field zoning is examined. Several aspects of flow-field zoning make the application of knowledge-based techniques challenging: the need for perceptual information, the role of individual bias in the design and evaluation of zonings, and the fact that the zoning process is modeled as a constructive, design-type task (for which there are relatively few examples of successful knowledge-based systems in any domain). Engineering solutions to the problems arising from these aspects are developed, and a demonstration system is implemented which can design, generate, and output flow-field zonings for representative 2-D aerodynamic configurations.

  17. Improved regional-scale Brazilian cropping systems' mapping based on a semi-automatic object-based clustering approach

    NASA Astrophysics Data System (ADS)

    Bellón, Beatriz; Bégué, Agnès; Lo Seen, Danny; Lebourgeois, Valentine; Evangelista, Balbino Antônio; Simões, Margareth; Demonte Ferraz, Rodrigo Peçanha

    2018-06-01

    Cropping systems' maps at fine scale over large areas provide key information for further agricultural production and environmental impact assessments, and thus represent a valuable tool for effective land-use planning. There is, therefore, a growing interest in mapping cropping systems in an operational manner over large areas, and remote sensing approaches based on vegetation index time series analysis have proven to be an efficient tool. However, supervised pixel-based approaches are commonly adopted, requiring resource consuming field campaigns to gather training data. In this paper, we present a new object-based unsupervised classification approach tested on an annual MODIS 16-day composite Normalized Difference Vegetation Index time series and a Landsat 8 mosaic of the State of Tocantins, Brazil, for the 2014-2015 growing season. Two variants of the approach are compared: an hyperclustering approach, and a landscape-clustering approach involving a previous stratification of the study area into landscape units on which the clustering is then performed. The main cropping systems of Tocantins, characterized by the crop types and cropping patterns, were efficiently mapped with the landscape-clustering approach. Results show that stratification prior to clustering significantly improves the classification accuracies for underrepresented and sparsely distributed cropping systems. This study illustrates the potential of unsupervised classification for large area cropping systems' mapping and contributes to the development of generic tools for supporting large-scale agricultural monitoring across regions.

  18. Text mining of cancer-related information: review of current status and future directions.

    PubMed

    Spasić, Irena; Livsey, Jacqueline; Keane, John A; Nenadić, Goran

    2014-09-01

    This paper reviews the research literature on text mining (TM) with the aim to find out (1) which cancer domains have been the subject of TM efforts, (2) which knowledge resources can support TM of cancer-related information and (3) to what extent systems that rely on knowledge and computational methods can convert text data into useful clinical information. These questions were used to determine the current state of the art in this particular strand of TM and suggest future directions in TM development to support cancer research. A review of the research on TM of cancer-related information was carried out. A literature search was conducted on the Medline database as well as IEEE Xplore and ACM digital libraries to address the interdisciplinary nature of such research. The search results were supplemented with the literature identified through Google Scholar. A range of studies have proven the feasibility of TM for extracting structured information from clinical narratives such as those found in pathology or radiology reports. In this article, we provide a critical overview of the current state of the art for TM related to cancer. The review highlighted a strong bias towards symbolic methods, e.g. named entity recognition (NER) based on dictionary lookup and information extraction (IE) relying on pattern matching. The F-measure of NER ranges between 80% and 90%, while that of IE for simple tasks is in the high 90s. To further improve the performance, TM approaches need to deal effectively with idiosyncrasies of the clinical sublanguage such as non-standard abbreviations as well as a high degree of spelling and grammatical errors. This requires a shift from rule-based methods to machine learning following the success of similar trends in biological applications of TM. Machine learning approaches require large training datasets, but clinical narratives are not readily available for TM research due to privacy and confidentiality concerns. This issue remains the main bottleneck for progress in this area. In addition, there is a need for a comprehensive cancer ontology that would enable semantic representation of textual information found in narrative reports. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  19. Facilitators for the development and implementation of health promoting policy and programs - a scoping review at the local community level.

    PubMed

    Weiss, Daniel; Lillefjell, Monica; Magnus, Eva

    2016-02-11

    Health promotion, with a focus on multidimensional upstream factors and an ecological, life-course approach, is establishing itself as the guiding philosophy for addressing public health. Action at the political and programmatic level on the Social Determinants of Health has proven effective for promoting and building public health at all levels but has been particularly evident at the national and international levels - due in large part to available documents and guidelines. Although research and experience establish that health promotion is most effective when settings-based, the development of health promoting policies and programs at the local level is still difficult. This study intended to investigate available knowledge on the development and implementation of health promoting policies and programs at the local level and identify factors most important for facilitating capacity building and outcome achievement. We used a scoping review in order to review the current literature on local policy development and program implementation. Keywords were chosen based on results of a previous literature review. A total of 53 articles were divided into two categories: policy and implementation. Critical analysis was conducted for each article and a summary assembled. Data was charted with specific focus on the aims of the study, data acquisition, key theories/concepts/frameworks used, outcome measures, results, and conclusions. The articles included in this study primarily focused on discussing factors that facilitate the development of health promoting policy and the implementation of health promotion programs. Most significant facilitators included: collaborative decision-making, agreement of objectives and goals, local planning and action, effective leadership, building and maintaining trust, availability of resources, a dynamic approach, a realistic time-frame, and trained and knowledgeable staff. Within each of these important facilitating factors, various elements supporting implementation were discussed and highlighted in this study. Our results indicate that clear and consistent facilitators exist for supporting health promoting policy development and program implementation at the local level. These results offer a starting point for local action on the Social Determinants of Health and have the potential to contribute to the development of a framework for improving action at the local level.

  20. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    PubMed Central

    Xian, Xuefeng; Cui, Zhiming

    2017-01-01

    Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost. PMID:28588611

  1. Pursuing common agendas: a collaborative model for knowledge translation between research and practice in clinical settings.

    PubMed

    Baumbusch, Jennifer L; Kirkham, Sheryl Reimer; Khan, Koushambhi Basu; McDonald, Heather; Semeniuk, Pat; Tan, Elsie; Anderson, Joan M

    2008-04-01

    There is an emerging discourse of knowledge translation that advocates a shift away from unidirectional research utilization and evidence-based practice models toward more interactive models of knowledge transfer. In this paper, we describe how our participatory approach to knowledge translation developed during an ongoing program of research concerning equitable care for diverse populations. At the core of our approach is a collaborative relationship between researchers and practitioners, which underpins the knowledge translation cycle, and occurs simultaneously with data collection/analysis/synthesis. We discuss lessons learned including: the complexities of translating knowledge within the political landscape of healthcare delivery, the need to negotiate the agendas of researchers and practitioners in a collaborative approach, and the kinds of resources needed to support this process.

  2. Un Programa Personalizado De Lectura En Un Aula Primaria: Reporte De Un Modelo Comprobado (A Personalized Program of Reading in a First Grade Classroom: Report From a Proven Model.)

    ERIC Educational Resources Information Center

    Mendenhall, Susie B.; And Others

    This document written in Spanish describes a personalized reading program and discusses the results of its implementation. The focus of this approach to reading is based on the individual child and his feelings. This model personalizes the child's reading material in the classroom. In personalizing the reading material, the child's attitudes…

  3. Teaching Thinking and Problem Solving.

    ERIC Educational Resources Information Center

    Bransford, John; And Others

    1986-01-01

    This article focuses on two approaches to teaching reasoning and problem solving. One emphasizes the role of domain-specific knowledge; the other emphasizes general strategic and metacognitive knowledge. Many instructional programs are based on the latter approach. The article concludes that these programs can be strengthened by focusing on domain…

  4. Semantics driven approach for knowledge acquisition from EMRs.

    PubMed

    Perera, Sujan; Henson, Cory; Thirunarayan, Krishnaprasad; Sheth, Amit; Nair, Suhas

    2014-03-01

    Semantic computing technologies have matured to be applicable to many critical domains such as national security, life sciences, and health care. However, the key to their success is the availability of a rich domain knowledge base. The creation and refinement of domain knowledge bases pose difficult challenges. The existing knowledge bases in the health care domain are rich in taxonomic relationships, but they lack nontaxonomic (domain) relationships. In this paper, we describe a semiautomatic technique for enriching existing domain knowledge bases with causal relationships gleaned from Electronic Medical Records (EMR) data. We determine missing causal relationships between domain concepts by validating domain knowledge against EMR data sources and leveraging semantic-based techniques to derive plausible relationships that can rectify knowledge gaps. Our evaluation demonstrates that semantic techniques can be employed to improve the efficiency of knowledge acquisition.

  5. Secure Location Provenance for Mobile Devices

    DTIC Science & Technology

    2015-07-01

    SECURE LOCATION PROVENANCE FOR MOBILE DEVICES UNIVERSITY OF ALABAMA AT BIRMINGHAM JULY 2015 FINAL TECHNICAL REPORT...PROVENANCE FOR MOBILE DEVICES 5a. CONTRACT NUMBER FA8750-12-2-0254 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 69220K 6. AUTHOR(S) Ragib Hasan...based services allow mobile device users to access various services based on the users’ current physical location information. Path-critical applications

  6. Intermittent kangaroo mother care: a NICU protocol.

    PubMed

    Davanzo, Riccardo; Brovedani, Pierpaolo; Travan, Laura; Kennedy, Jacqueline; Crocetta, Anna; Sanesi, Cecilia; Strajn, Tamara; De Cunto, Angela

    2013-08-01

    The practice of kangaroo mother care (KMC) is steadily increasing in high-tech settings due to its proven benefits for both infants and parents. In spite of that, clear guidelines about how to implement this method of care are lacking, and as a consequence, some restrictions are applied in many neonatal intensive care units (NICUs), preventing its practice. Based on recommendations from the Expert Group of the International Network on Kangaroo Mother Care, we developed a hospital protocol in the neonatal unit of the Institute for Maternal and Child Health in Trieste, Italy, a level 3 unit, aimed to facilitate and promote KMC implementation in high-tech settings. Our guideline is therefore proposed, based both on current scientific literature and on practical considerations and experience. Future adjustments and improvements would be considered based on increasing clinical KMC use and further knowledge.

  7. Knowledge Integration in Global R&D Networks

    NASA Astrophysics Data System (ADS)

    Erkelens, Rose; van den Hooff, Bart; Vlaar, Paul; Huysman, Marleen

    This paper reports a qualitative study conducted at multinational organizations' R&D departments about their process of knowledge integration. Taking into account the knowledge based view (KBV) of the firm and the practice-based view of knowledge, and building on the literatures concerning specialization and integration of knowledge in organizations, we explore which factors may have a significant influence on the integration process of knowledge between R&D units. The findings indicated (1) the contribution of relevant factors influencing knowledge integration processes and (2) a thoughtful balance between engineering and emergent approaches to be helpful in understanding and overcoming knowledge integration issues.

  8. Novel keyword co-occurrence network-based methods to foster systematic reviews of scientific literature.

    PubMed

    Radhakrishnan, Srinivasan; Erbis, Serkan; Isaacs, Jacqueline A; Kamarthi, Sagar

    2017-01-01

    Systematic reviews of scientific literature are important for mapping the existing state of research and highlighting further growth channels in a field of study, but systematic reviews are inherently tedious, time consuming, and manual in nature. In recent years, keyword co-occurrence networks (KCNs) are exploited for knowledge mapping. In a KCN, each keyword is represented as a node and each co-occurrence of a pair of words is represented as a link. The number of times that a pair of words co-occurs in multiple articles constitutes the weight of the link connecting the pair. The network constructed in this manner represents cumulative knowledge of a domain and helps to uncover meaningful knowledge components and insights based on the patterns and strength of links between keywords that appear in the literature. In this work, we propose a KCN-based approach that can be implemented prior to undertaking a systematic review to guide and accelerate the review process. The novelty of this method lies in the new metrics used for statistical analysis of a KCN that differ from those typically used for KCN analysis. The approach is demonstrated through its application to nano-related Environmental, Health, and Safety (EHS) risk literature. The KCN approach identified the knowledge components, knowledge structure, and research trends that match with those discovered through a traditional systematic review of the nanoEHS field. Because KCN-based analyses can be conducted more quickly to explore a vast amount of literature, this method can provide a knowledge map and insights prior to undertaking a rigorous traditional systematic review. This two-step approach can significantly reduce the effort and time required for a traditional systematic literature review. The proposed KCN-based pre-systematic review method is universal. It can be applied to any scientific field of study to prepare a knowledge map.

  9. Novel keyword co-occurrence network-based methods to foster systematic reviews of scientific literature

    PubMed Central

    Isaacs, Jacqueline A.

    2017-01-01

    Systematic reviews of scientific literature are important for mapping the existing state of research and highlighting further growth channels in a field of study, but systematic reviews are inherently tedious, time consuming, and manual in nature. In recent years, keyword co-occurrence networks (KCNs) are exploited for knowledge mapping. In a KCN, each keyword is represented as a node and each co-occurrence of a pair of words is represented as a link. The number of times that a pair of words co-occurs in multiple articles constitutes the weight of the link connecting the pair. The network constructed in this manner represents cumulative knowledge of a domain and helps to uncover meaningful knowledge components and insights based on the patterns and strength of links between keywords that appear in the literature. In this work, we propose a KCN-based approach that can be implemented prior to undertaking a systematic review to guide and accelerate the review process. The novelty of this method lies in the new metrics used for statistical analysis of a KCN that differ from those typically used for KCN analysis. The approach is demonstrated through its application to nano-related Environmental, Health, and Safety (EHS) risk literature. The KCN approach identified the knowledge components, knowledge structure, and research trends that match with those discovered through a traditional systematic review of the nanoEHS field. Because KCN-based analyses can be conducted more quickly to explore a vast amount of literature, this method can provide a knowledge map and insights prior to undertaking a rigorous traditional systematic review. This two-step approach can significantly reduce the effort and time required for a traditional systematic literature review. The proposed KCN-based pre-systematic review method is universal. It can be applied to any scientific field of study to prepare a knowledge map. PMID:28328983

  10. Shared knowledge or shared affordances? Insights from an ecological dynamics approach to team coordination in sports.

    PubMed

    Silva, Pedro; Garganta, Júlio; Araújo, Duarte; Davids, Keith; Aguiar, Paulo

    2013-09-01

    Previous research has proposed that team coordination is based on shared knowledge of the performance context, responsible for linking teammates' mental representations for collective, internalized action solutions. However, this representational approach raises many questions including: how do individual schemata of team members become reformulated together? How much time does it take for this collective cognitive process to occur? How do different cues perceived by different individuals sustain a general shared mental representation? This representational approach is challenged by an ecological dynamics perspective of shared knowledge in team coordination. We argue that the traditional shared knowledge assumption is predicated on 'knowledge about' the environment, which can be used to share knowledge and influence intentions of others prior to competition. Rather, during competitive performance, the control of action by perceiving surrounding informational constraints is expressed in 'knowledge of' the environment. This crucial distinction emphasizes perception of shared affordances (for others and of others) as the main communication channel between team members during team coordination tasks. From this perspective, the emergence of coordinated behaviours in sports teams is based on the formation of interpersonal synergies between players resulting from collective actions predicated on shared affordances.

  11. An Investigation of Knowledge-Building Activities in an Online Community of Practice at Subaru of America

    ERIC Educational Resources Information Center

    Land, Susan M.; Draper, Darryl C.; Ma, Ziyan; Hsieh, Hsiu-Wei; Smith, Brian K.; Jordan, Robert

    2009-01-01

    Current approaches to workplace learning emphasize designing communities of practice that are intended to support both formal and informal knowledge acquisition. This article presents the design and research of a knowledge-based community of practice for Subaru, based on principles outlined by Scardamalia (2002) and Zhang, Scardamalia, Lamon,…

  12. Epistemological Beliefs and Knowledge Sharing in Work Teams: A New Model and Research Questions

    ERIC Educational Resources Information Center

    Weinberg, Frankie J.

    2015-01-01

    Purpose: The purpose of this paper is to present a knowledge-sharing model that explains individual members' motivation to share knowledge (knowledge donation and knowledge collection). Design/methodology/approach: The model is based on social-constructivist theories of epistemological beliefs, learning and distributed cognition, and is organized…

  13. Passing Decisions in Football: Introducing an Empirical Approach to Estimating the Effects of Perceptual Information and Associative Knowledge.

    PubMed

    Steiner, Silvan

    2018-01-01

    The importance of various information sources in decision-making in interactive team sports is debated. While some highlight the role of the perceptual information provided by the current game context, others point to the role of knowledge-based information that athletes have regarding their team environment. Recently, an integrative perspective considering the simultaneous involvement of both of these information sources in decision-making in interactive team sports has been presented. In a theoretical example concerning passing decisions, the simultaneous involvement of perceptual and knowledge-based information has been illustrated. However, no precast method of determining the contribution of these two information sources empirically has been provided. The aim of this article is to bridge this gap and present a statistical approach to estimating the effects of perceptual information and associative knowledge on passing decisions. To this end, a sample dataset of scenario-based passing decisions is analyzed. This article shows how the effects of perceivable team positionings and athletes' knowledge about their fellow team members on passing decisions can be estimated. Ways of transfering this approach to real-world situations and implications for future research using more representative designs are presented.

  14. Passing Decisions in Football: Introducing an Empirical Approach to Estimating the Effects of Perceptual Information and Associative Knowledge

    PubMed Central

    Steiner, Silvan

    2018-01-01

    The importance of various information sources in decision-making in interactive team sports is debated. While some highlight the role of the perceptual information provided by the current game context, others point to the role of knowledge-based information that athletes have regarding their team environment. Recently, an integrative perspective considering the simultaneous involvement of both of these information sources in decision-making in interactive team sports has been presented. In a theoretical example concerning passing decisions, the simultaneous involvement of perceptual and knowledge-based information has been illustrated. However, no precast method of determining the contribution of these two information sources empirically has been provided. The aim of this article is to bridge this gap and present a statistical approach to estimating the effects of perceptual information and associative knowledge on passing decisions. To this end, a sample dataset of scenario-based passing decisions is analyzed. This article shows how the effects of perceivable team positionings and athletes' knowledge about their fellow team members on passing decisions can be estimated. Ways of transfering this approach to real-world situations and implications for future research using more representative designs are presented. PMID:29623057

  15. Cooperative knowledge evolution: a construction-integration approach to knowledge discovery in medicine.

    PubMed

    Schmalhofer, F J; Tschaitschian, B

    1998-11-01

    In this paper, we perform a cognitive analysis of knowledge discovery processes. As a result of this analysis, the construction-integration theory is proposed as a general framework for developing cooperative knowledge evolution systems. We thus suggest that for the acquisition of new domain knowledge in medicine, one should first construct pluralistic views on a given topic which may contain inconsistencies as well as redundancies. Only thereafter does this knowledge become consolidated into a situation-specific circumscription and the early inconsistencies become eliminated. As a proof for the viability of such knowledge acquisition processes in medicine, we present the IDEAS system, which can be used for the intelligent documentation of adverse events in clinical studies. This system provides a better documentation of the side-effects of medical drugs. Thereby, knowledge evolution occurs by achieving consistent explanations in increasingly larger contexts (i.e., more cases and more pharmaceutical substrates). Finally, it is shown how prototypes, model-based approaches and cooperative knowledge evolution systems can be distinguished as different classes of knowledge-based systems.

  16. Design and Implementation of Hydrologic Process Knowledge-base Ontology: A case study for the Infiltration Process

    NASA Astrophysics Data System (ADS)

    Elag, M.; Goodall, J. L.

    2013-12-01

    Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.

  17. Circular epidemiology.

    PubMed

    Kuller, L H

    1999-11-01

    Circular epidemiology can be defined as the continuation of specific types of epidemiologic studies beyond the point of reasonable doubt of the true existence of an important association or the absence of such an association. Circular epidemiology is an extreme example of studies of the consistency of associations. A basic problem for epidemiology is the lack of a systematic approach to acquiring new knowledge to reach a goal of improving public health and preventive medicine. For epidemiologists, research support unfortunately is biased toward the continued study of already proven hypotheses. Circular epidemiology, however, freezes at one point in the evolution of epidemiologic studies, failing to move from descriptive to analytical case-control and longitudinal studies, for example, to experimental, clinical trials. Good epidemiology journals are filled with very well-conducted epidemiologic studies that primarily repeat the obvious or are variations on the theme.

  18. Knowledge Driven Variable Selection (KDVS) – a new approach to enrichment analysis of gene signatures obtained from high–throughput data

    PubMed Central

    2013-01-01

    Background High–throughput (HT) technologies provide huge amount of gene expression data that can be used to identify biomarkers useful in the clinical practice. The most frequently used approaches first select a set of genes (i.e. gene signature) able to characterize differences between two or more phenotypical conditions, and then provide a functional assessment of the selected genes with an a posteriori enrichment analysis, based on biological knowledge. However, this approach comes with some drawbacks. First, gene selection procedure often requires tunable parameters that affect the outcome, typically producing many false hits. Second, a posteriori enrichment analysis is based on mapping between biological concepts and gene expression measurements, which is hard to compute because of constant changes in biological knowledge and genome analysis. Third, such mapping is typically used in the assessment of the coverage of gene signature by biological concepts, that is either score–based or requires tunable parameters as well, limiting its power. Results We present Knowledge Driven Variable Selection (KDVS), a framework that uses a priori biological knowledge in HT data analysis. The expression data matrix is transformed, according to prior knowledge, into smaller matrices, easier to analyze and to interpret from both computational and biological viewpoints. Therefore KDVS, unlike most approaches, does not exclude a priori any function or process potentially relevant for the biological question under investigation. Differently from the standard approach where gene selection and functional assessment are applied independently, KDVS embeds these two steps into a unified statistical framework, decreasing the variability derived from the threshold–dependent selection, the mapping to the biological concepts, and the signature coverage. We present three case studies to assess the usefulness of the method. Conclusions We showed that KDVS not only enables the selection of known biological functionalities with accuracy, but also identification of new ones. An efficient implementation of KDVS was devised to obtain results in a fast and robust way. Computing time is drastically reduced by the effective use of distributed resources. Finally, integrated visualization techniques immediately increase the interpretability of results. Overall, KDVS approach can be considered as a viable alternative to enrichment–based approaches. PMID:23302187

  19. Are Antimalarial Hybrid Molecules a Close Reality or a Distant Dream?

    PubMed

    Agarwal, Drishti; Gupta, Rinkoo D; Awasthi, Satish K

    2017-05-01

    Emergence of drug-resistant Plasmodium falciparum strains has led to a situation of haste in the scientific and pharmaceutical communities. Hence, all their efforts are redirected toward finding alternative chemotherapeutic agents that are capable of combating multidrug-resistant parasite strains. In light of this situation, scientists have come up with the concept of hybridization of two or more active pharmacophores into a single chemical entity, resulting in "antimalarial hybrids." The approach has been applied widely for generation of lead compounds against deadly diseases such as cancer and AIDS, with a proven potential for use as novel drugs, but is comparatively new in the sphere of antimalarial drug discovery. A sudden surge has been evidenced in the number of studies on the design and synthesis of hybrids for treating malaria and may be regarded as proof of their potential advantages over artemisinin-based combination therapy (ACT). However, it is evident from recent studies that most of the potential advantages of antimalarial hybrids, such as lower toxicity, better pharmacokinetics, and easier formulation, have yet to be realized. A number of questions left unaddressed at present need to be answered before this approach can progress to the late stages of clinical development and prove their worth in the clinic. To the best of our knowledge, this compilation is the first attempt to shed light on the shortcomings that are surfacing as more and more studies on molecular hybridization of the active pharmacophores of known antimalarials are being published. Copyright © 2017 American Society for Microbiology.

  20. Are Antimalarial Hybrid Molecules a Close Reality or a Distant Dream?

    PubMed Central

    Agarwal, Drishti; Gupta, Rinkoo D.

    2017-01-01

    ABSTRACT Emergence of drug-resistant Plasmodium falciparum strains has led to a situation of haste in the scientific and pharmaceutical communities. Hence, all their efforts are redirected toward finding alternative chemotherapeutic agents that are capable of combating multidrug-resistant parasite strains. In light of this situation, scientists have come up with the concept of hybridization of two or more active pharmacophores into a single chemical entity, resulting in “antimalarial hybrids.” The approach has been applied widely for generation of lead compounds against deadly diseases such as cancer and AIDS, with a proven potential for use as novel drugs, but is comparatively new in the sphere of antimalarial drug discovery. A sudden surge has been evidenced in the number of studies on the design and synthesis of hybrids for treating malaria and may be regarded as proof of their potential advantages over artemisinin-based combination therapy (ACT). However, it is evident from recent studies that most of the potential advantages of antimalarial hybrids, such as lower toxicity, better pharmacokinetics, and easier formulation, have yet to be realized. A number of questions left unaddressed at present need to be answered before this approach can progress to the late stages of clinical development and prove their worth in the clinic. To the best of our knowledge, this compilation is the first attempt to shed light on the shortcomings that are surfacing as more and more studies on molecular hybridization of the active pharmacophores of known antimalarials are being published. PMID:28289029

  1. How Knowledge Organizations Work: The Case of Detectives

    ERIC Educational Resources Information Center

    Gottschalk, Petter; Holgersson, Stefan; Karlsen, Jan Terje

    2009-01-01

    Purpose: The purpose of this paper is to conceptualize detectives in police investigations as knowledge workers. Design/methodology/approach: The paper is based on a literature review covering knowledge organizations, police organizations, police investigations, and detectives as knowledge workers. Findings: The paper finds that the changing role…

  2. An expert knowledge-based approach to landslide susceptibility mapping using GIS and fuzzy logic

    NASA Astrophysics Data System (ADS)

    Zhu, A.-Xing; Wang, Rongxun; Qiao, Jianping; Qin, Cheng-Zhi; Chen, Yongbo; Liu, Jing; Du, Fei; Lin, Yang; Zhu, Tongxin

    2014-06-01

    This paper presents an expert knowledge-based approach to landslide susceptibility mapping in an effort to overcome the deficiencies of data-driven approaches. The proposed approach consists of three generic steps: (1) extraction of knowledge on the relationship between landslide susceptibility and predisposing factors from domain experts, (2) characterization of predisposing factors using GIS techniques, and (3) prediction of landslide susceptibility under fuzzy logic. The approach was tested in two study areas in China - the Kaixian study area (about 250 km2) and the Three Gorges study area (about 4600 km2). The Kaixian study area was used to develop the approach and to evaluate its validity. The Three Gorges study area was used to test both the portability and the applicability of the developed approach for mapping landslide susceptibility over large study areas. Performance was evaluated by examining if the mean of the computed susceptibility values at landslide sites was statistically different from that of the entire study area. A z-score test was used to examine the statistical significance of the difference. The computed z for the Kaixian area was 3.70 and the corresponding p-value was less than 0.001. This suggests that the computed landslide susceptibility values are good indicators of landslide occurrences. In the Three Gorges study area, the computed z was 10.75 and the corresponding p-value was less than 0.001. In addition, we divided the susceptibility value into four levels: low (0.0-0.25), moderate (0.25-0.5), high (0.5-0.75) and very high (0.75-1.0). No landslides were found for areas of low susceptibility. Landslide density was about three times higher in areas of very high susceptibility than that in the moderate susceptibility areas, and more than twice as high as that in the high susceptibility areas. The results from the Three Gorge study area suggest that the extracted expert knowledge can be extrapolated to another study area and the developed approach can be used in large-scale projects. Results from these case studies suggest that the expert knowledge-based approach is effective in mapping landslide susceptibility and that its performance is maintained when it is moved to a new area from the model development area without changes to the knowledge base.

  3. Querying Provenance Information: Basic Notions and an Example from Paleoclimate Reconstruction

    NASA Astrophysics Data System (ADS)

    Stodden, V.; Ludaescher, B.; Bocinsky, K.; Kintigh, K.; Kohler, T.; McPhillips, T.; Rush, J.

    2016-12-01

    Computational models are used to reconstruct and explain past environments and to predict likely future environments. For example, Bocinsky and Kohler have performed a 2,000-year reconstruction of the rain-fed maize agricultural niche in the US Southwest. The resulting academic publications not only contain traditional method descriptions, figures, etc. but also links to code and data for basic transparency and reproducibility. Examples include ResearchCompendia.org and the new project "Merging Science and Cyberinfrastructure Pathways: The Whole Tale." Provenance information provides a further critical element to understand a published study and to possibly extend or challenge the findings of the original authors. We present different notions and uses of provenance information using a computational archaeology example, e.g., the common use of "provenance for others" (for transparency and reproducibility), but also the more elusive but equally important use of "provenance for self". To this end, we distinguish prospective provenance (a.k.a. workflow) from retrospective provenance (a.k.a. data lineage) and show how combinations of both forms of provenance can be used to answer different kinds of important questions about a workflow and its execution. Since many workflows are developed using scripting or special purpose languages such as Python and R, we employ an approach and toolkit called YesWorkflow that brings provenance modeling, capture, and querying into the realm of scripting. YesWorkflow employs the basic W3C PROV standard, as well as the ProvONE extension for sharing and exchanging retrospective and prospective provenance information, respectively. Finally, we argue that the utility of provenance information should be maximized by developing different kinds provenance questions and queries during the early phases of computational workflow design and implementation.

  4. Obesity, Health at Every Size, and Public Health Policy

    PubMed Central

    2014-01-01

    Obesity is associated with chronic diseases that may negatively affect individuals’ health and the sustainability of the health care system. Despite increasing emphasis on obesity as a major health care issue, little progress has been made in its treatment or prevention. Individual approaches to obesity treatment, largely composed of weight-loss dieting, have not proven effective. Little direct evidence supports the notion of reforms to the “obesogenic environment.” Both these individualistic and environmental approaches to obesity have important limitations and ethical implications. The low levels of success associated with these approaches may necessitate a new non–weight-centric public health strategy. Evidence is accumulating that a weight-neutral, nutrition- and physical activity–based, Health at Every Size (HAES) approach may be a promising chronic disease-prevention strategy. PMID:24328657

  5. Obesity, health at every size, and public health policy.

    PubMed

    Bombak, Andrea

    2014-02-01

    Obesity is associated with chronic diseases that may negatively affect individuals' health and the sustainability of the health care system. Despite increasing emphasis on obesity as a major health care issue, little progress has been made in its treatment or prevention. Individual approaches to obesity treatment, largely composed of weight-loss dieting, have not proven effective. Little direct evidence supports the notion of reforms to the "obesogenic environment." Both these individualistic and environmental approaches to obesity have important limitations and ethical implications. The low levels of success associated with these approaches may necessitate a new non-weight-centric public health strategy. Evidence is accumulating that a weight-neutral, nutrition- and physical activity-based, Health at Every Size (HAES) approach may be a promising chronic disease-prevention strategy.

  6. Translational Scholarship and a Palliative Approach: Enlisting the Knowledge-As-Action Framework.

    PubMed

    Reimer-Kirkham, Sheryl; Doane, Gweneth Hartrick; Antifeau, Elisabeth; Pesut, Barbara; Porterfield, Pat; Roberts, Della; Stajduhar, Kelli; Wikjord, Nicole

    2015-01-01

    Based on a retheorized epistemology for knowledge translation (KT) that problematizes the "know-do gap" and conceptualizes the knower, knowledge, and action as inseparable, this paper describes the application of the Knowledge-As-Action Framework. When applied as a heuristic device to support an inquiry process, the framework with the metaphor of a kite facilitates a responsiveness to the complexities that characterize KT. Examples from a KT demonstration project on the integration of a palliative approach at 3 clinical sites illustrate the interrelatedness of 6 dimensions-the local context, processes, people, knowledge, fluctuating realities, and values.

  7. GSA-PCA: gene set generation by principal component analysis of the Laplacian matrix of a metabolic network

    PubMed Central

    2012-01-01

    Background Gene Set Analysis (GSA) has proven to be a useful approach to microarray analysis. However, most of the method development for GSA has focused on the statistical tests to be used rather than on the generation of sets that will be tested. Existing methods of set generation are often overly simplistic. The creation of sets from individual pathways (in isolation) is a poor reflection of the complexity of the underlying metabolic network. We have developed a novel approach to set generation via the use of Principal Component Analysis of the Laplacian matrix of a metabolic network. We have analysed a relatively simple data set to show the difference in results between our method and the current state-of-the-art pathway-based sets. Results The sets generated with this method are semi-exhaustive and capture much of the topological complexity of the metabolic network. The semi-exhaustive nature of this method has also allowed us to design a hypergeometric enrichment test to determine which genes are likely responsible for set significance. We show that our method finds significant aspects of biology that would be missed (i.e. false negatives) and addresses the false positive rates found with the use of simple pathway-based sets. Conclusions The set generation step for GSA is often neglected but is a crucial part of the analysis as it defines the full context for the analysis. As such, set generation methods should be robust and yield as complete a representation of the extant biological knowledge as possible. The method reported here achieves this goal and is demonstrably superior to previous set analysis methods. PMID:22876834

  8. Visual affective classification by combining visual and text features.

    PubMed

    Liu, Ningning; Wang, Kai; Jin, Xin; Gao, Boyang; Dellandréa, Emmanuel; Chen, Liming

    2017-01-01

    Affective analysis of images in social networks has drawn much attention, and the texts surrounding images are proven to provide valuable semantic meanings about image content, which can hardly be represented by low-level visual features. In this paper, we propose a novel approach for visual affective classification (VAC) task. This approach combines visual representations along with novel text features through a fusion scheme based on Dempster-Shafer (D-S) Evidence Theory. Specifically, we not only investigate different types of visual features and fusion methods for VAC, but also propose textual features to effectively capture emotional semantics from the short text associated to images based on word similarity. Experiments are conducted on three public available databases: the International Affective Picture System (IAPS), the Artistic Photos and the MirFlickr Affect set. The results demonstrate that the proposed approach combining visual and textual features provides promising results for VAC task.

  9. Visual affective classification by combining visual and text features

    PubMed Central

    Liu, Ningning; Wang, Kai; Jin, Xin; Gao, Boyang; Dellandréa, Emmanuel; Chen, Liming

    2017-01-01

    Affective analysis of images in social networks has drawn much attention, and the texts surrounding images are proven to provide valuable semantic meanings about image content, which can hardly be represented by low-level visual features. In this paper, we propose a novel approach for visual affective classification (VAC) task. This approach combines visual representations along with novel text features through a fusion scheme based on Dempster-Shafer (D-S) Evidence Theory. Specifically, we not only investigate different types of visual features and fusion methods for VAC, but also propose textual features to effectively capture emotional semantics from the short text associated to images based on word similarity. Experiments are conducted on three public available databases: the International Affective Picture System (IAPS), the Artistic Photos and the MirFlickr Affect set. The results demonstrate that the proposed approach combining visual and textual features provides promising results for VAC task. PMID:28850566

  10. Optimal phase estimation with arbitrary a priori knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demkowicz-Dobrzanski, Rafal

    2011-06-15

    The optimal-phase estimation strategy is derived when partial a priori knowledge on the estimated phase is available. The solution is found with the help of the most famous result from the entanglement theory: the positive partial transpose criterion. The structure of the optimal measurements, estimators, and the optimal probe states is analyzed. This Rapid Communication provides a unified framework bridging the gap in the literature on the subject which until now dealt almost exclusively with two extreme cases: almost perfect knowledge (local approach based on Fisher information) and no a priori knowledge (global approach based on covariant measurements). Special attentionmore » is paid to a natural a priori probability distribution arising from a diffusion process.« less

  11. Hypoxia-based strategies for regenerative dentistry-Views from the different dental fields.

    PubMed

    Müller, Anna Sonja; Janjić, Klara; Lilaj, Bledar; Edelmayer, Michael; Agis, Hermann

    2017-09-01

    The understanding of the cell biological processes underlying development and regeneration of oral tissues leads to novel regenerative approaches. Over the past years, knowledge on key roles of the hypoxia-based response has become more profound. Based on these findings, novel regenerative approaches for dentistry are emerging, which target cellular oxygen sensors. These approaches include hypoxia pre-conditioning and pharmacologically simulated hypoxia. The increase in studies on hypoxia and hypoxia-based strategies in regenerative dentistry highlights the growing attention to hypoxia's role in regeneration and its underlying biology, as well as its application in a therapeutic setting. In this narrative review, we present the current knowledge on the role of hypoxia in oral tissues and review the proposed hypoxia-based approaches in different fields of dentistry, including endodontics, orthodontics, periodontics, and oral surgery. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. The KATE shell: An implementation of model-based control, monitor and diagnosis

    NASA Technical Reports Server (NTRS)

    Cornell, Matthew

    1987-01-01

    The conventional control and monitor software currently used by the Space Center for Space Shuttle processing has many limitations such as high maintenance costs, limited diagnostic capabilities and simulation support. These limitations have caused the development of a knowledge based (or model based) shell to generically control and monitor electro-mechanical systems. The knowledge base describes the system's structure and function and is used by a software shell to do real time constraints checking, low level control of components, diagnosis of detected faults, sensor validation, automatic generation of schematic diagrams and automatic recovery from failures. This approach is more versatile and more powerful than the conventional hard coded approach and offers many advantages over it, although, for systems which require high speed reaction times or aren't well understood, knowledge based control and monitor systems may not be appropriate.

  13. Integrative pathway knowledge bases as a tool for systems molecular medicine.

    PubMed

    Liang, Mingyu

    2007-08-20

    There exists a sense of urgency to begin to generate a cohesive assembly of biomedical knowledge as the pace of knowledge accumulation accelerates. The urgency is in part driven by the emergence of systems molecular medicine that emphasizes the combination of systems analysis and molecular dissection in the future of medical practice and research. A potentially powerful approach is to build integrative pathway knowledge bases that link organ systems function with molecules.

  14. Documenting Art Therapy Clinical Knowledge Using Interviews

    ERIC Educational Resources Information Center

    Regev, Dafna

    2017-01-01

    Practicing art therapists have vast stores of knowledge and experience, but in most cases, their work is not documented, and their clinical knowledge does not enter the academic discourse. This article proposes a systematic approach to the collection of practice knowledge about art therapy based on conducting interviews with art therapists who…

  15. Community-based participatory research and integrated knowledge translation: advancing the co-creation of knowledge.

    PubMed

    Jull, Janet; Giles, Audrey; Graham, Ian D

    2017-12-19

    Better use of research evidence (one form of "knowledge") in health systems requires partnerships between researchers and those who contend with the real-world needs and constraints of health systems. Community-based participatory research (CBPR) and integrated knowledge translation (IKT) are research approaches that emphasize the importance of creating partnerships between researchers and the people for whom the research is ultimately meant to be of use ("knowledge users"). There exist poor understandings of the ways in which these approaches converge and diverge. Better understanding of the similarities and differences between CBPR and IKT will enable researchers to use these approaches appropriately and to leverage best practices and knowledge from each. The co-creation of knowledge conveys promise of significant social impacts, and further understandings of how to engage and involve knowledge users in research are needed. We examine the histories and traditions of CBPR and IKT, as well as their points of convergence and divergence. We critically evaluate the ways in which both have the potential to contribute to the development and integration of knowledge in health systems. As distinct research traditions, the underlying drivers and rationale for CBPR and IKT have similarities and differences across the areas of motivation, social location, and ethics; nevertheless, the practices of CBPR and IKT converge upon a common aim: the co-creation of knowledge that is the result of knowledge user and researcher expertise. We argue that while CBPR and IKT both have the potential to contribute evidence to implementation science and practices for collaborative research, clarity for the purpose of the research-social change or application-is a critical feature in the selection of an appropriate collaborative approach to build knowledge. CBPR and IKT bring distinct strengths to a common aim: to foster democratic processes in the co-creation of knowledge. As research approaches, they create opportunities to challenge assumptions about for whom, how, and what is defined as knowledge, and to develop and integrate research findings into health systems. When used appropriately, CBPR and IKT both have the potential to contribute to and advance implementation science about the conduct of collaborative health systems research.

  16. Improving Professional Practice through Practice-Based Research: VaKE (Values "and" Knowledge Education) in University-Based Teacher Education

    ERIC Educational Resources Information Center

    Weinberger, Alfred; Patry, Jean-Luc; Weyringer, Sieglinde

    2016-01-01

    Evidence suggests that in the professional education of teachers the moral goals are currently a neglected topic in favor of the subject matter and knowledge. The constructivist instructional approach VaKE (Values "and" Knowledge Education) addresses this problem by combining the moral and epistemic goals through the discussion of moral…

  17. Knowledge Acquisition of Generic Queries for Information Retrieval

    PubMed Central

    Seol, Yoon-Ho; Johnson, Stephen B.; Cimino, James J.

    2002-01-01

    Several studies have identified clinical questions posed by health care professionals to understand the nature of information needs during clinical practice. To support access to digital information sources, it is necessary to integrate the information needs with a computer system. We have developed a conceptual guidance approach in information retrieval, based on a knowledge base that contains the patterns of information needs. The knowledge base uses a formal representation of clinical questions based on the UMLS knowledge sources, called the Generic Query model. To improve the coverage of the knowledge base, we investigated a method for extracting plausible clinical questions from the medical literature. This poster presents the Generic Query model, shows how it is used to represent the patterns of clinical questions, and describes the framework used to extract knowledge from the medical literature.

  18. Combining Knowledge and Data Driven Insights for Identifying Risk Factors using Electronic Health Records

    PubMed Central

    Sun, Jimeng; Hu, Jianying; Luo, Dijun; Markatou, Marianthi; Wang, Fei; Edabollahi, Shahram; Steinhubl, Steven E.; Daar, Zahra; Stewart, Walter F.

    2012-01-01

    Background: The ability to identify the risk factors related to an adverse condition, e.g., heart failures (HF) diagnosis, is very important for improving care quality and reducing cost. Existing approaches for risk factor identification are either knowledge driven (from guidelines or literatures) or data driven (from observational data). No existing method provides a model to effectively combine expert knowledge with data driven insight for risk factor identification. Methods: We present a systematic approach to enhance known knowledge-based risk factors with additional potential risk factors derived from data. The core of our approach is a sparse regression model with regularization terms that correspond to both knowledge and data driven risk factors. Results: The approach is validated using a large dataset containing 4,644 heart failure cases and 45,981 controls. The outpatient electronic health records (EHRs) for these patients include diagnosis, medication, lab results from 2003–2010. We demonstrate that the proposed method can identify complementary risk factors that are not in the existing known factors and can better predict the onset of HF. We quantitatively compare different sets of risk factors in the context of predicting onset of HF using the performance metric, the Area Under the ROC Curve (AUC). The combined risk factors between knowledge and data significantly outperform knowledge-based risk factors alone. Furthermore, those additional risk factors are confirmed to be clinically meaningful by a cardiologist. Conclusion: We present a systematic framework for combining knowledge and data driven insights for risk factor identification. We demonstrate the power of this framework in the context of predicting onset of HF, where our approach can successfully identify intuitive and predictive risk factors beyond a set of known HF risk factors. PMID:23304365

  19. Translating knowledge into practice: An exploratory study of dementia-specific training for community-based service providers.

    PubMed

    O'Sullivan, Grace; Hocking, Clare; McPherson, Kathryn

    2017-08-01

    Objective To develop, deliver, and evaluate dementia-specific training designed to inform service delivery by enhancing the knowledge of community-based service providers. Methods This exploratory qualitative study used an interdisciplinary, interuniversity team approach to develop and deliver dementia-specific training. Participants included management, care staff, and clients from three organizations funded to provide services in the community. Data on the acceptability, applicability, and perceived outcomes of the training were gathered through focus group discussions and individual interviews. Transcripts were analyzed to generate open codes which were clustered into themes and sub-themes addressing the content, delivery, and value of the training. Findings Staff valued up-to-date knowledge and "real stories" grounded in practice. Clients welcomed the strengths-based approach. Contractual obligations impact on the application of knowledge in practice. Implications The capacity to implement new knowledge may be limited by the legislative policies which frame service provision, to the detriment of service users.

  20. Treating the Juvenile Offender

    ERIC Educational Resources Information Center

    Hoge, Robert D., Ed.; Guerra, Nancy G., Ed.; Boxer, Paul, Ed.

    2008-01-01

    This authoritative, highly readable reference and text is grounded in the latest knowledge on how antisocial and criminal behavior develops in youth and how it can effectively be treated. Contributors describe proven ways to reduce juvenile delinquency by targeting specific risk factors and strengthening young people's personal, family, and…

Top