Sample records for design knowledge capture

  1. The elements of design knowledge capture

    NASA Technical Reports Server (NTRS)

    Freeman, Michael S.

    1988-01-01

    This paper will present the basic constituents of a design knowledge capture effort. This will include a discussion of the types of knowledge to be captured in such an effort and the difference between design knowledge capture and more traditional knowledge base construction. These differences include both knowledge base structure and knowledge acquisition approach. The motivation for establishing a design knowledge capture effort as an integral part of major NASA programs will be outlined, along with the current NASA position on that subject. Finally the approach taken in design knowledge capture for Space Station will be contrasted with that used in the HSTDEK project.

  2. Design knowledge capture for the space station

    NASA Technical Reports Server (NTRS)

    Crouse, K. R.; Wechsler, D. B.

    1987-01-01

    The benefits of design knowledge availability are identifiable and pervasive. The implementation of design knowledge capture and storage using current technology increases the probability for success, while providing for a degree of access compatibility with future applications. The space station design definition should be expanded to include design knowledge. Design knowledge should be captured. A critical timing relationship exists between the space station development program, and the implementation of this project.

  3. Case-Based Capture and Reuse of Aerospace Design Rationale

    NASA Technical Reports Server (NTRS)

    Leake, David B.

    2001-01-01

    The goal of this project was to apply artificial intelligence techniques to facilitate capture and reuse of aerospace design rationale. The project combined case-based reasoning (CBR) and concept maps (CMaps) to develop methods for capturing, organizing, and interactively accessing records of experiences encapsulating the methods and rationale underlying expert aerospace design, in order to bring the captured knowledge to bear to support future reasoning. The project's results contribute both principles and methods for effective design-aiding systems that aid capture and access of useful design knowledge. The project has been guided by the tenets that design-aiding systems must: (1) Leverage a designer's knowledge, rather than attempting to replace it; (2) Be able to reflect different designers' differing conceptualizations of the design task, and to clarify those conceptualizations to others; (3) Include capabilities to capture information both by interactive knowledge modeling and during normal use; and (4) Integrate into normal designer tasks as naturally and unobtrusive as possible.

  4. Study of Design Knowledge Capture (DKC) schemes implemented in magnetic bearing applications

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A design knowledge capture (DKC) scheme was implemented using frame-based techniques. The objective of such a system is to capture not only the knowledge which describes a design, but also that which explains how the design decisions were reached. These knowledge types were labelled definitive and explanatory, respectively. Examination of the design process helped determine what knowledge to retain and at what stage that knowledge is used. A discussion of frames resulted in the recognition of their value to knowledge representation and organization. The FORMS frame system was used as a basis for further development, and for examples using magnetic bearing design. The specific contributions made by this research include: determination that frame-based systems provide a useful methodology for management and application of design knowledge; definition of specific user interface requirements, (this consists of a window-based browser); specification of syntax for DKC commands; and demonstration of the feasibility of DKC by applications to existing designs. It was determined that design knowledge capture could become an extremely valuable engineering tool for complicated, long-life systems, but that further work was needed, particularly the development of a graphic, window-based interface.

  5. Hubble Space Telescope Design Engineering Knowledgebase (HSTDEK)

    NASA Technical Reports Server (NTRS)

    Johannes, James D.; Everetts, Clark

    1989-01-01

    The research covered here pays specific attention to the development of tools to assist knowledge engineers in acquiring knowledge and to assist other technical, engineering, and management personnel in automatically performing knowledge capture as part of their everyday work without adding any extra work to what they already do. Requirements for data products, the knowledge base, and methods for mapping knowledge in the documents onto the knowledge representations are discussed, as are some of the difficulties of capturing in the knowledge base the structure of the design process itself, along with a model of the system designed. The capture of knowledge describing the interactions of different components is also discussed briefly.

  6. An approach to design knowledge capture for the space station

    NASA Technical Reports Server (NTRS)

    Wechsler, D. B.; Crouse, K. R.

    1986-01-01

    The design of NASA's space station has begun. During the design cycle, and after activation of the space station, the reoccurring need will exist to access not only designs, but also deeper knowledge about the designs, which is only hinted in the design definition. Areas benefiting from this knowledge include training, fault management, and onboard automation. NASA's Artificial Intelligence Office at Johnson Space Center and The MITRE Corporation have conceptualized an approach for capture and storage of design knowledge.

  7. An Approach To Design Knowledge Capture For The Space Station

    NASA Astrophysics Data System (ADS)

    Wechsler, D. B.; Crouse, K. R.

    1987-02-01

    Design of NASA's Space Station has begun. During the design cycle, and after activation of the Space Station, the reoccuring need will exist to access not only designs; but also deeper knowledge about the designs, which is only hinted in the design definition. Areas benefiting from this knowledge include training, fault management, and onboard automation. NASA's Artificial Intelligence Office at Johnson Space Center and The MITRE Corporation have conceptualized an approach for capture and storage of design knowledge.

  8. An approach to design knowledge capture for the space station

    NASA Technical Reports Server (NTRS)

    Wechsler, D. B.; Crouse, K. R.

    1987-01-01

    The design of NASA's space station has begun. During the design cycle, and after activation of the space station, the reoccurring need will exist to access not only designs, but also deeper knowledge about the designs, which is only hinted in the design definition. Areas benefiting from this knowledge include training, fault management, and onboard automation. NASA's Artificial Intelligence Office at Johnson Space Center and The MITRE Corporation have conceptualized an approach for capture and storage of design knowledge.

  9. A knowledge-based system design/information tool

    NASA Technical Reports Server (NTRS)

    Allen, James G.; Sikora, Scott E.

    1990-01-01

    The objective of this effort was to develop a Knowledge Capture System (KCS) for the Integrated Test Facility (ITF) at the Dryden Flight Research Facility (DFRF). The DFRF is a NASA Ames Research Center (ARC) facility. This system was used to capture the design and implementation information for NASA's high angle-of-attack research vehicle (HARV), a modified F/A-18A. In particular, the KCS was used to capture specific characteristics of the design of the HARV fly-by-wire (FBW) flight control system (FCS). The KCS utilizes artificial intelligence (AI) knowledge-based system (KBS) technology. The KCS enables the user to capture the following characteristics of automated systems: the system design; the hardware (H/W) design and implementation; the software (S/W) design and implementation; and the utilities (electrical and hydraulic) design and implementation. A generic version of the KCS was developed which can be used to capture the design information for any automated system. The deliverable items for this project consist of the prototype generic KCS and an application, which captures selected design characteristics of the HARV FCS.

  10. Engineering design knowledge recycling in near-real-time

    NASA Technical Reports Server (NTRS)

    Leifer, Larry; Baya, Vinod; Toye, George; Baudin, Catherine; Underwood, Jody Gevins

    1994-01-01

    It is hypothesized that the capture and reuse of machine readable design records is cost beneficial. This informal engineering notebook design knowledge can be used to model the artifact and the design process. Design rationale is, in part, preserved and available for examination. Redesign cycle time is significantly reduced (Baya et al, 1992). These factors contribute to making it less costly to capture and reuse knowledge than to recreate comparable knowledge (current practice). To test the hypothesis, we have focused on validation of the concept and tools in two 'real design' projects this past year: (1) a short (8 month) turnaround project for NASA life science bioreactor researchers was done by a team of three mechanical engineering graduate students at Stanford University (in a class, ME210abc 'Mechatronic Systems Design and Methodology' taught by one of the authors, Leifer); and (2) a long range (8 to 20 year) international consortium project for NASA's Space Science program (STEP: satellite test of the equivalence principle). Design knowledge capture was supported this year by assigning the use of a Team-Design PowerBook. Design records were cataloged in near-real time. These records were used to qualitatively model the artifact design as it evolved. Dedal, an 'intelligent librarian' developed at NASA-ARC, was used to navigate and retrieve captured knowledge for reuse.

  11. A Design Rationale Capture Tool to Support Design Verification and Re-use

    NASA Technical Reports Server (NTRS)

    Hooey, Becky Lee; Da Silva, Jonny C.; Foyle, David C.

    2012-01-01

    A design rationale tool (DR tool) was developed to capture design knowledge to support design verification and design knowledge re-use. The design rationale tool captures design drivers and requirements, and documents the design solution including: intent (why it is included in the overall design); features (why it is designed the way it is); information about how the design components support design drivers and requirements; and, design alternatives considered but rejected. For design verification purposes, the tool identifies how specific design requirements were met and instantiated within the final design, and which requirements have not been met. To support design re-use, the tool identifies which design decisions are affected when design drivers and requirements are modified. To validate the design tool, the design knowledge from the Taxiway Navigation and Situation Awareness (T-NASA; Foyle et al., 1996) system was captured and the DR tool was exercised to demonstrate its utility for validation and re-use.

  12. Theory and ontology for sharing temporal knowledge

    NASA Technical Reports Server (NTRS)

    Loganantharaj, Rasiah

    1996-01-01

    Using current technology, the sharing or re-using of knowledge-bases is very difficult, if not impossible. ARPA has correctly recognized the problem and funded a knowledge sharing initiative. One of the outcomes of this project is a formal language called Knowledge Interchange Format (KIF) for representing knowledge that could be translated into other languages. Capturing and representing design knowledge and reasoning with them have become very important for NASA who is a pioneer of innovative design of unique products. For upgrading an existing design for changing technology, needs, or requirements, it is essential to understand the design rationale, design choices, options and other relevant information associated with the design. Capturing such information and presenting them in the appropriate form are part of the ongoing Design Knowledge Capture project of NASA. The behavior of an object and various other aspects related to time are captured by the appropriate temporal knowledge. The captured design knowledge will be represented in such a way that various groups of NASA who are interested in various aspects of the design cycle should be able to access and use the design knowledge effectively. To facilitate knowledge sharing among these groups, one has to develop a very well defined ontology. Ontology is a specification of conceptualization. In the literature several specific domains were studied and some well defined ontologies were developed for such domains. However, very little, or no work has been done in the area of representing temporal knowledge to facilitate sharing. During the ASEE summer program, I have investigated several temporal models and have proposed a theory for time that is flexible to accommodate the time elements, such as, points and intervals, and is capable of handling the qualitative and quantitative temporal constraints. I have also proposed a primitive temporal ontology using which other relevant temporal ontologies can be built. I have investigated various issues of sharing knowledge and have proposed a formal framework for modeling the concept of knowledge sharing. This work may be implemented and tested in the software environment supplied by Knowledge Based System, Inc.

  13. Expert knowledge maps for knowledge management: a case study in Traditional Chinese Medicine research.

    PubMed

    Cui, Meng; Yang, Shuo; Yu, Tong; Yang, Ce; Gao, Yonghong; Zhu, Haiyan

    2013-10-01

    To design a model to capture information on the state and trends of knowledge creation, at both an individual and an organizational level, in order to enhance knowledge management. We designed a graph-theoretic knowledge model, the expert knowledge map (EKM), based on literature-based annotation. A case study in the domain of Traditional Chinese Medicine research was used to illustrate the usefulness of the model. The EKM successfully captured various aspects of knowledge and enhanced knowledge management within the case-study organization through the provision of knowledge graphs, expert graphs, and expert-knowledge biography. Our model could help to reveal the hot topics, trends, and products of the research done by an organization. It can potentially be used to facilitate knowledge learning, sharing and decision-making among researchers, academicians, students, and administrators of organizations.

  14. TARGET: Rapid Capture of Process Knowledge

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.; Ly, H. V.; Saito, T.; Loftin, R. B.

    1993-01-01

    TARGET (Task Analysis/Rule Generation Tool) represents a new breed of tool that blends graphical process flow modeling capabilities with the function of a top-down reporting facility. Since NASA personnel frequently perform tasks that are primarily procedural in nature, TARGET models mission or task procedures and generates hierarchical reports as part of the process capture and analysis effort. Historically, capturing knowledge has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent the expert's knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some types of knowledge, procedural knowledge has received relatively little attention. In essence, TARGET is one of the first tools of its kind, commercial or institutional, that is designed to support this type of knowledge capture undertaking. This paper will describe the design and development of TARGET for the acquisition and representation of procedural knowledge. The strategies employed by TARGET to support use by knowledge engineers, subject matter experts, programmers and managers will be discussed. This discussion includes the method by which the tool employs its graphical user interface to generate a task hierarchy report. Next, the approach to generate production rules for incorporation in and development of a CLIPS based expert system will be elaborated. TARGET also permits experts to visually describe procedural tasks as a common medium for knowledge refinement by the expert community and knowledge engineer making knowledge consensus possible. The paper briefly touches on the verification and validation issues facing the CLIPS rule generation aspects of TARGET. A description of efforts to support TARGET's interoperability issues on PCs, Macintoshes and UNIX workstations concludes the paper.

  15. Software support environment design knowledge capture

    NASA Technical Reports Server (NTRS)

    Dollman, Tom

    1990-01-01

    The objective of this task is to assess the potential for using the software support environment (SSE) workstations and associated software for design knowledge capture (DKC) tasks. This assessment will include the identification of required capabilities for DKC and hardware/software modifications needed to support DKC. Several approaches to achieving this objective are discussed and interim results are provided: (1) research into the problem of knowledge engineering in a traditional computer-aided software engineering (CASE) environment, like the SSE; (2) research into the problem of applying SSE CASE tools to develop knowledge based systems; and (3) direct utilization of SSE workstations to support a DKC activity.

  16. Knowledge Capture and Management for Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Goodman, John L.

    2005-01-01

    The incorporation of knowledge capture and knowledge management strategies early in the development phase of an exploration program is necessary for safe and successful missions of human and robotic exploration vehicles over the life of a program. Following the transition from the development to the flight phase, loss of underlying theory and rationale governing design and requirements occur through a number of mechanisms. This degrades the quality of engineering work resulting in increased life cycle costs and risk to mission success and safety of flight. Due to budget constraints, concerned personnel in legacy programs often have to improvise methods for knowledge capture and management using existing, but often sub-optimal, information technology and archival resources. Application of advanced information technology to perform knowledge capture and management would be most effective if program wide requirements are defined at the beginning of a program.

  17. Design knowledge capture for a corporate memory facility

    NASA Technical Reports Server (NTRS)

    Boose, John H.; Shema, David B.; Bradshaw, Jeffrey M.

    1990-01-01

    Currently, much of the information regarding decision alternatives and trade-offs made in the course of a major program development effort is not represented or retained in a way that permits computer-based reasoning over the life cycle of the program. The loss of this information results in problems in tracing design alternatives to requirements, in assessing the impact of change in requirements, and in configuration management. To address these problems, the problem was studied of building an intelligent, active corporate memory facility which would provide for the capture of the requirements and standards of a program, analyze the design alternatives and trade-offs made over the program's lifetime, and examine relationships between requirements and design trade-offs. Early phases of the work have concentrated on design knowledge capture for the Space Station Freedom. Tools are demonstrated and extended which helps automate and document engineering trade studies, and another tool is being developed to help designers interactively explore design alternatives and constraints.

  18. Conservation of design knowledge. [of large complex spaceborne systems

    NASA Technical Reports Server (NTRS)

    Sivard, Cecilia; Zweben, Monte; Cannon, David; Lakin, Fred; Leifer, Larry

    1989-01-01

    This paper presents an approach for acquiring knowledge about a design during the design process. The objective is to increase the efficiency of the lifecycle management of a space-borne system by providing operational models of the system's structure and behavior, as well as the design rationale, to human and automated operators. A design knowledge acquisition system is under development that compares how two alternative design versions meet the system requirements as a means for automatically capturing rationale for design changes.

  19. Describing content in middle school science curricula

    NASA Astrophysics Data System (ADS)

    Schwarz-Ballard, Jennifer A.

    As researchers and designers, we intuitively recognize differences between curricula and describe them in terms of design strategy: project-based, laboratory-based, modular, traditional, and textbook, among others. We assume that practitioners recognize the differences in how each requires that students use knowledge, however these intuitive differences have not been captured or systematically described by the existing languages for describing learning goals. In this dissertation I argue that we need new ways of capturing relationships among elements of content, and propose a theory that describes some of the important differences in how students reason in differently designed curricula and activities. Educational researchers and curriculum designers have taken a variety of approaches to laying out learning goals for science. Through an analysis of existing descriptions of learning goals I argue that to describe differences in the understanding students come away with, they need to (1) be specific about the form of knowledge, (2) incorporate both the processes through which knowledge is used and its form, and (3) capture content development across a curriculum. To show the value of inquiry curricula, learning goals need to incorporate distinctions among the variety of ways we ask students to use knowledge. Here I propose the Epistemic Structures Framework as one way to describe differences in students reasoning that are not captured by existing descriptions of learning goals. The usefulness of the Epistemic Structures framework is demonstrated in the four curriculum case study examples in Part II of this work. The curricula in the case studies represent a range of content coverage, curriculum structure, and design rationale. They serve both to illustrate the Epistemic Structures analysis process and make the case that it does in fact describe learning goals in a way that captures important differences in students reasoning in differently designed curricula. Describing learning goals in terms of Epistemic Structures provides one way to define what we mean when we talk about "project-based" curricula and demonstrate its "value added" to educators, administrators and policy makers.

  20. E-Learning as a Knowledge Management Approach for Intellectual Capital Utilization

    ERIC Educational Resources Information Center

    Shehabat, Issa; Mahdi, Saad A.; Khoualdi, Kamel

    2008-01-01

    This paper addresses human resources utilization at the university environment. We address the design issues of e-learning courses that can capture the teacher knowledge. The underlying objective is that e-learning is a key knowledge and major resources for many universities. Therefore, the design of e-learning should be an important part of the…

  1. Concurrent engineering design and management knowledge capture

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The topics are presented in viewgraph form and include the following: real-time management, personnel management, project management, conceptual design and decision making; the SITRF design problem; and the electronic-design notebook.

  2. U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Year 2015

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Oliva, Vladenka R.

    2016-01-01

    The NASA U.S. Spacesuit Knowledge Capture (SKC) Program continues to capture, share, and archive significant spacesuit-related knowledge with engineers and other technical staff and invested entities. Since its 2007 inception, the SKC Program has hosted and recorded more than 75 events. By the end of Fiscal Year (FY) 2015, 40 of these were processed and uploaded to a publically accessible NASA Web site where viewers can expand their knowledge about the spacesuit's evolution, known capabilities and limitations, and lessons learned. Sharing this knowledge with entities beyond NASA can increase not only more people's understanding of the technical effort and importance involved in designing a spacesuit, it can also expand the interest and support in this valuable program that ensures significant knowledge is retained and accessible. This paper discusses the FY 2015 SKC events, the release and accessibility of the approved events, and the program's future plans.

  3. U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Year 2015

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Oliva, Vladenka R.

    2016-01-01

    The NASA U.S. Spacesuit Knowledge Capture (SKC) Program continues to capture, share, and archive significant spacesuit-related knowledge with engineers and other technical staff and invested entities. Since its 2007 inception, the SKC Program has hosted and recorded more than 65 events. By the end of Fiscal Year (FY) 2015, 40 of these were processed and uploaded to a publically accessible NASA Web site where viewers can expand their knowledge about the spacesuit's evolution, known capability and limitations, and lessons learned. Sharing this knowledge with entities beyond NASA can increase not only more people's understanding of the technical effort and importance involved in designing a spacesuit, it can also expand the interest and support in this valuable program that ensures significant knowledge is retained and accessible. This paper discusses the FY 2015 SKC events, the release and accessibility of the approved events, and the program's future plans.

  4. US Spacesuit Knowledge Capture

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Thomas, Ken; McMann, Joe; Dolan, Kristi; Bitterly, Rose; Lewis, Cathleen

    2011-01-01

    The ability to learn from both the mistakes and successes of the past is vital to assuring success in the future. Due to the close physical interaction between spacesuit systems and human beings as users, spacesuit technology and usage lends itself rather uniquely to the benefits realized from the skillful organization of historical information; its dissemination; the collection and identification of artifacts; and the education of those in the field. The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) Spacesuit Knowledge Capture since the beginning of space exploration. Avenues used to capture the knowledge have included publication of reports; conference presentations; specialized seminars; and classes usually given by veterans in the field. More recently the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives in which videotaping occurs engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge. With video archiving, all these avenues of learning can now be brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. Scope and topics of U.S. spacesuit knowledge capture have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, and aspects of program management. Concurrently, U.S. spacesuit knowledge capture activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a comprehensive closed-looped spacesuit knowledge capture system which includes

  5. Preliminary Design of a Consultation Knowledge-Based System for the Minimization of Distortion in Welded Structures

    DTIC Science & Technology

    1989-02-01

    which capture the knowledge of such experts. These Expert Systems, or Knowledge-Based Systems’, differ from the usual computer programming techniques...their applications in the fields of structural design and welding is reviewed. 5.1 Introduction Expert Systems, or KBES, are computer programs using Al...procedurally constructed as conventional computer programs usually are; * The knowledge base of such systems is executable, unlike databases 3 "Ill

  6. Using Modern Technologies to Capture and Share Indigenous Astronomical Knowledge

    NASA Astrophysics Data System (ADS)

    Nakata, Martin; Hamacher, Duane W.; Warren, John; Byrne, Alex; Pagnucco, Maurice; Harley, Ross; Venugopal, Srikumar; Thorpe, Kirsten; Neville, Richard; Bolt, Reuben

    2014-06-01

    Indigenous Knowledge is important for Indigenous communities across the globe and for the advancement of our general scientific knowledge. In particular, Indigenous astronomical knowledge integrates many aspects of Indigenous Knowledge, including seasonal calendars, navigation, food economics, law, ceremony, and social structure. Capturing, managing, and disseminating this knowledge in the digital environment poses a number of challenges, which we aim to address using a collaborative project emerging between experts in the higher education, library, archive and industry sectors. Using Microsoft's WorldWide Telescope and Rich Interactive Narratives technologies, we propose to develop software, media design, and archival management solutions to allow Indigenous communities to share their astronomical knowledge with the world on their terms and in a culturally sensitive manner.

  7. Space Shuttle Guidance, Navigation, and Rendezvous Knowledge Capture Reports. Revision 1

    NASA Technical Reports Server (NTRS)

    Goodman, John L.

    2011-01-01

    This document is a catalog and readers guide to lessons learned, experience, and technical history reports, as well as compilation volumes prepared by United Space Alliance personnel for the NASA/Johnson Space Center (JSC) Flight Dynamics Division.1 It is intended to make it easier for future generations of engineers to locate knowledge capture documentation from the Shuttle Program. The first chapter covers observations on documentation quality and research challenges encountered during the Space Shuttle and Orion programs. The second chapter covers the knowledge capture approach used to create many of the reports covered in this document. These chapters are intended to provide future flight programs with insight that could be used to formulate knowledge capture and management strategies. The following chapters contain descriptions of each knowledge capture report. The majority of the reports concern the Space Shuttle. Three are included that were written in support of the Orion Program. Most of the reports were written from the years 2001 to 2011. Lessons learned reports concern primarily the shuttle Global Positioning System (GPS) upgrade and the knowledge capture process. Experience reports on navigation and rendezvous provide examples of how challenges were overcome and how best practices were identified and applied. Some reports are of a more technical history nature covering navigation and rendezvous. They provide an overview of mission activities and the evolution of operations concepts and trajectory design. The lessons learned, experience, and history reports would be considered secondary sources by historians and archivists.

  8. Pedagogical-Research Designs to Capture the Symbiotic Nature of Professional Knowledge and Learning about e-Learning in Initial Teacher Education in the UK

    ERIC Educational Resources Information Center

    Turvey, Keith

    2010-01-01

    This paper argues that if new communications technologies and online spaces are to yield "new relationship[s] with learners" then research that is tuned to recognize, capture and explain the pedagogical processes at the center of such interactions is vital. This has implications for the design of pedagogical activities within Initial…

  9. On the acquisition and representation of procedural knowledge

    NASA Technical Reports Server (NTRS)

    Saito, T.; Ortiz, C.; Loftin, R. B.

    1992-01-01

    Historically knowledge acquisition has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some of some types of knowledge, little attention has been devoted to procedural knowledge. NASA personnel frequently perform tasks that are primarily procedural in nature. Previous work is reviewed in the field of knowledge acquisition and then focus on knowledge acquisition for procedural tasks with special attention devoted to the Navy's VISTA tool. The design and development is described of a system for the acquisition and representation of procedural knowledge-TARGET (Task Analysis and Rule Generation Tool). TARGET is intended as a tool that permits experts to visually describe procedural tasks and as a common medium for knowledge refinement by the expert and knowledge engineer. The system is designed to represent the acquired knowledge in the form of production rules. Systems such as TARGET have the potential to profoundly reduce the time, difficulties, and costs of developing knowledge-based systems for the performance of procedural tasks.

  10. ASIS 2000: Knowledge Innovations: Celebrating Our Heritage, Designing Our Future. Proceedings of the ASIS Annual Meeting (63rd, Chicago, Illinois, November 12-16, 2000). Volume 37.

    ERIC Educational Resources Information Center

    Kraft, Donald H., Ed.

    The 2000 ASIS (American Society for Information Science) conference explored knowledge innovation. The tracks in the conference program included knowledge discovery, capture, and creation; classification and representation; information retrieval; knowledge dissemination; and social, behavioral, ethical, and legal aspects. This proceedings is…

  11. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1992-05-01

    methodology, knowledge acquisition, 140 requirements definition, information systems, information engineering, 16. PRICE CODE systems engineering...and knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be...evolve towards an information -integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key

  12. Teaching Knowledge Management by Combining Wikis and Screen Capture Videos

    ERIC Educational Resources Information Center

    Makkonen, Pekka; Siakas, Kerstin; Vaidya, Shakespeare

    2011-01-01

    Purpose: This paper aims to report on the design and creation of a knowledge management course aimed at facilitating student creation and use of social interactive learning tools for enhanced learning. Design/methodology/approach: The era of social media and web 2.0 has enabled a bottom-up collaborative approach and new ways to publish work on the…

  13. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1995-09-01

    vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems

  14. Towards Knowledge Management for Smart Manufacturing.

    PubMed

    Feng, Shaw C; Bernstein, William Z; Hedberg, Thomas; Feeney, Allison Barnard

    2017-09-01

    The need for capturing knowledge in the digital form in design, process planning, production, and inspection has increasingly become an issue in manufacturing industries as the variety and complexity of product lifecycle applications increase. Both knowledge and data need to be well managed for quality assurance, lifecycle-impact assessment, and design improvement. Some technical barriers exist today that inhibit industry from fully utilizing design, planning, processing, and inspection knowledge. The primary barrier is a lack of a well-accepted mechanism that enables users to integrate data and knowledge. This paper prescribes knowledge management to address a lack of mechanisms for integrating, sharing, and updating domain-specific knowledge in smart manufacturing. Aspects of the knowledge constructs include conceptual design, detailed design, process planning, material property, production, and inspection. The main contribution of this paper is to provide a methodology on what knowledge manufacturing organizations access, update, and archive in the context of smart manufacturing. The case study in this paper provides some example knowledge objects to enable smart manufacturing.

  15. U.S. Spacesuit Knowledge Capture Status and Initiatives

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Woods, Ron; Jairala, Juniper; Bitterly, Rose; McMann, Joe; Lewis, Cathleen

    2011-01-01

    The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration via publication of reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge or being interviewed to archive their significance to NASA s history. Now with video archiving, all these avenues of learning are brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, aspects of program management, and personal interviews. These archives of actual spacesuit legacy now reflect its rich history and will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs. In this paper, NASA s formal spacesuit knowledge capture efforts will be reviewed and a status will be provided to reveal initiatives and accomplishments since the inception of the more formal U.S. spacesuit knowledge program. A detail itemization of the actual archives will be addressed along with topics that are now available to the general NASA community and the public. Additionally, the latest developments in the archival relationship with the Smithsonian will be discussed.

  16. U.S. Spacesuit Knowledge Capture Status and Initiatives

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Woods, Ron; Jairala, Juniper; Bitterly, Rose; McMann, Joe; Lewis, Cathleen

    2012-01-01

    The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration via publication of reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge or being interviewed to archive their significance to NASA's history. Now with video archiving, all these avenues of learning are brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, aspects of program management, and personal interviews. These archives of actual spacesuit legacy now reflect its rich history and will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs. In this paper, NASA s formal spacesuit knowledge capture efforts will be reviewed and a status will be provided to reveal initiatives and accomplishments since the inception of the more formal U.S. spacesuit knowledge program. A detail itemization of the actual archives will be addressed along with topics that are now available to the general NASA community and the public. Additionally, the latest developments in the archival relationship with the Smithsonian will be discussed.

  17. Capturing, using, and managing quality assurance knowledge for shuttle post-MECO flight design

    NASA Technical Reports Server (NTRS)

    Peters, H. L.; Fussell, L. R.; Goodwin, M. A.; Schultz, Roger D.

    1991-01-01

    Ascent initialization values used by the Shuttle's onboard computer for nominal and abort mission scenarios are verified by a six degrees of freedom computer simulation. The procedure that the Ascent Post Main Engine Cutoff (Post-MECO) group uses to perform quality assurance (QA) of the simulation is time consuming. Also, the QA data, checklists and associated rationale, though known by the group members, is not sufficiently documented, hindering transfer of knowledge and problem resolution. A new QA procedure which retains the current high level of integrity while reducing the time required to perform QA is needed to support the increasing Shuttle flight rate. Documenting the knowledge is also needed to increase its availability for training and problem resolution. To meet these needs, a knowledge capture process, embedded into the group activities, was initiated to verify the existing QA checks, define new ones, and document all rationale. The resulting checks were automated in a conventional software program to achieve the desired standardization, integrity, and time reduction. A prototype electronic knowledge base was developed with Macintosh's HyperCard to serve as a knowledge capture tool and data storage.

  18. Knowledge management in the engineering design environment

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    2006-01-01

    The Aerospace and Defense industry is experiencing an increasing loss of knowledge through workforce reductions associated with business consolidation and retirement of senior personnel. Significant effort is being placed on process definition as part of ISO certification and, more recently, CMMI certification. The process knowledge in these efforts represents the simplest of engineering knowledge and many organizations are trying to get senior engineers to write more significant guidelines, best practices and design manuals. A new generation of design software, known as Product Lifecycle Management systems, has many mechanisms for capturing and deploying a wider variety of engineering knowledge than simple process definitions. These hold the promise of significant improvements through reuse of prior designs, codification of practices in workflows, and placement of detailed how-tos at the point of application.

  19. U.S. Spacesuit Knowledge Capture

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Thomas, Ken; McMann, Joe; Dolan, Kristi; Bitterly, Rose; Lewis, Cathleen

    2010-01-01

    The ability to learn from both the mistakes and successes of the past is vital to assuring success in the future. Due to the close physical interaction between spacesuit systems and human beings as users, spacesuit technology and usage lends itself rather uniquely to the benefits realized from the skillful organization of historical information; its dissemination; the collection and identification of artifacts; and the education of individuals and groups working in the field. The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration. Avenues used to capture the knowledge have included publication of reports; conference presentations; specialized seminars; and classes usually given by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge. Now with video archiving, all these avenues of learning can be brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, and aspects of program management. Concurrently, U.S. spacesuit knowledge capture activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a rather closed-looped spacesuit knowledge capture system which includes specific attention to spacesuit system artifacts as well. A NASM report has recently been created that allows the cross reference of history to the artifacts and the artifacts to the history including spacesuit manufacturing details with current condition and location. NASA has examined spacesuits in the NASM collection for evidence of wear during their operational life. NASA s formal spacesuit knowledge capture efforts now make use of both the NASM spacesuit preservation collection and report to enhance its efforts to educate NASA personnel and contribute to spacesuit history. Be it archiving of human knowledge or archiving of the actual spacesuit legacy hardware with its rich history, the joining together of spacesuit system artifact history with that of development and use during past programs will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs.

  20. Automated knowledge base development from CAD/CAE databases

    NASA Technical Reports Server (NTRS)

    Wright, R. Glenn; Blanchard, Mary

    1988-01-01

    Knowledge base development requires a substantial investment in time, money, and resources in order to capture the knowledge and information necessary for anything other than trivial applications. This paper addresses a means to integrate the design and knowledge base development process through automated knowledge base development from CAD/CAE databases and files. Benefits of this approach include the development of a more efficient means of knowledge engineering, resulting in the timely creation of large knowledge based systems that are inherently free of error.

  1. Structure and properties of visible-light absorbing homodisperse nanoparticle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benedict, Jason

    Broadly, the scientific progress from this award focused in two main areas: developing time-resolved X-ray diffraction methods and the synthesis and characterization of molecular systems relevant to solar energy harvesting. The knowledge of photo–induced non–equilibrium states is central to our understanding of processes involved in solar–energy capture. More specifically, knowledge of the geometry changes on excitation and their relation to lifetimes and variation with adsorption of chromophores on the substrates is of importance for the design of molecular devices used in light capture.

  2. Job Knowledge Test Design: A Cognitively-Oriented Approach

    DTIC Science & Technology

    1993-07-01

    protocol analyses and related methods. We employed a plan-goal graph representation to capture the knowledge content and goal structure of the studied task...between job knowledge and hands-on performance from previous studies was .38. For the subset of Marines in this sample who had recently been examined...the job knowledge test provided similar results to conventional, total number correct scoring. Conclusion The evidence provided by this study supports

  3. Advanced software development workstation. Knowledge base design: Design of knowledge base for flight planning application

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.

  4. Job Knowledge Test Design: A Cognitively-Oriented Approach. Institute Report No. 241.

    ERIC Educational Resources Information Center

    DuBois, David; And Others

    Selected cognitive science methods were used to modify existing test development procedures so that the modified procedures could in turn be used to improve the usefulness of job knowledge tests as a proxy for hands-on performance. A plan-goal graph representation was used to capture the knowledge content and goal structure of the task of using a…

  5. Exploring the Downside of Open Knowledge Resources: The Case of Indigenous Knowledge Systems and Practices in the Philippines

    ERIC Educational Resources Information Center

    Flor, Alexander Gonzalez

    2013-01-01

    The paper is based on the challenges encountered by the researcher while conducting a study titled "Design, Development and Testing of an Indigenous Knowledge Management System Using Mobile Device Video Capture and Web 2.0 Protocols." During the conduct of the study the researcher observed a marked reluctance from organized indigenous…

  6. Information Integration for Concurrent Engineering (IICE) Compendium of Methods Report

    DTIC Science & Technology

    1995-06-01

    technological, economic, and strategic benefits can be attained through the effective capture, control, and management of information and knowledge ...resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to achieve...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems that

  7. The Environmental Control and Life Support System (ECLSS) advanced automation project

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, Ray

    1990-01-01

    The objective of the environmental control and life support system (ECLSS) Advanced Automation Project is to influence the design of the initial and evolutionary Space Station Freedom Program (SSFP) ECLSS toward a man-made closed environment in which minimal flight and ground manpower is needed. Another objective includes capturing ECLSS design and development knowledge future missions. Our approach has been to (1) analyze the SSFP ECLSS, (2) envision as our goal a fully automated evolutionary environmental control system - an augmentation of the baseline, and (3) document the advanced software systems, hooks, and scars which will be necessary to achieve this goal. From this analysis, prototype software is being developed, and will be tested using air and water recovery simulations and hardware subsystems. In addition, the advanced software is being designed, developed, and tested using automation software management plan and lifecycle tools. Automated knowledge acquisition, engineering, verification and testing tools are being used to develop the software. In this way, we can capture ECLSS development knowledge for future use develop more robust and complex software, provide feedback to the knowledge based system tool community, and ensure proper visibility of our efforts.

  8. Design of customer knowledge management system for Aglaonema Nursery in South Tangerang, Indonesia

    NASA Astrophysics Data System (ADS)

    Sugiarto, D.; Mardianto, I.; Dewayana, TS; Khadafi, M.

    2017-12-01

    The purpose of this paper is to describe the design of customer knowledge management system to support customer relationship management activities for an aglaonema nursery in South Tangerang, Indonesia. System. The steps were knowledge identification (knowledge about customer, knowledge from customer, knowledge for customer), knowledge capture, codification, analysis of system requirement and create use case and activity diagram. The result showed that some key knowledge were about supporting customer in plant care (know how) and types of aglaonema including with the prices (know what). That knowledge for customer then codified and shared in knowledge portal website integrated with social media. Knowledge about customer were about customers and their behaviour in purchasing aglaonema. Knowledge from customer were about feedback, favorite and customer experience. Codified knowledge were placed and shared using content management system based on wordpress.

  9. User Acceptance of Mobile Knowledge Management Learning System: Design and Analysis

    ERIC Educational Resources Information Center

    Chen, Hong-Ren; Huang, Hui-Ling

    2010-01-01

    Thanks to advanced developments in wireless technology, learners can now utilize digital learning websites at anytime and anywhere. Mobile learning captures more and more attention in the wave of digital learning. Evolving use of knowledge management plays an important role to enhance problem solving skills. Recently, innovative approaches for…

  10. Safety and Mission Assurance Knowledge Management Retention: Managing Knowledge for Successful Mission Operations

    NASA Technical Reports Server (NTRS)

    Johnson, Teresa A.

    2006-01-01

    Knowledge Management is a proactive pursuit for the future success of any large organization faced with the imminent possibility that their senior managers/engineers with gained experiences and lessons learned plan to retire in the near term. Safety and Mission Assurance (S&MA) is proactively pursuing unique mechanism to ensure knowledge learned is retained and lessons learned captured and documented. Knowledge Capture Event/Activities/Management helps to provide a gateway between future retirees and our next generation of managers/engineers. S&MA hosted two Knowledge Capture Events during 2005 featuring three of its retiring fellows (Axel Larsen, Dave Whittle and Gary Johnson). The first Knowledge Capture Event February 24, 2005 focused on two Safety and Mission Assurance Safety Panels (Space Shuttle System Safety Review Panel (SSRP); Payload Safety Review Panel (PSRP) and the latter event December 15, 2005 featured lessons learned during Apollo, Skylab, and Space Shuttle which could be applicable in the newly created Crew Exploration Vehicle (CEV)/Constellation development program. Gemini, Apollo, Skylab and the Space Shuttle promised and delivered exciting human advances in space and benefits of space in people s everyday lives on earth. Johnson Space Center's Safety & Mission Assurance team work over the last 20 years has been mostly focused on operations we are now beginning the Exploration development program. S&MA will promote an atmosphere of knowledge sharing in its formal and informal cultures and work processes, and reward the open dissemination and sharing of information; we are asking "Why embrace relearning the "lessons learned" in the past?" On the Exploration program the focus will be on Design, Development, Test, & Evaluation (DDT&E); therefore, it is critical to understand the lessons from these past programs during the DDT&E phase.

  11. Relationship Between Health Literacy, Knowledge of Health Status, and Beliefs about HIV/AIDS Transmission among Ryan White Clients in Miami

    ERIC Educational Resources Information Center

    Mooss, Angela; Brock-Getz, Petra; Ladner, Robert; Fiano, Theresa

    2013-01-01

    Objective: The aim of this study was to examine the relationships between health literacy, knowledge of health status, and human immunodeficiency virus/acquired immune deficiency syndrome (HIV/AIDS) transmission beliefs among recipients of Ryan White care. Design: Quota and convenience sampled, quantitative analysis captured with closed and…

  12. XML-Based SHINE Knowledge Base Interchange Language

    NASA Technical Reports Server (NTRS)

    James, Mark; Mackey, Ryan; Tikidjian, Raffi

    2008-01-01

    The SHINE Knowledge Base Interchange Language software has been designed to more efficiently send new knowledge bases to spacecraft that have been embedded with the Spacecraft Health Inference Engine (SHINE) tool. The intention of the behavioral model is to capture most of the information generally associated with a spacecraft functional model, while specifically addressing the needs of execution within SHINE and Livingstone. As such, it has some constructs that are based on one or the other.

  13. EDNA: Expert fault digraph analysis using CLIPS

    NASA Technical Reports Server (NTRS)

    Dixit, Vishweshwar V.

    1990-01-01

    Traditionally fault models are represented by trees. Recently, digraph models have been proposed (Sack). Digraph models closely imitate the real system dependencies and hence are easy to develop, validate and maintain. However, they can also contain directed cycles and analysis algorithms are hard to find. Available algorithms tend to be complicated and slow. On the other hand, the tree analysis (VGRH, Tayl) is well understood and rooted in vast research effort and analytical techniques. The tree analysis algorithms are sophisticated and orders of magnitude faster. Transformation of a digraph (cyclic) into trees (CLP, LP) is a viable approach to blend the advantages of the representations. Neither the digraphs nor the trees provide the ability to handle heuristic knowledge. An expert system, to capture the engineering knowledge, is essential. We propose an approach here, namely, expert network analysis. We combine the digraph representation and tree algorithms. The models are augmented by probabilistic and heuristic knowledge. CLIPS, an expert system shell from NASA-JSC will be used to develop a tool. The technique provides the ability to handle probabilities and heuristic knowledge. Mixed analysis, some nodes with probabilities, is possible. The tool provides graphics interface for input, query, and update. With the combined approach it is expected to be a valuable tool in the design process as well in the capture of final design knowledge.

  14. Space Telecommunications Radio System (STRS) Application Repository Design and Analysis

    NASA Technical Reports Server (NTRS)

    Handler, Louis M.

    2013-01-01

    The Space Telecommunications Radio System (STRS) Application Repository Design and Analysis document describes the STRS application repository for software-defined radio (SDR) applications intended to be compliant to the STRS Architecture Standard. The document provides information about the submission of artifacts to the STRS application repository, to provide information to the potential users of that information, and for the systems engineer to understand the requirements, concepts, and approach to the STRS application repository. The STRS application repository is intended to capture knowledge, documents, and other artifacts for each waveform application or other application outside of its project so that when the project ends, the knowledge is retained. The document describes the transmission of technology from mission to mission capturing lessons learned that are used for continuous improvement across projects and supporting NASA Procedural Requirements (NPRs) for performing software engineering projects and NASAs release process.

  15. The influence of capture-recapture methodology on the evolution of the North American Bird Banding Program

    USGS Publications Warehouse

    Tautin, J.; Lebreton, J.-D.; North, P.M.

    1993-01-01

    Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.

  16. Cognitive task analysis-based design and authoring software for simulation training.

    PubMed

    Munro, Allen; Clark, Richard E

    2013-10-01

    The development of more effective medical simulators requires a collaborative team effort where three kinds of expertise are carefully coordinated: (1) exceptional medical expertise focused on providing complete and accurate information about the medical challenges (i.e., critical skills and knowledge) to be simulated; (2) instructional expertise focused on the design of simulation-based training and assessment methods that produce maximum learning and transfer to patient care; and (3) software development expertise that permits the efficient design and development of the software required to capture expertise, present it in an engaging way, and assess student interactions with the simulator. In this discussion, we describe a method of capturing more complete and accurate medical information for simulators and combine it with new instructional design strategies that emphasize the learning of complex knowledge. Finally, we describe three different types of software support (Development/Authoring, Run Time, and Post Run Time) required at different stages in the development of medical simulations and the instructional design elements of the software required at each stage. We describe the contributions expected of each kind of software and the different instructional control authoring support required. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  17. WE-F-BRB-01: The Power of Ontologies and Standardized Terminologies for Capturing Clinical Knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gabriel, P.

    2015-06-15

    Advancements in informatics in radiotherapy are opening up opportunities to improve our ability to assess treatment plans. Models on individualizing patient dose constraints from prior patient data and shape relationships have been extensively researched and are now making their way into commercial products. New developments in knowledge based treatment planning involve understanding the impact of the radiation dosimetry on the patient. Akin to radiobiology models that have driven intensity modulated radiotherapy optimization, toxicity and outcome predictions based on treatment plans and prior patient experiences may be the next step in knowledge based planning. In order to realize these predictions, itmore » is necessary to understand how the clinical information can be captured, structured and organized with ontologies and databases designed for recall. Large databases containing radiation dosimetry and outcomes present the opportunity to evaluate treatment plans against predictions of toxicity and disease response. Such evaluations can be based on dose volume histogram or even the full 3-dimensional dose distribution and its relation to the critical anatomy. This session will provide an understanding of ontologies and standard terminologies used to capture clinical knowledge into structured databases; How data can be organized and accessed to utilize the knowledge in planning; and examples of research and clinical efforts to incorporate that clinical knowledge into planning for improved care for our patients. Learning Objectives: Understand the role of standard terminologies, ontologies and data organization in oncology Understand methods to capture clinical toxicity and outcomes in a clinical setting Understand opportunities to learn from clinical data and its application to treatment planning Todd McNutt receives funding from Philips, Elekta and Toshiba for some of the work presented.« less

  18. The Study on Collaborative Manufacturing Platform Based on Agent

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-yan; Qu, Zheng-geng

    To fulfill the trends of knowledge-intensive in collaborative manufacturing development, we have described multi agent architecture supporting knowledge-based platform of collaborative manufacturing development platform. In virtue of wrapper service and communication capacity agents provided, the proposed architecture facilitates organization and collaboration of multi-disciplinary individuals and tools. By effectively supporting the formal representation, capture, retrieval and reuse of manufacturing knowledge, the generalized knowledge repository based on ontology library enable engineers to meaningfully exchange information and pass knowledge across boundaries. Intelligent agent technology increases traditional KBE systems efficiency and interoperability and provides comprehensive design environments for engineers.

  19. A concept ideation framework for medical device design.

    PubMed

    Hagedorn, Thomas J; Grosse, Ian R; Krishnamurty, Sundar

    2015-06-01

    Medical device design is a challenging process, often requiring collaboration between medical and engineering domain experts. This collaboration can be best institutionalized through systematic knowledge transfer between the two domains coupled with effective knowledge management throughout the design innovation process. Toward this goal, we present the development of a semantic framework for medical device design that unifies a large medical ontology with detailed engineering functional models along with the repository of design innovation information contained in the US Patent Database. As part of our development, existing medical, engineering, and patent document ontologies were modified and interlinked to create a comprehensive medical device innovation and design tool with appropriate properties and semantic relations to facilitate knowledge capture, enrich existing knowledge, and enable effective knowledge reuse for different scenarios. The result is a Concept Ideation Framework for Medical Device Design (CIFMeDD). Key features of the resulting framework include function-based searching and automated inter-domain reasoning to uniquely enable identification of functionally similar procedures, tools, and inventions from multiple domains based on simple semantic searches. The significance and usefulness of the resulting framework for aiding in conceptual design and innovation in the medical realm are explored via two case studies examining medical device design problems. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Designing Albaha Internet of Farming Architecture

    NASA Astrophysics Data System (ADS)

    Alahmadi, A.

    2017-04-01

    Up to now, most farmers in Albaha, Saudi Arabia are still practicing traditional way, which is not optimized in term of water usage, quality of product, etc. At the same time, nowadays ICT becomes a key driver for Innovation in Farming. In this project, we propose a smart Internet of farming system to assist farmers in Albaha to optimize their farm productivity by providing accurate information to the farmers the right time prediction to harvest, to fertilize, to watering and other activities related to the farming/agriculture technology. The proposed system utilizes wireless sensor cloud to capture remotely important data such as temperature, humidity, soil condition (moisture, water level), etc., and then they are sent to a storage servers at Albaha University cloud. An adaptive knowledge engine will process the captured data into knowledge and the farmers can retrieve the knowledge using their smartphones via the Internet.

  1. The Power of a Question: A Case Study of Two Organizational Knowledge Capture Systems

    NASA Technical Reports Server (NTRS)

    Cooper, Lynn P.

    2003-01-01

    This document represents a presentation regarding organizational knowledge capture systems which was delivered at the HICSS-36 conference held from January 6-9, 2003. An exploratory case study of two knowledge resources is offered. Then, two organizational knowledge capture systems are briefly described: knowledge transfer from practitioner and the use of questions to represent knowledge. Finally, the creation of a database of peer review questions is suggested as a method of promoting organizational discussions and knowledge representation and exchange.

  2. Knowledge-based nursing diagnosis

    NASA Astrophysics Data System (ADS)

    Roy, Claudette; Hay, D. Robert

    1991-03-01

    Nursing diagnosis is an integral part of the nursing process and determines the interventions leading to outcomes for which the nurse is accountable. Diagnoses under the time constraints of modern nursing can benefit from a computer assist. A knowledge-based engineering approach was developed to address these problems. A number of problems were addressed during system design to make the system practical extended beyond capture of knowledge. The issues involved in implementing a professional knowledge base in a clinical setting are discussed. System functions, structure, interfaces, health care environment, and terminology and taxonomy are discussed. An integrated system concept from assessment through intervention and evaluation is outlined.

  3. Case-Based Capture and Reuse of Aerospace Design Rationale

    NASA Technical Reports Server (NTRS)

    Leake, David B.

    1998-01-01

    The goal of this project is to apply artificial intelligence techniques to facilitate capture and reuse of aerospace design rationale. The project applies case-based reasoning (CBR) and concept mapping (CMAP) tools to the task of capturing, organizing, and interactively accessing experiences or "cases" encapsulating the methods and rationale underlying expert aerospace design. As stipulated in the award, Indiana University and Ames personnel are collaborating on performance of research and determining the direction of research, to assure that the project focuses on high-value tasks. In the first five months of the project, we have made two visits to Ames Research Center to consult with our NASA collaborators, to learn about the advanced aerospace design tools being developed there, and to identify specific needs for intelligent design support. These meetings identified a number of task areas for applying CBR and concept mapping technology. We jointly selected a first task area to focus on: Acquiring the convergence criteria that experts use to guide the selection of useful data from a set of numerical simulations of high-lift systems. During the first funding period, we developed two software systems. First, we have adapted a CBR system developed at Indiana University into a prototype case-based reasoning shell to capture and retrieve information about design experiences, with the sample task of capturing and reusing experts' intuitive criteria for determining convergence (work conducted at Indiana University). Second, we have also adapted and refined existing concept mapping tools that will be used to clarify and capture the rationale underlying those experiences, to facilitate understanding of the expert's reasoning and guide future reuse of captured information (work conducted at the University of West Florida). The tools we have developed are designed to be the basis for a general framework for facilitating tasks within systems developed by the Advanced Design Technologies Testbed (ADTT) project at ARC. The tenets of our framework are (1) that the systems developed should leverage a designer's knowledge, rather than attempting to replace it; (2) that learning and user feedback must play a central role, so that the system can adapt to how it is used, and (3) that the learning and feedback processes must be as natural and as unobtrusive as possible. In the second funding period we will extend our current work, applying the tools to capturing higher-level design rationale.

  4. Designing and measuring the progress and impact of health research capacity strengthening initiatives

    PubMed Central

    2015-01-01

    Strengthening capacity in poorer countries to generate multi-disciplinary health research and to utilise research findings, is one of the most effective ways of advancing the countries' health and development. This paper explores current knowledge about how to design health research capacity strengthening (RCS) programmes and how to measure their progress and impact. It describes a systematic, evidence-based approach for designing such programmes and highlights some of the key challenges that will be faced in the next 10 years. These include designing and implementing common frameworks to facilitate comparisons among capacity strengthening projects, and developing monitoring indicators that can capture their interactions with knowledge users and their impact on changes in health systems. PMID:28281707

  5. De Novo Design of Bioactive Small Molecules by Artificial Intelligence

    PubMed Central

    Merk, Daniel; Friedrich, Lukas; Grisoni, Francesca

    2018-01-01

    Abstract Generative artificial intelligence offers a fresh view on molecular design. We present the first‐time prospective application of a deep learning model for designing new druglike compounds with desired activities. For this purpose, we trained a recurrent neural network to capture the constitution of a large set of known bioactive compounds represented as SMILES strings. By transfer learning, this general model was fine‐tuned on recognizing retinoid X and peroxisome proliferator‐activated receptor agonists. We synthesized five top‐ranking compounds designed by the generative model. Four of the compounds revealed nanomolar to low‐micromolar receptor modulatory activity in cell‐based assays. Apparently, the computational model intrinsically captured relevant chemical and biological knowledge without the need for explicit rules. The results of this study advocate generative artificial intelligence for prospective de novo molecular design, and demonstrate the potential of these methods for future medicinal chemistry. PMID:29319225

  6. The Ontology of Clinical Research (OCRe): An Informatics Foundation for the Science of Clinical Research

    PubMed Central

    Sim, Ida; Tu, Samson W.; Carini, Simona; Lehmann, Harold P.; Pollock, Brad H.; Peleg, Mor; Wittkowski, Knut M.

    2013-01-01

    To date, the scientific process for generating, interpreting, and applying knowledge has received less informatics attention than operational processes for conducting clinical studies. The activities of these scientific processes — the science of clinical research — are centered on the study protocol, which is the abstract representation of the scientific design of a clinical study. The Ontology of Clinical Research (OCRe) is an OWL 2 model of the entities and relationships of study design protocols for the purpose of computationally supporting the design and analysis of human studies. OCRe’s modeling is independent of any specific study design or clinical domain. It includes a study design typology and a specialized module called ERGO Annotation for capturing the meaning of eligibility criteria. In this paper, we describe the key informatics use cases of each phase of a study’s scientific lifecycle, present OCRe and the principles behind its modeling, and describe applications of OCRe and associated technologies to a range of clinical research use cases. OCRe captures the central semantics that underlies the scientific processes of clinical research and can serve as an informatics foundation for supporting the entire range of knowledge activities that constitute the science of clinical research. PMID:24239612

  7. Competent Systems: Effective, Efficient, Deliverable.

    ERIC Educational Resources Information Center

    Abramson, Bruce

    Recent developments in artificial intelligence and decision analysis suggest reassessing the approaches commonly taken to the design of knowledge-based systems. Competent systems are based on models known as influence diagrams, which graphically capture a domain's basic objects and their interrelationships. Among the benefits offered by influence…

  8. Linking departmental priorities to knowledge management: the experiences of Santa Cruz County's Human Services Department.

    PubMed

    Lindberg, Arley

    2012-01-01

    Federal welfare reform, local service collaborations, and the evolution of statewide information systems inspired agency interest in evidence-informed practice and knowledge sharing systems. Four agency leaders, including the Director, Deputy Director, Director of Planning and Evaluation, and Staff Development Program Manager championed the development of a learning organization based on knowledge management throughout the agency. Internal department restructuring helped to strengthen the Planning and Evaluation, Staff Development, and Personnel units, which have become central to supporting knowledge sharing activities. The Four Pillars of Knowledge framework was designed to capture agency directions in relationship to future knowledge management goals. Featuring People, Practice, Technology and Budget, the framework links the agency's services, mission and goals to the process of becoming a learning organization. Built through an iterative process, the framework was created by observing existing activities in each department rather than being designed from the top down. Knowledge management can help the department to fulfill its mission despite reduced resources. Copyright © Taylor & Francis Group, LLC

  9. Managing clinical failure: a complex adaptive system perspective.

    PubMed

    Matthews, Jean I; Thomas, Paul T

    2007-01-01

    The purpose of this article is to explore the knowledge capture process at the clinical level. It aims to identify factors that enable or constrain learning. The study applies complex adaptive system thinking principles to reconcile learning within the NHS. The paper uses a qualitative exploratory study with an interpretative methodological stance set in a secondary care NHS Trust. Semi-structured interviews were conducted with healthcare practitioners and managers involved at both strategic and operational risk management processes. A network structure is revealed that exhibits the communication and interdependent working practices to support knowledge capture and adaptive learning. Collaborative multidisciplinary communities, whose values reflect local priorities and promote open dialogue and reflection, are featured. The main concern is that the characteristics of bureaucracy; rational-legal authority, a rule-based culture, hierarchical lines of communication and a centralised governance focus, are hindering clinical learning by generating barriers. Locally emergent collaborative processes are a key strategic resource to capture knowledge, potentially fostering an environment that could learn from failure and translate lessons between contexts. What must be addressed is that reporting mechanisms serve not only the governance objectives, but also supplement learning by highlighting the potential lessons in context. Managers must nurture a collaborative infrastructure using networks in a co-evolutionary manner. Their role is not to direct and design processes but to influence, support and create effective knowledge capture. Although the study only investigated one site the findings and conclusions may well translate to other trusts--such as the risk of not enabling a learning environment at clinical levels.

  10. Knowledge Preservation for Design of Rocket Systems

    NASA Technical Reports Server (NTRS)

    Moreman, Douglas

    2002-01-01

    An engineer at NASA Lewis RC presented a challenge to us at Southern University. Our response to that challenge, stated circa 1993, has evolved into the Knowledge Preservation Project which is here reported. The stated problem was to capture some of the knowledge of retiring NASA engineers and make it useful to younger engineers via computers. We evolved that initial challenge to this - design a system of tools such that, with this system, people might efficiently capture and make available via commonplace computers, deep knowledge of retiring NASA engineers. In the process of proving some of the concepts of this system, we would (and did) capture knowledge from some specific engineers and, so, meet the original challenge along the way to meeting the new. Some of the specific knowledge acquired, particularly that on the RL- 10 engine, was directly relevant to design of rocket engines. We considered and rejected some of the techniques popular in the days we began - specifically "expert systems" and "oral histories". We judged that these old methods had too high a cost per sentence preserved. That cost could be measured in hours of labor of a "knowledge professional". We did spend, particularly in the grant preceding this one, some time creating a couple of "concept maps", one of the latest ideas of the day, but judged this also to be costly in time of a specially trained knowledge-professional. We reasoned that the cost in specialized labor could be lowered if less time were spent being selective about sentences from the engineers and in crafting replacements for those sentences. The trade-off would seem to be that our set of sentences would be less dense in information, but we found a computer-based way around this seeming defect. Our plan, details of which we have been carrying out, was to find methods of extracting information from experts which would be capable of gaining cooperation, and interest, of senior engineers and using their time in a way they would find worthy (and, so, they would give more of their time and recruit time of other engineers as well). We studied these four ways of creating text: 1) the old way, via interviews and discussions - one of our team working with one expert, 2) a group-discussion led by one of the experts themselves and on a topic which inspires interaction of the experts, 3) a spoken dissertation by one expert practiced in giving talks, 4) expropriating, and modifying for our system, some existing reports (such as "oral histories" from the Smithsonian Institution).

  11. 76 FR 15224 - Reducing Regulatory Burden; Retrospective Review Under E.O. 13563

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-21

    ... Department's Regulations Portal at http://www.dol.gov/regulations/regreview.htm . All comments will be.... The Department recognizes the knowledge of programs and their implementing regulations that exists... portal specifically designed to capture your input and suggestions, http://www.dol.gov/regulations...

  12. 76 FR 18104 - Reducing Regulatory Burden; Retrospective Review Under E.O. 13563

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-01

    ... April 8, 2011. ADDRESSES: You may submit comments through the Department's Regulations Portal at http... minimizing the burden on regulated entities. The Department recognizes the knowledge of programs and their... Department has created an Internet portal specifically designed to capture your input and suggestions, http...

  13. Screen-Capture Instructional Technology: A Cognitive Tool for Designing a Blended Multimedia Curriculum

    ERIC Educational Resources Information Center

    Smith, Jeffrey G.; Smith, Rita L.

    2012-01-01

    Online instruction has been demonstrated to increase the academic achievement for post-secondary students; however, little empirical investigation has been conducted on high school students learning from online multimedia instruction in the traditional classroom. This study investigated the knowledge acquisition, transfer, and favorability of…

  14. Measuring Knowledge Integration Learning of Energy Topics: A Two-Year Longitudinal Study

    ERIC Educational Resources Information Center

    Liu, Ou Lydia; Ryoo, Kihyun; Linn, Marcia C.; Sato, Elissa; Svihla, Vanessa

    2015-01-01

    Although researchers call for inquiry learning in science, science assessments rarely capture the impact of inquiry instruction. This paper reports on the development and validation of assessments designed to measure middle-school students' progress in gaining integrated understanding of energy while studying an inquiry-oriented curriculum. The…

  15. A Working Framework for Enabling International Science Data System Interoperability

    NASA Astrophysics Data System (ADS)

    Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.

    2016-07-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.

  16. U.S. Spacesuit Knowledge Capture Status and Initiatives in Fiscal Year 2014

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Oliva, Vladenka R.

    2015-01-01

    Since its 2008 inception, the NASA U.S. Spacesuit Knowledge Capture (KC) program has shared historical spacesuit information with engineers and other technical team members to expand their understanding of the spacesuit's evolution, known capability and limitations, and future desires and needs for its use. As part of the U.S. Spacesuit KC program, subject-matter experts have delivered presentations, held workshops, and participated in interviews to share valuable spacesuit lessons learned to ensure this vital information will survive for existing and future generations to use. These events have included spacesuit knowledge from the inception of NASA's first spacesuit to current spacesuit design. To ensure that this information is shared with the entire NASA community and other interested or invested entities, these KC events were digitally recorded and transcribed to be uploaded onto several applicable NASA Web sites. This paper discusses the various Web sites that the KC events are uploaded to and possible future sites that will channel this information.

  17. U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Year 2014

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Oliva, Vladenka R.

    2015-01-01

    Since its 2008 inception, the NASA U.S. Spacesuit Knowledge Capture (KC) program has shared historical spacesuit information with engineers and other technical team members to expand their understanding of the spacesuit's evolution, known capability and limitations, and future desires and needs for its use. As part of the U.S. Spacesuit KC program, subject-matter experts have delivered presentations, held workshops, and participated in interviews to share valuable spacesuit lessons learned to ensure this vital information will survive for existing and future generations to use. These events have included spacesuit knowledge from the inception of NASA's first spacesuit to current spacesuit design. To ensure that this information is shared with the entire NASA community and other interested or invested entities, these KC events were digitally recorded and transcribed to be uploaded onto several applicable NASA Web sites. This paper discusses the various Web sites that the KC events are uploaded to and possible future sites that will channel this information.

  18. Derate Mitigation Options for Pulverized Coal Power Plant Carbon Capture Retrofits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffmann, Jeffrey W.; Hackett, Gregory A.; Lewis, Eric G.

    Carbon capture and storage (CCS) technologies available in the near-term for pulverized coal-fueled power plants (i.e., post combustion solvent technologies) require substantial capital investment and result in marked decrease in electricity available for sale to the grid. The impact to overall plant economics can be mitigated for new plant designs (where the entire plant can be optimized around the CCS system). However, existing coal-fueled power plants were designed without the knowledge or intent to retrofit a CCS process, and it is simply not possible to re-engineer an existing plant in a manner that it could achieve the same performance asmore » if it was originally designed and optimized for CCS technology. Pairing an auxiliary steam supply to the capture system is a technically feasible option to mitigate the derate resulting from diverting steam away from an existing steam turbine and continuing to run that turbine at steam flow rates and properties outside of the original design specifications. The results of this analysis strongly support the merits of meeting the steam and power requirements for a retrofitted post-combustion solvent based carbon dioxide (CO2) capture system with an auxiliary combined heat and power (CHP) plant rather than robbing the base plant (i.e., diverting steam from the existing steam cycle and electricity from sale to the grid).« less

  19. WE-F-BRB-00: New Developments in Knowledge-Based Treatment Planning and Automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-06-15

    Advancements in informatics in radiotherapy are opening up opportunities to improve our ability to assess treatment plans. Models on individualizing patient dose constraints from prior patient data and shape relationships have been extensively researched and are now making their way into commercial products. New developments in knowledge based treatment planning involve understanding the impact of the radiation dosimetry on the patient. Akin to radiobiology models that have driven intensity modulated radiotherapy optimization, toxicity and outcome predictions based on treatment plans and prior patient experiences may be the next step in knowledge based planning. In order to realize these predictions, itmore » is necessary to understand how the clinical information can be captured, structured and organized with ontologies and databases designed for recall. Large databases containing radiation dosimetry and outcomes present the opportunity to evaluate treatment plans against predictions of toxicity and disease response. Such evaluations can be based on dose volume histogram or even the full 3-dimensional dose distribution and its relation to the critical anatomy. This session will provide an understanding of ontologies and standard terminologies used to capture clinical knowledge into structured databases; How data can be organized and accessed to utilize the knowledge in planning; and examples of research and clinical efforts to incorporate that clinical knowledge into planning for improved care for our patients. Learning Objectives: Understand the role of standard terminologies, ontologies and data organization in oncology Understand methods to capture clinical toxicity and outcomes in a clinical setting Understand opportunities to learn from clinical data and its application to treatment planning Todd McNutt receives funding from Philips, Elekta and Toshiba for some of the work presented.« less

  20. De Novo Design of Bioactive Small Molecules by Artificial Intelligence.

    PubMed

    Merk, Daniel; Friedrich, Lukas; Grisoni, Francesca; Schneider, Gisbert

    2018-01-01

    Generative artificial intelligence offers a fresh view on molecular design. We present the first-time prospective application of a deep learning model for designing new druglike compounds with desired activities. For this purpose, we trained a recurrent neural network to capture the constitution of a large set of known bioactive compounds represented as SMILES strings. By transfer learning, this general model was fine-tuned on recognizing retinoid X and peroxisome proliferator-activated receptor agonists. We synthesized five top-ranking compounds designed by the generative model. Four of the compounds revealed nanomolar to low-micromolar receptor modulatory activity in cell-based assays. Apparently, the computational model intrinsically captured relevant chemical and biological knowledge without the need for explicit rules. The results of this study advocate generative artificial intelligence for prospective de novo molecular design, and demonstrate the potential of these methods for future medicinal chemistry. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  1. Modeling and optimal design of CO2 Direct Air Capture systems in large arrays

    NASA Astrophysics Data System (ADS)

    Sadri Irani, Samaneh; Luzzatto-Fegiz, Paolo

    2017-11-01

    As noted by the 2014 IPCC report, while the rise in atmospheric CO2 would be slowed by emissions reductions, removing atmospheric CO2 is an important part of possible paths to climate stabilization. Direct Air Capture of CO2 with chemicals (DAC) is one of several proposed carbon capture technologies. There is an ongoing debate on whether DAC is an economically viable approach to alleviate climate change. In addition, like all air capture strategies, DAC is strongly constrained by the net-carbon problem, namely the need to control CO2 emissions associated with the capture process (for example, if DAC not powered by renewables). Research to date has focused on the chemistry and economics of individual DAC devices. However, the fluid mechanics of their large-scale deployment has not been examined in the literature, to the best of our knowledge. In this presentation, we develop a model for flow through an array of DAC devices, varying their lateral extent and their separation. We build on a recent theory of canopy flows, introducing terms for CO2 entrainment into the array boundary layer, and transport into the farm. In addition, we examine the possibility of driving flow passively by wind, thereby reducing energy consumption. The optimal operational design is established considering the total cost, drag force, energy consumption and total CO2 capture.

  2. Beyond knowledge capture: creating useful work-centric systems

    NASA Technical Reports Server (NTRS)

    Cooper, L. P.; Majchrzak, A.

    2001-01-01

    Once you have successfully captured knowledge, the challenge then becomes one of creating an affective way to use that knowledge. Two high knowledge content systems developed at the Jet Propulsion Laboratory are presented as examples of work-centric systems, where the primary value to the user is in the content.

  3. A knowledge-based patient assessment system: conceptual and technical design.

    PubMed Central

    Reilly, C. A.; Zielstorff, R. D.; Fox, R. L.; O'Connell, E. M.; Carroll, D. L.; Conley, K. A.; Fitzgerald, P.; Eng, T. K.; Martin, A.; Zidik, C. M.; Segal, M.

    2000-01-01

    This paper describes the design of an inpatient patient assessment application that captures nursing assessment data using a wireless laptop computer. The primary aim of this system is to capture structured information for facilitating decision support and quality monitoring. The system also aims to improve efficiency of recording patient assessments, reduce costs, and improve discharge planning and early identification of patient learning needs. Object-oriented methods were used to elicit functional requirements and to model the proposed system. A tools-based development approach is being used to facilitate rapid development and easy modification of assessment items and rules for decision support. Criteria for evaluation include perceived utility by clinician users, validity of decision support rules, time spent recording assessments, and perceived utility of aggregate reports for quality monitoring. PMID:11079970

  4. VibroCV: a computer vision-based vibroarthrography platform with possible application to Juvenile Idiopathic Arthritis.

    PubMed

    Wiens, Andrew D; Prahalad, Sampath; Inan, Omer T

    2016-08-01

    Vibroarthrography, a method for interpreting the sounds emitted by a knee during movement, has been studied for several joint disorders since 1902. However, to our knowledge, the usefulness of this method for management of Juvenile Idiopathic Arthritis (JIA) has not been investigated. To study joint sounds as a possible new biomarker for pediatric cases of JIA we designed and built VibroCV, a platform to capture vibroarthrograms from four accelerometers; electromyograms (EMG) and inertial measurements from four wireless EMG modules; and joint angles from two Sony Eye cameras and six light-emitting diodes with commercially-available off-the-shelf parts and computer vision via OpenCV. This article explains the design of this turn-key platform in detail, and provides a sample recording captured from a pediatric subject.

  5. A knowledge-based patient assessment system: conceptual and technical design.

    PubMed

    Reilly, C A; Zielstorff, R D; Fox, R L; O'Connell, E M; Carroll, D L; Conley, K A; Fitzgerald, P; Eng, T K; Martin, A; Zidik, C M; Segal, M

    2000-01-01

    This paper describes the design of an inpatient patient assessment application that captures nursing assessment data using a wireless laptop computer. The primary aim of this system is to capture structured information for facilitating decision support and quality monitoring. The system also aims to improve efficiency of recording patient assessments, reduce costs, and improve discharge planning and early identification of patient learning needs. Object-oriented methods were used to elicit functional requirements and to model the proposed system. A tools-based development approach is being used to facilitate rapid development and easy modification of assessment items and rules for decision support. Criteria for evaluation include perceived utility by clinician users, validity of decision support rules, time spent recording assessments, and perceived utility of aggregate reports for quality monitoring.

  6. Assessment of the NASA Space Shuttle Program's Problem Reporting and Corrective Action System

    NASA Technical Reports Server (NTRS)

    Korsmeryer, D. J.; Schreiner, J. A.; Norvig, Peter (Technical Monitor)

    2001-01-01

    This paper documents the general findings and recommendations of the Design for Safety Programs Study of the Space Shuttle Programs (SSP) Problem Reporting and Corrective Action (PRACA) System. The goals of this Study were: to evaluate and quantify the technical aspects of the SSP's PRACA systems, and to recommend enhancements addressing specific deficiencies in preparation for future system upgrades. The Study determined that the extant SSP PRACA systems accomplished a project level support capability through the use of a large pool of domain experts and a variety of distributed formal and informal database systems. This operational model is vulnerable to staff turnover and loss of the vast corporate knowledge that is not currently being captured by the PRACA system. A need for a Program-level PRACA system providing improved insight, unification, knowledge capture, and collaborative tools was defined in this study.

  7. Tacit Knowledge Capture and the Brain-Drain at Electrical Utilities

    NASA Astrophysics Data System (ADS)

    Perjanik, Nicholas Steven

    As a consequence of an aging workforce, electric utilities are at risk of losing their most experienced and knowledgeable electrical engineers. In this research, the problem was a lack of understanding of what electric utilities were doing to capture the tacit knowledge or know-how of these engineers. The purpose of this qualitative research study was to explore the tacit knowledge capture strategies currently used in the industry by conducting a case study of 7 U.S. electrical utilities that have demonstrated an industry commitment to improving operational standards. The research question addressed the implemented strategies to capture the tacit knowledge of retiring electrical engineers and technical personnel. The research methodology involved a qualitative embedded case study. The theories used in this study included knowledge creation theory, resource-based theory, and organizational learning theory. Data were collected through one time interviews of a senior electrical engineer or technician within each utility and a workforce planning or training professional within 2 of the 7 utilities. The analysis included the use of triangulation and content analysis strategies. Ten tacit knowledge capture strategies were identified: (a) formal and informal on-boarding mentorship and apprenticeship programs, (b) formal and informal off-boarding mentorship programs, (c) formal and informal training programs, (d) using lessons learned during training sessions, (e) communities of practice, (f) technology enabled tools, (g) storytelling, (h) exit interviews, (i) rehiring of retirees as consultants, and (j) knowledge risk assessments. This research contributes to social change by offering strategies to capture the know-how needed to ensure operational continuity in the delivery of safe, reliable, and sustainable power.

  8. The Training and Field Work Experiences of Community Health Workers conducting non-invasive, population-based screening for Cardiovascular Disease in Four Communities in Low and Middle-Income Settings

    PubMed Central

    Denman, Catalina A.; Montano, Carlos Mendoza; Gaziano, Thomas A.; Levitt, Naomi; Rivera-Andrade, Alvaro; Carrasco, Diana Munguía; Zulu, Jabu; Khanam, Masuma Akter; Puoane, Thandi

    2015-01-01

    Background Cardiovascular disease (CVD) is on the rise in low- and middle-income countries (LMIC) and is proving difficult to combat due to the emphasis on improving outcomes in maternal and child health and infectious diseases, against a backdrop of severe human resource and infrastructure constraints. Effective task-sharing from physicians or nurses to community health workers (CHWs) to conduct population-based screening for persons at risk, has the potential to mitigate the impact of CVD on vulnerable populations. CHWs in Bangladesh, Guatemala, Mexico, and South Africa were trained to conduct non-invasive population-based screening for persons at high risk for CVD. Objective (s) The objectives of this study were to quantitatively assess the performance of CHWs during training and to qualitatively capture their training and fieldwork experiences while conducting non-invasive screening for cardiovascular disease (CVD) risk in their communities. Methods Written tests were used to assess CHWs’ acquisition of content knowledge during training, and focus group discussions conducted to capture their training and fieldwork experiences. Results Training was effective at increasing the CHWs’ content knowledge of cardiovascular disease (CVD) and this knowledge was largely retained up to six months after the completion of field work. Common themes which need to be addressed when designing task sharing with CHWs in chronic diseases are identified, including language, respect, and compensation. The importance of having intimate knowledge of the community receiving services from design to implementation is underscored. Conclusions Effective training for screening for CVD in community settings should have a strong didactic core that is supplemented with culture-specific adaptations in the delivery of instruction. The incorporation of expert and intimate knowledge of the communities themselves is critical, from the design to implementation phases of training. Challenges such as role definition, defining career paths, and providing adequate remuneration, must be addressed. PMID:25754566

  9. The training and fieldwork experiences of community health workers conducting population-based, noninvasive screening for CVD in LMIC.

    PubMed

    Abrahams-Gessel, Shafika; Denman, Catalina A; Montano, Carlos Mendoza; Gaziano, Thomas A; Levitt, Naomi; Rivera-Andrade, Alvaro; Carrasco, Diana Munguía; Zulu, Jabu; Khanam, Masuma Akter; Puoane, Thandi

    2015-03-01

    Cardiovascular disease (CVD) is on the rise in low- and middle-income countries and is proving difficult to combat due to the emphasis on improving outcomes in maternal and child health and infectious diseases against a backdrop of severe human resource and infrastructure constraints. Effective task-sharing from physicians or nurses to community health workers (CHW) to conduct population-based screening for persons at risk has the potential to mitigate the impact of CVD on vulnerable populations. CHW in Bangladesh, Guatemala, Mexico, and South Africa were trained to conduct noninvasive population-based screening for persons at high risk for CVD. This study sought to quantitatively assess the performance of CHW during training and to qualitatively capture their training and fieldwork experiences while conducting noninvasive screening for CVD risk in their communities. Written tests were used to assess CHW's acquisition of content knowledge during training, and focus group discussions were conducted to capture their training and fieldwork experiences. Training was effective at increasing the CHW's content knowledge of CVD, and this knowledge was largely retained up to 6 months after the completion of fieldwork. Common themes that need to be addressed when designing task-sharing with CHW in chronic diseases are identified, including language, respect, and compensation. The importance of having intimate knowledge of the community receiving services from design to implementation is underscored. Effective training for screening for CVD in community settings should have a strong didactic core that is supplemented with culture-specific adaptations in the delivery of instruction. The incorporation of expert and intimate knowledge of the communities themselves is critical, from the design to implementation phases of training. Challenges such as role definition, defining career paths, and providing adequate remuneration must be addressed. Copyright © 2015 World Heart Federation (Geneva). Published by Elsevier B.V. All rights reserved.

  10. Reasoning with case histories of process knowledge for efficient process development

    NASA Technical Reports Server (NTRS)

    Bharwani, Seraj S.; Walls, Joe T.; Jackson, Michael E.

    1988-01-01

    The significance of compiling case histories of empirical process knowledge and the role of such histories in improving the efficiency of manufacturing process development is discussed in this paper. Methods of representing important investigations as cases and using the information from such cases to eliminate redundancy of empirical investigations in analogous process development situations are also discussed. A system is proposed that uses such methods to capture the problem-solving framework of the application domain. A conceptual design of the system is presented and discussed.

  11. WE-F-BRB-02: Setting the Stage for Incorporation of Toxicity Measures in Treatment Plan Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, C.

    2015-06-15

    Advancements in informatics in radiotherapy are opening up opportunities to improve our ability to assess treatment plans. Models on individualizing patient dose constraints from prior patient data and shape relationships have been extensively researched and are now making their way into commercial products. New developments in knowledge based treatment planning involve understanding the impact of the radiation dosimetry on the patient. Akin to radiobiology models that have driven intensity modulated radiotherapy optimization, toxicity and outcome predictions based on treatment plans and prior patient experiences may be the next step in knowledge based planning. In order to realize these predictions, itmore » is necessary to understand how the clinical information can be captured, structured and organized with ontologies and databases designed for recall. Large databases containing radiation dosimetry and outcomes present the opportunity to evaluate treatment plans against predictions of toxicity and disease response. Such evaluations can be based on dose volume histogram or even the full 3-dimensional dose distribution and its relation to the critical anatomy. This session will provide an understanding of ontologies and standard terminologies used to capture clinical knowledge into structured databases; How data can be organized and accessed to utilize the knowledge in planning; and examples of research and clinical efforts to incorporate that clinical knowledge into planning for improved care for our patients. Learning Objectives: Understand the role of standard terminologies, ontologies and data organization in oncology Understand methods to capture clinical toxicity and outcomes in a clinical setting Understand opportunities to learn from clinical data and its application to treatment planning Todd McNutt receives funding from Philips, Elekta and Toshiba for some of the work presented.« less

  12. WE-F-BRB-03: Inclusion of Data-Driven Risk Predictions in Radiation Treatment Planning in the Context of a Local Level Learning Health System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNutt, T.

    Advancements in informatics in radiotherapy are opening up opportunities to improve our ability to assess treatment plans. Models on individualizing patient dose constraints from prior patient data and shape relationships have been extensively researched and are now making their way into commercial products. New developments in knowledge based treatment planning involve understanding the impact of the radiation dosimetry on the patient. Akin to radiobiology models that have driven intensity modulated radiotherapy optimization, toxicity and outcome predictions based on treatment plans and prior patient experiences may be the next step in knowledge based planning. In order to realize these predictions, itmore » is necessary to understand how the clinical information can be captured, structured and organized with ontologies and databases designed for recall. Large databases containing radiation dosimetry and outcomes present the opportunity to evaluate treatment plans against predictions of toxicity and disease response. Such evaluations can be based on dose volume histogram or even the full 3-dimensional dose distribution and its relation to the critical anatomy. This session will provide an understanding of ontologies and standard terminologies used to capture clinical knowledge into structured databases; How data can be organized and accessed to utilize the knowledge in planning; and examples of research and clinical efforts to incorporate that clinical knowledge into planning for improved care for our patients. Learning Objectives: Understand the role of standard terminologies, ontologies and data organization in oncology Understand methods to capture clinical toxicity and outcomes in a clinical setting Understand opportunities to learn from clinical data and its application to treatment planning Todd McNutt receives funding from Philips, Elekta and Toshiba for some of the work presented.« less

  13. U.S. Spacesuit Knowledge Capture Series Catalog

    NASA Technical Reports Server (NTRS)

    Bitterly, Rose; Oliva, Vladenka

    2012-01-01

    The National Aeronautics and Space Administration (NASA) and other organizations have been performing U.S. Spacesuit Knowledge Capture (USSKC) since the beginning of space exploration through published reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. The close physical interaction between spacesuit systems and human beings makes them among the most personally evocative pieces of space hardware. Consequently, spacesuit systems have required nearly constant engineering refinements to do their jobs without impinging on human activity. Since 2008, spacesuit knowledge capture has occurred through video recording, engaging both current and former specialists presenting technical scope specifically to educate individuals and preserve knowledge. These archives of spacesuit legacy reflect its rich history and will provide knowledge that will enhance the chances for the success of future and more ambitious spacesuit system programs. The scope and topics of USSKC have included lessons learned in spacesuit technology; experience from the Gemini, Apollo, Skylab, and Shuttle Programs; the process of hardware certification, design, development, and other program components; spacesuit evolution and experience; failure analysis and resolution; and aspects of program management. USSKC activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a comprehensive way to organize and archive intra-agency information related to the development of spacesuit systems. These video recordings are currently being reviewed for public release using NASA export control processes. After a decision is made for either public or non-public release (internal NASA only), the videos and presentations will be available through the NASA Johnson Space Center Engineering Directorate (EA) Engineering Academy, the NASA Technical Reports Server (NTRS), the NASA Aeronautics & Space Database (NA&SD), or NASA YouTube. Event availability is duly noted in this catalog.

  14. U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Years 2012 and 2013

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Oliva, Vladenka R.

    2014-01-01

    The NASA U.S. spacesuit knowledge capture (KC) program has been in operations since the beginning 2008. The program was designed to augment engineers and others with information about spacesuits in a historical way. A multitude of seminars have captured spacesuit history and knowledge over the last six years of the programs existence. Subject matter experts have provided lectures and were interviewed to help bring the spacesuit to life so that lessons learned will never be lost. As well, the program concentrated in reaching out to the public and industry by making the recorded events part of the public domain through the NASA technical library via You Tube media. The U.S. spacesuit KC topics have included lessons learned from some of the most prominent spacesuit experts and spacesuit users including current and former astronauts. The events have enriched the spacesuit legacy knowledge from Gemini, Apollo, Skylab, Space Shuttle and International Space Station Programs. As well, expert engineers and scientists have shared their challenges and successes to be remembered. The last few years have been some of the most successful years of the KC program program's life with numerous recordings and releases to the public. It is evidenced by the thousands that have view the recordings online. This paper reviews the events accomplished and archived over Fiscal Years 2012 and 2013 and highlights a few of the most memorable ones. This paper also communicates ways to access the events that are available internally to NASA as well as in the public domain.

  15. Sketching for Knowledge Capture: A Progress Report

    DTIC Science & Technology

    2002-01-16

    understanding , qualitative modeling, knowledge acquisition, analogy, diagrammatic reasoning, spatial reasoning. INTRODUCTION Sketching is often used...main limits of sKEA’s expressivity are (a) the predicate vocabulary in its knowledge base and (b) how natural it is to express a piece of information ...Sketching for knowledge capture: A progress report Kenneth D. Forbus Qualitative Reasoning Group Northwestern University 1890 Maple Avenue

  16. The Sydney West Knowledge Portal: Evaluating the Growth of a Knowledge Portal to Support Translational Research.

    PubMed

    Janssen, Anna; Robinson, Tracy Elizabeth; Provan, Pamela; Shaw, Tim

    2016-06-29

    The Sydney West Translational Cancer Research Centre is an organization funded to build capacity for translational research in cancer. Translational research is essential for ensuring the integration of best available evidence into practice and for improving patient outcomes. However, there is a low level of awareness regarding what it is and how to conduct it optimally. One solution to addressing this gap is the design and deployment of web-based knowledge portals to disseminate new knowledge and engage with and connect dispersed networks of researchers. A knowledge portal is an web-based platform for increasing knowledge dissemination and management in a specialized area. To measure the design and growth of an web-based knowledge portal for increasing individual awareness of translational research and to build organizational capacity for the delivery of translational research projects in cancer. An adaptive methodology was used to capture the design and growth of an web-based knowledge portal in cancer. This involved stakeholder consultations to inform initial design of the portal. Once the portal was live, site analytics were reviewed to evaluate member usage of the portal and to measure growth in membership. Knowledge portal membership grew consistently for the first 18 months after deployment, before leveling out. Analysis of site metrics revealed members were most likely to visit portal pages with community-generated content, particularly pages with a focus on translational research. This was closely followed by pages that disseminated educational material about translational research. Preliminary data from this study suggest that knowledge portals may be beneficial tools for translating new evidence and fostering an environment of communication and collaboration.

  17. The Sydney West Knowledge Portal: Evaluating the Growth of a Knowledge Portal to Support Translational Research

    PubMed Central

    2016-01-01

    Background The Sydney West Translational Cancer Research Centre is an organization funded to build capacity for translational research in cancer. Translational research is essential for ensuring the integration of best available evidence into practice and for improving patient outcomes. However, there is a low level of awareness regarding what it is and how to conduct it optimally. One solution to addressing this gap is the design and deployment of web-based knowledge portals to disseminate new knowledge and engage with and connect dispersed networks of researchers. A knowledge portal is an web-based platform for increasing knowledge dissemination and management in a specialized area. Objective To measure the design and growth of an web-based knowledge portal for increasing individual awareness of translational research and to build organizational capacity for the delivery of translational research projects in cancer. Methods An adaptive methodology was used to capture the design and growth of an web-based knowledge portal in cancer. This involved stakeholder consultations to inform initial design of the portal. Once the portal was live, site analytics were reviewed to evaluate member usage of the portal and to measure growth in membership. Results Knowledge portal membership grew consistently for the first 18 months after deployment, before leveling out. Analysis of site metrics revealed members were most likely to visit portal pages with community-generated content, particularly pages with a focus on translational research. This was closely followed by pages that disseminated educational material about translational research. Conclusions Preliminary data from this study suggest that knowledge portals may be beneficial tools for translating new evidence and fostering an environment of communication and collaboration. PMID:27357641

  18. Digital Humanities: Envisioning a Collaborative Tool for Mapping, Evaluating, and Sharing Reconstructed Colonial American Parcel Maps

    ERIC Educational Resources Information Center

    Ruvane, Mary Brent

    2012-01-01

    The use of GIS technology for the humanities has opened up new avenues for visually exploring and asking questions of our nation's historical record. The potential to harness new knowledge with tools designed to capture and preserve geographic links to the artifacts of our past is within our grasp. This research explores the common information…

  19. Introduction of knowledge bases in patient's data management system: role of the user interface.

    PubMed

    Chambrin, M C; Ravaux, P; Jaborska, A; Beugnet, C; Lestavel, P; Chopin, C; Boniface, M

    1995-02-01

    As the number of signals and data to be handled grows in intensive care unit, it is necessary to design more powerful computing systems that integrate and summarize all this information. The manual input of data as e.g. clinical signs and drug prescription and the synthetic representation of these data requires an ever more sophisticated user interface. The introduction of knowledge bases in the data management allows to conceive contextual interfaces. The objective of this paper is to show the importance of the design of the user interface, in the daily use of clinical information system. Then we describe a methodology that uses the man-machine interaction to capture the clinician knowledge during the clinical practice. The different steps are the audit of the user's actions, the elaboration of statistic models allowing the definition of new knowledge, and the validation that is performed before complete integration. A part of this knowledge can be used to improve the user interface. Finally, we describe the implementation of these concepts on a UNIX platform using OSF/MOTIF graphical interface.

  20. Gene regulation knowledge commons: community action takes care of DNA binding transcription factors

    PubMed Central

    Tripathi, Sushil; Vercruysse, Steven; Chawla, Konika; Christie, Karen R.; Blake, Judith A.; Huntley, Rachael P.; Orchard, Sandra; Hermjakob, Henning; Thommesen, Liv; Lægreid, Astrid; Kuiper, Martin

    2016-01-01

    A large gap remains between the amount of knowledge in scientific literature and the fraction that gets curated into standardized databases, despite many curation initiatives. Yet the availability of comprehensive knowledge in databases is crucial for exploiting existing background knowledge, both for designing follow-up experiments and for interpreting new experimental data. Structured resources also underpin the computational integration and modeling of regulatory pathways, which further aids our understanding of regulatory dynamics. We argue how cooperation between the scientific community and professional curators can increase the capacity of capturing precise knowledge from literature. We demonstrate this with a project in which we mobilize biological domain experts who curate large amounts of DNA binding transcription factors, and show that they, although new to the field of curation, can make valuable contributions by harvesting reported knowledge from scientific papers. Such community curation can enhance the scientific epistemic process. Database URL: http://www.tfcheckpoint.org PMID:27270715

  1. Discussion of “Bayesian design of experiments for industrial and scientific applications via gaussian processes”

    DOE PAGES

    Anderson-Cook, Christine M.; Burke, Sarah E.

    2016-10-18

    First, we would like to commend Dr. Woods on his thought-provoking paper and insightful presentation at the 4th Annual Stu Hunter conference. We think that the material presented highlights some important needs in the area of design of experiments for generalized linear models (GLMs). In addition, we agree with Dr. Woods that design of experiements of GLMs does implicitly require expert judgement about model parameters, and hence using a Bayesian approach to capture this knowledge is a natural strategy to summarize what is known with the opportunity to incorporate associated uncertainty about that information.

  2. Discussion of “Bayesian design of experiments for industrial and scientific applications via gaussian processes”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson-Cook, Christine M.; Burke, Sarah E.

    First, we would like to commend Dr. Woods on his thought-provoking paper and insightful presentation at the 4th Annual Stu Hunter conference. We think that the material presented highlights some important needs in the area of design of experiments for generalized linear models (GLMs). In addition, we agree with Dr. Woods that design of experiements of GLMs does implicitly require expert judgement about model parameters, and hence using a Bayesian approach to capture this knowledge is a natural strategy to summarize what is known with the opportunity to incorporate associated uncertainty about that information.

  3. Detailed seafloor habitat mapping to enhance marine-resource management

    USGS Publications Warehouse

    Zawada, David G.; Hart, Kristen M.

    2010-01-01

    Pictures of the seafloor capture important information about the sediments, exposed geologic features, submerged aquatic vegetation, and animals found in a given habitat. With the emergence of marine protected areas (MPAs) as a favored tactic for preserving coral reef resources, knowledge of essential habitat components is paramount to designing effective management strategies. Surprisingly, detailed information on seafloor habitat components is not available in many areas that are being considered for MPA designation or that are already designated as MPAs. A task of the U.S. Geological Survey Coral Reef Ecosystem STudies (USGS CREST) project is addressing this issue.

  4. Linking Earth Observations and Models to Societal Information Needs: The Case of Coastal Flooding

    NASA Astrophysics Data System (ADS)

    Buzzanga, B. A.; Plag, H. P.

    2016-12-01

    Coastal flooding is expected to increase in many areas due to sea level rise (SLR). Many societal applications such as emergency planning and designing public services depend on information on how the flooding spectrum may change as a result of SLR. To identify the societal information needs a conceptual model is needed that identifies the key stakeholders, applications, and information and observation needs. In the context of the development of the Global Earth Observation System of Systems (GEOSS), which is implemented by the Group on Earth Observations (GEO), the Socio-Economic and Environmental Information Needs Knowledge Base (SEE-IN KB) is developed as part of the GEOSS Knowledge Base. A core function of the SEE-IN KB is to facilitate the linkage of societal information needs to observations, models, information and knowledge. To achieve this, the SEE-IN KB collects information on objects such as user types, observational requirements, societal goals, models, and datasets. Comprehensive information concerning the interconnections between instances of these objects is used to capture the connectivity and to establish a conceptual model as a network of networks. The captured connectivity can be used in searches to allow users to discover products and services for their information needs, and providers to search for users and applications benefiting from their products. It also allows to answer "What if?" questions and supports knowledge creation. We have used the SEE-IN KB to develop a conceptual model capturing the stakeholders in coastal flooding and their information needs, and to link these elements to objects. We show how the knowledge base enables the transition of scientific data to useable information by connecting individuals such as city managers to flood maps. Within the knowledge base, these same users can request information that improves their ability to make specific planning decisions. These needs are linked to entities within research institutions that have the capabilities to meet them. Further, current research such as that investigating precipitation-induced flooding under different SLR scenarios is linked to the users who benefit from the knowledge, effectively creating a bi-directional channel between science and society that increases knowledge and improves foresight.

  5. Sensory evaluation based fuzzy AHP approach for material selection in customized garment design and development process

    NASA Astrophysics Data System (ADS)

    Hong, Y.; Curteza, A.; Zeng, X.; Bruniaux, P.; Chen, Y.

    2016-06-01

    Material selection is the most difficult section in the customized garment product design and development process. This study aims to create a hierarchical framework for material selection. The analytic hierarchy process and fuzzy sets theories have been applied to mindshare the diverse requirements from the customer and inherent interaction/interdependencies among these requirements. Sensory evaluation ensures a quick and effective selection without complex laboratory test such as KES and FAST, using the professional knowledge of the designers. A real empirical application for the physically disabled people is carried out to demonstrate the proposed method. Both the theoretical and practical background of this paper have indicated the fuzzy analytical network process can capture expert's knowledge existing in the form of incomplete, ambiguous and vague information for the mutual influence on attribute and criteria of the material selection.

  6. Automating a human factors evaluation of graphical user interfaces for NASA applications: An update on CHIMES

    NASA Technical Reports Server (NTRS)

    Jiang, Jian-Ping; Murphy, Elizabeth D.; Bailin, Sidney C.; Truszkowski, Walter F.

    1993-01-01

    Capturing human factors knowledge about the design of graphical user interfaces (GUI's) and applying this knowledge on-line are the primary objectives of the Computer-Human Interaction Models (CHIMES) project. The current CHIMES prototype is designed to check a GUI's compliance with industry-standard guidelines, general human factors guidelines, and human factors recommendations on color usage. Following the evaluation, CHIMES presents human factors feedback and advice to the GUI designer. The paper describes the approach to modeling human factors guidelines, the system architecture, a new method developed to convert quantitative RGB primaries into qualitative color representations, and the potential for integrating CHIMES with user interface management systems (UIMS). Both the conceptual approach and its implementation are discussed. This paper updates the presentation on CHIMES at the first International Symposium on Ground Data Systems for Spacecraft Control.

  7. Ontologies and Information Systems: A Literature Survey

    DTIC Science & Technology

    2011-06-01

    Science and Technology Organisation DSTO–TN–1002 ABSTRACT An ontology captures in a computer-processable language the important con - cepts in a...knowledge shara- bility, reusability and scalability, and that support collaborative and distributed con - struction of ontologies, the DOGMA and DILIGENT...and assemble the received information). In the second stage, the designers determine how ontologies should be used in the pro - cess of adding

  8. Interactive Business Development, Capturing Business Knowledge and Practice: A Case Study

    ERIC Educational Resources Information Center

    McKelvie, Gregor; Dotsika, Fefie; Patrick, Keith

    2007-01-01

    Purpose: The purpose of this paper is to follow the planning and development of MapaWiki, a Knowledge Management System for Mapa, an independent research company that specialises in competitor benchmarking. Starting with the standard requirements to capture, store and share information and knowledge, a system was sought that would allow growth and…

  9. The cure: design and evaluation of a crowdsourcing game for gene selection for breast cancer survival prediction.

    PubMed

    Good, Benjamin M; Loguercio, Salvatore; Griffith, Obi L; Nanis, Max; Wu, Chunlei; Su, Andrew I

    2014-07-29

    Molecular signatures for predicting breast cancer prognosis could greatly improve care through personalization of treatment. Computational analyses of genome-wide expression datasets have identified such signatures, but these signatures leave much to be desired in terms of accuracy, reproducibility, and biological interpretability. Methods that take advantage of structured prior knowledge (eg, protein interaction networks) show promise in helping to define better signatures, but most knowledge remains unstructured. Crowdsourcing via scientific discovery games is an emerging methodology that has the potential to tap into human intelligence at scales and in modes unheard of before. The main objective of this study was to test the hypothesis that knowledge linking expression patterns of specific genes to breast cancer outcomes could be captured from players of an open, Web-based game. We envisioned capturing knowledge both from the player's prior experience and from their ability to interpret text related to candidate genes presented to them in the context of the game. We developed and evaluated an online game called The Cure that captured information from players regarding genes for use as predictors of breast cancer survival. Information gathered from game play was aggregated using a voting approach, and used to create rankings of genes. The top genes from these rankings were evaluated using annotation enrichment analysis, comparison to prior predictor gene sets, and by using them to train and test machine learning systems for predicting 10 year survival. Between its launch in September 2012 and September 2013, The Cure attracted more than 1000 registered players, who collectively played nearly 10,000 games. Gene sets assembled through aggregation of the collected data showed significant enrichment for genes known to be related to key concepts such as cancer, disease progression, and recurrence. In terms of the predictive accuracy of models trained using this information, these gene sets provided comparable performance to gene sets generated using other methods, including those used in commercial tests. The Cure is available on the Internet. The principal contribution of this work is to show that crowdsourcing games can be developed as a means to address problems involving domain knowledge. While most prior work on scientific discovery games and crowdsourcing in general takes as a premise that contributors have little or no expertise, here we demonstrated a crowdsourcing system that succeeded in capturing expert knowledge.

  10. Development of a general-purpose, integrated knowledge capture and delivery system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, A.G.; Freer, E.B.

    1991-01-01

    KATIE (Knowledge-Based Assistant for Troubleshooting Industrial Equipment) was first conceived as a solution for maintenance problems. In the area of process control, maintenance technicians have become responsible for increasingly complicated equipment and an overwhelming amount of associated information. The sophisticated distributed control systems have proven to be such a drastic change for technicians that they are forced to rely on the engineer for troubleshooting guidance. Because it is difficult for a knowledgeable engineer to be readily available for troubleshooting,maintenance personnel wish to capture the information provided by the engineer. The solution provided has two stages. First, a specific complicated systemmore » was chosen as a test case. An effort was made to gather all available system information in some form. Second, a method of capturing and delivering this collection of information was developed. Several features were desired for this knowledge capture/delivery system (KATIE). Creation of the knowledge base needed to be independent of the delivery system. The delivery path need to be as simple as possible for the technician, and the capture, or authoring, system could provide very sophisticated features. It was decided that KATIE should be as general as possible, not internalizing specifics about the first implementation. The knowledge bases created needed to be completely separate from KATIE needed to have a modular structure so that each type of information (rules, procedures, manuals, symptoms) could be encapsulated individually.« less

  11. The design and implementation of the immune epitope database and analysis resource

    PubMed Central

    Peters, Bjoern; Sidney, John; Bourne, Phil; Bui, Huynh-Hoa; Buus, Soeren; Doh, Grace; Fleri, Ward; Kronenberg, Mitch; Kubo, Ralph; Lund, Ole; Nemazee, David; Ponomarenko, Julia V.; Sathiamurthy, Muthu; Schoenberger, Stephen P.; Stewart, Scott; Surko, Pamela; Way, Scott; Wilson, Steve; Sette, Alessandro

    2016-01-01

    Epitopes are defined as parts of antigens interacting with receptors of the immune system. Knowledge about their intrinsic structure and how they affect the immune response is required to continue development of techniques that detect, monitor, and fight diseases. Their scientific importance is reflected in the vast amount of epitope-related information gathered, ranging from interactions between epitopes and major histocompatibility complex molecules determined by X-ray crystallography to clinical studies analyzing correlates of protection for epitope based vaccines. Our goal is to provide a central resource capable of capturing this information, allowing users to access and connect realms of knowledge that are currently separated and difficult to access. Here, we portray a new initiative, “The Immune Epitope Database and Analysis Resource.” We describe how we plan to capture, structure, and store this information, what query interfaces we will make available to the public, and what additional predictive and analytical tools we will provide. PMID:15895191

  12. Ontology-Based Gap Analysis for Technology Selection: A Knowledge Management Framework for the Support of Equipment Purchasing Processes

    NASA Astrophysics Data System (ADS)

    Macris, Aristomenis M.; Georgakellos, Dimitrios A.

    Technology selection decisions such as equipment purchasing and supplier selection are decisions of strategic importance to companies. The nature of these decisions usually is complex, unstructured and thus, difficult to be captured in a way that will be efficiently reusable. Knowledge reusability is of paramount importance since it enables users participate actively in process design/redesign activities stimulated by the changing technology selection environment. This paper addresses the technology selection problem through an ontology-based approach that captures and makes reusable the equipment purchasing process and assists in identifying (a) the specifications requested by the users' organization, (b) those offered by various candidate vendors' organizations and (c) in performing specifications gap analysis as a prerequisite for effective and efficient technology selection. This approach has practical appeal, operational simplicity, and the potential for both immediate and long-term strategic impact. An example from the iron and steel industry is also presented to illustrate the approach.

  13. U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Years 2012 and 2013

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Oliva, Vladenka R.

    2014-01-01

    The NASA U.S. Spacesuit Knowledge Capture (KC) program has existed since the beginning of 2008. The program was designed to augment engineers and other technical team members with historical spacesuit information to add to their understanding of the spacesuit, its evolution, its limitations, and its capabilities. Over 40 seminars have captured spacesuit history and knowledge over the last six years of the program's existence. Subject matter experts have provided lectures and some were interviewed to help bring the spacesuit to life so that lessons learned will never be lost. As well, the program concentrated in reaching out to the public and industry by making the recorded events part of the public domain through the NASA technical library through YouTube media. The U.S. Spacesuit KC topics have included lessons learned from some of the most prominent spacesuit experts and spacesuit users including current and former astronauts. The events have enriched the spacesuit legacy knowledge from Gemini, Apollo, Skylab, Space Shuttle and International Space Station Programs. As well, expert engineers and scientists have shared their challenges and successes to be remembered. Based on evidence by the thousands of people who have viewed the recordings online, the last few years have been some of the most successful years of the KC program's life with numerous digital recordings and public releases. This paper reviews the events accomplished and archived over Fiscal Years 2012 and 2013 and highlights a few of the most memorable ones. This paper also communicates ways to access the events that are available internally on the NASA domain as well as those released on the public domain.

  14. Automation of the Environmental Control and Life Support System

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, J. Ray

    1990-01-01

    The objective of the Environmental Control and Life Support System (ECLSS) Advanced Automation Project is to recommend and develop advanced software for the initial and evolutionary Space Station Freedom (SSF) ECLS system which will minimize the crew and ground manpower needed for operations. Another objective includes capturing ECLSS design and development knowledge for future missions. This report summarizes our results from Phase I, the ECLSS domain analysis phase, which we broke down into three steps: 1) Analyze and document the baselined ECLS system, 2) envision as our goal an evolution to a fully automated regenerative life support system, built upon an augmented baseline, and 3) document the augmentations (hooks and scars) and advanced software systems which we see as necessary in achieving minimal manpower support for ECLSS operations. In addition, Phase I included development of an advanced software life cycle testing tools will be used in the development of the software. In this way, we plan in preparation for phase II and III, the development and integration phases, respectively. Automated knowledge acquisition, engineering, verification, and can capture ECLSS development knowledge for future use, develop more robust and complex software, provide feedback to the KBS tool community, and insure proper visibility of our efforts.

  15. Usability and Acceptance of the Librarian Infobutton Tailoring Environment: An Open Access Online Knowledge Capture, Management, and Configuration Tool for OpenInfobutton.

    PubMed

    Jing, Xia; Cimino, James J; Del Fiol, Guilherme

    2015-11-30

    The Librarian Infobutton Tailoring Environment (LITE) is a Web-based knowledge capture, management, and configuration tool with which users can build profiles used by OpenInfobutton, an open source infobutton manager, to provide electronic health record users with context-relevant links to online knowledge resources. We conducted a multipart evaluation study to explore users' attitudes and acceptance of LITE and to guide future development. The evaluation consisted of an initial online survey to all LITE users, followed by an observational study of a subset of users in which evaluators' sessions were recorded while they conducted assigned tasks. The observational study was followed by administration of a modified System Usability Scale (SUS) survey. Fourteen users responded to the survey and indicated good acceptance of LITE with feedback that was mostly positive. Six users participated in the observational study, demonstrating average task completion time of less than 6 minutes and an average SUS score of 72, which is considered good compared with other SUS scores. LITE can be used to fulfill its designated tasks quickly and successfully. Evaluators proposed suggestions for improvements in LITE functionality and user interface.

  16. A four stage approach for ontology-based health information system design.

    PubMed

    Kuziemsky, Craig E; Lau, Francis

    2010-11-01

    To describe and illustrate a four stage methodological approach to capture user knowledge in a biomedical domain area, use that knowledge to design an ontology, and then implement and evaluate the ontology as a health information system (HIS). A hybrid participatory design-grounded theory (GT-PD) method was used to obtain data and code them for ontology development. Prototyping was used to implement the ontology as a computer-based tool. Usability testing evaluated the computer-based tool. An empirically derived domain ontology and set of three problem-solving approaches were developed as a formalized model of the concepts and categories from the GT coding. The ontology and problem-solving approaches were used to design and implement a HIS that tested favorably in usability testing. The four stage approach illustrated in this paper is useful for designing and implementing an ontology as the basis for a HIS. The approach extends existing ontology development methodologies by providing an empirical basis for theory incorporated into ontology design. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. A model to capture and manage tacit knowledge using a multiagent system

    NASA Astrophysics Data System (ADS)

    Paolino, Lilyam; Paggi, Horacio; Alonso, Fernando; López, Genoveva

    2014-10-01

    This article presents a model to capture and register business tacit knowledge belonging to different sources, using an expert multiagent system which enables the entry of incidences and captures the tacit knowledge which could fix them. This knowledge and their sources are evaluated through the application of trustworthy algorithms that lead to the registration of the data base and the best of each of them. Through its intelligent software agents, this system interacts with the administrator, users, with the knowledge sources and with all the practice communities which might exist in the business world. The sources as well as the knowledge are constantly evaluated, before being registered and also after that, in order to decide the staying or modification of its original weighting. If there is the possibility of better, new knowledge are registered through the old ones. This is also part of an investigation being carried out which refers to knowledge management methodologies in order to manage tacit business knowledge so as to make the business competitiveness easier and leading to innovation learning.

  18. The Acquisition Process as a Vehicle for Enabling Knowledge Management in the Lifecycle of Complex Federal Systems

    NASA Technical Reports Server (NTRS)

    Stewart, Helen; Spence, Matt Chew; Holm, Jeanne; Koga, Dennis (Technical Monitor)

    2001-01-01

    This white paper explores how to increase the success and operation of critical, complex, national systems by effectively capturing knowledge management requirements within the federal acquisition process. Although we focus on aerospace flight systems, the principles outlined within may have a general applicability to other critical federal systems as well. Fundamental design deficiencies in federal, mission-critical systems have contributed to recent, highly visible system failures, such as the V-22 Osprey and the Delta rocket family. These failures indicate that the current mechanisms for knowledge management and risk management are inadequate to meet the challenges imposed by the rising complexity of critical systems. Failures of aerospace system operations and vehicles may have been prevented or lessened through utilization of better knowledge management and information management techniques.

  19. Participatory approach to the development of a knowledge base for problem-solving in diabetes self-management.

    PubMed

    Cole-Lewis, Heather J; Smaldone, Arlene M; Davidson, Patricia R; Kukafka, Rita; Tobin, Jonathan N; Cassells, Andrea; Mynatt, Elizabeth D; Hripcsak, George; Mamykina, Lena

    2016-01-01

    To develop an expandable knowledge base of reusable knowledge related to self-management of diabetes that can be used as a foundation for patient-centric decision support tools. The structure and components of the knowledge base were created in participatory design with academic diabetes educators using knowledge acquisition methods. The knowledge base was validated using scenario-based approach with practicing diabetes educators and individuals with diabetes recruited from Community Health Centers (CHCs) serving economically disadvantaged communities and ethnic minorities in New York. The knowledge base includes eight glycemic control problems, over 150 behaviors known to contribute to these problems coupled with contextual explanations, and over 200 specific action-oriented self-management goals for correcting problematic behaviors, with corresponding motivational messages. The validation of the knowledge base suggested high level of completeness and accuracy, and identified improvements in cultural appropriateness. These were addressed in new iterations of the knowledge base. The resulting knowledge base is theoretically grounded, incorporates practical and evidence-based knowledge used by diabetes educators in practice settings, and allows for personally meaningful choices by individuals with diabetes. Participatory design approach helped researchers to capture implicit knowledge of practicing diabetes educators and make it explicit and reusable. The knowledge base proposed here is an important step towards development of new generation patient-centric decision support tools for facilitating chronic disease self-management. While this knowledge base specifically targets diabetes, its overall structure and composition can be generalized to other chronic conditions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Participatory approach to the development of a knowledge base for problem-solving in diabetes self-management

    PubMed Central

    Cole-Lewis, Heather J.; Smaldone, Arlene M.; Davidson, Patricia R.; Kukafka, Rita; Tobin, Jonathan N.; Cassells, Andrea; Mynatt, Elizabeth D.; Hripcsak, George; Mamykina, Lena

    2015-01-01

    Objective To develop an expandable knowledge base of reusable knowledge related to self-management of diabetes that can be used as a foundation for patient-centric decision support tools. Materials and methods The structure and components of the knowledge base were created in participatory design with academic diabetes educators using knowledge acquisition methods. The knowledge base was validated using scenario-based approach with practicing diabetes educators and individuals with diabetes recruited from Community Health Centers (CHCs) serving economically disadvantaged communities and ethnic minorities in New York. Results The knowledge base includes eight glycemic control problems, over 150 behaviors known to contribute to these problems coupled with contextual explanations, and over 200 specific action-oriented self-management goals for correcting problematic behaviors, with corresponding motivational messages. The validation of the knowledge base suggested high level of completeness and accuracy, and identified improvements in cultural appropriateness. These were addressed in new iterations of the knowledge base. Discussion The resulting knowledge base is theoretically grounded, incorporates practical and evidence-based knowledge used by diabetes educators in practice settings, and allows for personally meaningful choices by individuals with diabetes. Participatory design approach helped researchers to capture implicit knowledge of practicing diabetes educators and make it explicit and reusable. Conclusion The knowledge base proposed here is an important step towards development of new generation patient-centric decision support tools for facilitating chronic disease self-management. While this knowledge base specifically targets diabetes, its overall structure and composition can be generalized to other chronic conditions. PMID:26547253

  1. Evaluation of a Gait Assessment Module Using 3D Motion Capture Technology

    PubMed Central

    Baskwill, Amanda J.; Belli, Patricia; Kelleher, Leila

    2017-01-01

    Background Gait analysis is the study of human locomotion. In massage therapy, this observation is part of an assessment process that informs treatment planning. Massage therapy students must apply the theory of gait assessment to simulated patients. At Humber College, the gait assessment module traditionally consists of a textbook reading and a three-hour, in-class session in which students perform gait assessment on each other. In 2015, Humber College acquired a three-dimensional motion capture system. Purpose The purpose was to evaluate the use of 3D motion capture in a gait assessment module compared to the traditional gait assessment module. Participants Semester 2 massage therapy students who were enrolled in Massage Theory 2 (n = 38). Research Design Quasi-experimental, wait-list comparison study. Intervention The intervention group participated in an in-class session with a Qualisys motion capture system. Main Outcome Measure(s) The outcomes included knowledge and application of gait assessment theory as measured by quizzes, and students’ satisfaction as measured through a questionnaire. Results There were no statistically significant differences in baseline and post-module knowledge between both groups (pre-module: p = .46; post-module: p = .63). There was also no difference between groups on the final application question (p = .13). The intervention group enjoyed the in-class session because they could visualize the content, whereas the comparison group enjoyed the interactivity of the session. The intervention group recommended adding the assessment of gait on their classmates to their experience. Both groups noted more time was needed for the gait assessment module. Conclusions Based on the results of this study, it is recommended that the gait assessment module combine both the traditional in-class session and the 3D motion capture system. PMID:28293329

  2. From rational numbers to algebra: separable contributions of decimal magnitude and relational understanding of fractions.

    PubMed

    DeWolf, Melissa; Bassok, Miriam; Holyoak, Keith J

    2015-05-01

    To understand the development of mathematical cognition and to improve instructional practices, it is critical to identify early predictors of difficulty in learning complex mathematical topics such as algebra. Recent work has shown that performance with fractions on a number line estimation task predicts algebra performance, whereas performance with whole numbers on similar estimation tasks does not. We sought to distinguish more specific precursors to algebra by measuring multiple aspects of knowledge about rational numbers. Because fractions are the first numbers that are relational expressions to which students are exposed, we investigated how understanding the relational bipartite format (a/b) of fractions might connect to later algebra performance. We presented middle school students with a battery of tests designed to measure relational understanding of fractions, procedural knowledge of fractions, and placement of fractions, decimals, and whole numbers onto number lines as well as algebra performance. Multiple regression analyses revealed that the best predictors of algebra performance were measures of relational fraction knowledge and ability to place decimals (not fractions or whole numbers) onto number lines. These findings suggest that at least two specific components of knowledge about rational numbers--relational understanding (best captured by fractions) and grasp of unidimensional magnitude (best captured by decimals)--can be linked to early success with algebraic expressions. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Scotland's Knowledge Network: translating knowledge into action to improve quality of care.

    PubMed

    Wales, A; Graham, S; Rooney, K; Crawford, A

    2012-11-01

    The Knowledge Network (www.knowledge.scot.nhs.uk) is Scotland's online knowledge service for health and social care. It is designed to support practitioners to apply knowledge in frontline delivery of care, helping to translate knowledge into better health-care outcomes through safe, effective, person-centred care. The Knowledge Network helps to combine the worlds of evidence-based practice and quality improvement by providing access to knowledge about the effectiveness of clinical interventions ('know-what') and knowledge about how to implement this knowledge to support individual patients in working health-care environments ('know-how'). An 'evidence and guidance' search enables clinicians to quickly access quality-assured evidence and best practice, while point of care and mobile solutions provide knowledge in actionable formats to embed in clinical workflow. This research-based knowledge is complemented by social networking services and improvement tools which support the capture and exchange of knowledge from experience, facilitating practice change and systems improvement. In these cases, the Knowledge Network supports key components of the knowledge-to-action cycle--acquiring, creating, sharing and disseminating knowledge to improve performance and innovate. It provides a vehicle for implementing the recommendations of the national Knowledge into Action review, which outlines a new national approach to embedding knowledge in frontline practice and systems improvement.

  4. Accelerating learning for pro-poor health markets.

    PubMed

    Bennett, Sara; Lagomarsino, Gina; Knezovich, Jeffrey; Lucas, Henry

    2014-06-24

    Given the rapid evolution of health markets, learning is key to promoting the identification and uptake of health market policies and practices that better serve the needs of the poor. However there are significant challenges to learning about health markets. We discuss the different forms that learning takes, from the development of codified scientific knowledge, through to experience-based learning, all in relationship to health markets. Notable challenges to learning in health markets include the difficulty of acquiring data from private health care providers, designing evaluations that capture the complex dynamics present within health markets and developing communities of practice that encompass the diverse actors present within health markets, and building trust and mutual understanding across these groups. The paper proposes experimentation with country-specific market data platforms that can integrate relevant evidence from different data sources, and simultaneously exploring strategies to secure better information on private providers and health markets. Possible approaches to adapting evaluation designs so that they are better able to take account of different and changing contexts as well as producing real time findings are discussed. Finally capturing informal knowledge about health markets is key. Communities of practice that bridge different health market actors can help to share such experience-based knowledge and in so doing, may help to formalize it. More geographically-focused communities of practice are needed, and such communities may be supported by innovation brokers and/or be built around member-based organizations. Strategic investments in and support to learning about health markets can address some of the challenges experienced to-date, and accelerate learning that supports health markets that serve the poor.

  5. Semantic Analysis of Email Using Domain Ontologies and WordNet

    NASA Technical Reports Server (NTRS)

    Berrios, Daniel C.; Keller, Richard M.

    2005-01-01

    The problem of capturing and accessing knowledge in paper form has been supplanted by a problem of providing structure to vast amounts of electronic information. Systems that can construct semantic links for natural language documents like email messages automatically will be a crucial element of semantic email tools. We have designed an information extraction process that can leverage the knowledge already contained in an existing semantic web, recognizing references in email to existing nodes in a network of ontology instances by using linguistic knowledge and knowledge of the structure of the semantic web. We developed a heuristic score that uses several forms of evidence to detect references in email to existing nodes in the Semanticorganizer repository's network. While these scores cannot directly support automated probabilistic inference, they can be used to rank nodes by relevance and link those deemed most relevant to email messages.

  6. The Cure: Design and Evaluation of a Crowdsourcing Game for Gene Selection for Breast Cancer Survival Prediction

    PubMed Central

    Loguercio, Salvatore; Griffith, Obi L; Nanis, Max; Wu, Chunlei; Su, Andrew I

    2014-01-01

    Background Molecular signatures for predicting breast cancer prognosis could greatly improve care through personalization of treatment. Computational analyses of genome-wide expression datasets have identified such signatures, but these signatures leave much to be desired in terms of accuracy, reproducibility, and biological interpretability. Methods that take advantage of structured prior knowledge (eg, protein interaction networks) show promise in helping to define better signatures, but most knowledge remains unstructured. Crowdsourcing via scientific discovery games is an emerging methodology that has the potential to tap into human intelligence at scales and in modes unheard of before. Objective The main objective of this study was to test the hypothesis that knowledge linking expression patterns of specific genes to breast cancer outcomes could be captured from players of an open, Web-based game. We envisioned capturing knowledge both from the player’s prior experience and from their ability to interpret text related to candidate genes presented to them in the context of the game. Methods We developed and evaluated an online game called The Cure that captured information from players regarding genes for use as predictors of breast cancer survival. Information gathered from game play was aggregated using a voting approach, and used to create rankings of genes. The top genes from these rankings were evaluated using annotation enrichment analysis, comparison to prior predictor gene sets, and by using them to train and test machine learning systems for predicting 10 year survival. Results Between its launch in September 2012 and September 2013, The Cure attracted more than 1000 registered players, who collectively played nearly 10,000 games. Gene sets assembled through aggregation of the collected data showed significant enrichment for genes known to be related to key concepts such as cancer, disease progression, and recurrence. In terms of the predictive accuracy of models trained using this information, these gene sets provided comparable performance to gene sets generated using other methods, including those used in commercial tests. The Cure is available on the Internet. Conclusions The principal contribution of this work is to show that crowdsourcing games can be developed as a means to address problems involving domain knowledge. While most prior work on scientific discovery games and crowdsourcing in general takes as a premise that contributors have little or no expertise, here we demonstrated a crowdsourcing system that succeeded in capturing expert knowledge. PMID:25654473

  7. Making Sense of Rocket Science - Building NASA's Knowledge Management Program

    NASA Technical Reports Server (NTRS)

    Holm, Jeanne

    2002-01-01

    The National Aeronautics and Space Administration (NASA) has launched a range of KM activities-from deploying intelligent "know-bots" across millions of electronic sources to ensuring tacit knowledge is transferred across generations. The strategy and implementation focuses on managing NASA's wealth of explicit knowledge, enabling remote collaboration for international teams, and enhancing capture of the key knowledge of the workforce. An in-depth view of the work being done at the Jet Propulsion Laboratory (JPL) shows the integration of academic studies and practical applications to architect, develop, and deploy KM systems in the areas of document management, electronic archives, information lifecycles, authoring environments, enterprise information portals, search engines, experts directories, collaborative tools, and in-process decision capture. These systems, together, comprise JPL's architecture to capture, organize, store, and distribute key learnings for the U.S. exploration of space.

  8. DataHub: Knowledge-based data management for data discovery

    NASA Astrophysics Data System (ADS)

    Handley, Thomas H.; Li, Y. Philip

    1993-08-01

    Currently available database technology is largely designed for business data-processing applications, and seems inadequate for scientific applications. The research described in this paper, the DataHub, will address the issues associated with this shortfall in technology utilization and development. The DataHub development is addressing the key issues in scientific data management of scientific database models and resource sharing in a geographically distributed, multi-disciplinary, science research environment. Thus, the DataHub will be a server between the data suppliers and data consumers to facilitate data exchanges, to assist science data analysis, and to provide as systematic approach for science data management. More specifically, the DataHub's objectives are to provide support for (1) exploratory data analysis (i.e., data driven analysis); (2) data transformations; (3) data semantics capture and usage; analysis-related knowledge capture and usage; and (5) data discovery, ingestion, and extraction. Applying technologies that vary from deductive databases, semantic data models, data discovery, knowledge representation and inferencing, exploratory data analysis techniques and modern man-machine interfaces, DataHub will provide a prototype, integrated environement to support research scientists' needs in multiple disciplines (i.e. oceanography, geology, and atmospheric) while addressing the more general science data management issues. Additionally, the DataHub will provide data management services to exploratory data analysis applications such as LinkWinds and NCSA's XIMAGE.

  9. Process Materialization Using Templates and Rules to Design Flexible Process Models

    NASA Astrophysics Data System (ADS)

    Kumar, Akhil; Yao, Wen

    The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.

  10. Knowledge represented using RDF semantic network in the concept of semantic web

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lukasova, A., E-mail: alena.lukasova@osu.cz; Vajgl, M., E-mail: marek.vajgl@osu.cz; Zacek, M., E-mail: martin.zacek@osu.cz

    The RDF(S) model has been declared as the basic model to capture knowledge of the semantic web. It provides a common and flexible way to decompose composed knowledge to elementary statements, which can be represented by RDF triples or by RDF graph vectors. From the logical point of view, elements of knowledge can be expressed using at most binary predicates, which can be converted to RDF-triples or graph vectors. However, it is not able to capture implicit knowledge representable by logical formulas. This contribution shows how existing approaches (semantic networks and clausal form logic) can be combined together with RDFmore » to obtain RDF-compatible system with ability to represent implicit knowledge and inference over knowledge base.« less

  11. Ares Knowledge Capture: Summary and Key Themes Presentation

    NASA Technical Reports Server (NTRS)

    Coates, Ralph H.

    2011-01-01

    This report has been developed by the National Aeronautics and Space Administration (NASA) Human Exploration and Operations Mission Directorate (HEOMD) Risk Management team in close coordination with the MSFC Chief Engineers Office. This document provides a point-in-time, cumulative, summary of actionable key lessons learned derived from the design project. Lessons learned invariably address challenges and risks and the way in which these areas have been addressed. Accordingly the risk management thread is woven throughout the document.

  12. Capturing information needs of care providers to support knowledge sharing and distributed decision making.

    PubMed

    Rogers, M; Zach, L; An, Y; Dalrymple, P

    2012-01-01

    This paper reports on work carried out to elicit information needs at a trans-disciplinary, nurse-managed health care clinic that serves a medically disadvantaged urban population. The trans-disciplinary model provides a "one-stop shop" for patients who can receive a wide range of services beyond traditional primary care. However, this model of health care presents knowledge sharing challenges because little is known about how data collected from the non-traditional services can be integrated into the traditional electronic medical record (EMR) and shared with other care providers. There is also little known about how health information technology (HIT) can be used to support the workflow in such a practice. The objective of this case study was to identify the information needs of care providers in order to inform the design of HIT to support knowledge sharing and distributed decision making. A participatory design approach is presented as a successful technique to specify requirements for HIT applications that can support a trans-disciplinary model of care. Using this design approach, the researchers identified the information needs of care providers working at the clinic and suggested HIT improvements to integrate non-traditional information into the EMR. These modifications allow knowledge sharing among care providers and support better health decisions. We have identified information needs of care providers as they are relevant to the design of health information systems. As new technology is designed and integrated into various workflows it is clear that understanding information needs is crucial to acceptance of that technology.

  13. Cognitive Task Analysis for Instruction in Single-Injection Ultrasound Guided-Regional Anesthesia

    ERIC Educational Resources Information Center

    Gucev, Gligor V.

    2012-01-01

    Cognitive task analysis (CTA) is methodology for eliciting knowledge from subject matter experts. CTA has been used to capture the cognitive processes, decision-making, and judgments that underlie expert behaviors. A review of the literature revealed that CTA has not yet been used to capture the knowledge required to perform ultrasound guided…

  14. MSL Lessons Learned and Knowledge Capture

    NASA Technical Reports Server (NTRS)

    Buxbaum, Karen L.

    2012-01-01

    The Mars Program has recently been informed of the Planetary Protection Subcommittee (PPS) recommendation, which was endorsed by the NAC, concerning Mars Science Lab (MSL) lessons learned and knowledge capture. The Mars Program has not had an opportunity to consider any decisions specific to the PPS recommendation. Some of the activities recommended by the PPS would involve members of the MSL flight team who are focused on cruise, entry descent & landing, and early surface operations; those activities would have to wait. Members of the MSL planetary protection team at JPL are still available to support MSL lessons learned and knowledge capture; some of the specifically recommended activities have already begun. The Mars Program shares the PPS/NAC concerns about loss of potential information & expertise in planetary protection practice.

  15. Formalizing Knowledge in Multi-Scale Agent-Based Simulations

    PubMed Central

    Somogyi, Endre; Sluka, James P.; Glazier, James A.

    2017-01-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused. PMID:29338063

  16. Knowledge management for chronic patient control and monitoring

    NASA Astrophysics Data System (ADS)

    Pedreira, Nieves; Aguiar-Pulido, Vanessa; Dorado, Julián; Pazos, Alejandro; Pereira, Javier

    2014-10-01

    Knowledge Management (KM) can be seen as the process of capturing, developing, sharing, and effectively using organizational knowledge. In this context, the work presented here proposes a KM System to be used in the scope of chronic patient control and monitoring for distributed research projects. It was designed in order to enable communication between patient and doctors, as well as to be usedbythe researchers involved in the project for its management. The proposed model integrates all the information concerning every patient and project management tasks in the Institutional Memory of a KMSystem and uses an ontology to maintain the information and its categorization independently. Furthermore, taking the philosophy of intelligent agents, the system will interact with the user to show him the information according to his preferences and access rights. Finally, three different scenarios of application are described.

  17. Formalizing Knowledge in Multi-Scale Agent-Based Simulations.

    PubMed

    Somogyi, Endre; Sluka, James P; Glazier, James A

    2016-10-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused.

  18. NATO Human View Architecture and Human Networks

    NASA Technical Reports Server (NTRS)

    Handley, Holly A. H.; Houston, Nancy P.

    2010-01-01

    The NATO Human View is a system architectural viewpoint that focuses on the human as part of a system. Its purpose is to capture the human requirements and to inform on how the human impacts the system design. The viewpoint contains seven static models that include different aspects of the human element, such as roles, tasks, constraints, training and metrics. It also includes a Human Dynamics component to perform simulations of the human system under design. One of the static models, termed Human Networks, focuses on the human-to-human communication patterns that occur as a result of ad hoc or deliberate team formation, especially teams distributed across space and time. Parameters of human teams that effect system performance can be captured in this model. Human centered aspects of networks, such as differences in operational tempo (sense of urgency), priorities (common goal), and team history (knowledge of the other team members), can be incorporated. The information captured in the Human Network static model can then be included in the Human Dynamics component so that the impact of distributed teams is represented in the simulation. As the NATO militaries transform to a more networked force, the Human View architecture is an important tool that can be used to make recommendations on the proper mix of technological innovations and human interactions.

  19. Factors shaping the evolution of electronic documentation systems

    NASA Technical Reports Server (NTRS)

    Dede, Christopher J.; Sullivan, Tim R.; Scace, Jacque R.

    1990-01-01

    The main goal is to prepare the space station technical and managerial structure for likely changes in the creation, capture, transfer, and utilization of knowledge. By anticipating advances, the design of Space Station Project (SSP) information systems can be tailored to facilitate a progression of increasingly sophisticated strategies as the space station evolves. Future generations of advanced information systems will use increases in power to deliver environmentally meaningful, contextually targeted, interconnected data (knowledge). The concept of a Knowledge Base Management System is emerging when the problem is focused on how information systems can perform such a conversion of raw data. Such a system would include traditional management functions for large space databases. Added artificial intelligence features might encompass co-existing knowledge representation schemes; effective control structures for deductive, plausible, and inductive reasoning; means for knowledge acquisition, refinement, and validation; explanation facilities; and dynamic human intervention. The major areas covered include: alternative knowledge representation approaches; advanced user interface capabilities; computer-supported cooperative work; the evolution of information system hardware; standardization, compatibility, and connectivity; and organizational impacts of information intensive environments.

  20. System diagnostic builder: a rule-generation tool for expert systems that do intelligent data evaluation

    NASA Astrophysics Data System (ADS)

    Nieten, Joseph L.; Burke, Roger

    1993-03-01

    The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.

  1. A Lyapunov based approach to energy maximization in renewable energy technologies

    NASA Astrophysics Data System (ADS)

    Iyasere, Erhun

    This dissertation describes the design and implementation of Lyapunov-based control strategies for the maximization of the power captured by renewable energy harnessing technologies such as (i) a variable speed, variable pitch wind turbine, (ii) a variable speed wind turbine coupled to a doubly fed induction generator, and (iii) a solar power generating system charging a constant voltage battery. First, a torque control strategy is presented to maximize wind energy captured in variable speed, variable pitch wind turbines at low to medium wind speeds. The proposed strategy applies control torque to the wind turbine pitch and rotor subsystems to simultaneously control the blade pitch and tip speed ratio, via the rotor angular speed, to an optimum point at which the capture efficiency is maximum. The control method allows for aerodynamic rotor power maximization without exact knowledge of the wind turbine model. A series of numerical results show that the wind turbine can be controlled to achieve maximum energy capture. Next, a control strategy is proposed to maximize the wind energy captured in a variable speed wind turbine, with an internal induction generator, at low to medium wind speeds. The proposed strategy controls the tip speed ratio, via the rotor angular speed, to an optimum point at which the efficiency constant (or power coefficient) is maximal for a particular blade pitch angle and wind speed by using the generator rotor voltage as a control input. This control method allows for aerodynamic rotor power maximization without exact wind turbine model knowledge. Representative numerical results demonstrate that the wind turbine can be controlled to achieve near maximum energy capture. Finally, a power system consisting of a photovoltaic (PV) array panel, dc-to-dc switching converter, charging a battery is considered wherein the environmental conditions are time-varying. A backstepping PWM controller is developed to maximize the power of the solar generating system. The controller tracks a desired array voltage, designed online using an incremental conductance extremum-seeking algorithm, by varying the duty cycle of the switching converter. The stability of the control algorithm is demonstrated by means of Lyapunov analysis. Representative numerical results demonstrate that the grid power system can be controlled to track the maximum power point of the photovoltaic array panel in varying atmospheric conditions. Additionally, the performance of the proposed strategy is compared to the typical maximum power point tracking (MPPT) method of perturb and observe (P&O), where the converter dynamics are ignored, and is shown to yield better results.

  2. Patients' and physicians' understanding of health and biomedical concepts: relationship to the design of EMR systems.

    PubMed

    Patel, Vimla L; Arocha, José F; Kushniruk, André W

    2002-02-01

    The aim of this paper is to examine knowledge organization and reasoning strategies involved in physician-patient communication and to consider how these are affected by the use of computer tools, in particular, electronic medical record (EMR) systems. In the first part of the paper, we summarize results from a study in which patients were interviewed before their interactions with physicians and where physician-patient interactions were recorded and analyzed to evaluate patients' and physicians' understanding of the patient problem. We give a detailed presentation of one of such interaction, with characterizations of physician and patient models. In a second set of studies, the contents of both paper and EMRs were compared and in addition, physician-patient interactions (involving the use of EMR technology) were video recorded and analyzed to assess physicians' information gathering and knowledge organization for medical decision-making. Physicians explained the patient problems in terms of causal pathophysiological knowledge underlying the disease (disease model), whereas patients explained them in terms of narrative structures of illness (illness model). The data-driven nature of the traditional physician-patient interaction allows physicians to capture the temporal flow of events and to document key aspects of the patients' narratives. Use of electronic medical records was found to influence the way patient data were gathered, resulting in information loss and disruption of temporal sequence of events in assessing patient problem. The physician-patient interview allows physicians to capture crucial aspects of the patient's illness model, which are necessary for understanding the problem from the patients' perspective. Use of computer-based patient record technology may lead to a loss of this relevant information. As a consequence, designers of such systems should take into account information relevant to the patient comprehension of medical problems, which will influence their compliance.

  3. Accelerating learning for pro-poor health markets

    PubMed Central

    2014-01-01

    Background Given the rapid evolution of health markets, learning is key to promoting the identification and uptake of health market policies and practices that better serve the needs of the poor. However there are significant challenges to learning about health markets. We discuss the different forms that learning takes, from the development of codified scientific knowledge, through to experience-based learning, all in relationship to health markets. Discussion Notable challenges to learning in health markets include the difficulty of acquiring data from private health care providers, designing evaluations that capture the complex dynamics present within health markets and developing communities of practice that encompass the diverse actors present within health markets, and building trust and mutual understanding across these groups. The paper proposes experimentation with country-specific market data platforms that can integrate relevant evidence from different data sources, and simultaneously exploring strategies to secure better information on private providers and health markets. Possible approaches to adapting evaluation designs so that they are better able to take account of different and changing contexts as well as producing real time findings are discussed. Finally capturing informal knowledge about health markets is key. Communities of practice that bridge different health market actors can help to share such experience-based knowledge and in so doing, may help to formalize it. More geographically-focused communities of practice are needed, and such communities may be supported by innovation brokers and/or be built around member-based organizations. Summary Strategic investments in and support to learning about health markets can address some of the challenges experienced to-date, and accelerate learning that supports health markets that serve the poor. PMID:24961671

  4. Overview of the Design, Development, and Application of Nickel-hydrogen Batteries

    NASA Technical Reports Server (NTRS)

    Thaller, Lawrence H.; Zimmerman, Albert H.

    2003-01-01

    This document provides an overview of the design, development, and application of nickel-hydrogen (Ni-H2) battery technology for aerospace applications. It complements and updates the information presented in NASA RP-1314, NASA Handbook for Nickel- Hydrogen Batteries, published in 1993. Since that time, nickel-hydrogen batteries have become widely accepted for aerospace energy storage requirements and much more has been learned. The intent of this document is to capture some of that additional knowledge. This document addresses various aspects of nickel-hydrogen technology including the electrochemical reactions, cell component design, and selection considerations; overall cell and battery design considerations; charge control considerations; and manufacturing issues that have surfaced over the years that nickel-hydrogen battery technology has been the major energy storage technology for geosynchronous and low-Earth-orbiting satellites.

  5. DAWN (Design Assistant Workstation) for advanced physical-chemical life support systems

    NASA Technical Reports Server (NTRS)

    Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.

    1989-01-01

    This paper reports the results of a project supported by the National Aeronautics and Space Administration, Office of Aeronautics and Space Technology (NASA-OAST) under the Advanced Life Support Development Program. It is an initial attempt to integrate artificial intelligence techniques (via expert systems) with conventional quantitative modeling tools for advanced physical-chemical life support systems. The addition of artificial intelligence techniques will assist the designer in the definition and simulation of loosely/well-defined life support processes/problems as well as assist in the capture of design knowledge, both quantitative and qualitative. Expert system and conventional modeling tools are integrated to provide a design workstation that assists the engineer/scientist in creating, evaluating, documenting and optimizing physical-chemical life support systems for short-term and extended duration missions.

  6. Organizational culture and knowledge management in the electric power generation industry

    NASA Astrophysics Data System (ADS)

    Mayfield, Robert D.

    Scarcity of knowledge and expertise is a challenge in the electric power generation industry. Today's most pervasive knowledge issues result from employee turnover and the constant movement of employees from project to project inside organizations. To address scarcity of knowledge and expertise, organizations must enable employees to capture, transfer, and use mission-critical explicit and tacit knowledge. The purpose of this qualitative grounded theory research was to examine the relationship between and among organizations within the electric power generation industry developing knowledge management processes designed to retain, share, and use the industry, institutional, and technical knowledge upon which the organizations depend. The research findings show that knowledge management is a business problem within the domain of information systems and management. The risks associated with losing mission critical-knowledge can be measured using metrics on employee retention, recruitment, productivity, training and benchmarking. Certain enablers must be in place in order to engage people, encourage cooperation, create a knowledge-sharing culture, and, ultimately change behavior. The research revealed the following change enablers that support knowledge management strategies: (a) training - blended learning, (b) communities of practice, (c) cross-functional teams, (d) rewards and recognition programs, (e) active senior management support, (f) communication and awareness, (g) succession planning, and (h) team organizational culture.

  7. A method of computer aided design with self-generative models in NX Siemens environment

    NASA Astrophysics Data System (ADS)

    Grabowik, C.; Kalinowski, K.; Kempa, W.; Paprocka, I.

    2015-11-01

    Currently in CAD/CAE/CAM systems it is possible to create 3D design virtual models which are able to capture certain amount of knowledge. These models are especially useful in an automation of routine design tasks. These models are known as self-generative or auto generative and they can behave in an intelligent way. The main difference between the auto generative and fully parametric models consists in the auto generative models ability to self-organizing. In this case design model self-organizing means that aside from the possibility of making of automatic changes of model quantitative features these models possess knowledge how these changes should be made. Moreover they are able to change quality features according to specific knowledge. In spite of undoubted good points of self-generative models they are not so often used in design constructional process which is mainly caused by usually great complexity of these models. This complexity makes the process of self-generative time and labour consuming. It also needs a quite great investment outlays. The creation process of self-generative model consists of the three stages it is knowledge and information acquisition, model type selection and model implementation. In this paper methods of the computer aided design with self-generative models in NX Siemens CAD/CAE/CAM software are presented. There are the five methods of self-generative models preparation in NX with: parametric relations model, part families, GRIP language application, knowledge fusion and OPEN API mechanism. In the paper examples of each type of the self-generative model are presented. These methods make the constructional design process much faster. It is suggested to prepare this kind of self-generative models when there is a need of design variants creation. The conducted research on assessing the usefulness of elaborated models showed that they are highly recommended in case of routine tasks automation. But it is still difficult to distinguish which method of self-generative preparation is most preferred. It always depends on a problem complexity. The easiest way for such a model preparation is this with the parametric relations model whilst the hardest one is this with the OPEN API mechanism. From knowledge processing point of view the best choice is application of the knowledge fusion.

  8. D3: A Collaborative Infrastructure for Aerospace Design

    NASA Technical Reports Server (NTRS)

    Walton, Joan; Filman, Robert E.; Knight, Chris; Korsmeyer, David J.; Lee, Diana D.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    DARWIN is a NASA developed, Internet-based system for enabling aerospace researchers to securely and remotely access and collaborate on the analysis of aerospace vehicle design data, primarily the results of wind-tunnel testing and numeric (e.g., computational fluid dynamics) model executions. DARWIN captures, stores and indexes data, manages derived knowledge (such as visualizations across multiple data sets) and provides an environment for designers to collaborate in the analysis of the results of testing. DARWIN is an interesting application because it supports high volumes of data, integrates multiple modalities of data display (e.g. images and data visualizations), and provides non-trivial access control mechanisms. DARWIN enables collaboration by allowing not only sharing visualizations of data, but also commentary about and view of data.

  9. Monitoring Agents for Assisting NASA Engineers with Shuttle Ground Processing

    NASA Technical Reports Server (NTRS)

    Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Danil A.; Smith, Kevin E.; Boeloeni, Ladislau

    2005-01-01

    The Spaceport Processing Systems Branch at NASA Kennedy Space Center has designed, developed, and deployed a rule-based agent to monitor the Space Shuttle's ground processing telemetry stream. The NASA Engineering Shuttle Telemetry Agent increases situational awareness for system and hardware engineers during ground processing of the Shuttle's subsystems. The agent provides autonomous monitoring of the telemetry stream and automatically alerts system engineers when user defined conditions are satisfied. Efficiency and safety are improved through increased automation. Sandia National Labs' Java Expert System Shell is employed as the agent's rule engine. The shell's predicate logic lends itself well to capturing the heuristics and specifying the engineering rules within this domain. The declarative paradigm of the rule-based agent yields a highly modular and scalable design spanning multiple subsystems of the Shuttle. Several hundred monitoring rules have been written thus far with corresponding notifications sent to Shuttle engineers. This chapter discusses the rule-based telemetry agent used for Space Shuttle ground processing. We present the problem domain along with design and development considerations such as information modeling, knowledge capture, and the deployment of the product. We also present ongoing work with other condition monitoring agents.

  10. DREAMS and IMAGE: A Model and Computer Implementation for Concurrent, Life-Cycle Design of Complex Systems

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.

  11. A design space of visualization tasks.

    PubMed

    Schulz, Hans-Jörg; Nocke, Thomas; Heitzler, Magnus; Schumann, Heidrun

    2013-12-01

    Knowledge about visualization tasks plays an important role in choosing or building suitable visual representations to pursue them. Yet, tasks are a multi-faceted concept and it is thus not surprising that the many existing task taxonomies and models all describe different aspects of tasks, depending on what these task descriptions aim to capture. This results in a clear need to bring these different aspects together under the common hood of a general design space of visualization tasks, which we propose in this paper. Our design space consists of five design dimensions that characterize the main aspects of tasks and that have so far been distributed across different task descriptions. We exemplify its concrete use by applying our design space in the domain of climate impact research. To this end, we propose interfaces to our design space for different user roles (developers, authors, and end users) that allow users of different levels of expertise to work with it.

  12. Lessons Learned in Building the Ares Projects

    NASA Technical Reports Server (NTRS)

    Sumrall, John Phil

    2010-01-01

    Since being established in 2005, the Ares Projects at Marshall Space Flight Center have been making steady progress designing, building, testing, and flying the next generation of exploration launch vehicles. Ares is committed to rebuilding crucial capabilities from the Apollo era that made the first human flights to the Moon possible, as well as incorporating the latest in computer technology and changes in management philosophy. One example of an Apollo-era practice has been giving NASA overall authority over vehicle integration activities, giving civil service engineers hands-on experience in developing rocket hardware. This knowledge and experience help make the agency a "smart buyer" of products and services. More modern practices have been added to the management tool belt to improve efficiency, cost effectiveness, and institutional knowledge, including knowledge management/capture to gain better insight into design and decision making; earned value management, where Ares won a NASA award for its practice and implementation; designing for operability; and Lean Six Sigma applications to identify and eliminate wasted time and effort. While it is important to learn technical lessons like how to fly and control unique rockets like the Ares I-X flight test vehicle, the Ares management team also has been learning important lessons about how to manage large, long-term projects.

  13. A METHODOLOGY FOR INTEGRATING IMAGES AND TEXT FOR OBJECT IDENTIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, Patrick R.; Hohimer, Ryan E.; Doucette, Peter J.

    2006-02-13

    Often text and imagery contain information that must be combined to solve a problem. One approach begins with transforming the raw text and imagery into a common structure that contains the critical information in a usable form. This paper presents an application in which the imagery of vehicles and the text from police reports were combined to demonstrate the power of data fusion to correctly identify the target vehicle--e.g., a red 2002 Ford truck identified in a police report--from a collection of diverse vehicle images. The imagery was abstracted into a common signature by first capturing the conceptual models ofmore » the imagery experts in software. Our system then (1) extracted fundamental features (e.g., wheel base, color), (2) made inferences about the information (e.g., it’s a red Ford) and then (3) translated the raw information into an abstract knowledge signature that was designed to both capture the important features and account for uncertainty. Likewise, the conceptual models of text analysis experts were instantiated into software that was used to generate an abstract knowledge signature that could be readily compared to the imagery knowledge signature. While this experiment primary focus was to demonstrate the power of text and imagery fusion for a specific example it also suggested several ways that text and geo-registered imagery could be combined to help solve other types of problems.« less

  14. A prospective cohort study to assess seroprevalence, incidence, knowledge, attitudes and practices, willingness to pay for vaccine and related risk factors in dengue in a high incidence setting.

    PubMed

    Martínez-Vega, Ruth Aralí; Rodriguez-Morales, Alfonso J; Bracho-Churio, Yalil Tomás; Castro-Salas, Mirley Enith; Galvis-Ovallos, Fredy; Díaz-Quijano, Ronald Giovanny; Luna-González, María Lucrecia; Castellanos, Jaime E; Ramos-Castañeda, José; Diaz-Quijano, Fredi Alexander

    2016-11-25

    Dengue is one of the most important vector-borne diseases in the world, causing significant morbidity and economic impact. In Colombia, dengue is a major public health problem. Departments of La Guajira, Cesar and Magdalena are dengue endemic areas. The objective of this research is to determine the seroprevalence and the incidence of dengue virus infection in the participating municipalities from these Departments, and also establish the association between individual and housing factors and vector indices with seroprevalence and incidence. We will also assess knowledge, attitudes and practices, and willingness-to-pay for dengue vaccine. A cohort study will be assembled with a clustered multistage sampling in 11 endemic municipalities. Approximately 1000 homes will be visited to enroll people older than one year who living in these areas, who will be followed for 1 year. Dengue virus infections will be evaluated using IgG indirect ELISA and IgM and IgG capture ELISA. Additionally, vector indices will be measured, and adult mosquitoes will be captured with aspirators. Ovitraps will be used for continuous estimation of vector density. This research will generate necessary knowledge to design and implement strategies with a multidimensional approach that reduce dengue morbidity and mortality in La Guajira and other departments from Colombian Caribbean.

  15. Capturing and portraying science student teachers' pedagogical content knowledge through CoRe construction

    NASA Astrophysics Data System (ADS)

    Thongnoppakun, Warangkana; Yuenyong, Chokchai

    2018-01-01

    Pedagogical content knowledge (PCK) is an essential kind of knowledge that teacher have for teaching particular content to particular students for enhance students' understanding, therefore, teachers with adequate PCK can give content to their students in an understandable way rather than transfer subject matter knowledge to learner. This study explored science student teachers' PCK for teaching science using Content representation base methodology. Research participants were 68 4th year science student teachers from department of General Science, faculty of Education, Phuket Rajabhat University. PCK conceptualization for teaching science by Magnusson et al. (1999) was applied as a theoretical framework in this study. In this study, Content representation (CoRe) by Loughran et al. (2004) was employed as research methodology in the lesson preparation process. In addition, CoRe consisted of eight questions (CoRe prompts) that designed to elicit and portray teacher's PCK for teaching science. Data were collected from science student teachers' CoRes design for teaching a given topic and student grade. Science student teachers asked to create CoRes design for teaching in topic `Motion in one direction' for 7th grade student and further class discussion. Science student teachers mostly created a same group of science concepts according to subunits of school science textbook rather than planned and arranged content to support students' understanding. Furthermore, they described about the effect of student's prior knowledge and learning difficulties such as students' knowledge of Scalar and Vector quantity; and calculating skill. These responses portrayed science student teacher's knowledge of students' understanding of science and their content knowledge. However, they still have inadequate knowledge of instructional strategies and activities for enhance student learning. In summary, CoRes design can represented holistic overviews of science student teachers' PCK related to the teaching of a particular topic and also support them to gain more understanding about how to teach for understanding. Research implications are given for teacher education and educational research to offer a potential way to enhance science student teachers' PCK for teaching science and support their professional learning.

  16. DeMO: An Ontology for Discrete-event Modeling and Simulation.

    PubMed

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-09-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community.

  17. DeMO: An Ontology for Discrete-event Modeling and Simulation

    PubMed Central

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-01-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community. PMID:22919114

  18. A framework for collecting inclusive design data for the UK population.

    PubMed

    Langdon, Pat; Johnson, Daniel; Huppert, Felicia; Clarkson, P John

    2015-01-01

    Successful inclusive product design requires knowledge about the capabilities, needs and aspirations of potential users and should cater for the different scenarios in which people will use products, systems and services. This should include: the individual at home; in the workplace; for businesses, and for products in these contexts. It needs to reflect the development of theory, tools and techniques as research moves on. And it must also to draw in wider psychological, social, and economic considerations in order to gain a more accurate understanding of users' interactions with products and technology. However, recent research suggests that although a number of national disability surveys have been carried out, no such knowledge currently exists as information to support the design of products, systems and services for heterogeneous users. This paper outlines the strategy behind specific inclusive design research that is aimed at creating the foundations for measuring inclusion in product designs. A key outcome of this future research will be specifying and operationalising capability, and psychological, social and economic context measures for inclusive design. This paper proposes a framework for capturing such information, describes an early pilot study, and makes recommendations for better practice. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  19. Students Approach to Learning and Their Use of Lecture Capture

    ERIC Educational Resources Information Center

    Vajoczki, Susan; Watt, Susan; Marquis, Nick; Liao, Rose; Vine, Michelle

    2011-01-01

    This study examined lecture capture as a way of enhancing university education, and explored how students with different learning approaches used lecture capturing (i.e., podcasts and vodcasts). Results indicate that both deep and surface learners report increased course satisfaction and better retention of knowledge in courses with traditional…

  20. Operationalizing Levels of Academic Mastery Based on Vygotsky’s Theory

    PubMed Central

    Nezhnov, Peter; Kardanova, Elena; Ludlow, Larry

    2014-01-01

    The present study tested the possibility of operationalizing levels of knowledge acquisition based on Vygotsky’s theory of cognitive growth. An assessment tool (SAM-Math) was developed to capture a hypothesized hierarchical structure of mathematical knowledge consisting of procedural, conceptual, and functional levels. In Study 1, SAM-Math was administered to 4th-grade students (N = 2,216). The results of Rasch analysis indicated that the test provided an operational definition for the construct of mathematical competence that included the three levels of mastery corresponding to the theoretically based hierarchy of knowledge. In Study 2, SAM-Math was administered to students in 4th, 6th, 8th, and 10th grades (N = 396) to examine developmental changes in the levels of mathematics knowledge. The results showed that the mastery of mathematical concepts presented in elementary school continued to deepen beyond elementary school, as evidenced by a significant growth in conceptual and functional levels of knowledge. The findings are discussed in terms of their implications for psychological theory, test design, and educational practice. PMID:29795820

  1. Effect of the space environment on materials flown on the EURECA/TICCE-HVI experiment

    NASA Technical Reports Server (NTRS)

    Maag, Carl R.; Stevenson, Tim J.; Tanner, William G.; Borg, Janet

    1995-01-01

    The primary benefit of accurately quantifying and characterizing the space environmental effects on materials is longer instrument and spacecraft life. Knowledge of the limits of materials allows the designer to optimize the spacecraft design so that the required life is achieved. Materials such as radiator coatings that have excellent durability result in the design of smaller radiators than a radiator coated with a lower durability coating. This may reduce the weight of the spacecraft due to a more optimum design. Another benefit of characterizing materials is the quantification of outgassing properties. Spacecraft which have ultraviolet or visible sensor payloads are susceptible to contamination by outgassed volatile materials. Materials with known outgassing characteristics can be restricted in these spacecraft. Finally, good data on material characteristics improves the ability of analytical models to predict material performance. A flight experiment was conducted on the European Space Agency's European Retrievable Carrier (EuReCa) as part of the Timeband Capture Cell Experiment (TICCE). Our main objective was to gather additional data on the dust and debris environments, with the focus on understanding growth as a function of size (mass) for hypervelocity particles 1E-06 cm and larger. In addition to enumerating particle impacts, hypervelocity particles were to be captured and returned intact. Measurements were performed post-flight to determine the flux density, diameters, and subsequent effects on various optical, thermal control and structural materials. In addition to these principal measurements, the experiment also provided a structure and sample holders for the exposure of passive material samples to the space environment, e.g., the effects of thermal cycling, atomic oxygen, etc. Preliminary results are presented, including the techniques used for intact capture of particles.

  2. Effect of the space environment on materials flown on the EURECA/TICCE-HVI experiment

    NASA Astrophysics Data System (ADS)

    Maag, Carl R.; Stevenson, Tim J.; Tanner, William G.; Borg, Janet

    1995-02-01

    The primary benefit of accurately quantifying and characterizing the space environmental effects on materials is longer instrument and spacecraft life. Knowledge of the limits of materials allows the designer to optimize the spacecraft design so that the required life is achieved. Materials such as radiator coatings that have excellent durability result in the design of smaller radiators than a radiator coated with a lower durability coating. This may reduce the weight of the spacecraft due to a more optimum design. Another benefit of characterizing materials is the quantification of outgassing properties. Spacecraft which have ultraviolet or visible sensor payloads are susceptible to contamination by outgassed volatile materials. Materials with known outgassing characteristics can be restricted in these spacecraft. Finally, good data on material characteristics improves the ability of analytical models to predict material performance. A flight experiment was conducted on the European Space Agency's European Retrievable Carrier (EuReCa) as part of the Timeband Capture Cell Experiment (TICCE). Our main objective was to gather additional data on the dust and debris environments, with the focus on understanding growth as a function of size (mass) for hypervelocity particles 1E-06 cm and larger. In addition to enumerating particle impacts, hypervelocity particles were to be captured and returned intact. Measurements were performed post-flight to determine the flux density, diameters, and subsequent effects on various optical, thermal control and structural materials. In addition to these principal measurements, the experiment also provided a structure and sample holders for the exposure of passive material samples to the space environment, e.g., the effects of thermal cycling, atomic oxygen, etc. Preliminary results are presented, including the techniques used for intact capture of particles.

  3. Marketing practitioner’s tacit knowledge acquisition using Repertory Grid Technique (RTG)

    NASA Astrophysics Data System (ADS)

    Azmi, Afdhal; Adriman, Ramzi

    2018-05-01

    The tacit knowledge of Marketing practitioner’s experts is excellent resources and priceless. It takes into account their experiential, skill, ideas, belief systems, insight and speculation into management decision-making. This expertise is an individual intuitive judgment and personal shortcuts to complete the work efficiently. Tacit knowledge of Marketing practitioner’s experts is one of best problem solutions in marketing strategy, environmental analysis, product management and partner’s relationship. This paper proposes the acquisition method of tacit knowledge from Marketing practitioner’s using Repertory Grid Technique (RGT). The RGT is a software application for tacit acquisition knowledge to provide a systematic approach to capture and acquire the constructs from an individual. The result shows the understanding of RGT could make TKE and MPE get a good result in capturing and acquiring tacit knowledge of Marketing practitioner’s experts.

  4. Rapid Development of Custom Software Architecture Design Environments

    DTIC Science & Technology

    1999-08-01

    the tools themselves. This dissertation describes a new approach to capturing and using architectural design expertise in software architecture design environments...A language and tools are presented for capturing and encapsulating software architecture design expertise within a conceptual framework...of architectural styles and design rules. The design expertise thus captured is supported with an incrementally configurable software architecture

  5. A conceptual framework for the domain of evidence-based design.

    PubMed

    Ulrich, Roger S; Berry, Leonard L; Quan, Xiaobo; Parish, Janet Turner

    2010-01-01

    The physical facilities in which healthcare services are performed play an important role in the healing process. Evidence-based design in healthcare is a developing field of study that holds great promise for benefiting key stakeholders: patients, families, physicians, and nurses, as well as other healthcare staff and organizations. In this paper, the authors present and discuss a conceptual framework intended to capture the current domain of evidence-based design in healthcare. In this framework, the built environment is represented by nine design variable categories: audio environment, visual environment, safety enhancement, wayfinding system, sustainability, patient room, family support spaces, staff support spaces, and physician support spaces. Furthermore, a series of matrices is presented that indicates knowledge gaps concerning the relationship between specific healthcare facility design variable categories and participant and organizational outcomes. From this analysis, the authors identify fertile research opportunities from the perspectives of key stakeholders.

  6. Fan Fuel Casting Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imhoff, Seth D.

    LANL was approached to provide material and design guidance for a fan-shaped fuel element. A total of at least three castings were planned. The first casting is a simple billet mold to be made from high carbon DU-10Mo charge material. The second and third castings are for optimization of the actual fuel plate mold. The experimental scope for optimization is only broad enough for a second iteration of the mold design. It is important to note that partway through FY17, this project was cancelled by the sponsor. This report is being written in order to capture the knowledge gained shouldmore » this project resume at a later date.« less

  7. Preserving Knowledge

    ERIC Educational Resources Information Center

    Taintor, Spence

    2008-01-01

    Every year, teachers leave the profession and take valuable experience and knowledge with them. An increasing retirement rate makes schools vulnerable to a significant loss of knowledge. This article describes how implementing a knowledge management process will ensure that valuable assets are captured and shared. (Contains 3 online resources.)

  8. Reducing the cognitive workload - Trouble managing power systems

    NASA Technical Reports Server (NTRS)

    Manner, David B.; Liberman, Eugene M.; Dolce, James L.; Mellor, Pamela A.

    1993-01-01

    The complexity of space-based systems makes monitoring them and diagnosing their faults taxing for human beings. When a problem arises, immediate attention and quick resolution is mandatory. To aid humans in these endeavors we have developed an automated advisory system. Our advisory expert system, Trouble, incorporates the knowledge of the power system designers for Space Station Freedom. Trouble is designed to be a ground-based advisor for the mission controllers in the Control Center Complex at Johnson Space Center (JSC). It has been developed at NASA Lewis Research Center (LeRC) and tested in conjunction with prototype flight hardware contained in the Power Management and Distribution testbed and the Engineering Support Center, ESC, at LeRC. Our work will culminate with the adoption of these techniques by the mission controllers at JSC. This paper elucidates how we have captured power system failure knowledge, how we have built and tested our expert system, and what we believe its potential uses are.

  9. Knowledge Acquisition and Management for the NASA Earth Exchange (NEX)

    NASA Astrophysics Data System (ADS)

    Votava, P.; Michaelis, A.; Nemani, R. R.

    2013-12-01

    NASA Earth Exchange (NEX) is a data, computing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform with access to large supercomputing resources. As more and more projects are being executed on NEX, we are increasingly focusing on capturing the knowledge of the NEX users and provide mechanisms for sharing it with the community in order to facilitate reuse and accelerate research. There are many possible knowledge contributions to NEX, it can be a wiki entry on the NEX portal contributed by a developer, information extracted from a publication in an automated way, or a workflow captured during code execution on the supercomputing platform. The goal of the NEX knowledge platform is to capture and organize this information and make it easily accessible to the NEX community and beyond. The knowledge acquisition process consists of three main faucets - data and metadata, workflows and processes, and web-based information. Once the knowledge is acquired, it is processed in a number of ways ranging from custom metadata parsers to entity extraction using natural language processing techniques. The processed information is linked with existing taxonomies and aligned with internal ontology (which heavily reuses number of external ontologies). This forms a knowledge graph that can then be used to improve users' search query results as well as provide additional analytics capabilities to the NEX system. Such a knowledge graph will be an important building block in creating a dynamic knowledge base for the NEX community where knowledge is both generated and easily shared.

  10. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    NASA Astrophysics Data System (ADS)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  11. Temperature dependence of carrier capture by defects in gallium arsenide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wampler, William R.; Modine, Normand A.

    2015-08-01

    This report examines the temperature dependence of the capture rate of carriers by defects in gallium arsenide and compares two previously published theoretical treatments of this based on multi phonon emission (MPE). The objective is to reduce uncertainty in atomistic simulations of gain degradation in III-V HBTs from neutron irradiation. A major source of uncertainty in those simulations is poor knowledge of carrier capture rates, whose values can differ by several orders of magnitude between various defect types. Most of this variation is due to different dependence on temperature, which is closely related to the relaxation of the defect structuremore » that occurs as a result of the change in charge state of the defect. The uncertainty in capture rate can therefore be greatly reduced by better knowledge of the defect relaxation.« less

  12. The Antarctic Search for Meteorites: The Future of Space, on Earth Today - EVA Knowledge Capture Outbrief

    NASA Technical Reports Server (NTRS)

    Love, Stan

    2013-01-01

    NASA astronaut Stan Love shared his experiences with the Antarctic Search for Meteorites (ANSMET), an annual expedition to the southern continent to collect valuable samples for research in planetary science. ANSMET teams operate from isolated, remote field camps on the polar plateau, where windchill factors often reach -40 F. Several astronaut participants have noted ANSMET's similarity to a space mission. Some of the operational concepts, tools, and equipment employed by ANSMET teams may offer valuable insights to designers of future planetary surface exploration hardware.

  13. Reinventing Design Principles for Developing Low-Viscosity Carbon Dioxide-Binding Organic Liquids for Flue Gas Clean Up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    2017-01-11

    Anthropogenic carbon dioxide (CO 2) emission from point sources, such as coal fired-power plants, account for the majority of the green houses gasses in the atmosphere. Capture, storage and utilization are required to mitigate adverse environmental effects. Aqueous amine-based CO 2 capture solvents are currently considered the industry standard, but deployment to market is limited by their high regeneration energy demand. In that context, energy efficient and less-viscous water-lean transformational solvent systems known as CO 2 Binding Organic Liquids (CO 2BOLs) are being developed in our group to advance this technology to commercialization. Herein, we present a logical design approachmore » based on fundamental concepts of organic chemistry and computer simulations aimed at lowering solvent viscosity. Conceptually, viscosity reduction would be achieved by systemmatic methods such as introduction of steric hindrance on the anion to minimize the intermolecular cation-anion interactions, fine tuning the electronics, hydrogen bonding orientation and strength, and charge solvation. Conventional standard trial-and-error approaches while effective, are time consuming and economically expensive. Herein, we rethink the metrics and design principles of low-viscosity CO 2 capture solvents using a combined synthesis and computational modeling approach. We critically study the impacts of modyfying factors such as as orientation of hydrogen bonding, introduction of higher degrees of freedom and cation or anion charge solvation and assess if or how each factor impacts viscosity of CO 2BOL CO 2 capture solvents. Ultimately, we found that hydrogen bond orientation and strength is predominantly influencing the viscosity in CO 2BOL solvents. With this knowledge, a new 1-MEIPADM-2-BOL CO 2BOL variant was synthesized and tested, resulting in a solvent that is approximately 60% less viscous at 25 mol% CO 2 loading with respect to our base compound 1-IPADM-2-BOL. The insights gained from the current study redefines the fundamental concepts and understanding of what influences viscosity in concentrated organic CO 2 capture solvents.« less

  14. Tools for knowledge acquisition within the NeuroScholar system and their application to anatomical tract-tracing data

    PubMed Central

    Burns, Gully APC; Cheng, Wei-Cheng

    2006-01-01

    Background Knowledge bases that summarize the published literature provide useful online references for specific areas of systems-level biology that are not otherwise supported by large-scale databases. In the field of neuroanatomy, groups of small focused teams have constructed medium size knowledge bases to summarize the literature describing tract-tracing experiments in several species. Despite years of collation and curation, these databases only provide partial coverage of the available published literature. Given that the scientists reading these papers must all generate the interpretations that would normally be entered into such a system, we attempt here to provide general-purpose annotation tools to make it easy for members of the community to contribute to the task of data collation. Results In this paper, we describe an open-source, freely available knowledge management system called 'NeuroScholar' that allows straightforward structured markup of the PDF files according to a well-designed schema to capture the essential details of this class of experiment. Although, the example worked through in this paper is quite specific to neuroanatomical connectivity, the design is freely extensible and could conceivably be used to construct local knowledge bases for other experiment types. Knowledge representations of the experiment are also directly linked to the contributing textual fragments from the original research article. Through the use of this system, not only could members of the community contribute to the collation task, but input data can be gathered for automated approaches to permit knowledge acquisition through the use of Natural Language Processing (NLP). Conclusion We present a functional, working tool to permit users to populate knowledge bases for neuroanatomical connectivity data from the literature through the use of structured questionnaires. This system is open-source, fully functional and available for download from [1]. PMID:16895608

  15. A Tailored Ontology Supporting Sensor Implementation for the Maintenance of Industrial Machines.

    PubMed

    Maleki, Elaheh; Belkadi, Farouk; Ritou, Mathieu; Bernard, Alain

    2017-09-08

    The longtime productivity of an industrial machine is improved by condition-based maintenance strategies. To do this, the integration of sensors and other cyber-physical devices is necessary in order to capture and analyze a machine's condition through its lifespan. Thus, choosing the best sensor is a critical step to ensure the efficiency of the maintenance process. Indeed, considering the variety of sensors, and their features and performance, a formal classification of a sensor's domain knowledge is crucial. This classification facilitates the search for and reuse of solutions during the design of a new maintenance service. Following a Knowledge Management methodology, the paper proposes and develops a new sensor ontology that structures the domain knowledge, covering both theoretical and experimental sensor attributes. An industrial case study is conducted to validate the proposed ontology and to demonstrate its utility as a guideline to ease the search of suitable sensors. Based on the ontology, the final solution will be implemented in a shared repository connected to legacy CAD (computer-aided design) systems. The selection of the best sensor is, firstly, obtained by the matching of application requirements and sensor specifications (that are proposed by this sensor repository). Then, it is refined from the experimentation results. The achieved solution is recorded in the sensor repository for future reuse. As a result, the time and cost of the design process of new condition-based maintenance services is reduced.

  16. Enabling Security, Stability, Transition, and Reconstruction Operations through Knowledge Management

    DTIC Science & Technology

    2009-03-18

    strategy. Overall, the cultural barriers to knowledge sharing center on knowledge creation and capture. The primary barrier to knowledge sharing is lack ... Lacking a shared identity decreases the likelihood of knowledge sharing, which is essential to effective collaboration.84 Related to collaboration...to adapt, develop, and change based on experience-derived knowledge.90 A second cultural barrier to knowledge acquisition is the lack receptiveness

  17. Captured Knowledge: Presentations and Notes of the KMWorld Conference and Exposition (4th, Santa Clara, California, September 13-15, 2000).

    ERIC Educational Resources Information Center

    Jones, Rebecca, Ed.; Nixon, Carol, Comp.; Burmood, Jennifer, Comp.

    This publication contains presentations, notes, and illustrative materials used in the annual KMWorld Conference and Exposition, "Knowledge Nets: Defining and Driving the E-Enterprise." Presentations include: "Knowledge Management Applied to the Manufacturing Enterprise" (Matthew Artibee); "Ryder Knowledge Center: Building…

  18. The Spelling Sensitivity Score: Noting Developmental Changes in Spelling Knowledge

    ERIC Educational Resources Information Center

    Masterson, Julie J.; Apel, Kenn

    2010-01-01

    Spelling is a language skill supported by several linguistic knowledge sources, including phonemic, orthographic, and morphological knowledge. Typically, however, spelling assessment procedures do not capture the development and use of these linguistic knowledge sources. The purpose of this article is to describe a new assessment system, the…

  19. Exploratory research for the development of a computer aided software design environment with the software technology program

    NASA Technical Reports Server (NTRS)

    Hardwick, Charles

    1991-01-01

    Field studies were conducted by MCC to determine areas of research of mutual interest to MCC and JSC. NASA personnel from the Information Systems Directorate and research faculty from UHCL/RICIS visited MCC in Austin, Texas to examine tools and applications under development in the MCC Software Technology Program. MCC personnel presented workshops in hypermedia, design knowledge capture, and design recovery on site at JSC for ISD personnel. The following programs were installed on workstations in the Software Technology Lab, NASA/JSC: (1) GERM (Graphic Entity Relations Modeler); (2) gIBIS (Graphic Issues Based Information System); and (3) DESIRE (Design Recovery tool). These applications were made available to NASA for inspection and evaluation. Programs developed in the MCC Software Technology Program run on the SUN workstation. The programs do not require special configuration, but they will require larger than usual amounts of disk space and RAM to operate properly.

  20. Reinventing Design Principles for Developing Low-Viscosity Carbon Dioxide-Binding Organic Liquids for Flue Gas Clean Up.

    PubMed

    Malhotra, Deepika; Koech, Phillip K; Heldebrant, David J; Cantu, David C; Zheng, Feng; Glezakou, Vassiliki-Alexandra; Rousseau, Roger

    2017-02-08

    Anthropogenic CO 2 emissions from point sources (e.g., coal fired-power plants) account for the majority of the greenhouse gases in the atmosphere. Water-lean solvent systems such as CO 2 -binding organic liquids (CO 2 BOLs) are being developed to reduce the energy requirement for CO 2 capture. Many water-lean solvents such as CO 2 BOLs are currently limited by the high viscosities of concentrated electrolyte solvents, thus many of these solvents have yet to move toward commercialization. Conventional standard trial-and-error approaches for viscosity reduction, while effective, are time consuming and economically expensive. We rethink the metrics and design principles of low-viscosity CO 2 -capture solvents using a combined synthesis and computational modeling approach. We critically study the effects of viscosity reducing factors such as orientation of hydrogen bonding, introduction of higher degrees of freedom, and cation or anion charge solvation, and assess whether or how each factor affects viscosity of CO 2 BOL CO 2 capture solvents. Ultimately, we found that hydrogen bond orientation and strength is the predominant factor influencing the viscosity in CO 2 BOL solvents. With this knowledge, a new CO 2 BOL variant, 1-MEIPADM-2-BOL, was synthesized and tested, resulting in a solvent that is approximately 60 % less viscous at 25 mol % CO 2 loading than our base compound 1-IPADM-2-BOL. The insights gained from the current study redefine the fundamental concepts and understanding of what influences viscosity in concentrated organic CO 2 -capture solvents. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Requirements' Role in Mobilizing and Enabling Design Conversation

    NASA Astrophysics Data System (ADS)

    Bergman, Mark

    Requirements play a critical role in a design conversation of systems and products. Product and system design exists at the crossroads of problems, solutions and requirements. Requirements contextualize problems and solutions, pointing the way to feasible outcomes. These are captured with models and detailed specifications. Still, stakeholders need to be able to understand one-another using shared design representations in order to mobilize bias and transform knowledge towards legitimized, desired results. Many modern modeling languages, including UML, as well as detailed, logic-based specifications are beyond the comprehension of key stakeholders. Hence, they inhibit, rather than promote design conversation. Improved design boundary objects (DBO), especially design requirements boundary objects (DRBO), need to be created and refined to improve the communications between principals. Four key features of design boundary objects that improve and promote design conversation are discussed in detail. A systems analysis and design case study is presented which demonstrates these features in action. It describes how a small team of analysts worked with key stakeholders to mobilize and guide a complex system design discussion towards an unexpected, yet desired outcome within a short time frame.

  2. What Hansel and Gretel’s Trail Teach Us about Knowledge Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wayne Simpson; Troy Hiltbrand

    Background At Idaho National Laboratory (INL), we are on the cusp of a significant era of change. INL is the lead Department of Energy Nuclear Research and Development Laboratory, focused on finding innovative solutions to the nation’s energy challenges. Not only has the Laboratory grown at an unprecedented rate over the last five years, but also has a significant segment of its workforce that is ready for retirement. Over the next 10 years, it is anticipated that upwards of 60% of the current workforce at INL will be eligible for retirement. Since the Laboratory is highly dependent on the intellectualmore » capabilities of its scientists and engineers and their efforts to ensure the future of the nation’s energy portfolio, this attrition of resources has the potential of seriously impacting the ability of the Laboratory to sustain itself and the growth that it has achieved in the past years. Similar to Germany in the early nineteenth century, we face the challenge of our self-identity and must find a way to solidify our legacy to propel us into the future. Approach As the Brothers Grimm set out to collect their fairy tales, they focused on gathering information from the people that were most knowledgeable in the subject. For them, it was the peasants, with their rich knowledge of the region’s sub-culture of folk lore that was passed down from generation to generation around the evening fire. As we look to capture this tacit knowledge, it is requisite that we also seek this information from those individuals that are most versed in it. In our case, it is the scientists and researchers who have dedicated their lives to providing the nation with nuclear energy. This information comes in many forms, both digital and non-digital. Some of this information still resides in the minds of these scientists and researchers who are close to retirement, or who have already retired. Once the information has been collected, it has to be sorted through to identify where the “shining stones” can be found. The quantity of this information makes it improbable for an individual or set of individuals to sort through it and pick out those ideas which are most important. To accomplish both the step of information capture and classification, modern advancements in technology give us the tools that we need to successfully capture this tacit knowledge. To assist in this process, we have evaluated multiple tools and methods that will help us to unlock the power of tacit knowledge. Tools The first challenge that stands in the way of success is the capture of information. More than 50 years of nuclear research is captured in log books, microfiche, and other non-digital formats. To transform this information from its current form into a format that can “shine,” requires a number of different tools. These tools fall into three major categories: Information Capture, Content Retrieval, and Information Classification. Information Capture The first step is to capture the information from a myriad of sources. With knowledge existing in multiple formats, this step requires multiple approaches to be successful. Some of the sources that require consideration include handwritten documents, typed documents, microfiche, images, audio and video feeds, and electronic images. To make this step feasible for a large body of knowledge requires automation.« less

  3. Explicit instructions and consolidation promote rewiring of automatic behaviors in the human mind.

    PubMed

    Szegedi-Hallgató, Emese; Janacsek, Karolina; Vékony, Teodóra; Tasi, Lia Andrea; Kerepes, Leila; Hompoth, Emőke Adrienn; Bálint, Anna; Németh, Dezső

    2017-06-29

    One major challenge in human behavior and brain sciences is to understand how we can rewire already existing perceptual, motor, cognitive, and social skills or habits. Here we aimed to characterize one aspect of rewiring, namely, how we can update our knowledge of sequential/statistical regularities when they change. The dynamics of rewiring was explored from learning to consolidation using a unique experimental design which is suitable to capture the effect of implicit and explicit processing and the proactive and retroactive interference. Our results indicate that humans can rewire their knowledge of such regularities incidentally, and consolidation has a critical role in this process. Moreover, old and new knowledge can coexist, leading to effective adaptivity of the human mind in the changing environment, although the execution of the recently acquired knowledge may be more fluent than the execution of the previously learned one. These findings can contribute to a better understanding of the cognitive processes underlying behavior change, and can provide insights into how we can boost behavior change in various contexts, such as sports, educational settings or psychotherapy.

  4. Designing capture trajectories to unstable periodic orbits around Europa

    NASA Technical Reports Server (NTRS)

    Russell, Ryan P.; Lam, Try

    2006-01-01

    The hostile environment of third body perturbations restricts a mission designer's ability to find well-behaved reproducible capture trajectories when dealing with limited control authority as is typical with low-thrust missions. The approach outlined in this paper confronts this shortcoming by utilizing dynamical systems theory and an extensive preexisting database of Restricted Three Body Problem (RTBP) periodic orbits. The stable manifolds of unstable periodic orbits are utilized to attract a spacecraft towards Europa. By selecting an appropriate periodic orbit, a mission designer can control important characteristics of the captured state including stability, minimum altitudes, characteristic inclinations, and characteristic radii among others. Several free parameters are optimized in the non-trivial mapping from the RTBP to a more realistic model. Although the ephemeris capture orbit is ballistic by design, low-thrust is used to target the state that leads to the capture orbit, control the spacecraft after arriving on the unstable quasi-periodic orbit, and begin the spiral down towards the science orbit. The approach allows a mission designer to directly target fuel efficient captures at Europa in an ephemeris model. Furthermore, it provides structure and controllability to the design of capture trajectories that reside in a chaotic environment.

  5. Safety in numbers 3: Authenticity, Building knowledge & skills and Competency development & assessment: the ABC of safe medication dosage calculation problem-solving pedagogy.

    PubMed

    Weeks, Keith W; Meriel Hutton, B; Coben, Diana; Clochesy, John M; Pontin, David

    2013-03-01

    When designing learning and assessment environments it is essential to articulate the underpinning education philosophy, theory, model and learning style support mechanisms that inform their structure and content. We elaborate on original PhD research that articulates the design rationale of authentic medication dosage calculation problem-solving (MDC-PS) learning and diagnostic assessment environments. These environments embody the principles of authenticity, building knowledge and skills and competency assessment and are designed to support development of competence and bridging of the theory-practice gap. Authentic learning and diagnostic assessment environments capture the features and expert practices that are located in real world practice cultures and recreate them in authentic virtual clinical environments. We explore how this provides students with a safe virtual authentic environment to actively experience, practice and undertake MDC-PS learning and assessment activities. We argue that this is integral to the construction and diagnostic assessment of schemata validity (mental constructions and frameworks that are an individual's internal representation of their world), bridging of the theory-practice gap and cognitive and functional competence development. We illustrate these principles through the underpinning pedagogical design of two online virtual authentic learning and diagnostic assessment environments (safeMedicate and eDose™). Copyright © 2012. Published by Elsevier Ltd.

  6. Cognitive task analysis for instruction in single-injection ultrasound guided-regional anesthesia

    NASA Astrophysics Data System (ADS)

    Gucev, Gligor V.

    Cognitive task analysis (CTA) is methodology for eliciting knowledge from subject matter experts. CTA has been used to capture the cognitive processes, decision-making, and judgments that underlie expert behaviors. A review of the literature revealed that CTA has not yet been used to capture the knowledge required to perform ultrasound guided regional anesthesia (UGRA). The purpose of this study was to utilize CTA to extract knowledge from UGRA experts and to determine whether instruction based on CTA of UGRA will produce results superior to the results of traditional training. This study adds to the knowledge base of CTA in being the first one to effectively capture the expert knowledge of UGRA. The derived protocol was used in a randomized, double blinded experiment involving UGRA instruction to 39 novice learners. The results of this study strongly support the hypothesis that CTA-based instruction in UGRA is more effective than conventional clinical instruction, as measured by conceptual pre- and post-tests, performance of a simulated UGRA procedure, and time necessary for the task performance. This study adds to the number of studies that have proven the superiority of CTA-informed instruction. Finally, it produced several validated instruments that can be used in instructing and evaluating UGRA.

  7. Empirical study using network of semantically related associations in bridging the knowledge gap.

    PubMed

    Abedi, Vida; Yeasin, Mohammed; Zand, Ramin

    2014-11-27

    The data overload has created a new set of challenges in finding meaningful and relevant information with minimal cognitive effort. However designing robust and scalable knowledge discovery systems remains a challenge. Recent innovations in the (biological) literature mining tools have opened new avenues to understand the confluence of various diseases, genes, risk factors as well as biological processes in bridging the gaps between the massive amounts of scientific data and harvesting useful knowledge. In this paper, we highlight some of the findings using a text analytics tool, called ARIANA--Adaptive Robust and Integrative Analysis for finding Novel Associations. Empirical study using ARIANA reveals knowledge discovery instances that illustrate the efficacy of such tool. For example, ARIANA can capture the connection between the drug hexamethonium and pulmonary inflammation and fibrosis that caused the tragic death of a healthy volunteer in a 2001 John Hopkins asthma study, even though the abstract of the study was not part of the semantic model. An integrated system, such as ARIANA, could assist the human expert in exploratory literature search by bringing forward hidden associations, promoting data reuse and knowledge discovery as well as stimulating interdisciplinary projects by connecting information across the disciplines.

  8. "JOB SEEKER"(Job Shadowing for Employee Engagement through Knowledge and Experience Retention).

    DOT National Transportation Integrated Search

    2016-05-01

    The main objective of this study was to explore how to optimally use the particular knowledge : retention/transfer technique of job shadowing as an informal method for knowledge capture and : transfer as well as increasing communication and emp...

  9. Importance of Knowledge Management in the Higher Educational Institutes

    ERIC Educational Resources Information Center

    Namdev Dhamdhere, Sangeeta

    2015-01-01

    Every academic institution contributes to knowledge. The generated information and knowledge is to be compiled at a central place and disseminated among the society for further growth. It is observed that the generated knowledge in the academic institute is not stored or captured properly. It is also observed that many a times generated…

  10. New Knowledge Derived from Learned Knowledge: Functional-Anatomic Correlates of Stimulus Equivalence

    ERIC Educational Resources Information Center

    Schlund, Michael W.; Hoehn-Saric, Rudolf; Cataldo, Michael F.

    2007-01-01

    Forming new knowledge based on knowledge established through prior learning is a central feature of higher cognition that is captured in research on stimulus equivalence (SE). Numerous SE investigations show that reinforcing behavior under control of distinct sets of arbitrary conditional relations gives rise to stimulus control by new, "derived"…

  11. A Diagram Editor for Efficient Biomedical Knowledge Capture and Integration

    PubMed Central

    Yu, Bohua; Jakupovic, Elvis; Wilson, Justin; Dai, Manhong; Xuan, Weijian; Mirel, Barbara; Athey, Brian; Watson, Stanley; Meng, Fan

    2008-01-01

    Understanding the molecular mechanisms underlying complex disorders requires the integration of data and knowledge from different sources including free text literature and various biomedical databases. To facilitate this process, we created the Biomedical Concept Diagram Editor (BCDE) to help researchers distill knowledge from data and literature and aid the process of hypothesis development. A key feature of BCDE is the ability to capture information with a simple drag-and-drop. This is a vast improvement over manual methods of knowledge and data recording and greatly increases the efficiency of the biomedical researcher. BCDE also provides a unique concept matching function to enforce consistent terminology, which enables conceptual relationships deposited by different researchers in the BCDE database to be mined and integrated for intelligible and useful results. We hope BCDE will promote the sharing and integration of knowledge from different researchers for effective hypothesis development. PMID:21347131

  12. Interprofessional Education on Adverse Childhood Experiences for Associate Degree Nursing Students.

    PubMed

    Olsen, Jeanette M; Warring, Sarah L

    2018-02-01

    The health impact of adverse childhood experiences (ACEs) is significant. Nurses need knowledge and must work in multidisciplinary teams to address this problem. This study examined the effects of an interprofessional education (IPE) activity with nonhealth care students on associate degree nursing (ADN) students' ACEs knowledge and perspectives on IPE. The mixed-methods approach used a quasi-experimental pretest-posttest design with an intervention and control group and thematic analysis of focus group data. Readiness for Interprofessional Learning scale mean scores indicated positive baseline IPE perspectives. Scores changed minimally for most measures in both the intervention and control groups on posttest. However, four major relevant themes related to ACEs knowledge and two related to interprofessional learning were identified. IPE with nonhealth care students is an effective way to teach ADN students about ACEs and infuse interprofessional learning in a nonuniversity setting. However, outcomes are best captured with qualitative data. [J Nurs Educ. 2018;57(2):101-105.]. Copyright 2018, SLACK Incorporated.

  13. An integrated science-based methodology to assess potential risks and implications of engineered nanomaterials.

    PubMed

    Tolaymat, Thabet; El Badawy, Amro; Sequeira, Reynold; Genaidy, Ash

    2015-11-15

    There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology to assess the potential risks of engineered nanomaterials. To achieve the study objective, two major tasks are accomplished, knowledge synthesis and algorithmic computational methodology. The knowledge synthesis task is designed to capture "what is known" and to outline the gaps in knowledge from ENMs risk perspective. The algorithmic computational methodology is geared toward the provision of decisions and an understanding of the risks of ENMs along different endpoints for the constituents of the SEE complex adaptive system. The approach presented herein allows for addressing the formidable task of assessing the implications and risks of exposure to ENMs, with the long term goal to build a decision-support system to guide key stakeholders in the SEE system towards building sustainable ENMs and nano-enabled products. Published by Elsevier B.V.

  14. Printed Circuit Board Design (PCB) with HDL Designer

    NASA Technical Reports Server (NTRS)

    Winkert, Thomas K.; LaFourcade, Teresa

    2004-01-01

    Contents include the following: PCB design with HDL designer, design process and schematic capture - symbols and diagrams: 1. Motivation: time savings, money savings, simplicity. 2. Approach: use single tool PCB for FPGA design, more FPGA designs than PCB designers. 3. Use HDL designer for schematic capture.

  15. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  16. Marshall Space Flight Center Propulsion Systems Department (PSD) Knowledge Management (KM) Initiative

    NASA Technical Reports Server (NTRS)

    Caraccioli, Paul; Varnedoe, Tom; Smith, Randy; McCarter, Mike; Wilson, Barry; Porter, Richard

    2006-01-01

    NASA Marshall Space Flight Center's Propulsion Systems Department (PSD) is four months into a fifteen month Knowledge Management (KM) initiative to support enhanced engineering decision making and analyses, faster resolution of anomalies (near-term) and effective, efficient knowledge infused engineering processes, reduced knowledge attrition, and reduced anomaly occurrences (long-term). The near-term objective of this initiative is developing a KM Pilot project, within the context of a 3-5 year KM strategy, to introduce and evaluate the use of KM within PSD. An internal NASA/MSFC PSD KM team was established early in project formulation to maintain a practitioner, user-centric focus throughout the conceptual development, planning and deployment of KM technologies and capabilities within the PSD. The PSD internal team is supported by the University of Alabama's Aging Infrastructure Systems Center of Excellence (AISCE), lntergraph Corporation, and The Knowledge Institute. The principle product of the initial four month effort has been strategic planning of PSD KNI implementation by first determining the "as is" state of KM capabilities and developing, planning and documenting the roadmap to achieve the desired "to be" state. Activities undertaken to suppoth e planning phase have included data gathering; cultural surveys, group work-sessions, interviews, documentation review, and independent research. Assessments and analyses have beon pedormed including industry benchmarking, related local and Agency initiatives, specific tools and techniques used and strategies for leveraging existing resources, people and technology to achieve common KM goals. Key findings captured in the PSD KM Strategic Plan include the system vision, purpose, stakeholders, prioritized strategic objectives mapped to the top ten practitioner needs and analysis of current resource usage. Opportunities identified from research, analyses, cultural1KM surveys and practitioner interviews include: executive and senior management sponsorship, KM awareness, promotion and training, cultural change management, process improvement, leveraging existing resources and new innovative technologies to align with other NASA KM initiatives (convergence: the big picture). To enable results based incremental implementation and future growth of the KM initiative, key performance measures have been identified including stakeholder value, system utility, learning and growth (knowledge capture, sharing, reduced anomaly recurrence), cultural change, process improvement and return-on-investment. The next steps for the initial implementation spiral (focused on SSME Turbomachinery) have been identified, largely based on the organization and compilation of summary level engineering process models, data capture matrices, functional models and conceptual-level svstems architecture. Key elements include detailed KM requirements definition, KM technology architecture assessment, - evaluation and selection, deployable KM Pilot design, development, implementation and evaluation, and justifying full implementation (estimated Return-on-Investment). Features identified for the notional system architecture include the knowledge presentation layer (and its components), knowledge network layer (and its components), knowledge storage layer (and its components), User Interface and capabilities. This paper provides a snapshot of the progress to date, the near term planning for deploying the KM pilot project and a forward look at results based growth of KM capabilities with-in the MSFC PSD.

  17. Integrating patient voices into health information for self-care and patient-clinician partnerships: Veterans Affairs design recommendations for patient-generated data applications.

    PubMed

    Woods, Susan S; Evans, Neil C; Frisbee, Kathleen L

    2016-05-01

    Electronic health record content is created by clinicians and is driven largely by intermittent and brief encounters with patients. Collecting data directly from patients in the form of patient-generated data (PGD) provides an unprecedented opportunity to capture personal, contextual patient information that can supplement clinical data and enhance patients' self-care. The US Department of Veterans Affairs (VA) is striving to implement the enterprise-wide capability to collect and use PGD in order to partner with patients in their care, improve the patient healthcare experience, and promote shared decision making. Through knowledge gained from Veterans' and healthcare teams' perspectives, VA created a taxonomy and an evolving framework on which to design and develop applications that capture and help physicians utilize PGD. Ten recommendations for effectively collecting and integrating PGD into patient care are discussed, addressing health system culture, data value, architecture, policy, data standards, clinical workflow, data visualization, and analytics and population reach. Published by Oxford University Press on behalf of the American Medical Informatics Association 2016. This work is written by US Government employees and is in the public domain in the US.

  18. A Technology-Enhanced Unit of Modeling Static Electricity: Integrating scientific explanations and everyday observations

    NASA Astrophysics Data System (ADS)

    Shen, Ji; Linn, Marcia C.

    2011-08-01

    What trajectories do students follow as they connect their observations of electrostatic phenomena to atomic-level visualizations? We designed an electrostatics unit, using the knowledge integration framework to help students link observations and scientific ideas. We analyze how learners integrate ideas about charges, charged particles, energy, and observable events. We compare learning enactments in a typical school and a magnet school in the USA. We use pre-tests, post-tests, embedded notes, and delayed post-tests to capture the trajectories of students' knowledge integration. We analyze how visualizations help students grapple with abstract electrostatics concepts such as induction. We find that overall students gain more sophisticated ideas. They can interpret dynamic, interactive visualizations, and connect charge- and particle-based explanations to interpret observable events. Students continue to have difficulty in applying the energy-based explanation.

  19. Evaluation of Cabin Crew Technical Knowledge

    NASA Technical Reports Server (NTRS)

    Dunbar, Melisa G.; Chute, Rebecca D.; Jordan, Kevin

    1998-01-01

    Accident and incident reports have indicated that flight attendants have numerous opportunities to provide the flight-deck crew with operational information that may prevent or essen the severity of a potential problem. Additionally, as carrier fleets transition from three person to two person flight-deck crews, the reliance upon the cabin crew for the transfer of this information may increase further. Recent research (Chute & Wiener, 1996) indicates that light attendants do not feel confident in their ability to describe mechanical parts or malfunctions of the aircraft, and the lack of flight attendant technical training has been referenced in a number of recent reports (National Transportation Safety Board, 1992; Transportation Safety Board of Canada, 1995; Chute & Wiener, 1996). The present study explored both flight attendant technical knowledge and flight attendant and dot expectations of flight attendant technical knowledge. To assess the technical knowledge if cabin crewmembers, 177 current flight attendants from two U.S. carriers voluntarily :ompleted a 13-item technical quiz. To investigate expectations of flight attendant technical knowledge, 181 pilots and a second sample of 96 flight attendants, from the same two airlines, completed surveys designed to capture each group's expectations of operational knowledge required of flight attendants. Analyses revealed several discrepancies between the present level of flight attendants.

  20. An information model to support user-centered design of medical devices.

    PubMed

    Hagedorn, Thomas J; Krishnamurty, Sundar; Grosse, Ian R

    2016-08-01

    The process of engineering design requires the product development team to balance the needs and limitations of many stakeholders, including those of the user, regulatory organizations, and the designing institution. This is particularly true in medical device design, where additional consideration must be given for a much more complex user-base that can only be accessed on a limited basis. Given this inherent challenge, few projects exist that consider design domain concepts, such as aspects of a detailed design, a detailed view of various stakeholders and their capabilities, along with the user-needs simultaneously. In this paper, we present a novel information model approach that combines a detailed model of design elements with a model of the design itself, customer requirements, and of the capabilities of the customer themselves. The information model is used to facilitate knowledge capture and automated reasoning across domains with a minimal set of rules by adopting a terminology that treats customer and design specific factors identically, thus enabling straightforward assessments. A uniqueness of this approach is that it systematically provides an integrated perspective on the key usability information that drive design decisions towards more universal or effective outcomes with the very design information impacted by the usability information. This can lead to cost-efficient optimal designs based on a direct inclusion of the needs of customers alongside those of business, marketing, and engineering requirements. Two case studies are presented to show the method's potential as a more effective knowledge management tool with built-in automated inferences that provide design insight, as well as its overall effectiveness as a platform to develop and execute medical device design from a holistic perspective. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. ScienceOrganizer System and Interface Summary

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Norvig, Peter (Technical Monitor)

    2001-01-01

    ScienceOrganizer is a specialized knowledge management tool designed to enhance the information storage, organization, and access capabilities of distributed NASA science teams. Users access ScienceOrganizer through an intuitive Web-based interface that enables them to upload, download, and organize project information - including data, documents, images, and scientific records associated with laboratory and field experiments. Information in ScienceOrganizer is "threaded", or interlinked, to enable users to locate, track, and organize interrelated pieces of scientific data. Linkages capture important semantic relationships among information resources in the repository, and these assist users in navigating through the information related to their projects.

  2. Fourth Conference on Artificial Intelligence for Space Applications

    NASA Technical Reports Server (NTRS)

    Odell, Stephen L. (Compiler); Denton, Judith S. (Compiler); Vereen, Mary (Compiler)

    1988-01-01

    Proceedings of a conference held in Huntsville, Alabama, on November 15-16, 1988. The Fourth Conference on Artificial Intelligence for Space Applications brings together diverse technical and scientific work in order to help those who employ AI methods in space applications to identify common goals and to address issues of general interest in the AI community. Topics include the following: space applications of expert systems in fault diagnostics, in telemetry monitoring and data collection, in design and systems integration; and in planning and scheduling; knowledge representation, capture, verification, and management; robotics and vision; adaptive learning; and automatic programming.

  3. The SMART Study, a Mobile Health and Citizen Science Methodological Platform for Active Living Surveillance, Integrated Knowledge Translation, and Policy Interventions: Longitudinal Study

    PubMed Central

    Bhawra, Jasmin; Leatherdale, Scott T; Ferguson, Leah; Longo, Justin; Rainham, Daniel; Larouche, Richard; Osgood, Nathaniel

    2018-01-01

    Background Physical inactivity is the fourth leading cause of death worldwide, costing approximately US $67.5 billion per year to health care systems. To curb the physical inactivity pandemic, it is time to move beyond traditional approaches and engage citizens by repurposing sedentary behavior (SB)–enabling ubiquitous tools (eg, smartphones). Objective The primary objective of the Saskatchewan, let’s move and map our activity (SMART) Study was to develop a mobile and citizen science methodological platform for active living surveillance, knowledge translation, and policy interventions. This methodology paper enumerates the SMART Study platform’s conceptualization, design, implementation, data collection procedures, analytical strategies, and potential for informing policy interventions. Methods This longitudinal investigation was designed to engage participants (ie, citizen scientists) in Regina and Saskatoon, Saskatchewan, Canada, in four different seasons across 3 years. In spring 2017, pilot data collection was conducted, where 317 adult citizen scientists (≥18 years) were recruited in person and online. Citizen scientists used a custom-built smartphone app, Ethica (Ethica Data Services Inc), for 8 consecutive days to provide a complex series of objective and subjective data. Citizen scientists answered a succession of validated surveys that were assigned different smartphone triggering mechanisms (eg, user-triggered and schedule-triggered). The validated surveys captured physical activity (PA), SB, motivation, perception of outdoor and indoor environment, and eudaimonic well-being. Ecological momentary assessments were employed on each day to capture not only PA but also physical and social contexts along with barriers and facilitators of PA, as relayed by citizen scientists using geo-coded pictures and audio files. To obtain a comprehensive objective picture of participant location, motion, and compliance, 6 types of sensor-based (eg, global positioning system and accelerometer) data were surveilled for 8 days. Initial descriptive analyses were conducted using geo-coded photographs and audio files. Results Pictures and audio files (ie, community voices) showed that the barriers and facilitators of active living included intrinsic or extrinsic motivations, social contexts, and outdoor or indoor environment, with pets and favorable urban design featuring as the predominant facilitators, and work-related screen time proving to be the primary barrier. Conclusions The preliminary pilot results show the flexibility of the SMART Study surveillance platform in identifying and addressing limitations based on empirical evidence. The results also show the successful implementation of a platform that engages participants to catalyze policy interventions. Although SMART Study is currently geared toward surveillance, using the same platform, active living interventions could be remotely implemented. SMART Study is the first mobile, citizen science surveillance platform utilizing a rigorous, longitudinal, and mixed-methods investigation to temporally capture behavioral data for knowledge translation and policy interventions. PMID:29588267

  4. Information Technology Management Strategies to Implement Knowledge Management Systems

    ERIC Educational Resources Information Center

    McGee, Mary Jane Christy

    2017-01-01

    More than 38% of the U.S. public workforce will likely retire by 2030, which may result in a labor shortage. Business leaders may adopt strategies to mitigate knowledge loss within their organizations by capturing knowledge in a knowledge management system (KMS). The purpose of this single case study was to explore strategies that information…

  5. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes.

    PubMed

    Ragan, Eric D; Endert, Alex; Sanyal, Jibonananda; Chen, Jian

    2016-01-01

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance information and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research.

  6. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric; Alex, Endert; Sanyal, Jibonananda

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance informationmore » and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research« less

  7. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes

    DOE PAGES

    Ragan, Eric; Alex, Endert; Sanyal, Jibonananda; ...

    2016-01-01

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance informationmore » and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research« less

  8. Techniques for capturing expert knowledge - An expert systems/hypertext approach

    NASA Technical Reports Server (NTRS)

    Lafferty, Larry; Taylor, Greg; Schumann, Robin; Evans, Randy; Koller, Albert M., Jr.

    1990-01-01

    The knowledge-acquisition strategy developed for the Explosive Hazards Classification (EHC) Expert System is described in which expert systems and hypertext are combined, and broad applications are proposed. The EHC expert system is based on rapid prototyping in which primary knowledge acquisition from experts is not emphasized; the explosive hazards technical bulletin, technical guidance, and minimal interviewing are used to develop the knowledge-based system. Hypertext is used to capture the technical information with respect to four issues including procedural, materials, test, and classification issues. The hypertext display allows the integration of multiple knowlege representations such as clarifications or opinions, and thereby allows the performance of a broad range of tasks on a single machine. Among other recommendations, it is suggested that the integration of hypertext and expert systems makes the resulting synergistic system highly efficient.

  9. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  10. Capturing flight system test engineering expertise: Lessons learned

    NASA Technical Reports Server (NTRS)

    Woerner, Irene Wong

    1991-01-01

    Within a few years, JPL will be challenged by the most active mission set in history. Concurrently, flight systems are increasingly more complex. Presently, the knowledge to conduct integration and test of spacecraft and large instruments is held by a few key people, each with many years of experience. JPL is in danger of losing a significant amount of this critical expertise, through retirement, during a period when demand for this expertise is rapidly increasing. The most critical issue at hand is to collect and retain this expertise and develop tools that would ensure the ability to successfully perform the integration and test of future spacecraft and large instruments. The proposed solution was to capture and codity a subset of existing knowledge, and to utilize this captured expertise in knowledge-based systems. First year results and activities planned for the second year of this on-going effort are described. Topics discussed include lessons learned in knowledge acquisition and elicitation techniques, life-cycle paradigms, and rapid prototyping of a knowledge-based advisor (Spacecraft Test Assistant) and a hypermedia browser (Test Engineering Browser). The prototype Spacecraft Test Assistant supports a subset of integration and test activities for flight systems. Browser is a hypermedia tool that allows users easy perusal of spacecraft test topics. A knowledge acquisition tool called ConceptFinder which was developed to search through large volumes of data for related concepts is also described and is modified to semi-automate the process of creating hypertext links.

  11. Using local ecological knowledge to monitor threatened Mekong megafauna in Lao PDR

    PubMed Central

    Phommachak, Amphone; Vannachomchan, Kongseng; Guegan, Francois

    2017-01-01

    Pressures on freshwater biodiversity in Southeast Asia are accelerating yet the status and conservation needs of many of the region’s threatened fish species are unclear. This impacts the ability to implement conservation activities and to understand the effects of infrastructure developments and other hydrological changes. We used Local Ecological Knowledge from fishing communities on the Mekong River in the Siphandone waterscape, Lao PDR to estimate mean and mode last capture dates of eight rare or culturally significant fish species in order to provide conservation monitoring baselines. One hundred and twenty fishermen, from six villages, were interviewed. All eight species had been captured, by at least one of the interviewees, within the waterscape within the past year. However the mean and mode last capture dates varied between the species. Larger species, and those with higher Red List threat status, were caught less recently than smaller species of less conservation concern. The status of the Critically Endangered Pangasius sanitwongsei (mean last capture date 116.4 months) is particularly worrying suggesting severe population decline although cultural issues may have caused this species to have been under-reported. This highlights that studies making use of Local Ecological Knowledge need to understand the cultural background and context from which data is collected. Nevertheless we recommend our approach, of stratified random interviews to establish mean last capture dates, may be an effective methodology for monitoring freshwater fish species of conservation concern within artisanal fisheries. If fishing effort remains relatively constant, or if changes in fishing effort are accounted for, differences over time in mean last capture dates are likely to represent changes in the status of species. We plan to repeat our interview surveys within the waterscape as part of a long-term fish-monitoring program. PMID:28820901

  12. Using local ecological knowledge to monitor threatened Mekong megafauna in Lao PDR.

    PubMed

    Gray, Thomas N E; Phommachak, Amphone; Vannachomchan, Kongseng; Guegan, Francois

    2017-01-01

    Pressures on freshwater biodiversity in Southeast Asia are accelerating yet the status and conservation needs of many of the region's threatened fish species are unclear. This impacts the ability to implement conservation activities and to understand the effects of infrastructure developments and other hydrological changes. We used Local Ecological Knowledge from fishing communities on the Mekong River in the Siphandone waterscape, Lao PDR to estimate mean and mode last capture dates of eight rare or culturally significant fish species in order to provide conservation monitoring baselines. One hundred and twenty fishermen, from six villages, were interviewed. All eight species had been captured, by at least one of the interviewees, within the waterscape within the past year. However the mean and mode last capture dates varied between the species. Larger species, and those with higher Red List threat status, were caught less recently than smaller species of less conservation concern. The status of the Critically Endangered Pangasius sanitwongsei (mean last capture date 116.4 months) is particularly worrying suggesting severe population decline although cultural issues may have caused this species to have been under-reported. This highlights that studies making use of Local Ecological Knowledge need to understand the cultural background and context from which data is collected. Nevertheless we recommend our approach, of stratified random interviews to establish mean last capture dates, may be an effective methodology for monitoring freshwater fish species of conservation concern within artisanal fisheries. If fishing effort remains relatively constant, or if changes in fishing effort are accounted for, differences over time in mean last capture dates are likely to represent changes in the status of species. We plan to repeat our interview surveys within the waterscape as part of a long-term fish-monitoring program.

  13. Feasibility, Acceptability and Findings from a Pilot Randomized Controlled Intervention Study on the Impact of a Book Designed to Inform Patients about Cancer Clinical Trials

    PubMed Central

    Carney, Patricia A.; Tucker, Erin K.; Newby, Timothy A.; Beer, Tomasz M.

    2014-01-01

    Objective To assess the feasibility, acceptability and changes in knowledge among cancer patients assigned to receive a 160 page book on experimental cancer therapies and clinical trials. Methods We enrolled 20 patients with cancer who had never participated in a clinical trial, and randomly assigned them to receive the book either during Week 1or Week 4 of the study. We collected baseline patient demographic and cancer related information as well as knowledge about cancer clinical trials at Week 0. Follow-up surveys were administered at Weeks 3 and 6 for both study groups. Comparisons were made within and between groups randomized to receive the book Early (at Week 1) to those who received it Later (at Week 4). Results One hundred percent of data were captured in both groups at baseline, which decreased to 77.8% by Week 6. The vast majority of participants found the book moderately or very useful (89% in the Early Group at Week 3 and 95.5% in the Late Group at Week 6). Within group pair-wise comparisons found significant difference between baseline and Week 6 in content-specific knowledge scores among participants in the Late Group (79% vs. 92.1%, p=0.01). Global knowledge scores increased significantly for variables reflecting knowledge that promotes decisions to participate in clinical trials. Conclusions Providing published reading material to patients with cancer is both feasible and acceptable. Offering information to patients about cancer clinical trials, using a book designed for patients with cancer may influence knowledge related to decision to participate in clinical trials. PMID:24127249

  14. Feasibility, acceptability and findings from a pilot randomized controlled intervention study on the impact of a book designed to inform patients about cancer clinical trials.

    PubMed

    Carney, Patricia A; Tucker, Erin K; Newby, Timothy A; Beer, Tomasz M

    2014-03-01

    This study was conducted to assess the feasibility, acceptability, and changes in knowledge among cancer patients assigned to receive a 160-page book on experimental cancer therapies and clinical trials. We enrolled 20 patients with cancer who had never participated in a clinical trial and randomly assigned them to receive the book either during week 1 or week 4 of the study. We collected baseline patient demographic and cancer-related information as well as knowledge about cancer clinical trials at week 0. Follow-up surveys were administered at weeks 3 and 6 for both study groups. Comparisons were made within and between groups randomized to receive the book early (at week 1) to those who received it later (at week 4). One hundred percent of data were captured in both groups at baseline, which decreased to 77.8% by week 6. The vast majority of participants found the book moderately or very useful (89% in the Early Group at week 3 and 95.5% in the Late Group at week 6). Within group pairwise comparisons found significant difference between baseline and week 6 in content-specific knowledge scores among participants in the Late Group [79% versus 92.1%, p = 0.01). Global knowledge scores increased significantly for variables reflecting knowledge that promotes decisions to participate in clinical trials. Providing published reading material to patients with cancer is both feasible and acceptable. Offering information to patients about cancer clinical trials, using a book designed for patients with cancer may influence knowledge related to decision to participate in clinical trials.

  15. A Qualitative Approach to Assessing Technological Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Groth, Randall; Spickler, Donald; Bergner, Jennifer; Bardzell, Michael

    2009-01-01

    Because technological pedagogical content knowledge is becoming an increasingly important construct in the field of teacher education, there is a need for assessment mechanisms that capture teachers' development of this portion of the knowledge base for teaching. The paper describes a proposal drawing on qualitative data produced during lesson…

  16. Constraining OCT with Knowledge of Device Design Enables High Accuracy Hemodynamic Assessment of Endovascular Implants.

    PubMed

    O'Brien, Caroline C; Kolandaivelu, Kumaran; Brown, Jonathan; Lopes, Augusto C; Kunio, Mie; Kolachalama, Vijaya B; Edelman, Elazer R

    2016-01-01

    Stacking cross-sectional intravascular images permits three-dimensional rendering of endovascular implants, yet introduces between-frame uncertainties that limit characterization of device placement and the hemodynamic microenvironment. In a porcine coronary stent model, we demonstrate enhanced OCT reconstruction with preservation of between-frame features through fusion with angiography and a priori knowledge of stent design. Strut positions were extracted from sequential OCT frames. Reconstruction with standard interpolation generated discontinuous stent structures. By computationally constraining interpolation to known stent skeletons fitted to 3D 'clouds' of OCT-Angio-derived struts, implant anatomy was resolved, accurately rendering features from implant diameter and curvature (n = 1 vessels, r2 = 0.91, 0.90, respectively) to individual strut-wall configurations (average displacement error ~15 μm). This framework facilitated hemodynamic simulation (n = 1 vessel), showing the critical importance of accurate anatomic rendering in characterizing both quantitative and basic qualitative flow patterns. Discontinuities with standard approaches systematically introduced noise and bias, poorly capturing regional flow effects. In contrast, the enhanced method preserved multi-scale (local strut to regional stent) flow interactions, demonstrating the impact of regional contexts in defining the hemodynamic consequence of local deployment errors. Fusion of planar angiography and knowledge of device design permits enhanced OCT image analysis of in situ tissue-device interactions. Given emerging interests in simulation-derived hemodynamic assessment as surrogate measures of biological risk, such fused modalities offer a new window into patient-specific implant environments.

  17. A Tailored Ontology Supporting Sensor Implementation for the Maintenance of Industrial Machines

    PubMed Central

    Belkadi, Farouk; Bernard, Alain

    2017-01-01

    The longtime productivity of an industrial machine is improved by condition-based maintenance strategies. To do this, the integration of sensors and other cyber-physical devices is necessary in order to capture and analyze a machine’s condition through its lifespan. Thus, choosing the best sensor is a critical step to ensure the efficiency of the maintenance process. Indeed, considering the variety of sensors, and their features and performance, a formal classification of a sensor’s domain knowledge is crucial. This classification facilitates the search for and reuse of solutions during the design of a new maintenance service. Following a Knowledge Management methodology, the paper proposes and develops a new sensor ontology that structures the domain knowledge, covering both theoretical and experimental sensor attributes. An industrial case study is conducted to validate the proposed ontology and to demonstrate its utility as a guideline to ease the search of suitable sensors. Based on the ontology, the final solution will be implemented in a shared repository connected to legacy CAD (computer-aided design) systems. The selection of the best sensor is, firstly, obtained by the matching of application requirements and sensor specifications (that are proposed by this sensor repository). Then, it is refined from the experimentation results. The achieved solution is recorded in the sensor repository for future reuse. As a result, the time and cost of the design process of new condition-based maintenance services is reduced. PMID:28885592

  18. Knowledge Framework Implementation with Multiple Architectures - 13090

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, H.; Lagos, L.; Quintero, W.

    2013-07-01

    Multiple kinds of knowledge management systems are operational in public and private enterprises, large and small organizations with a variety of business models that make the design, implementation and operation of integrated knowledge systems very difficult. In recent days, there has been a sweeping advancement in the information technology area, leading to the development of sophisticated frameworks and architectures. These platforms need to be used for the development of integrated knowledge management systems which provides a common platform for sharing knowledge across the enterprise, thereby reducing the operational inefficiencies and delivering cost savings. This paper discusses the knowledge framework andmore » architecture that can be used for the system development and its application to real life need of nuclear industry. A case study of deactivation and decommissioning (D and D) is discussed with the Knowledge Management Information Tool platform and framework. D and D work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with DOE sites, the Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintain this valuable information in a universally available and easily usable system. (authors)« less

  19. Learning from the Mars Rover Mission: Scientific Discovery, Learning and Memory

    NASA Technical Reports Server (NTRS)

    Linde, Charlotte

    2005-01-01

    Purpose: Knowledge management for space exploration is part of a multi-generational effort. Each mission builds on knowledge from prior missions, and learning is the first step in knowledge production. This paper uses the Mars Exploration Rover mission as a site to explore this process. Approach: Observational study and analysis of the work of the MER science and engineering team during rover operations, to investigate how learning occurs, how it is recorded, and how these representations might be made available for subsequent missions. Findings: Learning occurred in many areas: planning science strategy, using instrumen?s within the constraints of the martian environment, the Deep Space Network, and the mission requirements; using software tools effectively; and running two teams on Mars time for three months. This learning is preserved in many ways. Primarily it resides in individual s memories. It is also encoded in stories, procedures, programming sequences, published reports, and lessons learned databases. Research implications: Shows the earliest stages of knowledge creation in a scientific mission, and demonstrates that knowledge management must begin with an understanding of knowledge creation. Practical implications: Shows that studying learning and knowledge creation suggests proactive ways to capture and use knowledge across multiple missions and generations. Value: This paper provides a unique analysis of the learning process of a scientific space mission, relevant for knowledge management researchers and designers, as well as demonstrating in detail how new learning occurs in a learning organization.

  20. GATOR: Requirements capturing of telephony features

    NASA Technical Reports Server (NTRS)

    Dankel, Douglas D., II; Walker, Wayne; Schmalz, Mark

    1992-01-01

    We are developing a natural language-based, requirements gathering system called GATOR (for the GATherer Of Requirements). GATOR assists in the development of more accurate and complete specifications of new telephony features. GATOR interacts with a feature designer who describes a new feature, set of features, or capability to be implemented. The system aids this individual in the specification process by asking for clarifications when potential ambiguities are present, by identifying potential conflicts with other existing features, and by presenting its understanding of the feature to the designer. Through user interaction with a model of the existing telephony feature set, GATOR constructs a formal representation of the new, 'to be implemented' feature. Ultimately GATOR will produce a requirements document and will maintain an internal representation of this feature to aid in future design and specification. This paper consists of three sections that describe (1) the structure of GATOR, (2) POND, GATOR's internal knowledge representation language, and (3) current research issues.

  1. Marshall Space Flight Center Propulsion Systems Department (PSD) KM Initiative

    NASA Technical Reports Server (NTRS)

    Caraccioli, Paul; Varnadoe, Tom; McCarter, Mike

    2006-01-01

    NASA Marshall Space Flight Center s Propulsion Systems Department (PSD) is four months into a fifteen month Knowledge Management (KM) initiative to support enhanced engineering decision making and analyses, faster resolution of anomalies (near-term) and effective, efficient knowledge infused engineering processes, reduced knowledge attrition, and reduced anomaly occurrences (long-term). The near-term objective of this initiative is developing a KM Pilot project, within the context of a 3-5 year KM strategy, to introduce and evaluate the use of KM within PSD. An internal NASA/MSFC PSD KM team was established early in project formulation to maintain a practitioner, user-centric focus throughout the conceptual development, planning and deployment of KM technologies and capabilities with in the PSD. The PSD internal team is supported by the University of Alabama's Aging Infrastructure Systems Center Of Excellence (AISCE), Intergraph Corporation, and The Knowledge Institute. The principle product of the initial four month effort has been strategic planning of PSD KM implementation by first determining the "as is" state of KM capabilities and developing, planning and documenting the roadmap to achieve the desired "to be" state. Activities undertaken to support the planning phase have included data gathering; cultural surveys, group work-sessions, interviews, documentation review, and independent research. Assessments and analyses have been performed including industry benchmarking, related local and Agency initiatives, specific tools and techniques used and strategies for leveraging existing resources, people and technology to achieve common KM goals. Key findings captured in the PSD KM Strategic Plan include the system vision, purpose, stakeholders, prioritized strategic objectives mapped to the top ten practitioner needs and analysis of current resource usage. Opportunities identified from research, analyses, cultural/KM surveys and practitioner interviews include: executive and senior management sponsorship, KM awareness, promotion and training, cultural change management, process improvement, leveraging existing resources and new innovative technologies to align with other NASA KM initiatives (convergence: the big picture). To enable results based incremental implementation and future growth of the KM initiative, key performance measures have been identified including stakeholder value, system utility, learning and growth (knowledge capture, sharing, reduced anomaly recurrence), cultural change, process improvement and return-on-investment. The next steps for the initial implementation spiral (focused on SSME Turbomachinery) have been identified, largely based on the organization and compilation of summary level engineering process models, data capture matrices, functional models and conceptual-level systems architecture. Key elements include detailed KM requirements definition, KM technology architecture assessment, evaluation and selection, deployable KM Pilot design, development, implementation and evaluation, and justifying full implementation (estimated Return-on-Investment). Features identified for the notional system architecture include the knowledge presentation layer (and its components), knowledge network layer (and its components), knowledge storage layer (and its components), User Interface and capabilities. This paper provides a snapshot of the progress to date, the near term planning for deploying the KM pilot project and a forward look at results based growth of KM capabilities with-in the MSFC PSD.

  2. Bench-Scale Process for Low-Cost Carbon Dioxide (CO2) Capture Using a Phase-Changing Absorbent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westendorf, Tiffany; Caraher, Joel; Chen, Wei

    2015-03-31

    The objective of this project is to design and build a bench-scale process for a novel phase-changing aminosilicone-based CO2-capture solvent. The project will establish scalability and technical and economic feasibility of using a phase-changing CO2-capture absorbent for post-combustion capture of CO2 from coal-fired power plants with 90% capture efficiency and 95% CO2 purity at a cost of $40/tonne of CO2 captured by 2025 and a cost of <$10/tonne of CO2 captured by 2035. In the first budget period of this project, the bench-scale phase-changing CO2 capture process was designed using data and operating experience generated under a previous project (ARPA-emore » project DE-AR0000084). Sizing and specification of all major unit operations was completed, including detailed process and instrumentation diagrams. The system was designed to operate over a wide range of operating conditions to allow for exploration of the effect of process variables on CO2 capture performance.« less

  3. Architectural Blueprint for Plate Boundary Observatories based on interoperable Data Management Platforms

    NASA Astrophysics Data System (ADS)

    Kerschke, D. I.; Häner, R.; Schurr, B.; Oncken, O.; Wächter, J.

    2014-12-01

    Interoperable data management platforms play an increasing role in the advancement of knowledge and technology in many scientific disciplines. Through high quality services they support the establishment of efficient and innovative research environments. Well-designed research environments can facilitate the sustainable utilization, exchange, and re-use of scientific data and functionality by using standardized community models. Together with innovative 3D/4D visualization, these concepts provide added value in improving scientific knowledge-gain, even across the boundaries of disciplines. A project benefiting from the added value is the Integrated Plate boundary Observatory in Chile (IPOC). IPOC is a European-South American network to study earthquakes and deformation at the Chilean continental margin and to monitor the plate boundary system for capturing an anticipated great earthquake in a seismic gap. In contrast to conventional observatories that monitor individual signals only, IPOC captures a large range of different processes through various observation methods (e.g., seismographs, GPS, magneto-telluric sensors, creep-meter, accelerometer, InSAR). For IPOC a conceptual design has been devised that comprises an architectural blueprint for a data management platform based on common and standardized data models, protocols, and encodings as well as on an exclusive use of Free and Open Source Software (FOSS) including visualization components. Following the principles of event-driven service-oriented architectures, the design enables novel processes by sharing and re-using functionality and information on the basis of innovative data mining and data fusion technologies. This platform can help to improve the understanding of the physical processes underlying plate deformations as well as the natural hazards induced by them. Through the use of standards, this blueprint can not only be facilitated for other plate observing systems (e.g., the European Plate Observing System EPOS), it also supports integrated approaches to include sensor networks that provide complementary processes for dynamic monitoring. Moreover, the integration of such observatories into superordinate research infrastructures (federation of virtual observatories) will be enabled.

  4. Structuring and extracting knowledge for the support of hypothesis generation in molecular biology

    PubMed Central

    Roos, Marco; Marshall, M Scott; Gibson, Andrew P; Schuemie, Martijn; Meij, Edgar; Katrenko, Sophia; van Hage, Willem Robert; Krommydas, Konstantinos; Adriaans, Pieter W

    2009-01-01

    Background Hypothesis generation in molecular and cellular biology is an empirical process in which knowledge derived from prior experiments is distilled into a comprehensible model. The requirement of automated support is exemplified by the difficulty of considering all relevant facts that are contained in the millions of documents available from PubMed. Semantic Web provides tools for sharing prior knowledge, while information retrieval and information extraction techniques enable its extraction from literature. Their combination makes prior knowledge available for computational analysis and inference. While some tools provide complete solutions that limit the control over the modeling and extraction processes, we seek a methodology that supports control by the experimenter over these critical processes. Results We describe progress towards automated support for the generation of biomolecular hypotheses. Semantic Web technologies are used to structure and store knowledge, while a workflow extracts knowledge from text. We designed minimal proto-ontologies in OWL for capturing different aspects of a text mining experiment: the biological hypothesis, text and documents, text mining, and workflow provenance. The models fit a methodology that allows focus on the requirements of a single experiment while supporting reuse and posterior analysis of extracted knowledge from multiple experiments. Our workflow is composed of services from the 'Adaptive Information Disclosure Application' (AIDA) toolkit as well as a few others. The output is a semantic model with putative biological relations, with each relation linked to the corresponding evidence. Conclusion We demonstrated a 'do-it-yourself' approach for structuring and extracting knowledge in the context of experimental research on biomolecular mechanisms. The methodology can be used to bootstrap the construction of semantically rich biological models using the results of knowledge extraction processes. Models specific to particular experiments can be constructed that, in turn, link with other semantic models, creating a web of knowledge that spans experiments. Mapping mechanisms can link to other knowledge resources such as OBO ontologies or SKOS vocabularies. AIDA Web Services can be used to design personalized knowledge extraction procedures. In our example experiment, we found three proteins (NF-Kappa B, p21, and Bax) potentially playing a role in the interplay between nutrients and epigenetic gene regulation. PMID:19796406

  5. Knowledge-based requirements analysis for automating software development

    NASA Technical Reports Server (NTRS)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  6. Using Ontologies to Formalize Services Specifications in Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Breitman, Karin Koogan; Filho, Aluizio Haendchen; Haeusler, Edward Hermann

    2004-01-01

    One key issue in multi-agent systems (MAS) is their ability to interact and exchange information autonomously across applications. To secure agent interoperability, designers must rely on a communication protocol that allows software agents to exchange meaningful information. In this paper we propose using ontologies as such communication protocol. Ontologies capture the semantics of the operations and services provided by agents, allowing interoperability and information exchange in a MAS. Ontologies are a formal, machine processable, representation that allows to capture the semantics of a domain and, to derive meaningful information by way of logical inference. In our proposal we use a formal knowledge representation language (OWL) that translates into Description Logics (a subset of first order logic), thus eliminating ambiguities and providing a solid base for machine based inference. The main contribution of this approach is to make the requirements explicit, centralize the specification in a single document (the ontology itself), at the same that it provides a formal, unambiguous representation that can be processed by automated inference machines.

  7. Assessing the Relative Risk of Aerocapture Using Probabalistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Percy, Thomas K.; Bright, Ellanee; Torres, Abel O.

    2005-01-01

    A recent study performed for the Aerocapture Technology Area in the In-Space Propulsion Technology Projects Office at the Marshall Space Flight Center investigated the relative risk of various capture techniques for Mars missions. Aerocapture has been proposed as a possible capture technique for future Mars missions but has been perceived by many in the community as a higher risk option as compared to aerobraking and propulsive capture. By performing a probabilistic risk assessment on aerocapture, aerobraking and propulsive capture, a comparison was made to uncover the projected relative risks of these three maneuvers. For mission planners, this knowledge will allow them to decide if the mass savings provided by aerocapture warrant any incremental risk exposure. The study focuses on a Mars Sample Return mission currently under investigation at the Jet Propulsion Laboratory (JPL). In each case (propulsive, aerobraking and aerocapture), the Earth return vehicle is inserted into Martian orbit by one of the three techniques being investigated. A baseline spacecraft was established through initial sizing exercises performed by JPL's Team X. While Team X design results provided the baseline and common thread between the spacecraft, in each case the Team X results were supplemented by historical data as needed. Propulsion, thermal protection, guidance, navigation and control, software, solar arrays, navigation and targeting and atmospheric prediction were investigated. A qualitative assessment of human reliability was also included. Results show that different risk drivers contribute significantly to each capture technique. For aerocapture, the significant drivers include propulsion system failures and atmospheric prediction errors. Software and guidance hardware contribute the most to aerobraking risk. Propulsive capture risk is mainly driven by anomalous solar array degradation and propulsion system failures. While each subsystem contributes differently to the risk of each technique, results show that there exists little relative difference in the reliability of these capture techniques although uncertainty for the aerocapture estimates remains high given the lack of in-space demonstration.

  8. Overview of NRC Proactive Management of Materials Degradation (PMMD) Program

    NASA Astrophysics Data System (ADS)

    Carpenter, C. E. Gene; Hull, Amy; Oberson, Greg

    Materials degradation phenomena, if not appropriately managed, have the potential to adversely impact the design functionality and safety margins of nuclear power plant (NPP) systems, structures and components (SSCs). Therefore, the U.S. Nuclear Regulatory Commission (NRC) has initiated an over-the-horizon multi-year research Proactive Management of Materials Degradation (PMMD) Research Program, which is presently evaluating longer time frames (i.e., 80 or more years) and including passive long-lived SSCs beyond the primary piping and core internals, such as concrete containment and cable insulation. This will allow the NRC to (1) identify significant knowledge gaps and new forms of degradation; (2) capture current knowledge base; and, (3) prioritize materials degradation research needs and directions for future efforts. This effort is being accomplished in collaboration with the U.S. Department of Energy's (DOE) LWR Sustainability (LWRS) program. This presentation will discuss the activities to date, including results, and the path forward.

  9. Measuring Knowledge Integration Learning of Energy Topics: A two-year longitudinal study

    NASA Astrophysics Data System (ADS)

    Liu, Ou Lydia; Ryoo, Kihyun; Linn, Marcia C.; Sato, Elissa; Svihla, Vanessa

    2015-05-01

    Although researchers call for inquiry learning in science, science assessments rarely capture the impact of inquiry instruction. This paper reports on the development and validation of assessments designed to measure middle-school students' progress in gaining integrated understanding of energy while studying an inquiry-oriented curriculum. The assessment development was guided by the knowledge integration framework. Over 2 years of implementation, more than 4,000 students from 4 schools participated in the study, including a cross-sectional and a longitudinal cohort. Results from item response modeling analyses revealed that: (a) the assessments demonstrated satisfactory psychometric properties in terms of reliability and validity; (b) both the cross-sectional and longitudinal cohorts made progress on integrating their understanding energy concepts; and (c) among many factors (e.g. gender, grade, school, and home language) associated with students' science performance, unit implementation was the strongest predictor.

  10. Investigating the Knowledge Needed for Teaching Mathematics: An Exploratory Validation Study Focusing on Teaching Practices

    ERIC Educational Resources Information Center

    Charalambous, Charalambos Y.

    2016-01-01

    Central in the frameworks proposed to capture the knowledge needed for teaching mathematics is the assumption that teachers need more than pure subject-matter knowledge. Validation studies exploring this assumption by recruiting contrasting populations are relatively scarce. Drawing on a sample of 644 Greek-Cypriots preservice and inservice…

  11. Knowledge as a Resource--Networks Do Matter: A Study of SME Firms in Rural Illinois.

    ERIC Educational Resources Information Center

    Solymossy, Emeric

    2000-01-01

    Networks among people and businesses facilitate the capture and diffusion of technical and organizational knowledge and can be classified by type of knowledge being exchanged. Types include buyer-supplier information, technical problem-solving information, and informal community information. A survey of 141 small and medium-sized enterprises…

  12. Operationalizing Levels of Academic Mastery Based on Vygotsky's Theory: The Study of Mathematical Knowledge

    ERIC Educational Resources Information Center

    Nezhnov, Peter; Kardanova, Elena; Vasilyeva, Marina; Ludlow, Larry

    2015-01-01

    The present study tested the possibility of operationalizing levels of knowledge acquisition based on Vygotsky's theory of cognitive growth. An assessment tool (SAM-Math) was developed to capture a hypothesized hierarchical structure of mathematical knowledge consisting of procedural, conceptual, and functional levels. In Study 1, SAM-Math was…

  13. A Knowledge Base for FIA Data Uses

    Treesearch

    Victor A. Rudis

    2005-01-01

    Knowledge management provides a way to capture the collective wisdom of an organization, facilitate organizational learning, and foster opportunities for improvement. This paper describes a knowledge base compiled from uses of field observations made by the U.S. Department of Agriculture Forest Service, Forest Inventory and Analysis program and a citation database of...

  14. If we only knew what we know: principles for knowledge sharing across people, practices, and platforms.

    PubMed

    Dearing, James W; Greene, Sarah M; Stewart, Walter F; Williams, Andrew E

    2011-03-01

    The improvement of health outcomes for both individual patients and entire populations requires improvement in the array of structures that support decisions and activities by healthcare practitioners. Yet, many gaps remain in how even sophisticated healthcare organizations manage knowledge. Here we describe the value of a trans-institutional network for identifying and capturing how-to knowledge that contributes to improved outcomes. Organizing and sharing on-the-job experience would concentrate and organize the activities of individual practitioners and subject their rapid cycle improvement testing and refinement to a form of collective intelligence for subsequent diffusion back through the network. We use the existing Cancer Research Network as an example of how a loosely structured consortium of healthcare delivery organizations could create and grow an implementation registry to foster innovation and implementation success by communicating what works, how, and which practitioners are using each innovation. We focus on the principles and parameters that could be used as a basis for infrastructure design. As experiential knowledge from across institutions builds within such a system, the system could ultimately motivate rapid learning and adoption of best practices. Implications for research about healthcare IT, invention, and organizational learning are discussed.

  15. A Survey of Knowledge Management Research & Development at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This chapter catalogs knowledge management research and development activities at NASA Ames Research Center as of April 2002. A general categorization scheme for knowledge management systems is first introduced. This categorization scheme divides knowledge management capabilities into five broad categories: knowledge capture, knowledge preservation, knowledge augmentation, knowledge dissemination, and knowledge infrastructure. Each of nearly 30 knowledge management systems developed at Ames is then classified according to this system. Finally, a capsule description of each system is presented along with information on deployment status, funding sources, contact information, and both published and internet-based references.

  16. Knowledge translation strategies for enhancing nurses' evidence-informed decision making: a scoping review.

    PubMed

    Yost, Jennifer; Thompson, David; Ganann, Rebecca; Aloweni, Fazila; Newman, Kristine; McKibbon, Ann; Dobbins, Maureen; Ciliska, Donna

    2014-06-01

    Nurses are increasingly expected to engage in evidence-informed decision making (EIDM); the use of research evidence with information about patient preferences, clinical context and resources, and their clinical expertise in decision making. Strategies for enhancing EIDM have been synthesized in high-quality systematic reviews, yet most relate to physicians or mixed disciplines. Existing reviews, specific to nursing, have not captured a broad range of strategies for promoting the knowledge and skills for EIDM, patient outcomes as a result of EIDM, or contextual information for why these strategies "work." To conduct a scoping review to identify and map the literature related to strategies implemented among nurses in tertiary care for promoting EIDM knowledge, skills, and behaviours, as well as patient outcomes and contextual implementation details. A search strategy was developed and executed to identify relevant research evidence. Participants included registered nurses, clinical nurse specialists, nurse practitioners, and advanced practice nurses. Strategies were those enhancing nurses' EIDM knowledge, skills, or behaviours, as well as patient outcomes. Relevant studies included systematic reviews, randomized controlled trials, cluster randomized controlled trials, non-randomized trials (including controlled before and after studies), cluster non-randomized trials, interrupted time series designs, prospective cohort studies, mixed-method studies, and qualitative studies. Two reviewers performed study selection and data extraction using standardized forms. Disagreements were resolved through discussion or third party adjudication. Using a narrative synthesis, the body of research was mapped by design, clinical areas, strategies, and provider and patient outcomes to determine areas appropriate for a systematic review. There are a sufficiently high number of studies to conduct a more focused systematic review by care settings, study design, implementation strategies, or outcomes. A focused review could assist in determining which strategies can be recommended for enhancing EIDM knowledge, skills, and behaviours among nurses in tertiary care. © 2014 The Authors. Worldviews on Evidence-Based Nursing published by Wiley Periodicals, Inc. on behalf of Sigma Theta Tau International.

  17. Assessment of the role of small mammals in the transmission cycle of tegumentary leishmaniasis and first report of natural infection with Leishmania braziliensis in two sigmodontines in northeastern Argentina.

    PubMed

    Fernández, María S; Fraschina, Jimena; Acardi, Soraya; Liotta, Domingo J; Lestani, Eduardo; Giuliani, Magalí; Busch, María; Salomón, O Daniel

    2018-02-01

    To contribute to the knowledge of the role of small mammals in the transmission cycle of tegumentary leishmaniasis caused by Leishmania braziliensis, we studied the small mammal community and its temporal and spatial association with phlebotominae, as well as small mammal infection by Leishmania spp. by PCR-RFLP analyses in an endemic area of northeastern Argentina. Ten small mammal samplings were conducted (2007-2009, 7506 Sherman trap nights and 422 cage trap nights). In two of these samplings, 16 capture stations each one consisting of a CDC light trap to capture phlebotominae, two to four Sherman traps and two cage traps were placed. We found co-occurrence of phlebotominae and small mammal captures in four stations, which were all the stations with small mammal captures and yielded 97% (2295 specimens, including 21 gravid females) of the total phlebotominae captures, suggesting that small mammals may provide a potential source of blood for phlebotominae females. One Didelphis albiventris and two Rattus rattus were associated with high captures of Nyssomyia whitmani, vector of L. braziliensis in the study area. The PCR-RFLP analyses confirm the presence of L. braziliensis in two sigmodontine small mammals (Akodon sp. and Euryoryzomys russatus) for the first time in Argentina, to our knowledge.

  18. Nitrosamines and Nitramines in Amine-Based Carbon Dioxide Capture Systems: Fundamentals, Engineering Implications, and Knowledge Gaps.

    PubMed

    Yu, Kun; Mitch, William A; Dai, Ning

    2017-10-17

    Amine-based absorption is the primary contender for postcombustion CO 2 capture from fossil fuel-fired power plants. However, significant concerns have arisen regarding the formation and emission of toxic nitrosamine and nitramine byproducts from amine-based systems. This paper reviews the current knowledge regarding these byproducts in CO 2 capture systems. In the absorber, flue gas NO x drives nitrosamine and nitramine formation after its dissolution into the amine solvent. The reaction mechanisms are reviewed based on CO 2 capture literature as well as biological and atmospheric chemistry studies. In the desorber, nitrosamines are formed under high temperatures by amines reacting with nitrite (a hydrolysis product of NO x ), but they can also thermally decompose following pseudo-first order kinetics. The effects of amine structure, primarily amine order, on nitrosamine formation and the corresponding mechanisms are discussed. Washwater units, although intended to control emissions from the absorber, can contribute to additional nitrosamine formation when accumulated amines react with residual NO x . Nitramines are much less studied than nitrosamines in CO 2 capture systems. Mitigation strategies based on the reaction mechanisms in each unit of the CO 2 capture systems are reviewed. Lastly, we highlight research needs in clarifying reaction mechanisms, developing analytical methods for both liquid and gas phases, and integrating different units to quantitatively predict the accumulation and emission of nitrosamines and nitramines.

  19. The role of ethnography in STI and HIV/AIDS education and promotion with traditional healers in Zimbabwe

    PubMed Central

    Simmons, David

    2011-01-01

    This article explores the utility of ethnography in accounting for healers’ understandings of HIV/AIDS—and more generally sexually transmitted infections—and the planning of HIV/AIDS education interventions targeting healers in urban Zimbabwe. I argue that much of the information utilized for planning and implementing such programs is actually based on rapid research procedures (usually single-method survey-based approaches) that do not fully capture healers’ explanatory frameworks. This incomplete information then becomes authoritative knowledge about local ‘traditions' and forms the basis for the design and implementation of training programs. Such decontextualization may, in turn, affect program effectiveness. PMID:21343161

  20. A community effort towards a knowledge-base and mathematical model of the human pathogen Salmonella Typhimurium LT2.

    PubMed

    Thiele, Ines; Hyduke, Daniel R; Steeb, Benjamin; Fankam, Guy; Allen, Douglas K; Bazzani, Susanna; Charusanti, Pep; Chen, Feng-Chi; Fleming, Ronan M T; Hsiung, Chao A; De Keersmaecker, Sigrid C J; Liao, Yu-Chieh; Marchal, Kathleen; Mo, Monica L; Özdemir, Emre; Raghunathan, Anu; Reed, Jennifer L; Shin, Sook-il; Sigurbjörnsdóttir, Sara; Steinmann, Jonas; Sudarsan, Suresh; Swainston, Neil; Thijs, Inge M; Zengler, Karsten; Palsson, Bernhard O; Adkins, Joshua N; Bumann, Dirk

    2011-01-18

    Metabolic reconstructions (MRs) are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG) knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium is a human pathogen, causes various diseases and its increasing antibiotic resistance poses a public health problem. Here, we describe a community-driven effort, in which more than 20 experts in S. Typhimurium biology and systems biology collaborated to reconcile and expand the S. Typhimurium BiGG knowledge-base. The consensus MR was obtained starting from two independently developed MRs for S. Typhimurium. Key results of this reconstruction jamboree include i) development and implementation of a community-based workflow for MR annotation and reconciliation; ii) incorporation of thermodynamic information; and iii) use of the consensus MR to identify potential multi-target drug therapy approaches. Taken together, with the growing number of parallel MRs a structured, community-driven approach will be necessary to maximize quality while increasing adoption of MRs in experimental design and interpretation.

  1. Model-based diagnostics for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Martin, Eric R.; Lerutte, Marcel G.

    1991-01-01

    An innovative approach to fault management was recently demonstrated for the NASA LeRC Space Station Freedom (SSF) power system testbed. This project capitalized on research in model-based reasoning, which uses knowledge of a system's behavior to monitor its health. The fault management system (FMS) can isolate failures online, or in a post analysis mode, and requires no knowledge of failure symptoms to perform its diagnostics. An in-house tool called MARPLE was used to develop and run the FMS. MARPLE's capabilities are similar to those available from commercial expert system shells, although MARPLE is designed to build model-based as opposed to rule-based systems. These capabilities include functions for capturing behavioral knowledge, a reasoning engine that implements a model-based technique known as constraint suspension, and a tool for quickly generating new user interfaces. The prototype produced by applying MARPLE to SSF not only demonstrated that model-based reasoning is a valuable diagnostic approach, but it also suggested several new applications of MARPLE, including an integration and testing aid, and a complement to state estimation.

  2. Creating and testing the concept of an academic NGO for enhancing health equity: a new mode of knowledge production?

    PubMed

    Robinson, Vivian; Tugwell, Peter; Walker, Peter; Ter Kuile, Aleida A; Neufeld, Vic; Hatcher-Roberts, Janet; Amaratunga, Carol; Andersson, Neil; Doull, Marion; Labonte, Ron; Muckle, Wendy; Murangira, Felicite; Nyamai, Caroline; Ralph-Robinson, Dawn; Simpson, Don; Sitthi-Amorn, Chitr; Turnbull, Jeff; Walker, Joelle; Wood, Chris

    2007-08-01

    Collaborative action is required to address persistent and systematic health inequities which exist for most diseases in most countries of the world. The Academic NGO initiative (ACANGO) described in this paper was set up as a focused network giving priority to twinned partnerships between Academic research centres and community-based NGOs. ACANGO aims to capture the strengths of both in order to build consensus among stakeholders, engage the community, focus on leadership training, shared management and resource development and deployment. A conceptual model was developed through a series of community consultations. This model was tested with four academic-community challenge projects based in Kenya, Canada, Thailand and Rwanda and an online forum and coordinating hub based at the University of Ottawa. Between February 2005 and February 2007, each of the four challenge projects was able to show specific outputs, outcomes and impacts related to enhancing health equity through the relevant production and application of knowledge. The ACANGO initiative model and network has demonstrated success in enhancing the production and use of knowledge in program design and implementation for vulnerable populations.

  3. A community effort towards a knowledge-base and mathematical model of the human pathogen Salmonella Typhimurium LT2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thiele, Ines; Hyduke, Daniel R.; Steeb, Benjamin

    2011-01-01

    Metabolic reconstructions (MRs) are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG) knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium is a human pathogen, causes various diseases and its increasing antibiotic resistance poses a public health problem. Here, we describe a community-driven effort, in which more than 20 experts in S. Typhimurium biology and systems biology collaborated to reconcile and expand the S. Typhimurium BiGG knowledge-base. The consensus MR was obtained starting from two independently developed MRs for S. Typhimurium. Key results of thismore » reconstruction jamboree include i) development and implementation of a community-based workflow for MR annotation and reconciliation; ii) incorporation of thermodynamic information; and iii) use of the consensus MR to identify potential multi-target drug therapy approaches. Finally, taken together, with the growing number of parallel MRs a structured, community-driven approach will be necessary to maximize quality while increasing adoption of MRs in experimental design and interpretation.« less

  4. The prediction of crystal structure by merging knowledge methods with first principles quantum mechanics

    NASA Astrophysics Data System (ADS)

    Ceder, Gerbrand

    2007-03-01

    The prediction of structure is a key problem in computational materials science that forms the platform on which rational materials design can be performed. Finding structure by traditional optimization methods on quantum mechanical energy models is not possible due to the complexity and high dimensionality of the coordinate space. An unusual, but efficient solution to this problem can be obtained by merging ideas from heuristic and ab initio methods: In the same way that scientist build empirical rules by observation of experimental trends, we have developed machine learning approaches that extract knowledge from a large set of experimental information and a database of over 15,000 first principles computations, and used these to rapidly direct accurate quantum mechanical techniques to the lowest energy crystal structure of a material. Knowledge is captured in a Bayesian probability network that relates the probability to find a particular crystal structure at a given composition to structure and energy information at other compositions. We show that this approach is highly efficient in finding the ground states of binary metallic alloys and can be easily generalized to more complex systems.

  5. Developing a geoscience knowledge framework for a national geological survey organisation

    NASA Astrophysics Data System (ADS)

    Howard, Andrew S.; Hatton, Bill; Reitsma, Femke; Lawrie, Ken I. G.

    2009-04-01

    Geological survey organisations (GSOs) are established by most nations to provide a geoscience knowledge base for effective decision-making on mitigating the impacts of natural hazards and global change, and on sustainable management of natural resources. The value of the knowledge base as a national asset is continually enhanced by the exchange of knowledge between GSOs as data and information providers and the stakeholder community as knowledge 'users and exploiters'. Geological maps and associated narrative texts typically form the core of national geoscience knowledge bases, but have some inherent limitations as methods of capturing and articulating knowledge. Much knowledge about the three-dimensional (3D) spatial interpretation and its derivation and uncertainty, and the wider contextual value of the knowledge, remains intangible in the minds of the mapping geologist in implicit and tacit form. To realise the value of these knowledge assets, the British Geological Survey (BGS) has established a workflow-based cyber-infrastructure to enhance its knowledge management and exchange capability. Future geoscience surveys in the BGS will contribute to a national, 3D digital knowledge base on UK geology, with the associated implicit and tacit information captured as metadata, qualitative assessments of uncertainty, and documented workflows and best practice. Knowledge-based decision-making at all levels of society requires both the accessibility and reliability of knowledge to be enhanced in the grid-based world. Establishment of collaborative cyber-infrastructures and ontologies for geoscience knowledge management and exchange will ensure that GSOs, as knowledge-based organisations, can make their contribution to this wider goal.

  6. MSFC Propulsion Systems Department Knowledge Management Project

    NASA Technical Reports Server (NTRS)

    Caraccioli, Paul A.

    2007-01-01

    This slide presentation reviews the Knowledge Management (KM) project of the Propulsion Systems Department at Marshall Space Flight Center. KM is needed to support knowledge capture, preservation and to support an information sharing culture. The presentation includes the strategic plan for the KM initiative, the system requirements, the technology description, the User Interface and custom features, and a search demonstration.

  7. Course Ontology-Based User's Knowledge Requirement Acquisition from Behaviors within E-Learning Systems

    ERIC Educational Resources Information Center

    Zeng, Qingtian; Zhao, Zhongying; Liang, Yongquan

    2009-01-01

    User's knowledge requirement acquisition and analysis are very important for a personalized or user-adaptive learning system. Two approaches to capture user's knowledge requirement about course content within an e-learning system are proposed and implemented in this paper. The first approach is based on the historical data accumulated by an…

  8. 3D Geological Mapping - uncovering the subsurface to increase environmental understanding

    NASA Astrophysics Data System (ADS)

    Kessler, H.; Mathers, S.; Peach, D.

    2012-12-01

    Geological understanding is required for many disciplines studying natural processes from hydrology to landscape evolution. The subsurface structure of rocks and soils and their properties occupies three-dimensional (3D) space and geological processes operate in time. Traditionally geologists have captured their spatial and temporal knowledge in 2 dimensional maps and cross-sections and through narrative, because paper maps and later two dimensional geographical information systems (GIS) were the only tools available to them. Another major constraint on using more explicit and numerical systems to express geological knowledge is the fact that a geologist only ever observes and measures a fraction of the system they study. Only on rare occasions does the geologist have access to enough real data to generate meaningful predictions of the subsurface without the input of conceptual understanding developed from and knowledge of the geological processes responsible for the deposition, emplacement and diagenesis of the rocks. This in turn has led to geology becoming an increasingly marginalised science as other disciplines have embraced the digital world and have increasingly turned to implicit numerical modelling to understand environmental processes and interactions. Recent developments in geoscience methodology and technology have gone some way to overcoming these barriers and geologists across the world are beginning to routinely capture their knowledge and combine it with all available subsurface data (of often highly varying spatial distribution and quality) to create regional and national geological three dimensional geological maps. This is re-defining the way geologists interact with other science disciplines, as their concepts and knowledge are now expressed in an explicit form that can be used downstream to design process models structure. For example, groundwater modellers can refine their understanding of groundwater flow in three dimensions or even directly parameterize their numerical models using outputs from 3D mapping. In some cases model code is being re-designed in order to deal with the increasing geological complexity expressed by Geologists. These 3D maps contain have inherent uncertainty, just as their predecessors, 2D geological maps had, and there remains a significant body of work to quantify and effectively communicate this uncertainty. Here we present examples of regional and national 3D maps from Geological Survey Organisations worldwide and how these are being used to better solve real-life environmental problems. The future challenge for geologists is to make these 3D maps easily available in an accessible and interoperable form so that the environmental science community can truly integrate the hidden subsurface into a common understanding of the whole geosphere.

  9. Factors associated with (un)willingness to be an organ donor: importance of public exposure and knowledge.

    PubMed

    Haustein, Silke V; Sellers, Marty T

    2004-04-01

    Transplantation is increasingly limited by the supply of donor organs. Identifying subgroups that do not support organ donation will allow targeted efforts to increase organ donation. A total of 185 non-acutely ill outpatients visiting a community physician's office voluntarily completed a survey designed to capture views and general knowledge/misconceptions about cadaveric organ donation/transplantation. Of 185 patients, 86 were willing to donate, 42 were unwilling, and 57 were unsure. Willingness to donate was significantly associated with: having discussed the topic with family; having known a cadaveric organ donor; age 55 yr; having graduated high school; recognizing the organ shortage as the primary problem in transplantation; having received a post-high school degree; having seen public information within 30 d; and having a family member in health care (all p

  10. Safety and Mission Assurance Knowledge Management Retention

    NASA Technical Reports Server (NTRS)

    Johnson, Teresa A.

    2006-01-01

    This viewgraph presentation reviews the issues surrounding the management of knowledge in regards to safety and mission assurance. The JSC workers who were hired in the 1960's are slated to retire in the next two to three years. The experiences and knowledge of these NASA workers must be identified, and disseminated. This paper reviews some of the strategies that the S&MA is developing to capture that valuable institutional knowledge.

  11. Knowledge Engineering for Preservation and Future use of Institutional Knowledge

    NASA Technical Reports Server (NTRS)

    Moreman, Douglas; Dyer, John

    1996-01-01

    This Project has two main thrusts-preservation of special knowledge and its useful representation via computers. NASA is losing the expertise of its engineers and scientists who put together the great missions of the past. We no longer are landing men on the moon. Some of the equipment still used today (such as the RL-10 rocket) was designed decades ago by people who are now retiring. Furthermore, there has been a lack, in some areas of technology, of new projects that overlap with the old and that would have provided opportunities for monitoring by senior engineers of the young ones. We are studying this problem and trying out a couple of methods of soliciting and recording rare knowledge from experts. One method is that of Concept Maps which produces a graphical interface to knowledge even as it helps solicit that knowledge. We arranged for experienced help in this method from John Coffey of the Institute of Human and Machine Technology at the University of West Florida. A second method which we plan to try out in May, is a video-taped review of selected failed missions (e.g., the craft tumbled and blew up). Five senior engineers (most already retired from NASA) will, as a team, analyze available data, illustrating their thought processes as they try to solve the problem of why a space craft failed to complete its mission. The session will be captured in high quality audio and with at least two video cameras. The video can later be used to plan future concept mapping interviews and, in edited form, be a product in itself. Our computer representations of the amassed knowledge may eventually, via the methods of expert systems, be joined with other software being prepared as a suite of tools to aid future engineers designing rocket engines. In addition to representation by multimedia concept maps, we plan to consider linking vast bodies of text (and other media) by hypertexting methods.

  12. Photography Basics. Capturing the Essence of Physical Education and Sport Programs.

    ERIC Educational Resources Information Center

    Kluka, Darlene A.; Mitchell, Carolyn B.

    1990-01-01

    The physical educator or coach may be responsible for marketing programs to the public, and skill in 35mm photography can help. Ingredients necessary for successful 35mm movement photography are discussed: knowledge of the movement and the appropriate equipment; techniques for capturing movement; positioning for the ultimate shot; and practice.…

  13. Student Perceptions of Online Tutoring Videos

    ERIC Educational Resources Information Center

    Sligar, Steven R.; Pelletier, Christopher D.; Bonner, Heidi Stone; Coghill, Elizabeth; Guberman, Daniel; Zeng, Xiaoming; Newman, Joyce J.; Muller, Dorothy; Dennis, Allen

    2017-01-01

    Online tutoring is made possible by using videos to replace or supplement face to face services. The purpose of this research was to examine student reactions to the use of lecture capture technology in a university tutoring setting and to assess student knowledge of some features of Tegrity lecture capture software. A survey was administered to…

  14. Knowledge of healthcare professionals about rights of patient’s images

    PubMed Central

    Caires, Bianca Rodrigues; Lopes, Maria Carolina Barbosa Teixeira; Okuno, Meiry Fernanda Pinto; Vancini-Campanharo, Cássia Regina; Batista, Ruth Ester Assayag

    2015-01-01

    Objective To assess knowledge of healthcare professionals about capture and reproduction of images of patients in a hospital setting. Methods A cross-sectional and observational study among 360 healthcare professionals (nursing staff, physical therapists, and physicians), working at a teaching hospital in the city of São Paulo (SP). A questionnaire with sociodemographic information was distributed and data were correlated to capture and reproduction of images at hospitals. Results Of the 360 respondents, 142 had captured images of patients in the last year, and 312 reported seeing other professionals taking photographs of patients. Of the participants who captured images, 61 said they used them for studies and presentation of clinical cases, and 168 professionals reported not knowing of any legislation in the Brazilian Penal Code regarding collection and use of images. Conclusion There is a gap in the training of healthcare professionals regarding the use of patient´s images. It is necessary to include subjects that address this theme in the syllabus of undergraduate courses, and the healthcare organizations should regulate this issue. PMID:26267838

  15. Smart homes and ambient assisted living applications: from data to knowledge-empowering or overwhelming older adults? Contribution of the IMIA Smart Homes and Ambiant Assisted Living Working Group.

    PubMed

    Demiris, G; Thompson, H

    2011-01-01

    As health care systems face limited resources and workforce shortages to address the complex needs of older adult populations, innovative approaches utilizing information technology can support aging. Smart Home and Ambient Assisted Living (SHAAL) systems utilize advanced and ubiquitous technologies including sensors and other devices that are integrated in the residential infrastructure or wearable, to capture data describing activities of daily living and health related events. This paper highlights how data from SHAAL systems can lead to information and knowledge that ultimately improves clinical outcomes and quality of life for older adults as well as quality of health care services. We conducted a review of personal health record applications specifically for older adults and approaches to using information to improve elder care. We present a framework that showcases how data captured from SHAAL systems can be processed to provide meaningful information that becomes part of a personal health record. Synthesis and visualization of information resulting from SHAAL systems can lead to knowledge and support education, delivery of tailored interventions and if needed, transitions in care. Such actions can involve multiple stakeholders as part of shared decision making. SHAAL systems have the potential to support aging and improve quality of life and decision making for older adults and their families. The framework presented in this paper demonstrates how emphasis needs to be placed into extracting meaningful information from new innovative systems that will support decision making. The challenge for informatics designers and researchers is to facilitate an evolution of SHAAL systems expanding beyond demonstration projects to actual interventions that will improve health care for older adults.

  16. Fishers' knowledge and seahorse conservation in Brazil

    PubMed Central

    Rosa, Ierecê ML; Alves, Rômulo RN; Bonifácio, Kallyne M; Mourão, José S; Osório, Frederico M; Oliveira, Tacyana PR; Nottingham, Mara C

    2005-01-01

    From a conservationist perspective, seahorses are threatened fishes. Concomitantly, from a socioeconomic perspective, they represent a source of income to many fishing communities in developing countries. An integration between these two views requires, among other things, the recognition that seahorse fishers have knowledge and abilities that can assist the implementation of conservation strategies and of management plans for seahorses and their habitats. This paper documents the knowledge held by Brazilian fishers on the biology and ecology of the longsnout seahorse Hippocampus reidi. Its aims were to explore collaborative approaches to seahorse conservation and management in Brazil; to assess fishers' perception of seahorse biology and ecology, in the context evaluating potential management options; to increase fishers' involvement with seahorse conservation in Brazil. Data were obtained through questionnaires and interviews made during field surveys conducted in fishing villages located in the States of Piauí, Ceará, Paraíba, Maranhão, Pernambuco and Pará. We consider the following aspects as positive for the conservation of seahorses and their habitats in Brazil: fishers were willing to dialogue with researchers; although captures and/or trade of brooding seahorses occurred, most interviewees recognized the importance of reproduction to the maintenance of seahorses in the wild (and therefore of their source of income), and expressed concern over population declines; fishers associated the presence of a ventral pouch with reproduction in seahorses (regardless of them knowing which sex bears the pouch), and this may facilitate the construction of collaborative management options designed to eliminate captures of brooding specimens; fishers recognized microhabitats of importance to the maintenance of seahorse wild populations; fishers who kept seahorses in captivity tended to recognize the condtions as poor, and as being a cause of seahorse mortality. PMID:16336660

  17. Fishers' knowledge and seahorse conservation in Brazil.

    PubMed

    Rosa, Ierecê Ml; Alves, Rômulo Rn; Bonifácio, Kallyne M; Mourão, José S; Osório, Frederico M; Oliveira, Tacyana Pr; Nottingham, Mara C

    2005-12-08

    From a conservationist perspective, seahorses are threatened fishes. Concomitantly, from a socioeconomic perspective, they represent a source of income to many fishing communities in developing countries. An integration between these two views requires, among other things, the recognition that seahorse fishers have knowledge and abilities that can assist the implementation of conservation strategies and of management plans for seahorses and their habitats. This paper documents the knowledge held by Brazilian fishers on the biology and ecology of the longsnout seahorse Hippocampus reidi. Its aims were to explore collaborative approaches to seahorse conservation and management in Brazil; to assess fishers' perception of seahorse biology and ecology, in the context evaluating potential management options; to increase fishers' involvement with seahorse conservation in Brazil. Data were obtained through questionnaires and interviews made during field surveys conducted in fishing villages located in the States of Piauí, Ceará, Paraíba, Maranhão, Pernambuco and Pará. We consider the following aspects as positive for the conservation of seahorses and their habitats in Brazil: fishers were willing to dialogue with researchers; although captures and/or trade of brooding seahorses occurred, most interviewees recognized the importance of reproduction to the maintenance of seahorses in the wild (and therefore of their source of income), and expressed concern over population declines; fishers associated the presence of a ventral pouch with reproduction in seahorses (regardless of them knowing which sex bears the pouch), and this may facilitate the construction of collaborative management options designed to eliminate captures of brooding specimens; fishers recognized microhabitats of importance to the maintenance of seahorse wild populations; fishers who kept seahorses in captivity tended to recognize the condtions as poor, and as being a cause of seahorse mortality.

  18. Promotion of Influenza Prevention Beliefs and Behaviors through Primary School Science Education

    PubMed Central

    Koep, TH; Jenkins, S; M Hammerlund, ME; Clemens, C; Fracica, E; Ekker, SC; Enders, FT; Huskins, WC; Pierret, C

    2016-01-01

    Background School-based campaigns to improve student health have demonstrated short-term success across various health topics. However, evidence of the effectiveness of programs in promoting healthy beliefs and behaviors is limited. We hypothesized that educational curricula teaching the science behind health promotion would increase student knowledge, beliefs and adherence to healthy behaviors, in this case related to influenza. Methods Integrated Science Education Outreach is a successful education intervention in Rochester, Minnesota public schools that has demonstrated improvements in student learning. Within this program, we designed novel curricula and assessments to determine if gains in knowledge extended to influenza prevention. Further, we coupled InSciEd Out programming with a clinical intervention, Influenza Prevention Prescription Education (IPPE), to compare students' attitudes, intentions and healthy behaviors utilizing surveys and hand hygiene monitoring equipment. Results 95 students participated in (IPPE) in the intervention school. Talking drawings captured improvement in influenza prevention understanding related to hand washing [pre n=17(43%); post n=30(77%)] and vaccination [pre n=2(5%); post n=15(38%)]. Findings from 1024 surveys from 566 students revealed strong baseline understanding and attitudes related to hand washing and cough etiquette (74% or greater positive responses). Automated hand hygiene monitoring in school bathrooms and classrooms estimated compliance for both soap (overall median 63%, IQR 38% to 100%) and hand sanitizer use (0.04 to 0.24 uses per student per day) but did not show significant pre/ post IPPE differences. Conclusions Student understanding of principles of influenza prevention was reasonably high. Even with this baseline, InSciEd Out and IPPE improved students’ unprompted knowledge of behaviors to prevent influenza, as reflected by talking drawings. This novel metric may be more sensitive in capturing knowledge among students than traditional assessment methods. However, IPPE did not produce further significant differences in student attitudes and behaviors regarding the flu. PMID:27525193

  19. Promotion of Influenza Prevention Beliefs and Behaviors through Primary School Science Education.

    PubMed

    Koep, T H; Jenkins, S; M Hammerlund, M E; Clemens, C; Fracica, E; Ekker, S C; Enders, F T; Huskins, W C; Pierret, C

    2016-06-01

    School-based campaigns to improve student health have demonstrated short-term success across various health topics. However, evidence of the effectiveness of programs in promoting healthy beliefs and behaviors is limited. We hypothesized that educational curricula teaching the science behind health promotion would increase student knowledge, beliefs and adherence to healthy behaviors, in this case related to influenza. Integrated Science Education Outreach is a successful education intervention in Rochester, Minnesota public schools that has demonstrated improvements in student learning. Within this program, we designed novel curricula and assessments to determine if gains in knowledge extended to influenza prevention. Further, we coupled InSciEd Out programming with a clinical intervention, Influenza Prevention Prescription Education (IPPE), to compare students' attitudes, intentions and healthy behaviors utilizing surveys and hand hygiene monitoring equipment. 95 students participated in (IPPE) in the intervention school. Talking drawings captured improvement in influenza prevention understanding related to hand washing [pre n=17(43%); post n=30(77%)] and vaccination [pre n=2(5%); post n=15(38%)]. Findings from 1024 surveys from 566 students revealed strong baseline understanding and attitudes related to hand washing and cough etiquette (74% or greater positive responses). Automated hand hygiene monitoring in school bathrooms and classrooms estimated compliance for both soap (overall median 63%, IQR 38% to 100%) and hand sanitizer use (0.04 to 0.24 uses per student per day) but did not show significant pre/ post IPPE differences. Student understanding of principles of influenza prevention was reasonably high. Even with this baseline, InSciEd Out and IPPE improved students' unprompted knowledge of behaviors to prevent influenza, as reflected by talking drawings. This novel metric may be more sensitive in capturing knowledge among students than traditional assessment methods. However, IPPE did not produce further significant differences in student attitudes and behaviors regarding the flu.

  20. Applicability of Kerker preconditioning scheme to the self-consistent density functional theory calculations of inhomogeneous systems

    NASA Astrophysics Data System (ADS)

    Zhou, Yuzhi; Wang, Han; Liu, Yu; Gao, Xingyu; Song, Haifeng

    2018-03-01

    The Kerker preconditioner, based on the dielectric function of homogeneous electron gas, is designed to accelerate the self-consistent field (SCF) iteration in the density functional theory calculations. However, a question still remains regarding its applicability to the inhomogeneous systems. We develop a modified Kerker preconditioning scheme which captures the long-range screening behavior of inhomogeneous systems and thus improves the SCF convergence. The effectiveness and efficiency is shown by the tests on long-z slabs of metals, insulators, and metal-insulator contacts. For situations without a priori knowledge of the system, we design the a posteriori indicator to monitor if the preconditioner has suppressed charge sloshing during the iterations. Based on the a posteriori indicator, we demonstrate two schemes of the self-adaptive configuration for the SCF iteration.

  1. SPICE: An innovative, flexible instrument concept

    NASA Technical Reports Server (NTRS)

    Nishioka, Kenji; Cauffman, D. P.; Jurcevich, B.; Mendez, David J.; Ryder, James T.

    1994-01-01

    Studies and plans for orbital capture of cosmic dust and interplanetary dust particles (IDP's) looked very bright with the advent of space station Freedom (SSF) and formal selection of Cosmic Dust Collection Facility (CDCF) as an attached payload in 1990. Unfortunately it has been downhill since its selection, culminating in CDCF being dropped as attached payload in the SSF redesign process this year. This action was without any input from the science or cosmic dust communities. The Exobiology Intact Capture Experiment (Exo-ICE) as an experiment on CDCF was also lost. Without CDCF, no facility-class instrument for cosmic dust studies is available or planned. When CDCF (and Exo-ICE) was selected as a SSF attached payload, an exercise called the small particle intact capture experiment (SPICE) was started for Exo-ICE to develop an understanding and early testing of the necessary expertise and technology for intact capture of cosmic dust and IDP's. This SPICE activity looks to fly small, meter square or less, collection area experiments on early orbital platforms of opportunity such as EURECA, MIR, WESTAR, and others, including the shuttle. The SPICE activity has focused on developing techniques and instrument concepts to capture particles intact and without inadvertent contamination. It began with a survey and screening of available capture media concepts and then focused on the development of a capture medium that can meet these requirements. Evaluation and development of the chosen capture medium, aerogel (a silicon oxide gel), has so far lived up to the expectations of meeting the requirements and is highlighted in a companion paper at this workshop. Others such as McDonnell's Timeband Capture Cell Experiment (TICCE) on EuReCa and Tsuo's GAS-CAN lid experiments on STS 47 and 57 have flown aerogel, but without addressing the contamination issue/requirement, especially regarding organics. Horz, Zolenskym and others have studied and have also been advocates for its development. The SPICE instrument's experiment design builds on the knowledge gained from these efforts to meet the intact capture, noncontamination requirements. An overview of a possible SPICE experimental instrument concept using the MIR space station as a host platform for cosmic dust collection is provided in this paper. The SPICE concept is nonplatform-specified and can fly on any platform that provides a mode for experiment recovery.

  2. The SMART Study, a Mobile Health and Citizen Science Methodological Platform for Active Living Surveillance, Integrated Knowledge Translation, and Policy Interventions: Longitudinal Study.

    PubMed

    Katapally, Tarun Reddy; Bhawra, Jasmin; Leatherdale, Scott T; Ferguson, Leah; Longo, Justin; Rainham, Daniel; Larouche, Richard; Osgood, Nathaniel

    2018-03-27

    Physical inactivity is the fourth leading cause of death worldwide, costing approximately US $67.5 billion per year to health care systems. To curb the physical inactivity pandemic, it is time to move beyond traditional approaches and engage citizens by repurposing sedentary behavior (SB)-enabling ubiquitous tools (eg, smartphones). The primary objective of the Saskatchewan, let's move and map our activity (SMART) Study was to develop a mobile and citizen science methodological platform for active living surveillance, knowledge translation, and policy interventions. This methodology paper enumerates the SMART Study platform's conceptualization, design, implementation, data collection procedures, analytical strategies, and potential for informing policy interventions. This longitudinal investigation was designed to engage participants (ie, citizen scientists) in Regina and Saskatoon, Saskatchewan, Canada, in four different seasons across 3 years. In spring 2017, pilot data collection was conducted, where 317 adult citizen scientists (≥18 years) were recruited in person and online. Citizen scientists used a custom-built smartphone app, Ethica (Ethica Data Services Inc), for 8 consecutive days to provide a complex series of objective and subjective data. Citizen scientists answered a succession of validated surveys that were assigned different smartphone triggering mechanisms (eg, user-triggered and schedule-triggered). The validated surveys captured physical activity (PA), SB, motivation, perception of outdoor and indoor environment, and eudaimonic well-being. Ecological momentary assessments were employed on each day to capture not only PA but also physical and social contexts along with barriers and facilitators of PA, as relayed by citizen scientists using geo-coded pictures and audio files. To obtain a comprehensive objective picture of participant location, motion, and compliance, 6 types of sensor-based (eg, global positioning system and accelerometer) data were surveilled for 8 days. Initial descriptive analyses were conducted using geo-coded photographs and audio files. Pictures and audio files (ie, community voices) showed that the barriers and facilitators of active living included intrinsic or extrinsic motivations, social contexts, and outdoor or indoor environment, with pets and favorable urban design featuring as the predominant facilitators, and work-related screen time proving to be the primary barrier. The preliminary pilot results show the flexibility of the SMART Study surveillance platform in identifying and addressing limitations based on empirical evidence. The results also show the successful implementation of a platform that engages participants to catalyze policy interventions. Although SMART Study is currently geared toward surveillance, using the same platform, active living interventions could be remotely implemented. SMART Study is the first mobile, citizen science surveillance platform utilizing a rigorous, longitudinal, and mixed-methods investigation to temporally capture behavioral data for knowledge translation and policy interventions. ©Tarun Reddy Katapally, Jasmin Bhawra, Scott T Leatherdale, Leah Ferguson, Justin Longo, Daniel Rainham, Richard Larouche, Nathaniel Osgood. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 27.03.2018.

  3. Cutting Silica Aerogel for Particle Extraction

    NASA Technical Reports Server (NTRS)

    Tsou, P.; Brownlee, D. E.; Glesias, R.; Grigoropoulos, C. P.; Weschler, M.

    2005-01-01

    The detailed laboratory analyses of extraterrestrial particles have revolutionized our knowledge of planetary bodies in the last three decades. This knowledge of chemical composition, morphology, mineralogy, and isotopics of particles cannot be provided by remote sensing. In order to acquire these detail information in the laboratories, the samples need be intact, unmelted. Such intact capture of hypervelocity particles has been developed in 1996. Subsequently silica aerogel was introduced as the preferred medium for intact capturing of hypervelocity particles and later showed it to be particularly suitable for the space environment. STARDUST, the 4th NASA Discovery mission to capture samples from 81P/Wild 2 and contemporary interstellar dust, is the culmination of these new technologies. In early laboratory experiments of launching hypervelocity projectiles into aerogel, there was the need to cut aerogel to isolate or extract captured particles/tracks. This is especially challenging for space captures, since there will be many particles/tracks of wide ranging scales closely located, even collocated. It is critical to isolate and extract one particle without compromising its neighbors since the full significance of a particle is not known until it is extracted and analyzed. To date, three basic techniques have been explored: mechanical cutting, lasers cutting and ion beam milling. We report the current findings.

  4. Reducing the cognitive workload: Trouble managing power systems

    NASA Technical Reports Server (NTRS)

    Manner, David B.; Liberman, Eugene M.; Dolce, James L.; Mellor, Pamela A.

    1993-01-01

    The complexity of space-based systems makes monitoring them and diagnosing their faults taxing for human beings. Mission control operators are well-trained experts but they can not afford to have their attention diverted by extraneous information. During normal operating conditions monitoring the status of the components of a complex system alone is a big task. When a problem arises, immediate attention and quick resolution is mandatory. To aid humans in these endeavors we have developed an automated advisory system. Our advisory expert system, Trouble, incorporates the knowledge of the power system designers for Space Station Freedom. Trouble is designed to be a ground-based advisor for the mission controllers in the Control Center Complex at Johnson Space Center (JSC). It has been developed at NASA Lewis Research Center (LeRC) and tested in conjunction with prototype flight hardware contained in the Power Management and Distribution testbed and the Engineering Support Center, ESC, at LeRC. Our work will culminate with the adoption of these techniques by the mission controllers at JSC. This paper elucidates how we have captured power system failure knowledge, how we have built and tested our expert system, and what we believe are its potential uses.

  5. Why children are not vaccinated: a review of the grey literature.

    PubMed

    Favin, Michael; Steinglass, Robert; Fields, Rebecca; Banerjee, Kaushik; Sawhney, Monika

    2012-12-01

    In collaboration with WHO, IMMUNIZATION basics analyzed 126 documents from the global grey literature to identify reasons why eligible children had incomplete or no vaccinations. The main reasons for under-vaccination were related to immunization services and to parental knowledge and attitudes. The most frequently cited factors were: access to services, health staff attitudes and practices, reliability of services, false contraindications, parents' practical knowledge of vaccination, fear of side effects, conflicting priorities and parental beliefs. Some family demographic characteristics were strong, but underlying, risk factors for under-vaccination. Studies must be well designed to capture a complete picture of the simultaneous causes of under-vaccination and to avoid biased results. Although the grey literature contains studies of varying quality, it includes many well-designed studies. Every immunization program should strive to provide quality services that are accessible, convenient, reliable, friendly, affordable and acceptable, and should solicit feedback from families and community leaders. Every program should monitor missed and under-vaccinated children and assess and address the causes. Although global reviews, such as this one, can play a useful role in identifying key questions for local study, local enquiry and follow-up remain essential.

  6. Principal Leadership for Technology-enhanced Learning in Science

    NASA Astrophysics Data System (ADS)

    Gerard, Libby F.; Bowyer, Jane B.; Linn, Marcia C.

    2008-02-01

    Reforms such as technology-enhanced instruction require principal leadership. Yet, many principals report that they need help to guide implementation of science and technology reforms. We identify strategies for helping principals provide this leadership. A two-phase design is employed. In the first phase we elicit principals' varied ideas about the Technology-enhanced Learning in Science (TELS) curriculum materials being implemented by teachers in their schools, and in the second phase we engage principals in a leadership workshop designed based on the ideas they generated. Analysis uses an emergent coding scheme to categorize principals' ideas, and a knowledge integration framework to capture the development of these ideas. The analysis suggests that principals frame their thinking about the implementation of TELS in terms of: principal leadership, curriculum, educational policy, teacher learning, student outcomes and financial resources. They seek to improve their own knowledge to support this reform. The principals organize their ideas around individual school goals and current political issues. Principals prefer professional development activities that engage them in reviewing curricula and student work with other principals. Based on the analysis, this study offers guidelines for creating learning opportunities that enhance principals' leadership abilities in technology and science reform.

  7. Cognitive knowledge, attitude toward science, and skill development in virtual science laboratories

    NASA Astrophysics Data System (ADS)

    Babaie, Mahya

    The purpose of this quantitative, descriptive, single group, pretest posttest design study was to explore the influence of a Virtual Science Laboratory (VSL) on middle school students' cognitive knowledge, skill development, and attitudes toward science. This study involved 2 eighth grade Physical Science classrooms at a large urban charter middle school located in Southern California. The Buoyancy and Density Test (BDT), a computer generated test, assessed students' scientific knowledge in areas of Buoyancy and Density. The Attitude Toward Science Inventory (ATSI), a multidimensional survey assessment, measured students' attitudes toward science in the areas of value of science in society, motivation in science, enjoyment of science, self-concept regarding science, and anxiety toward science. A Virtual Laboratory Packet (VLP), generated by the researcher, captured students' mathematical and scientific skills. Data collection was conducted over a period of five days. BDT and ATSI assessments were administered twice: once before the Buoyancy and Density VSL to serve as baseline data (pre) and also after the VSL (post). The findings of this study revealed that students' cognitive knowledge and attitudes toward science were positively changed as expected, however, the results from paired sample t-tests found no statistical significance. Analyses indicated that VSLs were effective in supporting students' scientific knowledge and attitude toward science. The attitudes most changed were value of science in society and enjoyment of science with mean differences of 1.71 and 0.88, respectively. Researchers and educational practitioners are urged to further examine VSLs, covering a variety of topics, with more middle school students to assess their learning outcomes. Additionally, it is recommended that publishers in charge of designing the VSLs communicate with science instructors and research practitioners to further improve the design and analytic components of these virtual learning environments. The results of this study contribute to the existing body of knowledge in an effort to raise awareness about the inclusion of VSLs in secondary science classrooms. With the advancement of technological tools in secondary science classrooms, instructional practices should consider including VSLs especially if providing real science laboratories is a challenge.

  8. Risk Information Management Resource (RIMR): modeling an approach to defending against military medical information assurance brain drain

    NASA Astrophysics Data System (ADS)

    Wright, Willie E.

    2003-05-01

    As Military Medical Information Assurance organizations face off with modern pressures to downsize and outsource, they battle with losing knowledgeable people who leave and take with them what they know. This knowledge is increasingly being recognized as an important resource and organizations are now taking steps to manage it. In addition, as the pressures for globalization (Castells, 1998) increase, collaboration and cooperation are becoming more distributed and international. Knowledge sharing in a distributed international environment is becoming an essential part of Knowledge Management. This is a major shortfall in the current approach to capturing and sharing knowledge in Military Medical Information Assurance. This paper addresses this challenge by exploring Risk Information Management Resource (RIMR) as a tool for sharing knowledge using the concept of Communities of Practice. RIMR is based no the framework of sharing and using knowledge. This concept is done through three major components - people, process and technology. The people aspect enables remote collaboration, support communities of practice, reward and recognize knowledge sharing while encouraging storytelling. The process aspect enhances knowledge capture and manages information. While the technology aspect enhance system integration and data mining, it also utilizes intelligent agents and exploits expert systems. These coupled with supporting activities of education and training, technology infrastructure and information security enables effective information assurance collaboration.

  9. Making PCK Explicit--Capturing Science Teachers' Pedagogical Content Knowledge (PCK) in the Science Classroom

    ERIC Educational Resources Information Center

    Nilsson, Pernilla; Vikström, Anna

    2015-01-01

    One way for teachers to develop their professional knowledge, which also focuses on specific science content and the ways students learn, is through being involved in researching their own practice. The aim of this study was to examine how science teachers changed (or not) their professional knowledge of teaching after inquiring into their own…

  10. Protecting and Promoting Indigenous Knowledge: Environmental Adult Education and Organic Agriculture

    ERIC Educational Resources Information Center

    Sumner, Jennifer

    2008-01-01

    Given today's pressing environmental issues, environmental adult educators can help us learn to live more sustainably. One of the models for more sustainable ways of life is organic agriculture, based in a knowledge system that works with nature, not against it. In order to understand this knowledge, we need to frame it in a way that captures all…

  11. A Foundation for Understanding Knowledge Sharing: Organizational Culture, Informal Workplace Learning, Performance Support, and Knowledge Management

    ERIC Educational Resources Information Center

    Caruso, Shirley J.

    2017-01-01

    This paper serves as an exploration into some of the ways in which organizations can promote, capture, share, and manage the valuable knowledge of their employees. The problem is that employees typically do not share valuable information, skills, or expertise with other employees or with the entire organization. The author uses research as well as…

  12. Tool for Constructing Data Albums for Significant Weather Events

    NASA Astrophysics Data System (ADS)

    Kulkarni, A.; Ramachandran, R.; Conover, H.; McEniry, M.; Goodman, H.; Zavodsky, B. T.; Braun, S. A.; Wilson, B. D.

    2012-12-01

    Case study analysis and climatology studies are common approaches used in Atmospheric Science research. Research based on case studies involves a detailed description of specific weather events using data from different sources, to characterize physical processes in play for a given event. Climatology-based research tends to focus on the representativeness of a given event, by studying the characteristics and distribution of a large number of events. To gather relevant data and information for case studies and climatology analysis is tedious and time consuming; current Earth Science data systems are not suited to assemble multi-instrument, multi mission datasets around specific events. For example, in hurricane science, finding airborne or satellite data relevant to a given storm requires searching through web pages and data archives. Background information related to damages, deaths, and injuries requires extensive online searches for news reports and official storm summaries. We will present a knowledge synthesis engine to create curated "Data Albums" to support case study analysis and climatology studies. The technological challenges in building such a reusable and scalable knowledge synthesis engine are several. First, how to encode domain knowledge in a machine usable form? This knowledge must capture what information and data resources are relevant and the semantic relationships between the various fragments of information and data. Second, how to extract semantic information from various heterogeneous sources including unstructured texts using the encoded knowledge? Finally, how to design a structured database from the encoded knowledge to store all information and to support querying? The structured database must allow both knowledge overviews of an event as well as drill down capability needed for detailed analysis. An application ontology driven framework is being used to design the knowledge synthesis engine. The knowledge synthesis engine is being applied to build a portal for hurricane case studies at the Global Hydrology and Resource Center (GHRC), a NASA Data Center. This portal will auto-generate Data Albums for specific hurricane events, compiling information from distributed resources such as NASA field campaign collections, relevant data sets, storm reports, pictures, videos and other useful sources.

  13. Knowledge Management and Reference Services

    ERIC Educational Resources Information Center

    Gandhi, Smiti

    2004-01-01

    Many corporations are embracing knowledge management (KM) to capture the intellectual capital of their employees. This article focuses on KM applications for reference work in libraries. It defines key concepts of KM, establishes a need for KM for reference services, and reviews various KM initiatives for reference services.

  14. Event-related potentials reveal the effect of prior knowledge on competition for representation and attentional capture.

    PubMed

    Hilimire, Matthew R; Corballis, Paul M

    2014-01-01

    Objects compete for representation in our limited capacity visual system. We examined how this competition is influenced by top-down knowledge using event-related potentials. Competition was manipulated by presenting visual search arrays in which the target or distractor was the only color singleton compared to displays in which both singletons were presented. Experiments 1 and 2 manipulated whether the observer knew the color of the target in advance. Experiment 3 ruled out low-level sensory explanations. Results show that, under conditions of competition, the distractor does not elicit an N2pc when the target color is known. However, the N2pc elicited by the target is reduced in the presence of a distractor. These findings suggest that top-down knowledge can prevent the capture of attention by distracting information, but this prior knowledge does not eliminate the competitive influence of the distractor on the target. Copyright © 2013 Society for Psychophysiological Research.

  15. Promoting Collaborative Practice and Reciprocity in Initial Teacher Education: Realising a "Dialogic Space" through Video Capture Analysis

    ERIC Educational Resources Information Center

    Youens, Bernadette; Smethem, Lindsey; Sullivan, Stefanie

    2014-01-01

    This paper explores the potential of video capture to generate a collaborative space for teacher preparation; a space in which traditional hierarchies and boundaries between actors (student teacher, school mentor and university tutor) and knowledge (academic, professional and practical) are disrupted. The study, based in a teacher education…

  16. e-Learning Content Design for Corrective Maintenance of Toshiba BMC 80.5 based on Knowledge Conversion using SECI Method: A Case Study in Aerospace Company

    NASA Astrophysics Data System (ADS)

    Permata Shabrina, Ayu; Pramuditya Soesanto, Rayinda; Kurniawati, Amelia; Teguh Kurniawan, Mochamad; Andrawina, Luciana

    2018-03-01

    Knowledge is a combination of experience, value, and information that is based on the intuition that allows an organization to evaluate and combine new information. In an organization, knowledge is not only attached to document but also in routine value creating activities, therefore knowledge is an important asset for the organization. X Corp is a company that focused on manufacturing aerospace components. In carrying out the production process, the company is supported by various machines, one of the machines is Toshiba BMC 80.5. The machine is used occasionally and therefore maintenance activity is needed, especially corrective maintenance. Corrective maintenance is done to make a breakdown machine back to work. Corrective maintenance is done by maintenance operator whose retirement year is close. The long term experience of the maintenance operator needs to be captured by the organization and shared across maintenance division. E-learning is one type of media that can support and assist knowledge sharing. This research purpose is to create the e-learning content for best practice of corrective maintenance activity for Toshiba BMC 80.5 by extracting the knowledge and experience from the operator based on knowledge conversion using SECI method. The knowledge source in this research is a maintenance supervisor and a senior maintenance engineer. From the evaluation of the e-learning content, it is known that the average test score of the respondents who use the e-learning increases from 77.5 to 87.5.

  17. A Three-Lesson Teaching Unit Significantly Increases High School Students’ Knowledge about Epilepsy and Positively Influences Their Attitude towards This Disease

    PubMed Central

    Simon, Uwe K.; Gesslbauer, Lisa; Fink, Andreas

    2016-01-01

    Epilepsy is not a regular topic in many countries’ schools. Thus many people harbor misconceptions about people suffering from this disease. It was our aim to a) examine what grade ten students know and believe about epilepsy, and b) to develop and test a teaching unit to improve their knowledge and attitude. The test group comprised eight grade ten classes from six different Austrian high schools (54 girls and 51 boys aged 14–17), the control group (no intervention) five grade ten classes from the same schools (26 girls and 37 boys aged 14–17). The teaching unit consisted of three 45-min lessons using different methods and material. Changes in knowledge about and attitude towards epilepsy as a result of the intervention were psychometrically assessed in a pre-test intervention post-test design (along with a follow-up assessment two months after the intervention) by means of a questionnaire capturing different facets of epilepsy-related knowledge and attitude. Across all knowledge/attitude domains, students of the test group had a significantly improved knowledge about and a more positive attitude towards epilepsy and people suffering from it after the teaching unit. However, starting levels were different between the five knowledge/attitude domains tested. Medical background knowledge was lowest and consequently associated with the highest increase after the intervention. This study shows that epilepsy-related knowledge of many grade ten high school students is fragmentary and that some harbor beliefs and attitudes which require improvement. Our comprehensive but concise teaching unit significantly increased knowledge about epilepsy and positively influenced attitude towards individuals with epilepsy. Thus we recommend implementing this unit into regular school curricula. PMID:26919557

  18. Design and implementation of GaAs HBT circuits with ACME

    NASA Technical Reports Server (NTRS)

    Hutchings, Brad L.; Carter, Tony M.

    1993-01-01

    GaAs HBT circuits offer high performance (5-20 GHz) and radiation hardness (500 Mrad) that is attractive for space applications. ACME is a CAD tool specifically developed for HBT circuits. ACME implements a novel physical schematic-capture design technique where designers simultaneously view the structure and physical organization of a circuit. ACME's design interface is similar to schematic capture; however, unlike conventional schematic capture, designers can directly control the physical placement of both function and interconnect at the schematic level. In addition, ACME provides design-time parasitic extraction, complex wire models, and extensions to Multi-Chip Modules (MCM's). A GaAs HBT gate-array and semi-custom circuits have been developed with ACME; several circuits have been fabricated and found to be fully functional .

  19. Integration of E-Learning and Knowledge Management.

    ERIC Educational Resources Information Center

    Woelk, Darrell; Agarwal, Shailesh

    E-Learning technology today is used primarily to handcraft training courses about carefully selected topics for delivery to employees registered for those courses. This paper investigates the integration of e-learning and knowledge management technology to improve the capture, organization and delivery of both traditional training courses and…

  20. A domain-specific design architecture for composite material design and aircraft part redesign

    NASA Technical Reports Server (NTRS)

    Punch, W. F., III; Keller, K. J.; Bond, W.; Sticklen, J.

    1992-01-01

    Advanced composites have been targeted as a 'leapfrog' technology that would provide a unique global competitive position for U.S. industry. Composites are unique in the requirements for an integrated approach to designing, manufacturing, and marketing of products developed utilizing the new materials of construction. Numerous studies extending across the entire economic spectrum of the United States from aerospace to military to durable goods have identified composites as a 'key' technology. In general there have been two approaches to composite construction: build models of a given composite materials, then determine characteristics of the material via numerical simulation and empirical testing; and experience-directed construction of fabrication plans for building composites with given properties. The first route sets a goal to capture basic understanding of a device (the composite) by use of a rigorous mathematical model; the second attempts to capture the expertise about the process of fabricating a composite (to date) at a surface level typically expressed in a rule based system. From an AI perspective, these two research lines are attacking distinctly different problems, and both tracks have current limitations. The mathematical modeling approach has yielded a wealth of data but a large number of simplifying assumptions are needed to make numerical simulation tractable. Likewise, although surface level expertise about how to build a particular composite may yield important results, recent trends in the KBS area are towards augmenting surface level problem solving with deeper level knowledge. Many of the relative advantages of composites, e.g., the strength:weight ratio, is most prominent when the entire component is designed as a unitary piece. The bottleneck in undertaking such unitary design lies in the difficulty of the re-design task. Designing the fabrication protocols for a complex-shaped, thick section composite are currently very difficult. It is in fact this difficulty that our research will address.

  1. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    NASA Astrophysics Data System (ADS)

    Akram, Muhammad Farooq Bin

    The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.

  2. Development of user-centered interfaces to search the knowledge resources of the Virginia Henderson International Nursing Library.

    PubMed

    Jones, Josette; Harris, Marcelline; Bagley-Thompson, Cheryl; Root, Jane

    2003-01-01

    This poster describes the development of user-centered interfaces in order to extend the functionality of the Virginia Henderson International Nursing Library (VHINL) from library to web based portal to nursing knowledge resources. The existing knowledge structure and computational models are revised and made complementary. Nurses' search behavior is captured and analyzed, and the resulting search models are mapped to the revised knowledge structure and computational model.

  3. AMModels: An R package for storing models, data, and metadata to facilitate adaptive management

    PubMed Central

    Katz, Jonathan E.

    2018-01-01

    Agencies are increasingly called upon to implement their natural resource management programs within an adaptive management (AM) framework. This article provides the background and motivation for the R package, AMModels. AMModels was developed under R version 3.2.2. The overall goal of AMModels is simple: To codify knowledge in the form of models and to store it, along with models generated from numerous analyses and datasets that may come our way, so that it can be used or recalled in the future. AMModels facilitates this process by storing all models and datasets in a single object that can be saved to an .RData file and routinely augmented to track changes in knowledge through time. Through this process, AMModels allows the capture, development, sharing, and use of knowledge that may help organizations achieve their mission. While AMModels was designed to facilitate adaptive management, its utility is far more general. Many R packages exist for creating and summarizing models, but to our knowledge, AMModels is the only package dedicated not to the mechanics of analysis but to organizing analysis inputs, analysis outputs, and preserving descriptive metadata. We anticipate that this package will assist users hoping to preserve the key elements of an analysis so they may be more confidently revisited at a later date. PMID:29489825

  4. AMModels: An R package for storing models, data, and metadata to facilitate adaptive management.

    PubMed

    Donovan, Therese M; Katz, Jonathan E

    2018-01-01

    Agencies are increasingly called upon to implement their natural resource management programs within an adaptive management (AM) framework. This article provides the background and motivation for the R package, AMModels. AMModels was developed under R version 3.2.2. The overall goal of AMModels is simple: To codify knowledge in the form of models and to store it, along with models generated from numerous analyses and datasets that may come our way, so that it can be used or recalled in the future. AMModels facilitates this process by storing all models and datasets in a single object that can be saved to an .RData file and routinely augmented to track changes in knowledge through time. Through this process, AMModels allows the capture, development, sharing, and use of knowledge that may help organizations achieve their mission. While AMModels was designed to facilitate adaptive management, its utility is far more general. Many R packages exist for creating and summarizing models, but to our knowledge, AMModels is the only package dedicated not to the mechanics of analysis but to organizing analysis inputs, analysis outputs, and preserving descriptive metadata. We anticipate that this package will assist users hoping to preserve the key elements of an analysis so they may be more confidently revisited at a later date.

  5. The effect of a knowledge-based ergonomic intervention amongst administrators at Aga Khan University Hospital, Nairobi.

    PubMed

    Wanyonyi, Nancy; Frantz, Jose; Saidi, Hassan

    2015-01-01

    Low back pain (LBP) and neck pain are part of the common work-related musculoskeletal disorders with a large impact on the affected person. Despite having a multifactorial aetiology, ergonomic factors play a major role thus necessitating workers' education. To determine the prevalence of ergonomic-related LBP and neck pain, and describe the effect of a knowledge-based ergonomic intervention amongst administrators in Aga Khan University Hospital, Nairobi. This study applied a mixed method design utilizing a survey and two focus group discussions (FGD). A self-administered questionnaire was distributed to 208 participants through systematic sampling. A one hour knowledge-based ergonomic session founded on the survey results was thereafter administered to interested participants, followed by two FGDs a month later with purposive selection of eight participants to explore their experience of the ergonomic intervention. Quantitative data was captured and analyzed using SPSS by means of descriptive and inferential statistics, whereas thematic content analysis was used for qualitative data. Most participants were knowledgeable about ergonomic-related LBP and neck pain with a twelve month prevalence of 75.5% and 67.8% respectively. Continual ergonomic education is necessary for adherence to health-related behaviours that will preventwork-related LBP and neck pain.

  6. Application of Knowledge Management: Pressing questions and practical answers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FROMM-LEWIS,MICHELLE

    2000-02-11

    Sandia National Laboratory are working on ways to increase production using Knowledge Management. Knowledge Management is: finding ways to create, identify, capture, and distribute organizational knowledge to the people who need it; to help information and knowledge flow to the right people at the right time so they can act more efficiently and effectively; recognizing, documenting and distributing explicit knowledge (explicit knowledge is quantifiable and definable, it makes up reports, manuals, instructional materials, etc.) and tacit knowledge (tacit knowledge is doing and performing, it is a combination of experience, hunches, intuition, emotions, and beliefs) in order to improve organizational performancemore » and a systematic approach to find, understand and use knowledge to create value.« less

  7. Neutron capture therapies

    DOEpatents

    Yanch, Jacquelyn C.; Shefer, Ruth E.; Klinkowstein, Robert E.

    1999-01-01

    In one embodiment there is provided an application of the .sup.10 B(n,.alpha.).sup.7 Li nuclear reaction or other neutron capture reactions for the treatment of rheumatoid arthritis. This application, called Boron Neutron Capture Synovectomy (BNCS), requires substantially altered demands on neutron beam design than for instance treatment of deep seated tumors. Considerations for neutron beam design for the treatment of arthritic joints via BNCS are provided for, and comparisons with the design requirements for Boron Neutron Capture Therapy (BNCT) of tumors are made. In addition, exemplary moderator/reflector assemblies are provided which produce intense, high-quality neutron beams based on (p,n) accelerator-based reactions. In another embodiment there is provided the use of deuteron-based charged particle reactions to be used as sources for epithermal or thermal neutron beams for neutron capture therapies. Many d,n reactions (e.g. using deuterium, tritium or beryllium targets) are very prolific at relatively low deuteron energies.

  8. Portraiture in the Large Lecture: Storying One Chemistry Professor's Practical Knowledge

    NASA Astrophysics Data System (ADS)

    Eddleton, Jeannine E.

    Practical knowledge, as defined by Freema Elbaz (1983), is a complex, practically oriented set of understandings which teachers use to actively shape and direct their work. The goal of this study is the construction of a social science portrait that illuminates the practical knowledge of a large lecture professor of general chemistry at a public research university in the southeast. This study continues Elbaz's (1981) work on practical knowledge with the incorporation of a qualitative and intentionally interventionist methodology which "blurs the boundaries of aesthetics and empiricism in an effort to capture the complexity, dynamics, and subtlety of human experience and organizational life," (Lawrence-Lightfoot & Davis, 1997). This collection of interviews, observations, writings, and reflections is designed for an eclectic audience with the intent of initiating conversation on the topic of the large lecture and is a purposeful attempt to link research and practice. Social science portraiture is uniquely suited to this intersection of researcher and researched, the perfect combination of methodology and analysis for a project that is both product and praxis. The following research questions guide the study. • Are aspects of Elbaz's practical knowledge identifiable in the research conversations conducted with a large lecture college professor? • Is practical knowledge identifiable during observations of Patricia's large lecture? Freema Elbaz conducted research conversations with Sarah, a high school classroom and writing resource teacher who conducted much of her teaching work one on one with students. Patricia's practice differs significantly from Sarah's with respect to subject matter and to scale.

  9. Model authoring system for fail safe analysis

    NASA Technical Reports Server (NTRS)

    Sikora, Scott E.

    1990-01-01

    The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.

  10. Influence of trap modifications and environmental predictors on capture success of southern flying squirrels

    USGS Publications Warehouse

    Jacques, Christopher N.; Zweep, James S.; Scheihing, Mary E.; Rechkemmer, Will T.; Jenkins, Sean E.; Klaver, Robert W.; Dubay, Shelli A.

    2017-01-01

    Sherman traps are the most commonly used live traps in studies of small mammals and have been successfully used in the capture of arboreal species such as the southern flying squirrel (Glaucomys volans). However, southern flying squirrels spend proportionately less time foraging on the ground, which necessitates above-ground trapping methods and modifications of capture protocols. Further, quantitative estimates of the factors affecting capture success of flying squirrel populations have focused solely on effects of trapping methodologies. We developed and evaluated the efficacy of a portable Sherman trap design for capturing southern flying squirrels during 2015–2016 at the Alice L. Kibbe Field Station, Illinois, USA. Additionally, we used logistic regression to quantify potential effects of time-dependent (e.g., weather) and time-independent (e.g., habitat, extrinsic) factors on capture success of southern flying squirrels. We recorded 165 capture events (119 F, 44 M, 2 unknown) using our modified Sherman trap design. Probability of capture success decreased 0.10/1° C increase in daily maximum temperature and by 0.09/unit increase (km/hr) in wind speed. Conversely, probability of capture success increased by 1.2/1° C increase in daily minimum temperature. The probability of capturing flying squirrels was negatively associated with trap orientation. When tree-mounted traps are required, our modified trap design is a safe, efficient, and cost-effective method of capturing animals when moderate weather (temp and wind speed) conditions prevail. Further, we believe that strategic placement of traps (e.g., northeast side of tree) and quantitative information on site-specific (e.g., trap location) characteristics (e.g., topographical features, slope, aspect, climatologic factors) could increase southern flying squirrel capture success. © 2017 The Wildlife Society.

  11. Evaluation of Flight Attendant Technical Knowledge

    NASA Technical Reports Server (NTRS)

    Dunbar, Melisa G.; Chute, Rebecca D.; Rosekind, Mark (Technical Monitor)

    1997-01-01

    Accident and incident reports have indicated that flight attendants have numerous opportunities to provide the flight-deck crew with operational information that may prevent or lessen the severity of a potential problem. Additionally, as carrier fleets transition from three person to two person flight-deck crews, the reliance upon the cabin crew for the transfer of this information may increase further. Recent research indicates that flight attendants do not feel confident in their ability to describe mechanical parts or malfunctions of the aircraft, and the lack of flight attendant technical training has been referenced in a number of recent reports. Chute and Wiener describe five factors which may produce communication barriers between cockpit and cabin crews: the historical background of aviation, the physical separation of the two crews, psychosocial issues, regulatory factors, and organizational factors. By examining these areas of division we can identify possible bridges and address the implications of deficient cockpit/cabin communication on flight safety. Flight attendant operational knowledge may provide some mitigation of these barriers. The present study explored both flight attendant technical knowledge and flight attendant and pilot expectations of flight attendant technical knowledge. To assess the technical knowledge of cabin crewmembers, 177 current flight attendants from two U.S. carriers voluntarily completed a 13-item technical quiz. To investigate expectations of flight attendant technical knowledge, 181 pilots and a second sample of 96 flight attendants, from the same two airlines, completed surveys designed to capture each group's expectations of operational knowledge required of flight attendants. Analyses revealed several discrepancies between the present level of flight attendant operational knowledge and pilots' and flight attendants' expected and desired levels of technical knowledge. Implications for training will be discussed.

  12. Mining Hesitation Information by Vague Association Rules

    NASA Astrophysics Data System (ADS)

    Lu, An; Ng, Wilfred

    In many online shopping applications, such as Amazon and eBay, traditional Association Rule (AR) mining has limitations as it only deals with the items that are sold but ignores the items that are almost sold (for example, those items that are put into the basket but not checked out). We say that those almost sold items carry hesitation information, since customers are hesitating to buy them. The hesitation information of items is valuable knowledge for the design of good selling strategies. However, there is no conceptual model that is able to capture different statuses of hesitation information. Herein, we apply and extend vague set theory in the context of AR mining. We define the concepts of attractiveness and hesitation of an item, which represent the overall information of a customer's intent on an item. Based on the two concepts, we propose the notion of Vague Association Rules (VARs). We devise an efficient algorithm to mine the VARs. Our experiments show that our algorithm is efficient and the VARs capture more specific and richer information than do the traditional ARs.

  13. Acute Short-Term Sleep Deprivation Does Not Affect Metacognitive Monitoring Captured by Confidence Ratings: A Systematic Literature Review

    ERIC Educational Resources Information Center

    Jackson, Simon A.; Martin, Gregory D.; Aidman, Eugene; Kleitman, Sabina

    2018-01-01

    This article presents the results of a systematic review of the literature surrounding the effects that acute sleep deprivation has on metacognitive monitoring. Metacognitive monitoring refers to the ability to accurately assess one's own performance and state of knowledge. The mechanism behind this assessment is captured by subjective feelings of…

  14. A community effort towards a knowledge-base and mathematical model of the human pathogen Salmonella Typhimurium LT2

    USDA-ARS?s Scientific Manuscript database

    Metabolic reconstructions (MRs) are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG) knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium...

  15. Non-Contact Measurement of Thermal Diffusivity in Ion-Implanted Nuclear Materials

    NASA Astrophysics Data System (ADS)

    Hofmann, F.; Mason, D. R.; Eliason, J. K.; Maznev, A. A.; Nelson, K. A.; Dudarev, S. L.

    2015-11-01

    Knowledge of mechanical and physical property evolution due to irradiation damage is essential for the development of future fission and fusion reactors. Ion-irradiation provides an excellent proxy for studying irradiation damage, allowing high damage doses without sample activation. Limited ion-penetration-depth means that only few-micron-thick damaged layers are produced. Substantial effort has been devoted to probing the mechanical properties of these thin implanted layers. Yet, whilst key to reactor design, their thermal transport properties remain largely unexplored due to a lack of suitable measurement techniques. Here we demonstrate non-contact thermal diffusivity measurements in ion-implanted tungsten for nuclear fusion armour. Alloying with transmutation elements and the interaction of retained gas with implantation-induced defects both lead to dramatic reductions in thermal diffusivity. These changes are well captured by our modelling approaches. Our observations have important implications for the design of future fusion power plants.

  16. Flight elements: Fault detection and fault management

    NASA Technical Reports Server (NTRS)

    Lum, H.; Patterson-Hine, A.; Edge, J. T.; Lawler, D.

    1990-01-01

    Fault management for an intelligent computational system must be developed using a top down integrated engineering approach. An approach proposed includes integrating the overall environment involving sensors and their associated data; design knowledge capture; operations; fault detection, identification, and reconfiguration; testability; causal models including digraph matrix analysis; and overall performance impacts on the hardware and software architecture. Implementation of the concept to achieve a real time intelligent fault detection and management system will be accomplished via the implementation of several objectives, which are: Development of fault tolerant/FDIR requirement and specification from a systems level which will carry through from conceptual design through implementation and mission operations; Implementation of monitoring, diagnosis, and reconfiguration at all system levels providing fault isolation and system integration; Optimize system operations to manage degraded system performance through system integration; and Lower development and operations costs through the implementation of an intelligent real time fault detection and fault management system and an information management system.

  17. Expert Users' Perceptions of Racing Wheelchair Design and Setup: The Knowns, Unknowns, and Next Steps.

    PubMed

    Bundon, Andrea; Mason, Barry S; Goosey-Tolfrey, Victoria L

    2017-04-01

    This paper demonstrates how a qualitative methodology can be used to gain novel insights into the demands of wheelchair racing and the impact of particular racing chair configurations on optimal sport performance via engagement with expert users (wheelchair racers, coaches, and manufacturers). We specifically explore how expert users understand how wheels, tires, and bearings impact sport performance and how they engage, implement, or reject evidence-based research pertaining to these components. We identify areas where participants perceive there to be an immediate need for more research especially pertaining to the ability to make individualized recommendations for athletes. The findings from this project speak to the value of a qualitative research design for capturing the embodied knowledge of expert users and also make suggestions for "next step" projects pertaining to wheels, tires, and bearings drawn directly from the comments of participants.

  18. Non-Contact Measurement of Thermal Diffusivity in Ion-Implanted Nuclear Materials

    DOE PAGES

    Hofmann, F.; Mason, D. R.; Eliason, J. K.; ...

    2015-11-03

    Knowledge of mechanical and physical property evolution due to irradiation damage is essential for the development of future fission and fusion reactors. Ion-irradiation provides an excellent proxy for studying irradiation damage, allowing high damage doses without sample activation. Limited ion-penetration-depth means that only few-micron-thick damaged layers are produced. Substantial effort has been devoted to probing the mechanical properties of these thin implanted layers. Yet, whilst key to reactor design, their thermal transport properties remain largely unexplored due to a lack of suitable measurement techniques. Here we demonstrate non-contact thermal diffusivity measurements in ion-implanted tungsten for nuclear fusion armour. Alloying withmore » transmutation elements and the interaction of retained gas with implantation-induced defects both lead to dramatic reductions in thermal diffusivity. These changes are well captured by our modelling approaches. Our observations have important implications for the design of future fusion power plants.« less

  19. Non-Contact Measurement of Thermal Diffusivity in Ion-Implanted Nuclear Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofmann, F.; Mason, D. R.; Eliason, J. K.

    Knowledge of mechanical and physical property evolution due to irradiation damage is essential for the development of future fission and fusion reactors. Ion-irradiation provides an excellent proxy for studying irradiation damage, allowing high damage doses without sample activation. Limited ion-penetration-depth means that only few-micron-thick damaged layers are produced. Substantial effort has been devoted to probing the mechanical properties of these thin implanted layers. Yet, whilst key to reactor design, their thermal transport properties remain largely unexplored due to a lack of suitable measurement techniques. Here we demonstrate non-contact thermal diffusivity measurements in ion-implanted tungsten for nuclear fusion armour. Alloying withmore » transmutation elements and the interaction of retained gas with implantation-induced defects both lead to dramatic reductions in thermal diffusivity. These changes are well captured by our modelling approaches. Our observations have important implications for the design of future fusion power plants.« less

  20. Non-Contact Measurement of Thermal Diffusivity in Ion-Implanted Nuclear Materials

    PubMed Central

    Hofmann, F.; Mason, D. R.; Eliason, J. K.; Maznev, A. A.; Nelson, K. A.; Dudarev, S. L.

    2015-01-01

    Knowledge of mechanical and physical property evolution due to irradiation damage is essential for the development of future fission and fusion reactors. Ion-irradiation provides an excellent proxy for studying irradiation damage, allowing high damage doses without sample activation. Limited ion-penetration-depth means that only few-micron-thick damaged layers are produced. Substantial effort has been devoted to probing the mechanical properties of these thin implanted layers. Yet, whilst key to reactor design, their thermal transport properties remain largely unexplored due to a lack of suitable measurement techniques. Here we demonstrate non-contact thermal diffusivity measurements in ion-implanted tungsten for nuclear fusion armour. Alloying with transmutation elements and the interaction of retained gas with implantation-induced defects both lead to dramatic reductions in thermal diffusivity. These changes are well captured by our modelling approaches. Our observations have important implications for the design of future fusion power plants. PMID:26527099

  1. Designed amyloid fibers as materials for selective carbon dioxide capture

    PubMed Central

    Li, Dan; Furukawa, Hiroyasu; Deng, Hexiang; Liu, Cong; Yaghi, Omar M.; Eisenberg, David S.

    2014-01-01

    New materials capable of binding carbon dioxide are essential for addressing climate change. Here, we demonstrate that amyloids, self-assembling protein fibers, are effective for selective carbon dioxide capture. Solid-state NMR proves that amyloid fibers containing alkylamine groups reversibly bind carbon dioxide via carbamate formation. Thermodynamic and kinetic capture-and-release tests show the carbamate formation rate is fast enough to capture carbon dioxide by dynamic separation, undiminished by the presence of water, in both a natural amyloid and designed amyloids having increased carbon dioxide capacity. Heating to 100 °C regenerates the material. These results demonstrate the potential of amyloid fibers for environmental carbon dioxide capture. PMID:24367077

  2. Anodal right ventricular capture during left ventricular stimulation in CRT-implantable cardioverter defibrillators.

    PubMed

    Thibault, Bernard; Roy, Denis; Guerra, Peter G; Macle, Laurent; Dubuc, Marc; Gagné, Pierre; Greiss, Isabelle; Novak, Paul; Furlani, Aldo; Talajic, Mario

    2005-07-01

    Cardiac resynchronization therapy (CRT) has been shown to improve symptoms of patients with moderate to severe heart failure. Optimal CRT involves biventricular or left ventricular (LV) stimulation alone, atrio-ventricular (AV) delay optimization, and possibly interventricular timing adjustment. Recently, anodal capture of the right ventricle (RV) has been described for patients with CRT-pacemakers. It is unknown whether the same phenomenon exists in CRT systems associated with defibrillators (CRT-ICD). The RV leads used in these systems are different from pacemaker leads: they have a larger diameter and shocking coils, which may affect the occurrence of anodal capture. We looked for anodal RV capture during LV stimulation in 11 consecutive patients who received a CRT-ICD system with RV leads with a true bipolar design. Fifteen patients who had RV leads with an integrated design were used as controls. Anodal RV and LV thresholds were determined at pulse width (pw) durations of 0.2, 0.5, and 1.0 ms. RV anodal capture during LV pacing was found in 11/11 patients at some output with true bipolar RV leads versus 0/15 patients with RV leads with an integrated bipolar design. Anodal RV capture threshold was more affected by changes in pw duration than LV capture threshold. In CRT-ICD systems, RV leads with a true bipolar design with the proximal ring also used as the anode for LV pacing are associated with a high incidence of anodal RV capture during LV pacing. This may affect the clinical response to alternative resynchronization methods using single LV stimulation or interventricular delay programming.

  3. [INVITED] Computational intelligence for smart laser materials processing

    NASA Astrophysics Data System (ADS)

    Casalino, Giuseppe

    2018-03-01

    Computational intelligence (CI) involves using a computer algorithm to capture hidden knowledge from data and to use them for training ;intelligent machine; to make complex decisions without human intervention. As simulation is becoming more prevalent from design and planning to manufacturing and operations, laser material processing can also benefit from computer generating knowledge through soft computing. This work is a review of the state-of-the-art on the methodology and applications of CI in laser materials processing (LMP), which is nowadays receiving increasing interest from world class manufacturers and 4.0 industry. The focus is on the methods that have been proven effective and robust in solving several problems in welding, cutting, drilling, surface treating and additive manufacturing using the laser beam. After a basic description of the most common computational intelligences employed in manufacturing, four sections, namely, laser joining, machining, surface, and additive covered the most recent applications in the already extensive literature regarding the CI in LMP. Eventually, emerging trends and future challenges were identified and discussed.

  4. NASA Human Health and Performance Center (NHHPC)

    NASA Technical Reports Server (NTRS)

    Davis, J. R.; Richard, E. E.

    2010-01-01

    The NASA Human Health and Performance Center (NHHPC) will provide a collaborative and virtual forum to integrate all disciplines of the human system to address spaceflight, aviation, and terrestrial human health and performance topics and issues. The NHHPC will serve a vital role as integrator, convening members to share information and capture a diverse knowledge base, while allowing the parties to collaborate to address the most important human health and performance topics of interest to members. The Center and its member organizations will address high-priority risk reduction strategies, including research and technology development, improved medical and environmental health diagnostics and therapeutics, and state-of-the art design approaches for human factors and habitability. Once full established in 2011, the NHHPC will focus on a number of collaborative projects focused on human health and performance, including workshops, education and outreach, information sharing and knowledge management, and research and technology development projects, to advance the study of the human system for spaceflight and other national and international priorities.

  5. The use of automatic programming techniques for fault tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Wild, C.

    1985-01-01

    It is conjectured that the production of software for ultra-reliable computing systems such as required by Space Station, aircraft, nuclear power plants and the like will require a high degree of automation as well as fault tolerance. In this paper, the relationship between automatic programming techniques and fault tolerant computing systems is explored. Initial efforts in the automatic synthesis of code from assertions to be used for error detection as well as the automatic generation of assertions and test cases from abstract data type specifications is outlined. Speculation on the ability to generate truly diverse designs capable of recovery from errors by exploring alternate paths in the program synthesis tree is discussed. Some initial thoughts on the use of knowledge based systems for the global detection of abnormal behavior using expectations and the goal-directed reconfiguration of resources to meet critical mission objectives are given. One of the sources of information for these systems would be the knowledge captured during the automatic programming process.

  6. Development of Korean Rare Disease Knowledge Base

    PubMed Central

    Seo, Heewon; Kim, Dokyoon; Chae, Jong-Hee; Kang, Hee Gyung; Lim, Byung Chan; Cheong, Hae Il

    2012-01-01

    Objectives Rare disease research requires a broad range of disease-related information for the discovery of causes of genetic disorders that are maladies caused by abnormalities in genes or chromosomes. A rarity in cases makes it difficult for researchers to elucidate definite inception. This knowledge base will be a major resource not only for clinicians, but also for the general public, who are unable to find consistent information on rare diseases in a single location. Methods We design a compact database schema for faster querying; its structure is optimized to store heterogeneous data sources. Then, clinicians at Seoul National University Hospital (SNUH) review and revise those resources. Additionally, we integrated other sources to capture genomic resources and clinical trials in detail on the Korean Rare Disease Knowledge base (KRDK). Results As a result, we have developed a Web-based knowledge base, KRDK, suitable for study of Mendelian diseases that commonly occur among Koreans. This knowledge base is comprised of disease summary and review, causal gene list, laboratory and clinic directory, patient registry, and so on. Furthermore, database for analyzing and giving access to human biological information and the clinical trial management system are integrated on KRDK. Conclusions We expect that KRDK, the first rare disease knowledge base in Korea, may contribute to collaborative research and be a reliable reference for application to clinical trials. Additionally, this knowledge base is ready for querying of drug information so that visitors can search a list of rare diseases that is relative to specific drugs. Visitors can have access to KRDK via http://www.snubi.org/software/raredisease/. PMID:23346478

  7. Intelligence by design in an entropic power grid

    NASA Astrophysics Data System (ADS)

    Negrete-Pincetic, Matias Alejandro

    In this work, the term Entropic Grid is coined to describe a power grid with increased levels of uncertainty and dynamics. These new features will require the reconsideration of well-established paradigms in the way of planning and operating the grid and its associated markets. New tools and models able to handle uncertainty and dynamics will form the required scaffolding to properly capture the behavior of the physical system, along with the value of new technologies and policies. The leverage of this knowledge will facilitate the design of new architectures to organize power and energy systems and their associated markets. This work presents several results, tools and models with the goal of contributing to that design objective. A central idea of this thesis is that the definition of products is critical in electricity markets. When markets are constructed with appropriate product definitions in mind, the interference between the physical and the market/financial systems seen in today's markets can be reduced. A key element of evaluating market designs is understanding the impact that salient features of an entropic grid---uncertainty, dynamics, constraints---can have on the electricity markets. Dynamic electricity market models tailored to capture such features are developed in this work. Using a multi-settlement dynamic electricity market, the impact of volatility is investigated. The results show the need to implement policies and technologies able to cope with the volatility of renewable sources. Similarly, using a dynamic electricity market model in which ramping costs are considered, the impacts of those costs on electricity markets are investigated. The key conclusion is that those additional ramping costs, in average terms, are not reflected in electricity prices. These results reveal several difficulties with today's real-time markets. Elements of an alternative architecture to organize these markets are also discussed.

  8. How Do Clinicians Learn About Knowledge Translation? An Investigation of Current Web-Based Learning Opportunities

    PubMed Central

    Tieman, Jennifer J

    2017-01-01

    Background Clinicians are important stakeholders in the translation of well-designed research evidence into clinical practice for optimal patient care. However, the application of knowledge translation (KT) theories and processes may present conceptual and practical challenges for clinicians. Online learning platforms are an effective means of delivering KT education, providing an interactive, time-efficient, and affordable alternative to face-to-face education programs. Objective This study investigates the availability and accessibility of online KT learning opportunities for health professionals. It also provides an analysis of the types of resources and associated disciplines retrieved by a range of KT synonyms. Methods We searched a range of bibliographic databases and the Internet (Google advanced option) using 9 KT terms to identify online KT learning resources. To be eligible, resources had to be free, aimed at clinicians, educational in intent, and interactive in design. Each term was searched using two different search engines. The details of the first 100 websites captured per browser (ie, n=200 results per term) were entered into EndNote. Each site was subsequently visited to determine its status as a learning resource. Eligible websites were appraised for quality using the AACODS (Authority, Accuracy, Coverage, Objectivity, Date, Significance) tool. Results We identified 971 unique websites via our multiple search strategies. Of these, 43 were health-related and educational in intent. Once these sites were evaluated for interactivity, a single website matched our inclusion criteria (Dementia Knowledge Translation Learning Centre). Conclusions KT is an important but complex system of processes. These processes overlap with knowledge, practice, and improvement processes that go by a range of different names. For clinicians to be informed and competent in KT, they require better access to free learning opportunities. These resources should be designed from the viewpoint of the clinician, presenting KT’s multifaceted theories and processes in an engaging, interactive way. This learning should empower clinicians to contextualize and apply KT strategies within their own care settings. PMID:28705788

  9. Development of environmental impact monitoring protocol for offshore carbon capture and storage (CCS): A biological perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Hyewon, E-mail: hyewon@ldeo.columbia.edu; Kim, Yong Hoon, E-mail: Yong.Kim@rpsgroup.com; Kang, Seong-Gil, E-mail: kangsg@kriso.re.kr

    Offshore geologic storage of carbon dioxide (CO{sub 2}), known as offshore carbon capture and sequestration (CCS), has been under active investigation as a safe, effective mitigation option for reducing CO{sub 2} levels from anthropogenic fossil fuel burning and climate change. Along with increasing trends in implementation plans and related logistics on offshore CCS, thorough risk assessment (i.e. environmental impact monitoring) needs to be conducted to evaluate potential risks, such as CO{sub 2} gas leakage at injection sites. Gas leaks from offshore CCS may affect the physiology of marine organisms and disrupt certain ecosystem functions, thereby posing an environmental risk. Here,more » we synthesize current knowledge on environmental impact monitoring of offshore CCS with an emphasis on biological aspects and provide suggestions for better practice. Based on our critical review of preexisting literatures, this paper: 1) discusses key variables sensitive to or indicative of gas leakage by summarizing physico-chemical and ecological variables measured from previous monitoring cruises on offshore CCS; 2) lists ecosystem and organism responses to a similar environmental condition to CO{sub 2} leakage and associated impacts, such as ocean acidification and hypercapnia, to predict how they serve as responsive indicators of short- and long-term gas exposure, and 3) discusses the designs of the artificial gas release experiments in fields and the best model simulation to produce realistic leakage scenarios in marine ecosystems. Based on our analysis, we suggest that proper incorporation of biological aspects will provide successful and robust long-term monitoring strategies with earlier detection of gas leakage, thus reducing the risks associated with offshore CCS. - Highlights: • This paper synthesizes the current knowledge on environmental impact monitoring of offshore Carbon Capture and Sequestration (CCS). • Impacts of CO{sub 2} leakage (ocean acidification, hypercapnia) on marine organisms and ecosystems are discussed. • Insights and recommendations on EIA monitoring for CCS operations are proposed specifically in marine ecosystem perspective.« less

  10. Tropical Rainfall Measuring Mission (TRMM). Phase B: Data capture facility definition study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The National Aeronautics and Aerospace Administration (NASA) and the National Space Development Agency of Japan (NASDA) initiated the Tropical Rainfall Measuring Mission (TRMM) to obtain more accurate measurements of tropical rainfall then ever before. The measurements are to improve scientific understanding and knowledge of the mechanisms effecting the intra-annual and interannual variability of the Earth's climate. The TRMM is largely dependent upon the handling and processing of the data by the TRMM Ground System supporting the mission. The objective of the TRMM is to obtain three years of climatological determinations of rainfall in the tropics, culminating in data sets of 30-day average rainfall over 5-degree square areas, and associated estimates of vertical distribution of latent heat release. The scope of this study is limited to the functions performed by TRMM Data Capture Facility (TDCF). These functions include capturing the TRMM spacecraft return link data stream; processing the data in the real-time, quick-look, and routine production modes, as appropriate; and distributing real time, quick-look, and production data products to users. The following topics are addressed: (1) TRMM end-to-end system description; (2) TRMM mission operations concept; (3) baseline requirements; (4) assumptions related to mission requirements; (5) external interface; (6) TDCF architecture and design options; (7) critical issues and tradeoffs; and (8) recommendation for the final TDCF selection process.

  11. Codman Award Paper: self-efficacy of staff nurses for health promotion counselling of patients at risk for stroke.

    PubMed

    Mayer, Cheryl; Andrusyszyn, Mary-Anne; Iwasiw, Carroll

    2005-06-01

    The effect of nurses' confidence to counsel patients at risk of stroke in selected health promotion areas: smoking cessation, exercise and nutrition was examined. Bandura's (1986) self-efficacy and Knowles' adult learning theories provided the theoretical underpinnings for the study. This was a quasi-experimental design in which neuroscience nurses (N = 23) from a quaternary hospital completed questionnaires prior to, immediately after, and 2 months post completion of a self-directed learning manual (SDL). The researcher-designed manual was designed to enhance learning about the risk factors for stroke and the importance of stroke prevention. Along with reflective activities and pre-post test, strategies for counseling high-risk, stroke-prone individuals in the areas of smoking cessation, exercise, and nutrition were also integrated. The Health Promotion Counseling Self-Efficacy Scale (Tresolini, Saluja, and Stritter, 1995), consisting of 10 self-efficacy subscales relating to self-confidence in knowledge and ability to counsel in health promotion areas, was used to capture the nurses' self-report of self-efficacy. Using a 5-point Likert Scale, nurses also rated their amount of agreement or disagreement about health promotion counseling in practice. Overall, self-efficacy levels for both knowledge and counseling increased significantly (p < .01) from pre-to immediately post completion of the manual, and decreased slightly at two-month follow-up. This pattern was evident in all health promotion areas measured except for knowledge in exercise (p = .015). Nurses' attitudes about aspects of health promotion practices correlated significantly (p < 05) at two-month follow-up with all health promotion areas. Results of this study support the usefulness of a self-directed learning manual as a teaching strategy for health promotion counseling of individuals at risk of stroke.

  12. Computational materials chemistry for carbon capture using porous materials

    NASA Astrophysics Data System (ADS)

    Sharma, Abhishek; Huang, Runhong; Malani, Ateeque; Babarao, Ravichandar

    2017-11-01

    Control over carbon dioxide (CO2) release is extremely important to decrease its hazardous effects on the environment such as global warming, ocean acidification, etc. For CO2 capture and storage at industrial point sources, nanoporous materials offer an energetically viable and economically feasible approach compared to chemisorption in amines. There is a growing need to design and synthesize new nanoporous materials with enhanced capability for carbon capture. Computational materials chemistry offers tools to screen and design cost-effective materials for CO2 separation and storage, and it is less time consuming compared to trial and error experimental synthesis. It also provides a guide to synthesize new materials with better properties for real world applications. In this review, we briefly highlight the various carbon capture technologies and the need of computational materials design for carbon capture. This review discusses the commonly used computational chemistry-based simulation methods for structural characterization and prediction of thermodynamic properties of adsorbed gases in porous materials. Finally, simulation studies reported on various potential porous materials, such as zeolites, porous carbon, metal organic frameworks (MOFs) and covalent organic frameworks (COFs), for CO2 capture are discussed.

  13. Intelligent systems technology infrastructure for integrated systems

    NASA Technical Reports Server (NTRS)

    Lum, Henry

    1991-01-01

    A system infrastructure must be properly designed and integrated from the conceptual development phase to accommodate evolutionary intelligent technologies. Several technology development activities were identified that may have application to rendezvous and capture systems. Optical correlators in conjunction with fuzzy logic control might be used for the identification, tracking, and capture of either cooperative or non-cooperative targets without the intensive computational requirements associated with vision processing. A hybrid digital/analog system was developed and tested with a robotic arm. An aircraft refueling application demonstration is planned within two years. Initially this demonstration will be ground based with a follow-on air based demonstration. System dependability measurement and modeling techniques are being developed for fault management applications. This involves usage of incremental solution/evaluation techniques and modularized systems to facilitate reuse and to take advantage of natural partitions in system models. Though not yet commercially available and currently subject to accuracy limitations, technology is being developed to perform optical matrix operations to enhance computational speed. Optical terrain recognition using camera image sequencing processed with optical correlators is being developed to determine position and velocity in support of lander guidance. The system is planned for testing in conjunction with Dryden Flight Research Facility. Advanced architecture technology is defining open architecture design constraints, test bed concepts (processors, multiple hardware/software and multi-dimensional user support, knowledge/tool sharing infrastructure), and software engineering interface issues.

  14. Capturing domain knowledge from multiple sources: the rare bone disorders use case.

    PubMed

    Groza, Tudor; Tudorache, Tania; Robinson, Peter N; Zankl, Andreas

    2015-01-01

    Lately, ontologies have become a fundamental building block in the process of formalising and storing complex biomedical information. The community-driven ontology curation process, however, ignores the possibility of multiple communities building, in parallel, conceptualisations of the same domain, and thus providing slightly different perspectives on the same knowledge. The individual nature of this effort leads to the need of a mechanism to enable us to create an overarching and comprehensive overview of the different perspectives on the domain knowledge. We introduce an approach that enables the loose integration of knowledge emerging from diverse sources under a single coherent interoperable resource. To accurately track the original knowledge statements, we record the provenance at very granular levels. We exemplify the approach in the rare bone disorders domain by proposing the Rare Bone Disorders Ontology (RBDO). Using RBDO, researchers are able to answer queries, such as: "What phenotypes describe a particular disorder and are common to all sources?" or to understand similarities between disorders based on divergent groupings (classifications) provided by the underlying sources. RBDO is available at http://purl.org/skeletome/rbdo. In order to support lightweight query and integration, the knowledge captured by RBDO has also been made available as a SPARQL Endpoint at http://bio-lark.org/se_skeldys.html.

  15. Representations of everyday life: a proposal for capturing social values from the Marxist perspective of knowledge production.

    PubMed

    Soares, Cássia Baldini; Santos, Vilmar Ezequiel Dos; Campos, Célia Maria Sivalli; Lachtim, Sheila Aparecida Ferreira; Campos, Fernanda Cristina

    2011-12-01

    We propose from the Marxist perspective of the construction of knowledge, a theoretical and methodological framework for understanding social values by capturing everyday representations. We assume that scientific research brings together different dimensions: epistemological, theoretical and methodological that consistently to the other instances, proposes a set of operating procedures and techniques for capturing and analyzing the reality under study in order to expose the investigated object. The study of values reveals the essentiality of the formation of judgments and choices, there are values that reflect the dominant ideology, spanning all social classes, but there are values that reflect class interests, these are not universal, they are formed in relationships and social activities. Basing on the Marxist theory of consciousness, representations are discursive formulations of everyday life - opinion or conviction - issued by subjects about their reality, being a coherent way of understanding and exposure social values: focus groups show is suitable for grasping opinions while interviews show potential to expose convictions.

  16. Stratospheric controlled perturbation experiment: a small-scale experiment to improve understanding of the risks of solar geoengineering

    PubMed Central

    Dykema, John A.; Keith, David W.; Anderson, James G.; Weisenstein, Debra

    2014-01-01

    Although solar radiation management (SRM) through stratospheric aerosol methods has the potential to mitigate impacts of climate change, our current knowledge of stratospheric processes suggests that these methods may entail significant risks. In addition to the risks associated with current knowledge, the possibility of ‘unknown unknowns’ exists that could significantly alter the risk assessment relative to our current understanding. While laboratory experimentation can improve the current state of knowledge and atmospheric models can assess large-scale climate response, they cannot capture possible unknown chemistry or represent the full range of interactive atmospheric chemical physics. Small-scale, in situ experimentation under well-regulated circumstances can begin to remove some of these uncertainties. This experiment—provisionally titled the stratospheric controlled perturbation experiment—is under development and will only proceed with transparent and predominantly governmental funding and independent risk assessment. We describe the scientific and technical foundation for performing, under external oversight, small-scale experiments to quantify the risks posed by SRM to activation of halogen species and subsequent erosion of stratospheric ozone. The paper's scope includes selection of the measurement platform, relevant aspects of stratospheric meteorology, operational considerations and instrument design and engineering. PMID:25404681

  17. Stratospheric controlled perturbation experiment: a small-scale experiment to improve understanding of the risks of solar geoengineering.

    PubMed

    Dykema, John A; Keith, David W; Anderson, James G; Weisenstein, Debra

    2014-12-28

    Although solar radiation management (SRM) through stratospheric aerosol methods has the potential to mitigate impacts of climate change, our current knowledge of stratospheric processes suggests that these methods may entail significant risks. In addition to the risks associated with current knowledge, the possibility of 'unknown unknowns' exists that could significantly alter the risk assessment relative to our current understanding. While laboratory experimentation can improve the current state of knowledge and atmospheric models can assess large-scale climate response, they cannot capture possible unknown chemistry or represent the full range of interactive atmospheric chemical physics. Small-scale, in situ experimentation under well-regulated circumstances can begin to remove some of these uncertainties. This experiment-provisionally titled the stratospheric controlled perturbation experiment-is under development and will only proceed with transparent and predominantly governmental funding and independent risk assessment. We describe the scientific and technical foundation for performing, under external oversight, small-scale experiments to quantify the risks posed by SRM to activation of halogen species and subsequent erosion of stratospheric ozone. The paper's scope includes selection of the measurement platform, relevant aspects of stratospheric meteorology, operational considerations and instrument design and engineering.

  18. Specification, Design, and Analysis of Advanced HUMS Architectures

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    2004-01-01

    During the two-year project period, we have worked on several aspects of domain-specific architectures for HUMS. In particular, we looked at using scenario-based approach for the design and designed a language for describing such architectures. The language is now being used in all aspects of our HUMS design. In particular, we have made contributions in the following areas. 1) We have employed scenarios in the development of HUMS in three main areas. They are: (a) To improve reusability by using scenarios as a library indexing tool and as a domain analysis tool; (b) To improve maintainability by recording design rationales from two perspectives - problem domain and solution domain; (c) To evaluate the software architecture. 2) We have defined a new architectural language called HADL or HUMS Architectural Definition Language. It is a customized version of xArch/xADL. It is based on XML and, hence, is easily portable from domain to domain, application to application, and machine to machine. Specifications written in HADL can be easily read and parsed using the currently available XML parsers. Thus, there is no need to develop a plethora of software to support HADL. 3) We have developed an automated design process that involves two main techniques: (a) Selection of solutions from a large space of designs; (b) Synthesis of designs. However, the automation process is not an absolute Artificial Intelligence (AI) approach though it uses a knowledge-based system that epitomizes a specific HUMS domain. The process uses a database of solutions as an aid to solve the problems rather than creating a new design in the literal sense. Since searching is adopted as the main technique, the challenges involved are: (a) To minimize the effort in searching the database where a very large number of possibilities exist; (b) To develop representations that could conveniently allow us to depict design knowledge evolved over many years; (c) To capture the required information that aid the automation process.

  19. How do we Remain Us in a Time of Change: Culture and Knowledge Management at NASA

    NASA Technical Reports Server (NTRS)

    Linde, Charlotte

    2003-01-01

    This viewgraph representation presents an overview of findings of a NASA agency-wide Knowledge Management Team considering culture and knowledge management issues at the agency. Specific issues identified by the team include: (1) NASA must move from being a knowledge hoarding culture to a knowledge sharing culture; (2) NASA must move from being center focused to being Agency focused; (3) NASA must capture the knowledge of a departing workforce. Topics considered include: what must NASA know to remain NASA, what were previous forms of knowledge reproduction and how has technological innovations changed these systems, and what changes in funding and relationships between contractors and NASA affected knowledge reproduction.

  20. A case of malignant hyperthermia captured by an anesthesia information management system.

    PubMed

    Maile, Michael D; Patel, Rajesh A; Blum, James M; Tremper, Kevin K

    2011-04-01

    Many cases of malignant hyperthermia triggered by volatile anesthetic agents have been described. However, to our knowledge, there has not been a report describing the precise changes in physiologic data of a human suffering from this process. Here we describe a case of malignant hyperthermia in which monitoring information was frequently and accurately captured by an anesthesia information management system.

  1. Connecting Provenance with Semantic Descriptions in the NASA Earth Exchange (NEX)

    NASA Astrophysics Data System (ADS)

    Votava, P.; Michaelis, A.; Nemani, R. R.

    2012-12-01

    NASA Earth Exchange (NEX) is a data, modeling and knowledge collaboratory that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform. Some of the main goals of NEX are transparency and repeatability and to that extent we have been adding components that enable tracking of provenance of both scientific processes and datasets produced by these processes. As scientific processes become more complex, they are often developed collaboratively and it becomes increasingly important for the research team to be able to track the development of the process and the datasets that are produced along the way. Additionally, we want to be able to link the processes and the datasets developed on NEX to an existing information and knowledge, so that the users can query and compare the provenance of any dataset or process with regard to the component-specific attributes such as data quality, geographic location, related publications, user comments and annotations etc. We have developed several ontologies that describe datasets and workflow components available on NEX using the OWL ontology language as well as a simple ontology that provides linking mechanism to the collected provenance information. The provenance is captured in two ways - we utilize existing provenance infrastructure of VisTrails, which is used as a workflow engine on NEX, and we extend the captured provenance using the PROV data model expressed through the PROV-O ontology. We do this in order to link and query the provenance easier in the context of the existing NEX information and knowledge. The captured provenance graph is processed and stored using RDFlib with MySQL backend that can be queried using either RDFLib or SPARQL. As a concrete example, we show how this information is captured during anomaly detection process in large satellite datasets.

  2. Electronic Portfolios as Capstone Experiences in a Graduate Program in Organizational Leadership

    ERIC Educational Resources Information Center

    Goertzen, Brent J.; McRay, Jeni; Klaus, Kaley

    2016-01-01

    Assessment of student learning in graduate education often takes the form of a summative measure by way of written comprehensive exams. However, written examinations, while suitable for evaluating cognitive knowledge, may not fully capture students' abilities to transfer and apply leadership related knowledge and skills into real-world practice.…

  3. ScienceDesk Project Overview

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Norvig, Peter (Technical Monitor)

    2000-01-01

    NASA's ScienceDesk Project at the Ames Research Center is responsible for scientific knowledge management which includes ensuring the capture, preservation, and traceability of scientific knowledge. Other responsibilities include: 1) Maintaining uniform information access which is achieved through intelligent indexing and visualization, 2) Collaborating both asynchronous and synchronous science teamwork, 3) Monitoring and controlling semi-autonomous remote experimentation.

  4. From Data to Knowledge – Promising Analytical Tools and Techniques for Capture and Reuse of Corporate Knowledge and to Aid in the State Evaluation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danielson, Gary R.; Augustenborg, Elsa C.; Beck, Andrew E.

    2010-10-29

    The IAEA is challenged with limited availability of human resources for inspection and data analysis while proliferation threats increase. PNNL has a variety of IT solutions and techniques (at varying levels of maturity and development) that take raw data closer to useful knowledge, thereby assisting with and standardizing the analytical processes. This paper highlights some PNNL tools and techniques which are applicable to the international safeguards community, including: • Intelligent in-situ triage of data prior to reliable transmission to an analysis center resulting in the transmission of smaller and more relevant data sets • Capture of expert knowledge in re-usablemore » search strings tailored to specific mission outcomes • Image based searching fused with text based searching • Use of gaming to discover unexpected proliferation scenarios • Process modeling (e.g. Physical Model) as the basis for an information integration portal, which links to data storage locations along with analyst annotations, categorizations, geographic data, search strings and visualization outputs.« less

  5. A top-level ontology of functions and its application in the Open Biomedical Ontologies.

    PubMed

    Burek, Patryk; Hoehndorf, Robert; Loebe, Frank; Visagie, Johann; Herre, Heinrich; Kelso, Janet

    2006-07-15

    A clear understanding of functions in biology is a key component in accurate modelling of molecular, cellular and organismal biology. Using the existing biomedical ontologies it has been impossible to capture the complexity of the community's knowledge about biological functions. We present here a top-level ontological framework for representing knowledge about biological functions. This framework lends greater accuracy, power and expressiveness to biomedical ontologies by providing a means to capture existing functional knowledge in a more formal manner. An initial major application of the ontology of functions is the provision of a principled way in which to curate functional knowledge and annotations in biomedical ontologies. Further potential applications include the facilitation of ontology interoperability and automated reasoning. A major advantage of the proposed implementation is that it is an extension to existing biomedical ontologies, and can be applied without substantial changes to these domain ontologies. The Ontology of Functions (OF) can be downloaded in OWL format from http://onto.eva.mpg.de/. Additionally, a UML profile and supplementary information and guides for using the OF can be accessed from the same website.

  6. A process for capturing CO 2 from the atmosphere

    DOE PAGES

    Keith, David W.; Holmes, Geoffrey; St. Angelo, David; ...

    2018-06-07

    Here, we describe a process for capturing CO 2 from the atmosphere in an industrial plant. The design captures ~1 Mt-CO 2/year in a continuous process using an aqueous KOH sorbent coupled to a calcium caustic recovery loop. We describe the design rationale, summarize performance of the major unit operations, and provide a capital cost breakdown developed with an independent consulting engineering firm. We report results from a pilot plant which provides data on performance of the major unit operations. We summarize the energy and material balance computed using an Aspen process simulation. When CO 2 is delivered at 15more » MPa the design requires either 8.81 GJ of natural gas, or 5.25 GJ of gas and 366 kWhr of electricity, per ton of CO 2 captured. Depending on financial assumptions, energy costs, and the specific choice of inputs and outputs, the levelized cost per ton CO 2 captured from the atmosphere ranges from 94 to 232 $/t-CO 2.« less

  7. Tilted pillar array fabrication by the combination of proton beam writing and soft lithography for microfluidic cell capture: Part 1 Design and feasibility.

    PubMed

    Rajta, Istvan; Huszánk, Robert; Szabó, Atilla T T; Nagy, Gyula U L; Szilasi, Szabolcs; Fürjes, Peter; Holczer, Eszter; Fekete, Zoltan; Járvás, Gabor; Szigeti, Marton; Hajba, Laszlo; Bodnár, Judit; Guttman, Andras

    2016-02-01

    Design, fabrication, integration, and feasibility test results of a novel microfluidic cell capture device is presented, exploiting the advantages of proton beam writing to make lithographic irradiations under multiple target tilting angles and UV lithography to easily reproduce large area structures. A cell capture device is demonstrated with a unique doubly tilted micropillar array design for cell manipulation in microfluidic applications. Tilting the pillars increased their functional surface, therefore, enhanced fluidic interaction when special bioaffinity coating was used, and improved fluid dynamic behavior regarding cell culture injection. The proposed microstructures were capable to support adequate distribution of body fluids, such as blood, spinal fluid, etc., between the inlet and outlet of the microfluidic sample reservoirs, offering advanced cell capture capability on the functionalized surfaces. The hydrodynamic characteristics of the microfluidic systems were tested with yeast cells (similar size as red blood cells) for efficient capture. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. A process for capturing CO 2 from the atmosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keith, David W.; Holmes, Geoffrey; St. Angelo, David

    Here, we describe a process for capturing CO 2 from the atmosphere in an industrial plant. The design captures ~1 Mt-CO 2/year in a continuous process using an aqueous KOH sorbent coupled to a calcium caustic recovery loop. We describe the design rationale, summarize performance of the major unit operations, and provide a capital cost breakdown developed with an independent consulting engineering firm. We report results from a pilot plant which provides data on performance of the major unit operations. We summarize the energy and material balance computed using an Aspen process simulation. When CO 2 is delivered at 15more » MPa the design requires either 8.81 GJ of natural gas, or 5.25 GJ of gas and 366 kWhr of electricity, per ton of CO 2 captured. Depending on financial assumptions, energy costs, and the specific choice of inputs and outputs, the levelized cost per ton CO 2 captured from the atmosphere ranges from 94 to 232 $/t-CO 2.« less

  9. Solar Electric Propulsion Triple-Satellite-Aided Capture With Mars Flyby

    NASA Astrophysics Data System (ADS)

    Patrick, Sean

    Triple-Satellite-aided-capture sequences use gravity-assists at three of Jupiter's four massive Galilean moons to reduce the DeltaV required to enter into Jupiter orbit. A triple-satellite-aided capture at Callisto, Ganymede, and Io is proposed to capture a SEP spacecraft into Jupiter orbit from an interplanetary Earth-Jupiter trajectory that employs low-thrust maneuvers. The principal advantage of this method is that it combines the ISP efficiency of ion propulsion with nearly impulsive but propellant-free gravity assists. For this thesis, two main chapters are devoted to the exploration of low-thrust triple-flyby capture trajectories. Specifically, the design and optimization of these trajectories are explored heavily. The first chapter explores the design of two solar electric propulsion (SEP), low-thrust trajectories developed using the JPL's MALTO software. The two trajectories combined represent a full Earth to Jupiter capture split into a heliocentric Earth to Jupiter Sphere of Influence (SOI) trajectory and a Joviocentric capture trajectory. The Joviocentric trajectory makes use of gravity assist flybys of Callisto, Ganymede, and Io to capture into Jupiter orbit with a period of 106.3 days. Following this, in chapter two, three more SEP low-thrust trajectories were developed based upon those in chapter one. These trajectories, devised using the high-fidelity Mystic software, also developed by JPL, improve upon the original trajectories developed in chapter one. Here, the developed trajectories are each three separate, full Earth to Jupiter capture orbits. As in chapter one, a Mars gravity assist is used to augment the heliocentric trajectories. Gravity-assist flybys of Callisto, Ganymede, and Io or Europa are used to capture into Jupiter Orbit. With between 89.8 and 137.2-day periods, the orbits developed in chapters one and two are shorter than most Jupiter capture orbits achieved using low-thrust propulsion techniques. Finally, chapter 3 presents an original trajectory design for a Very-Long-Baseline Interferometry (VLBI) satellite constellation. The design was created for the 8th Global Trajectory Optimization Competition (GTOC8) in which participants are tasked with creating and optimizing low-thrust trajectories to place a series of three space craft into formation to map given radio sources.

  10. Verification and Validation of Digitally Upgraded Control Rooms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Lau, Nathan

    2015-09-01

    As nuclear power plants undertake main control room modernization, a challenge is the lack of a clearly defined human factors process to follow. Verification and validation (V&V) as applied in the nuclear power community has tended to involve efforts such as integrated system validation, which comes at the tail end of the design stage. To fill in guidance gaps and create a step-by-step process for control room modernization, we have developed the Guideline for Operational Nuclear Usability and Knowledge Elicitation (GONUKE). This approach builds on best practices in the software industry, which prescribe an iterative user-centered approach featuring multiple cyclesmore » of design and evaluation. Nuclear regulatory guidance for control room design emphasizes summative evaluation—which occurs after the design is complete. In the GONUKE approach, evaluation is also performed at the formative stage of design—early in the design cycle using mockups and prototypes for evaluation. The evaluation may involve expert review (e.g., software heuristic evaluation at the formative stage and design verification against human factors standards like NUREG-0700 at the summative stage). The evaluation may also involve user testing (e.g., usability testing at the formative stage and integrated system validation at the summative stage). An additional, often overlooked component of evaluation is knowledge elicitation, which captures operator insights into the system. In this report we outline these evaluation types across design phases that support the overall modernization process. The objective is to provide industry-suitable guidance for steps to be taken in support of the design and evaluation of a new human-machine interface (HMI) in the control room. We suggest the value of early-stage V&V and highlight how this early-stage V&V can help improve the design process for control room modernization. We argue that there is a need to overcome two shortcomings of V&V in current practice—the propensity for late-stage V&V and the use of increasingly complex psychological assessment measures for V&V.« less

  11. Direction discovery: A science enrichment program for high school students.

    PubMed

    Sikes, Suzanne S; Schwartz-Bloom, Rochelle D

    2009-03-01

    Launch into education about pharmacology (LEAP) is an inquiry-based science enrichment program designed to enhance competence in biology and chemistry and foster interest in science careers especially among under-represented minorities. The study of how drugs work, how they enter cells, alter body chemistry, and exit the body engages students to conceptualize fundamental precepts in biology, chemistry, and math. Students complete an intensive three-week course in the fundamentals of pharmacology during the summer followed by a mentored research component during the school year. Following a 5E learning paradigm, the summer course captures student interest by introducing controversial topics in pharmacology and provides a framework that guides them to explore topics in greater detail. The 5E learning cycle is recapitulated as students extend their knowledge to design and to test an original research question in pharmacology. LEAP students demonstrated significant gains in biology and chemistry knowledge and interests in pursuing science. Several students earned honors for the presentation of their research in regional and state science fairs. Success of the LEAP model in its initial 2 years argues that coupling college-level coursework of interest to teens with an authentic research experience enhances high school student success in and enthusiasm for science. Copyright © 2009 International Union of Biochemistry and Molecular Biology, Inc.

  12. Promoting innovation in pediatric nutrition.

    PubMed

    Bier, Dennis M

    2010-01-01

    Truly impactful innovation can only be recognized in retrospect. Moreover, almost by definition, developing algorithmic paths on roadmaps for innovation are likely to be unsuccessful because innovators do not generally follow established routes. Nonetheless, environments can be established within Departments of Pediatrics that promote innovating thinking. The environmental factors necessary to do so include: (1) demand that academic Pediatrics Departments function in an aggressively scholarly mode; (2) capture the most fundamental science in postnatal developmental biology; (3) focus education and training on the boundaries of our knowledge, rather than the almost exclusive attention to what we think we already know; (4) devote mentoring, time and resources to only the most compelling unanswered questions in the pediatric sciences, including nutrition; (5) accept only systematic, evidence-based answers to clinical questions; (6) if systematic, evidence-based data are not available, design the proper studies to get them; (7) prize questioning the answers to further move beyond the knowledge limit; (8) support the principle that experiments in children will be required to convincingly answer clinical questions important to children, and (9) establish the multicenter resources in pediatric scientist training, clinical study design and implementation, and laboratory and instrument technologies required to answer today's questions with tomorrow's methods. Copyright © 2010 S. Karger AG, Basel.

  13. Do different attention capture paradigms measure different types of capture?

    PubMed

    Roque, Nelson A; Wright, Timothy J; Boot, Walter R

    2016-10-01

    When something captures our attention, why does it do so? This topic has been hotly debated, with some arguing that attention is captured only by salient stimuli (bottom-up view) and others arguing capture is always due to a match between a stimulus and our goals (top-down view). Many different paradigms have provided evidence for 1 view or the other. If either of these strong views are correct, then capture represents a unitary phenomenon, and there should be a high correlation between capture in these paradigms. But if there are different types of capture (top-down, bottom-up), then some attention capture effects should be correlated and some should not. In 2 studies, we collected data from several paradigms used in support of claims of top-down and bottom-up capture in relatively large samples of participants. Contrary to either prediction, measures of capture were not strongly correlated. Results suggest that capture may in fact be strongly determined by idiosyncratic task demands and strategies. Relevant to this lack of relations among tasks, we observed that classic measures of attention capture demonstrated low reliability, especially among measures used to support bottom-up capture. Implications for the low reliability of capture measures are discussed. We also observed that the proportion of participants demonstrating a pattern of responses consistent with capture varied widely among classic measures of capture. Overall, results demonstrate that, even for relatively simple laboratory measures of attention, there are still important gaps in knowledge regarding what these paradigms measure and how they are related.

  14. Designing Ionic Liquids for CO2 Capture: What’s the role for computation?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brennecke, Joan F.

    Presentation on the computational aspects of ionic liquid selection for carbon dioxide capture to the conference attendees at the New Vistas in Molecular Thermodynamics: Experimentation, Molecular Modeling, and Inverse Design, Berkeley, CA, January 7 through 9, 2018

  15. Automatic programming of arc welding robots

    NASA Astrophysics Data System (ADS)

    Padmanabhan, Srikanth

    Automatic programming of arc welding robots requires the geometric description of a part from a solid modeling system, expert weld process knowledge and the kinematic arrangement of the robot and positioner automatically. Current commercial solid models are incapable of storing explicitly product and process definitions of weld features. This work presents a paradigm to develop a computer-aided engineering environment that supports complete weld feature information in a solid model and to create an automatic programming system for robotic arc welding. In the first part, welding features are treated as properties or attributes of an object, features which are portions of the object surface--the topological boundary. The structure for representing the features and attributes is a graph called the Welding Attribute Graph (WAGRAPH). The method associates appropriate weld features to geometric primitives, adds welding attributes, and checks the validity of welding specifications. A systematic structure is provided to incorporate welding attributes and coordinate system information in a CSG tree. The specific implementation of this structure using a hybrid solid modeler (IDEAS) and an object-oriented programming paradigm is described. The second part provides a comprehensive methodology to acquire and represent weld process knowledge required for the proper selection of welding schedules. A methodology of knowledge acquisition using statistical methods is proposed. It is shown that these procedures did little to capture the private knowledge of experts (heuristics), but helped in determining general dependencies, and trends. A need was established for building the knowledge-based system using handbook knowledge and to allow the experts further to build the system. A methodology to check the consistency and validity for such knowledge addition is proposed. A mapping shell designed to transform the design features to application specific weld process schedules is described. A new approach using fixed path modified continuation methods is proposed in the final section to plan continuously the trajectory of weld seams in an integrated welding robot and positioner environment. The joint displacement, velocity, and acceleration histories all along the path as a function of the path parameter for the best possible welding condition are provided for the robot and the positioner to track various paths normally encountered in arc welding.

  16. Image processing system design for microcantilever-based optical readout infrared arrays

    NASA Astrophysics Data System (ADS)

    Tong, Qiang; Dong, Liquan; Zhao, Yuejin; Gong, Cheng; Liu, Xiaohua; Yu, Xiaomei; Yang, Lei; Liu, Weiyu

    2012-12-01

    Compared with the traditional infrared imaging technology, the new type of optical-readout uncooled infrared imaging technology based on MEMS has many advantages, such as low cost, small size, producing simple. In addition, the theory proves that the technology's high thermal detection sensitivity. So it has a very broad application prospects in the field of high performance infrared detection. The paper mainly focuses on an image capturing and processing system in the new type of optical-readout uncooled infrared imaging technology based on MEMS. The image capturing and processing system consists of software and hardware. We build our image processing core hardware platform based on TI's high performance DSP chip which is the TMS320DM642, and then design our image capturing board based on the MT9P031. MT9P031 is Micron's company high frame rate, low power consumption CMOS chip. Last we use Intel's company network transceiver devices-LXT971A to design the network output board. The software system is built on the real-time operating system DSP/BIOS. We design our video capture driver program based on TI's class-mini driver and network output program based on the NDK kit for image capturing and processing and transmitting. The experiment shows that the system has the advantages of high capturing resolution and fast processing speed. The speed of the network transmission is up to 100Mbps.

  17. Sampling designs matching species biology produce accurate and affordable abundance indices

    PubMed Central

    Farley, Sean; Russell, Gareth J.; Butler, Matthew J.; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions, which raised capture probabilities. The grid design was least biased (−10.5%), but imprecise (CV 21.2%), and used most effort (16,100 trap-nights). The targeted configuration was more biased (−17.3%), but most precise (CV 12.3%), with least effort (7,000 trap-nights). Targeted sampling generated encounter rates four times higher, and capture and recapture probabilities 11% and 60% higher than grid sampling, in a sampling frame 88% smaller. Bears had unequal probability of capture with both sampling designs, partly because some bears never had traps available to sample them. Hence, grid and targeted sampling generated abundance indices, not estimates. Overall, targeted sampling provided the most accurate and affordable design to index abundance. Targeted sampling may offer an alternative method to index the abundance of other species inhabiting expansive and inaccessible landscapes elsewhere, provided their attraction to resource concentrations. PMID:24392290

  18. Clustering of host-seeking activity of Anopheles gambiae mosquitoes at the top surface of a human-baited bed net

    PubMed Central

    2013-01-01

    Background Knowledge of the interactions between mosquitoes and humans, and how vector control interventions affect them, is sparse. A study exploring host-seeking behaviour at a human-occupied bed net, a key event in such interactions, is reported here. Methods Host-seeking female Anopheles gambiae activity was studied using a human-baited ‘sticky-net’ (a bed net without insecticide, coated with non-setting adhesive) to trap mosquitoes. The numbers and distribution of mosquitoes captured on each surface of the bed net were recorded and analysed using non-parametric statistical methods and random effects regression analysis. To confirm sticky-net reliability, the experiment was repeated using a pitched sticky-net (tilted sides converging at apex, i.e., neither horizontal nor vertical). The capture efficiency of horizontal and vertical sticky surfaces were compared, and the potential repellency of the adhesive was investigated. Results In a semi-field experiment, more mosquitoes were caught on the top (74-87%) than on the sides of the net (p < 0.001). In laboratory experiments, more mosquitoes were caught on the top than on the sides in human-baited tests (p < 0.001), significantly different to unbaited controls (p < 0.001) where most mosquitoes were on the sides (p = 0.047). In both experiments, approximately 70% of mosquitoes captured on the top surface were clustered within a 90 × 90 cm (or lesser) area directly above the head and chest (p < 0.001). In pitched net tests, similar clustering occurred over the sleeper’s head and chest in baited tests only (p < 0.001). Capture rates at horizontal and vertical surfaces were not significantly different and the sticky-net was not repellent. Conclusion This study demonstrated that An. gambiae activity occurs predominantly within a limited area of the top surface of bed nets. The results provide support for the two-in-one bed net design for managing pyrethroid-resistant vector populations. Further exploration of vector behaviour at the bed net interface could contribute to additional improvements in insecticide-treated bed net design or the development of novel vector control tools. PMID:23902661

  19. Knowledge Management in healthcare libraries: the current picture.

    PubMed

    Hopkins, Emily

    2017-06-01

    Knowledge management has seen something of a resurgence in attention amongst health librarians recently. Of course it has never ceased to exist, but now many library staff are becoming more involved in organisational knowledge management, and positioning themselves as key players in the sphere. No single model of knowledge management is proliferating, but approaches that best fit the organisation's size, structure and culture, and a blending of evidence based practice and knowledge sharing. Whatever it is called and whatever models are used, it's clear that for librarians and information professionals, the importance of putting knowledge and evidence into practice, sharing knowledge well and capturing it effectively, are still what we will continue to do. © 2017 Health Libraries Group.

  20. Intelligent nursing: accounting for knowledge as action in practice.

    PubMed

    Purkis, Mary E; Bjornsdottir, Kristin

    2006-10-01

    This paper provides an analysis of nursing as a knowledgeable discipline. We examined ways in which knowledge operates in the practice of home care nursing and explored how knowledge might be fruitfully understood within the ambiguous spaces and competing temporalities characterizing contemporary healthcare services. Two popular metaphors of knowledge in nursing practice were identified and critically examined; evidence-based practice and the nurse as an intuitive worker. Pointing to faults in these conceptualizations, we suggest a different way of conceptualizing the relationship between knowledge and practice, namely practice as being activated by contextualized knowledge. This conceptualization is captured in an understanding of the intelligent creation of context by the nurse for nursing practice to be ethical and effective.

  1. Knowledge Value Creation Characteristics of Virtual Teams: A Case Study in the Construction Sector

    NASA Astrophysics Data System (ADS)

    Vorakulpipat, Chalee; Rezgui, Yacine

    Any knowledge environment aimed at virtual teams should promote identification, access, capture and retrieval of relevant knowledge anytime / anywhere, while nurturing the social activities that underpin the knowledge sharing and creation process. In fact, socio-cultural issues play a critical role in the successful implementation of Knowledge Management (KM), and constitute a milestone towards value creation. The findings indicate that Knowledge Management Systems (KMS) promote value creation when they embed and nurture the social conditions that bind and bond team members together. Furthermore, technology assets, human networks, social capital, intellectual capital, and change management are identified as essential ingredients that have the potential to ensure effective knowledge value creation.

  2. Design Rules and Analysis of a Capture Mechanism for Rendezvous between a Space Tether and Payload

    NASA Technical Reports Server (NTRS)

    Sorensen, Kirk F.; Canfield, Stephen L.; Norris, Marshall A.

    2006-01-01

    Momentum-exchange/electrodynamic reboost (MXER) tether systems have been proposed to serve as an "upper stage in space". A MXER tether station would boost spacecraft from low Earth orbit to a high-energy orbit quickly, like a high-thrust rocket. Then, it would slowly rebuild its orbital momentum through electrodynamic thrust, minimizing the use of propellant. One of the primary challenges in developing a momentum-exchange/electrodynamic reboost tether system as identified by the 2003 MXER Technology Assessment Group is in the development of a mechanism that will enable the processes of capture, carry and release of a payload by the rotating tether as required by the MXER tether approach. This paper will present a concept that will achieve the desired goals of the capture system. This solution is presented as a multi-DOF (degree-of-freedom) capture mechanism with nearly passive operation that features matching of the capture space and expected window of capture error, efficient use of mass and nearly passive actuation during the capture process. This paper will describe the proposed capture mechanism concept and provide an evaluation of the concept through a dynamic model and experimental tests performed on a prototype article of the mechanism in a dynamically similar environment. This paper will also develop a set of rules to guide the design of such a capture mechanism based on analytical and experimental analyses. The primary contributions of this paper will be a description of the proposed capture mechanism concept, a collection of rules to guide its design, and empirical and model information that can be used to evaluate the capability of the concept

  3. Integrating Video-Capture Virtual Reality Technology into a Physically Interactive Learning Environment for English Learning

    ERIC Educational Resources Information Center

    Yang, Jie Chi; Chen, Chih Hung; Jeng, Ming Chang

    2010-01-01

    The aim of this study is to design and develop a Physically Interactive Learning Environment, the PILE system, by integrating video-capture virtual reality technology into a classroom. The system is designed for elementary school level English classes where students can interact with the system through physical movements. The system is designed to…

  4. AMUC: Associated Motion capture User Categories.

    PubMed

    Norman, Sally Jane; Lawson, Sian E M; Olivier, Patrick; Watson, Paul; Chan, Anita M-A; Dade-Robertson, Martyn; Dunphy, Paul; Green, Dave; Hiden, Hugo; Hook, Jonathan; Jackson, Daniel G

    2009-07-13

    The AMUC (Associated Motion capture User Categories) project consisted of building a prototype sketch retrieval client for exploring motion capture archives. High-dimensional datasets reflect the dynamic process of motion capture and comprise high-rate sampled data of a performer's joint angles; in response to multiple query criteria, these data can potentially yield different kinds of information. The AMUC prototype harnesses graphic input via an electronic tablet as a query mechanism, time and position signals obtained from the sketch being mapped to the properties of data streams stored in the motion capture repository. As well as proposing a pragmatic solution for exploring motion capture datasets, the project demonstrates the conceptual value of iterative prototyping in innovative interdisciplinary design. The AMUC team was composed of live performance practitioners and theorists conversant with a variety of movement techniques, bioengineers who recorded and processed motion data for integration into the retrieval tool, and computer scientists who designed and implemented the retrieval system and server architecture, scoped for Grid-based applications. Creative input on information system design and navigation, and digital image processing, underpinned implementation of the prototype, which has undergone preliminary trials with diverse users, allowing identification of rich potential development areas.

  5. An evaluation of the efficiency of minnow traps for estimating the abundance of minnows in desert spring systems

    USGS Publications Warehouse

    Peterson, James T.; Scheerer, Paul D.; Clements, Shaun

    2015-01-01

    Desert springs are sensitive aquatic ecosystems that pose unique challenges to natural resource managers and researchers. Among the most important of these is the need to accurately quantify population parameters for resident fish, particularly when the species are of special conservation concern. We evaluated the efficiency of baited minnow traps for estimating the abundance of two at-risk species, Foskett Speckled Dace Rhinichthys osculus ssp. and Borax Lake Chub Gila boraxobius, in desert spring systems in southeastern Oregon. We evaluated alternative sample designs using simulation and found that capture–recapture designs with four capture occasions would maximize the accuracy of estimates and minimize fish handling. We implemented the design and estimated capture and recapture probabilities using the Huggins closed-capture estimator. Trap capture probabilities averaged 23% and 26% for Foskett Speckled Dace and Borax Lake Chub, respectively, but differed substantially among sample locations, through time, and nonlinearly with fish body size. Recapture probabilities for Foskett Speckled Dace were, on average, 1.6 times greater than (first) capture probabilities, suggesting “trap-happy” behavior. Comparison of population estimates from the Huggins model with the commonly used Lincoln–Petersen estimator indicated that the latter underestimated Foskett Speckled Dace and Borax Lake Chub population size by 48% and by 20%, respectively. These biases were due to variability in capture and recapture probabilities. Simulation of fish monitoring that included the range of capture and recapture probabilities observed indicated that variability in capture and recapture probabilities in time negatively affected the ability to detect annual decreases by up to 20% in fish population size. Failure to account for variability in capture and recapture probabilities can lead to poor quality data and study inferences. Therefore, we recommend that fishery researchers and managers employ sample designs and estimators that can account for this variability.

  6. Mission Design, Guidance, and Navigation of a Callisto-Io-Ganymede Triple Flyby Jovian Capture

    NASA Astrophysics Data System (ADS)

    Didion, Alan M.

    Use of a triple-satellite-aided capture maneuver to enter Jovian orbit reduces insertion DeltaV and provides close flyby science opportunities at three of Jupiter's four large Galilean moons. This capture can be performed while maintaining appropriate Jupiter standoff distance and setting up a suitable apojove for plotting an extended tour. This paper has three main chapters, the first of which discusses the design and optimization of a triple-flyby capture trajectory. A novel triple-satellite-aided capture uses sequential flybys of Callisto, Io, and Ganymede to reduce the DeltaV required to capture into orbit about Jupiter. An optimal broken-plane maneuver is added between Earth and Jupiter to form a complete chemical/impulsive interplanetary trajectory from Earth to Jupiter. Such a trajectory can yield significant fuel savings over single and double-flyby capture schemes while maintaining a brief and simple interplanetary transfer phase. The second chapter focuses on the guidance and navigation of such trajectories in the presence of spacecraft navigation errors, ephemeris errors, and maneuver execution errors. A powered-flyby trajectory correction maneuver (TCM) is added to the nominal trajectory at Callisto and the nominal Jupiter orbit insertion (JOI) maneuver is modified to both complete the capture and target the Ganymede flyby. A third TCM is employed after all the flybys to act as a JOI cleanup maneuver. A Monte Carlo simulation shows that the statistical DeltaV required to correct the trajectory is quite manageable and the flyby characteristics are very consistent. The developed methods maintain flexibility for adaptation to similar launch, cruise, and capture conditions. The third chapter details the methodology and results behind a completely separate project to design and optimize an Earth-orbiting three satellite constellation to perform very long baseline interferometry (VLBI) as part of the 8th annual Global Trajectory Optimisation Competition (GTOC8). A script is designed to simulate the prescribed constellation and record its observations; the observations made are scored according to a provided performance index.

  7. The use of a robust capture-recapture design in small mammal population studies: A field example with Microtus pennsylvanicus

    USGS Publications Warehouse

    Nichols, James D.; Pollock, Kenneth H.; Hines, James E.

    1984-01-01

    The robust design of Pollock (1982) was used to estimate parameters of a Maryland M. pennsylvanicus population. Closed model tests provided strong evidence of heterogeneity of capture probability, and model M eta (Otis et al., 1978) was selected as the most appropriate model for estimating population size. The Jolly-Seber model goodness-of-fit test indicated rejection of the model for this data set, and the M eta estimates of population size were all higher than the Jolly-Seber estimates. Both of these results are consistent with the evidence of heterogeneous capture probabilities. The authors thus used M eta estimates of population size, Jolly-Seber estimates of survival rate, and estimates of birth-immigration based on a combination of the population size and survival rate estimates. Advantages of the robust design estimates for certain inference procedures are discussed, and the design is recommended for future small mammal capture-recapture studies directed at estimation.

  8. Culture, Interface Design, and Design Methods for Mobile Devices

    NASA Astrophysics Data System (ADS)

    Lee, Kun-Pyo

    Aesthetic differences and similarities among cultures are obviously one of the very important issues in cultural design. However, ever since products became knowledge-supporting tools, the visible elements of products have become more universal so that the invisible parts of products such as interface and interaction are getting more important. Therefore, the cultural design should be extended to the invisible elements of culture like people's conceptual models beyond material and phenomenal culture. This chapter aims to explain how we address the invisible cultural elements in interface design and design methods by exploring the users' cognitive styles and communication patterns in different cultures. Regarding cultural interface design, we examined users' conceptual models while interacting with mobile phone and website interfaces, and observed cultural difference in performing tasks and viewing patterns, which appeared to agree with cultural cognitive styles known as Holistic thoughts vs. Analytic thoughts. Regarding design methods for culture, we explored how to localize design methods such as focus group interview and generative session for specific cultural groups, and the results of comparative experiments revealed cultural difference on participants' behaviors and performance in each design method and led us to suggest how to conduct them in East Asian culture. Mobile Observation Analyzer and Wi-Pro, user research tools we invented to capture user behaviors and needs especially in their mobile context, were also introduced.

  9. An exploratory mixed-methods crossover study comparing DVD- vs. Web-based patient decision support in three conditions: The importance of patient perspectives.

    PubMed

    Halley, Meghan C; Rendle, Katharine A S; Gillespie, Katherine A; Stanley, Katherine M; Frosch, Dominick L

    2015-12-01

    The last 15 years have witnessed considerable progress in the development of decision support interventions (DESIs). However, fundamental questions about design and format of delivery remain. An exploratory, randomized mixed-method crossover study was conducted to compare a DVD- and Web-based DESI. Randomized participants used either the Web or the DVD first, followed by the alternative format. Participants completed a questionnaire to assess decision-specific knowledge at baseline and a questionnaire and structured qualitative interview after viewing each format. Tracking software was used to capture Web utilization. Transcripts were analyzed using integrated inductive and deductive approaches. Quantitative data were analyzed using exploratory bivariate and multivariate analyses. Exploratory knowledge analyses suggest that both formats increased knowledge, with limited evidence that the DVD increased knowledge more than the Web. Format preference varied across participants: 44% preferred the Web, 32% preferred the DVD and 24% preferred 'both'. Patient discussions of preferences for DESI information structure and the importance of a patients' stage of a given decision suggest these characteristics may be important factors underlying variation in utilization, format preferences and knowledge outcomes. Our results suggest that both DESI formats effectively increase knowledge. Patients' perceptions of these two formats further suggest that there may be no single 'best' format for all patients. These results have important implications for understanding why different DESI formats might be preferable to and more effective for different patients. Further research is needed to explore the relationship between these factors and DESI utilization outcomes across diverse patient populations. © 2014 John Wiley & Sons Ltd.

  10. Baseline evidence-based practice use, knowledge, and attitudes of allied health professionals: a survey to inform staff training and organisational change.

    PubMed

    Wilkinson, Shelley A; Hinchliffe, Fiona; Hough, Judith; Chang, Anne

    2012-01-01

    Evidence-based practice (EBP) is fundamental to improving patient outcomes. Universal adoption of EBP into the allied health clinical setting has not yet occurred. The primary aim of this project was to capture baseline measurements of the level of EBP self-efficacy, outcome expectancy, knowledge and use at our health service prior to training and organisational changes to support EBP. All allied health staff (n=252) employed across the campus were invited to participate in an online survey consisting of a battery of validated and reliable survey tools. Professional background, knowledge and previous training in EBP and research processes were collected. One hundred eighty-two allied health staff completed the survey (response rate 72%). One-way ANOVAs were used to compare levels of self-efficacy, outcome expectancy, knowledge and use, according to allied health discipline and experience with EBP and research processes. Mean scores for EBP attitudes (self-efficacy and outcome expectancy) and knowledge were higher than for use. Professional group differences were noted in the post-hoc analysis of the significant EBP constructs. Regression analyses indicated that EBP course attendance as well as training in research design and analysis impacted positively on EBP construct scores. Despite positive attitudes about, a belief in and knowledge of EBP, self-reports of EBP processes do not indicate systematic application in the allied health workplace. The results of this research will inform a targeted intervention to foster ongoing training in EBP and research activity for allied health staff.

  11. Research Needs in Fire Safety for the Human Exploration and Utilization of Space: Proceedings and Research Plan

    NASA Technical Reports Server (NTRS)

    Ruff, Gary A.

    2003-01-01

    The purpose of the workshop documented in this publication was to bring together personnel responsible for the design and operations of the International Space Station (ISS) and the fire protection research community to review the current knowledge in fire safety relative to spacecraft. From this review, research needs were identified that were then used to formulate a research plan with specific objectives. In this document, I have attempted to capture the very informative and lively discussions that occurred in the plenary sessions and the working groups. I hope that it will be useful to readers and serve as a significant step in assuring fire protection for the crews of current and future spacecraft.

  12. U.S. Spacesuit Legacy: Maintaining it for the Future

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; McMann, Joe; Thomas, Ken; Kosmo, Joe; Lewis, Cathleen; Wright, Rebecca; Bitterly, Rose; Olivia, Vladenka Rose

    2013-01-01

    The history of U.S. spacesuit development and its use are rich with information on lessons learned, and constitutes a valuable legacy to those designing spacesuits for the future, as well as to educators, students, and the general public. The genesis of lessons learned is best understood by studying the evolution of past spacesuit programs - how the challenges and pressures of the times influenced the direction of the various spacesuit programs. This paper shows how the legacy of various spacesuit-related programs evolved in response to these forces. Important aspects of how this U.S. spacesuit legacy is being preserved today is described, including the archiving of spacesuit hardware, important documents, videos, oral history, and the rapidly expanding U.S. Spacesuit Knowledge Capture program.

  13. Advanced Supersonic Nozzle Concepts: Experimental Flow Visualization Results Paired With LES

    NASA Astrophysics Data System (ADS)

    Berry, Matthew; Magstadt, Andrew; Stack, Cory; Gaitonde, Datta; Glauser, Mark; Syracuse University Team; The Ohio State University Team

    2015-11-01

    Advanced supersonic nozzle concepts are currently under investigation, utilizing multiple bypass streams and airframe integration to bolster performance and efficiency. This work focuses on the parametric study of a supersonic, multi-stream jet with aft deck. The single plane of symmetry, rectangular nozzle, displays very complex and unique flow characteristics. Flow visualization techniques in the form of PIV and schlieren capture flow features at various deck lengths and Mach numbers. LES is compared to the experimental results to both validate the computational model and identify limitations of the simulation. By comparing experimental results to LES, this study will help create a foundation of knowledge for advanced nozzle designs in future aircraft. SBIR Phase II with Spectral Energies, LLC under direction of Barry Kiel.

  14. IDEF3 formalization report

    NASA Technical Reports Server (NTRS)

    Menzel, Christopher; Mayer, Richard J.; Edwards, Douglas D.

    1991-01-01

    The Process Description Capture Method (IDEF3) is one of several Integrated Computer-Aided Manufacturing (ICAM) DEFinition methods developed by the Air Force to support systems engineering activities, and in particular, to support information systems development. These methods have evolved as a distillation of 'good practice' experience by information system developers and are designed to raise the performance level of the novice practitioner to one comparable with that of an expert. IDEF3 is meant to serve as a knowledge acquisition and requirements definition tool that structures the user's understanding of how a given process, event, or system works around process descriptions. A special purpose graphical language accompanying the method serves to highlight temporal precedence and causality relationships relative to the process or event being described.

  15. U.S. Spacesuit Legacy: Maintaining it for the Future

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; McMann, Joe; Thomas, Ken; Kosmo, Joe; Lewis, Cathleen; Wright, Rebecca; Bitterly, Rose; Oliva, Vladenka

    2012-01-01

    The history of US Spacesuit development and use is rich with information on lessons learned, and constitutes a valuable legacy to those designing spacesuits for the future, as well as educators, students and the general public. The genesis of lessons learned is best understood by studying the evolution of past spacesuit programs how the challenges and pressures of the times influenced the direction of the various spacesuit programs. This paper will show how the legacy of various programs evolved in response to these forces. Important aspects of how this rich U.S. spacesuit legacy is being preserved today will be described, including the archiving of spacesuit hardware, important documents, videos, oral history, and the rapidly expanding US Spacesuit Knowledge Capture program.

  16. An automated framework for hypotheses generation using literature.

    PubMed

    Abedi, Vida; Zand, Ramin; Yeasin, Mohammed; Faisal, Fazle Elahi

    2012-08-29

    In bio-medicine, exploratory studies and hypothesis generation often begin with researching existing literature to identify a set of factors and their association with diseases, phenotypes, or biological processes. Many scientists are overwhelmed by the sheer volume of literature on a disease when they plan to generate a new hypothesis or study a biological phenomenon. The situation is even worse for junior investigators who often find it difficult to formulate new hypotheses or, more importantly, corroborate if their hypothesis is consistent with existing literature. It is a daunting task to be abreast with so much being published and also remember all combinations of direct and indirect associations. Fortunately there is a growing trend of using literature mining and knowledge discovery tools in biomedical research. However, there is still a large gap between the huge amount of effort and resources invested in disease research and the little effort in harvesting the published knowledge. The proposed hypothesis generation framework (HGF) finds "crisp semantic associations" among entities of interest - that is a step towards bridging such gaps. The proposed HGF shares similar end goals like the SWAN but are more holistic in nature and was designed and implemented using scalable and efficient computational models of disease-disease interaction. The integration of mapping ontologies with latent semantic analysis is critical in capturing domain specific direct and indirect "crisp" associations, and making assertions about entities (such as disease X is associated with a set of factors Z). Pilot studies were performed using two diseases. A comparative analysis of the computed "associations" and "assertions" with curated expert knowledge was performed to validate the results. It was observed that the HGF is able to capture "crisp" direct and indirect associations, and provide knowledge discovery on demand. The proposed framework is fast, efficient, and robust in generating new hypotheses to identify factors associated with a disease. A full integrated Web service application is being developed for wide dissemination of the HGF. A large-scale study by the domain experts and associated researchers is underway to validate the associations and assertions computed by the HGF.

  17. Population dynamics of Microtus pennsylvanicus in corridor-linked patches

    USGS Publications Warehouse

    Coffman, C.J.; Nichols, J.D.; Pollock, K.H.

    2001-01-01

    Corridors have become a key issue in the discussion of conservation planning: however, few empirical data exist on the use of corridors and their effects on population dynamics. The objective of this replicated, population level, capture-re-capture experiment on meadow voles was to estimate and compare population characteristics of voles between (1) corridor-linked fragments, (2) isolated or non-linked fragments, and (3) unfragmented areas. We conducted two field experiments involving 22600 captures of 5700 individuals. In the first, the maintained corridor study, corridors were maintained at the time of fragmentation, and in the second, the constructed corridor study, we constructed corridors between patches that had been fragmented for some period of time. We applied multistate capture-recapture models with the robust design to estimate adult movement and survival rates, population size, temporal variation in population size, recruitment, and juvenile survival rates. Movement rates increased to a greater extent on constructed corridor-linked grids than on the unfragmented or non-linked fragmented grids between the pre- and post-treatment periods. We found significant differences in local survival on the treated (corridor-linked) grids compared to survival on the fragmented and unfragmented grids between the pre- and post-treatment periods. We found no clear pattern of treatment effects on population size or recruitment in either study. However, in both studies, we found that unfragmented grids were more stable than the fragmented grids based on lower temporal variability in population size. To our knowledge, this is the first experimental study demonstrating that corridors constructed between existing fragmented populations can indeed cause increases in movement and associated changes in demography, supporting the use of constructed corridors for this purpose in conservation biology.

  18. Creation of an Accurate Algorithm to Detect Snellen Best Documented Visual Acuity from Ophthalmology Electronic Health Record Notes.

    PubMed

    Mbagwu, Michael; French, Dustin D; Gill, Manjot; Mitchell, Christopher; Jackson, Kathryn; Kho, Abel; Bryar, Paul J

    2016-05-04

    Visual acuity is the primary measure used in ophthalmology to determine how well a patient can see. Visual acuity for a single eye may be recorded in multiple ways for a single patient visit (eg, Snellen vs. Jäger units vs. font print size), and be recorded for either distance or near vision. Capturing the best documented visual acuity (BDVA) of each eye in an individual patient visit is an important step for making electronic ophthalmology clinical notes useful in research. Currently, there is limited methodology for capturing BDVA in an efficient and accurate manner from electronic health record (EHR) notes. We developed an algorithm to detect BDVA for right and left eyes from defined fields within electronic ophthalmology clinical notes. We designed an algorithm to detect the BDVA from defined fields within 295,218 ophthalmology clinical notes with visual acuity data present. About 5668 unique responses were identified and an algorithm was developed to map all of the unique responses to a structured list of Snellen visual acuities. Visual acuity was captured from a total of 295,218 ophthalmology clinical notes during the study dates. The algorithm identified all visual acuities in the defined visual acuity section for each eye and returned a single BDVA for each eye. A clinician chart review of 100 random patient notes showed a 99% accuracy detecting BDVA from these records and 1% observed error. Our algorithm successfully captures best documented Snellen distance visual acuity from ophthalmology clinical notes and transforms a variety of inputs into a structured Snellen equivalent list. Our work, to the best of our knowledge, represents the first attempt at capturing visual acuity accurately from large numbers of electronic ophthalmology notes. Use of this algorithm can benefit research groups interested in assessing visual acuity for patient centered outcome. All codes used for this study are currently available, and will be made available online at https://phekb.org.

  19. Creation of an Accurate Algorithm to Detect Snellen Best Documented Visual Acuity from Ophthalmology Electronic Health Record Notes

    PubMed Central

    French, Dustin D; Gill, Manjot; Mitchell, Christopher; Jackson, Kathryn; Kho, Abel; Bryar, Paul J

    2016-01-01

    Background Visual acuity is the primary measure used in ophthalmology to determine how well a patient can see. Visual acuity for a single eye may be recorded in multiple ways for a single patient visit (eg, Snellen vs. Jäger units vs. font print size), and be recorded for either distance or near vision. Capturing the best documented visual acuity (BDVA) of each eye in an individual patient visit is an important step for making electronic ophthalmology clinical notes useful in research. Objective Currently, there is limited methodology for capturing BDVA in an efficient and accurate manner from electronic health record (EHR) notes. We developed an algorithm to detect BDVA for right and left eyes from defined fields within electronic ophthalmology clinical notes. Methods We designed an algorithm to detect the BDVA from defined fields within 295,218 ophthalmology clinical notes with visual acuity data present. About 5668 unique responses were identified and an algorithm was developed to map all of the unique responses to a structured list of Snellen visual acuities. Results Visual acuity was captured from a total of 295,218 ophthalmology clinical notes during the study dates. The algorithm identified all visual acuities in the defined visual acuity section for each eye and returned a single BDVA for each eye. A clinician chart review of 100 random patient notes showed a 99% accuracy detecting BDVA from these records and 1% observed error. Conclusions Our algorithm successfully captures best documented Snellen distance visual acuity from ophthalmology clinical notes and transforms a variety of inputs into a structured Snellen equivalent list. Our work, to the best of our knowledge, represents the first attempt at capturing visual acuity accurately from large numbers of electronic ophthalmology notes. Use of this algorithm can benefit research groups interested in assessing visual acuity for patient centered outcome. All codes used for this study are currently available, and will be made available online at https://phekb.org. PMID:27146002

  20. The Collective Knowledge of Social Tags: Direct and Indirect Influences on Navigation, Learning, and Information Processing

    ERIC Educational Resources Information Center

    Cress, Ulrike; Held, Christoph; Kimmerle, Joachim

    2013-01-01

    Tag clouds generated in social tagging systems can capture the collective knowledge of communities. Using as a basis spreading activation theories, information foraging theory, and the co-evolution model of cognitive and social systems, we present here a model for an "extended information scent," which proposes that both collective and individual…

  1. Interactions between Knowledge and Testimony in Children's Reality-Status Judgments

    ERIC Educational Resources Information Center

    Lopez-Mobilia, Gabriel; Woolley, Jacqueline D.

    2016-01-01

    In 2 studies, we attempted to capture the information-processing abilities underlying children's reality-status judgments. Forty 5- to 6-year-olds and 53 7- to 8-year-olds heard about novel entities (animals) that varied in their fit with children's world knowledge. After hearing about each entity, children could either guess reality status…

  2. Performance Factors Analysis -- A New Alternative to Knowledge Tracing

    ERIC Educational Resources Information Center

    Pavlik, Philip I., Jr.; Cen, Hao; Koedinger, Kenneth R.

    2009-01-01

    Knowledge tracing (KT)[1] has been used in various forms for adaptive computerized instruction for more than 40 years. However, despite its long history of application, it is difficult to use in domain model search procedures, has not been used to capture learning where multiple skills are needed to perform a single action, and has not been used…

  3. Integrating Social Activity Theory and Critical Discourse Analysis: A Multilayered Methodological Model for Examining Knowledge Mediation in Mentoring

    ERIC Educational Resources Information Center

    Becher, Ayelet; Orland-Barak, Lily

    2016-01-01

    This study suggests an integrative qualitative methodological framework for capturing complexity in mentoring activity. Specifically, the model examines how historical developments of a discipline direct mentors' mediation of professional knowledge through the language that they use. The model integrates social activity theory and a framework of…

  4. Neutron capture cross sections of Kr

    NASA Astrophysics Data System (ADS)

    Fiebiger, Stefan; Baramsai, Bayarbadrakh; Couture, Aaron; Krtička, Milan; Mosby, Shea; Reifarth, René; O'Donnell, John; Rusev, Gencho; Ullmann, John; Weigand, Mario; Wolf, Clemens

    2018-01-01

    Neutron capture and β- -decay are competing branches of the s-process nucleosynthesis path at 85Kr [1], which makes it an important branching point. The knowledge of its neutron capture cross section is therefore essential to constrain stellar models of nucleosynthesis. Despite its importance for different fields, no direct measurement of the cross section of 85Kr in the keV-regime has been performed. The currently reported uncertainties are still in the order of 50% [2, 3]. Neutron capture cross section measurements on a 4% enriched 85Kr gas enclosed in a stainless steel cylinder were performed at Los Alamos National Laboratory (LANL) using the Detector for Advanced Neutron Capture Experiments (DANCE). 85Kr is radioactive isotope with a half life of 10.8 years. As this was a low-enrichment sample, the main contaminants, the stable krypton isotopes 83Kr and 86Kr, were also investigated. The material was highly enriched and contained in pressurized stainless steel spheres.

  5. Applying Knowledge Management to an Organization's Transformation

    NASA Technical Reports Server (NTRS)

    Potter, Shannon; Gill, Tracy; Fritsche, Ralph

    2008-01-01

    Although workers in the information age have more information at their fingertips than ever before, the ability to effectively capture and reuse actual knowledge is still a surmounting challenge for many organizations. As high tech organizations transform from providing complex products and services in an established domain to providing them in new domains, knowledge remains an increasingly valuable commodity. This paper explores the supply and demand elements of the "knowledge market" within the International Space Station and Spacecraft Processing Directorate (ISSSPD) of NASA's Kennedy Space Center (KSC). It examines how knowledge supply and knowledge demand determine the success of an organization's knowledge management (KM) activities, and how the elements of a KM infrastructure (tools, culture, and training), can be used to create and sustain knowledge supply and demand

  6. An integrated science-based methodology to assess potential ...

    EPA Pesticide Factsheets

    There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology to assess the potential risks of engineered nanomaterials. To achieve the study objective, two major tasks are accomplished, knowledge synthesis and algorithmic computational methodology. The knowledge synthesis task is designed to capture “what is known” and to outline the gaps in knowledge from ENMs risk perspective. The algorithmic computational methodology is geared toward the provision of decisions and an understanding of the risks of ENMs along different endpoints for the constituents of the SEE complex adaptive system. The approach presented herein allows for addressing the formidable task of assessing the implications and risks of exposure to ENMs, with the long term goal to build a decision-support system to guide key stakeholders in the SEE system towards building sustainable ENMs and nano-enabled products. The following specific aims are formulated to achieve the study objective: (1) to propose a system of systems (SoS) architecture that builds a network management among the different entities in the large SEE system to track the flow of ENMs emission, fate and transport from the source to the receptor; (2) to establish a staged approach for knowledge synthesis methodo

  7. AMModels: An R package for storing models, data, and metadata to facilitate adaptive management

    USGS Publications Warehouse

    Donovan, Therese M.; Katz, Jonathan

    2018-01-01

    Agencies are increasingly called upon to implement their natural resource management programs within an adaptive management (AM) framework. This article provides the background and motivation for the R package, AMModels. AMModels was developed under R version 3.2.2. The overall goal of AMModels is simple: To codify knowledge in the form of models and to store it, along with models generated from numerous analyses and datasets that may come our way, so that it can be used or recalled in the future. AMModels facilitates this process by storing all models and datasets in a single object that can be saved to an .RData file and routinely augmented to track changes in knowledge through time. Through this process, AMModels allows the capture, development, sharing, and use of knowledge that may help organizations achieve their mission. While AMModels was designed to facilitate adaptive management, its utility is far more general. Many R packages exist for creating and summarizing models, but to our knowledge, AMModels is the only package dedicated not to the mechanics of analysis but to organizing analysis inputs, analysis outputs, and preserving descriptive metadata. We anticipate that this package will assist users hoping to preserve the key elements of an analysis so they may be more confidently revisited at a later date.

  8. Research on Capturing of Customer Requirements Based on Innovation Theory

    NASA Astrophysics Data System (ADS)

    junwu, Ding; dongtao, Yang; zhenqiang, Bao

    To exactly and effectively capture customer requirements information, a new customer requirements capturing modeling method was proposed. Based on the analysis of function requirement models of previous products and the application of technology system evolution laws of the Theory of Innovative Problem Solving (TRIZ), the customer requirements could be evolved from existing product designs, through modifying the functional requirement unit and confirming the direction of evolution design. Finally, a case study was provided to illustrate the feasibility of the proposed approach.

  9. Enhancing health policymakers' information literacy knowledge and skill for policymaking on control of infectious diseases of poverty in Nigeria.

    PubMed

    Uneke, Chigozie Jesse; Ezeoha, Abel Ebeh; Uro-Chukwu, Henry; Ezeonu, Chinonyelum Thecla; Ogbu, Ogbonnaya; Onwe, Friday; Edoga, Chima

    2015-01-01

    In Nigeria, one of the major challenges associated with evidence-to-policy link in the control of infectious diseases of poverty (IDP), is deficient information literacy knowledge and skill among policymakers. There is need for policymakers to acquire the skill to discover relevant information, accurately evaluate retrieved information and to apply it correctly. To use information literacy tool of International Network for Availability of Scientific Publications (INASP) to enhance policymakers' knowledge and skill for policymaking on control of IDP in Nigeria. Modified "before and after" intervention study design was used in which outcomes were measured on target participants both before the intervention is implemented and after. This study was conducted in Ebonyi State, south-eastern Nigeria and participants were career health policy makers. A two-day health-policy information literacy training workshop was organized to enhance participants" information literacy capacity. Topics covered included: introduction to information literacy; defining information problem; searching for information online; evaluating information; science information; knowledge sharing interviews; and training skills. A total of 52 policymakers attended the workshop. The pre-workshop mean rating (MNR) of knowledge and capacity for information literacy ranged from 2.15-2.97, while the post-workshop MNR ranged from 3.34-3.64 on 4-point scale. The percentage increase in MNR of knowledge and capacity at the end of the workshop ranged from 22.6%-55.3%. The results of this study suggest that through information literacy training workshop policy makers can acquire the knowledge and skill to identify, capture and share the right kind of information in the right contexts to influence relevant action or a policy decision.

  10. Enhancing health policymakers' information literacy knowledge and skill for policymaking on control of infectious diseases of poverty in Nigeria

    PubMed Central

    Uneke, Chigozie Jesse; Ezeoha, Abel Ebeh; Uro-Chukwu, Henry; Ezeonu, Chinonyelum Thecla; Ogbu, Ogbonnaya; Onwe, Friday; Edoga, Chima

    2015-01-01

    Background In Nigeria, one of the major challenges associated with evidence-to-policy link in the control of infectious diseases of poverty (IDP), is deficient information literacy knowledge and skill among policymakers. There is need for policymakers to acquire the skill to discover relevant information, accurately evaluate retrieved information and to apply it correctly. Objectives To use information literacy tool of International Network for Availability of Scientific Publications (INASP) to enhance policymakers' knowledge and skill for policymaking on control of IDP in Nigeria. Methods Modified "before and after" intervention study design was used in which outcomes were measured on target participants both before the intervention is implemented and after. This study was conducted in Ebonyi State, south-eastern Nigeria and participants were career health policy makers. A two-day health-policy information literacy training workshop was organized to enhance participants" information literacy capacity. Topics covered included: introduction to information literacy; defining information problem; searching for information online; evaluating information; science information; knowledge sharing interviews; and training skills. Results A total of 52 policymakers attended the workshop. The pre-workshop mean rating (MNR) of knowledge and capacity for information literacy ranged from 2.15-2.97, while the post-workshop MNR ranged from 3.34-3.64 on 4-point scale. The percentage increase in MNR of knowledge and capacity at the end of the workshop ranged from 22.6%-55.3%. Conclusion The results of this study suggest that through information literacy training workshop policy makers can acquire the knowledge and skill to identify, capture and share the right kind of information in the right contexts to influence relevant action or a policy decision. PMID:26284149

  11. Ontology-based configuration of problem-solving methods and generation of knowledge-acquisition tools: application of PROTEGE-II to protocol-based decision support.

    PubMed

    Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A

    1995-06-01

    PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.

  12. Knowledge Management tools integration within DLR's concurrent engineering facility

    NASA Astrophysics Data System (ADS)

    Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.

    The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.

  13. Smart Companies.

    ERIC Educational Resources Information Center

    Galagan, Patricia A.

    1997-01-01

    Capturing and leveraging knowledge is an important new management trend that is as yet undefined. Some companies are accounting for their intellectual capital and applying it to the company balance sheets. (JOW)

  14. The relation between cognitive and metacognitive strategic processing during a science simulation.

    PubMed

    Dinsmore, Daniel L; Zoellner, Brian P

    2018-03-01

    This investigation was designed to uncover the relations between students' cognitive and metacognitive strategies used during a complex climate simulation. While cognitive strategy use during science inquiry has been studied, the factors related to this strategy use, such as concurrent metacognition, prior knowledge, and prior interest, have not been investigated in a multidimensional fashion. This study addressed current issues in strategy research by examining not only how metacognitive, surface-level, and deep-level strategies influence performance, but also how these strategies related to each other during a contextually relevant science simulation. The sample for this study consisted of 70 undergraduates from a mid-sized Southeastern university in the United States. These participants were recruited from both physical and life science (e.g., biology) and education majors to obtain a sample with variance in terms of their prior knowledge, interest, and strategy use. Participants completed measures of prior knowledge and interest about global climate change. Then, they were asked to engage in an online climate simulator for up to 30 min while thinking aloud. Finally, participants were asked to answer three outcome questions about global climate change. Results indicated a poor fit for the statistical model of the frequency and level of processing predicting performance. However, a statistical model that independently examined the influence of metacognitive monitoring and control of cognitive strategies showed a very strong relation between the metacognitive and cognitive strategies. Finally, smallest space analysis results provided evidence that strategy use may be better captured in a multidimensional fashion, particularly with attention paid towards the combination of strategies employed. Conclusions drawn from the evidence point to the need for more dynamic, multidimensional models of strategic processing that account for the patterns of optimal and non-optimal strategy use. Additionally, analyses that can capture these complex patterns need to be further explored. © 2017 The British Psychological Society.

  15. Component Database for the APS Upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veseli, S.; Arnold, N. D.; Jarosz, D. P.

    The Advanced Photon Source Upgrade (APS-U) project will replace the existing APS storage ring with a multi-bend achromat (MBA) lattice to provide extreme transverse coherence and extreme brightness x-rays to its users. As the time to replace the existing storage ring accelerator is of critical concern, an aggressive one-year removal/installation/testing period is being planned. To aid in the management of the thousands of components to be installed in such a short time, the Component Database (CDB) application is being developed with the purpose to identify, document, track, locate, and organize components in a central database. Three major domains are beingmore » addressed: Component definitions (which together make up an exhaustive "Component Catalog"), Designs (groupings of components to create subsystems), and Component Instances (“Inventory”). Relationships between the major domains offer additional "system knowledge" to be captured that will be leveraged with future tools and applications. It is imperative to provide sub-system engineers with a functional application early in the machine design cycle. Topics discussed in this paper include the initial design and deployment of CDB, as well as future development plans.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldsmith, Steven Y.; Spires, Shannon V.

    There are currently two proposed standards for agent communication languages, namely, KQML (Finin, Lobrou, and Mayfield 1994) and the FIPA ACL. Neither standard has yet achieved primacy, and neither has been evaluated extensively in an open environment such as the Internet. It seems prudent therefore to design a general-purpose agent communications facility for new agent architectures that is flexible yet provides an architecture that accepts many different specializations. In this paper we exhibit the salient features of an agent communications architecture based on distributed metaobjects. This architecture captures design commitments at a metaobject level, leaving the base-level design and implementationmore » up to the agent developer. The scope of the metamodel is broad enough to accommodate many different communication protocols, interaction protocols, and knowledge sharing regimes through extensions to the metaobject framework. We conclude that with a powerful distributed object substrate that supports metaobject communications, a general framework can be developed that will effectively enable different approaches to agent communications in the same agent system. We have implemented a KQML-based communications protocol and have several special-purpose interaction protocols under development.« less

  17. DMS augmented monitoring and diganosis application (DMS AMDA) prototype

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Boyd, Mark A.; Iverson, David L.; Donnell, Brian; Lauritsen, Janet; Doubek, Sharon; Gibson, Jim; Monahan, Christine; Rosenthal, Donald A.

    1993-01-01

    The Data Management System Augmented Monitoring and Diagnosis Application (DMS AMDA) is currently under development at NASA Ames Research Center (ARC). It will provide automated monitoring and diagnosis capabilities for the Space Station Freedom (SSF) Data Management System (DMS) in the Control Center Complex (CCC) at NASA Johnson Space Center. Several advanced automation applications are under development for use in the CCC for other SSF subsystems. The DMS AMDA, however, is the first application to utilize digraph failure analysis techniques and the Extended Realtime FEAT (ERF) application as the core of its diagnostic system design, since the other projects were begun before the digraph tools were available. Model-based diagnosis and expert systems techniques will provide additional capabilities and augment ERF where appropriate. Utilization of system knowledge captured in the design phase of a system in digraphs should result in both a cost savings and a technical advantage during implementation of the diagnostic software. This paper addresses both the programmatic and technical considerations of this approach, and describes the software design and initial prototyping effort.

  18. 47 CFR 43.21 - Transactions with affiliates.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... file, by April 1 of each year, a report designed to capture trends in service quality under price cap... report designed to capture trends in service quality under price cap regulation. The report shall contain...) REPORTS OF COMMUNICATION COMMON CARRIERS AND CERTAIN AFFILIATES § 43.21 Transactions with affiliates. (a...

  19. 47 CFR 43.21 - Transactions with affiliates.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... file, by April 1 of each year, a report designed to capture trends in service quality under price cap... report designed to capture trends in service quality under price cap regulation. The report shall contain...) REPORTS OF COMMUNICATION COMMON CARRIERS AND CERTAIN AFFILIATES § 43.21 Transactions with affiliates. (a...

  20. 40 CFR 65.113 - Standards: Sampling connection systems.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... be collected or captured. (c) Equipment design and operation. Each closed-purge, closed-loop, or... system; or (2) Collect and recycle the purged process fluid to a process; or (3) Be designed and operated to capture and transport all the purged process fluid to a control device that meets the requirements...

  1. 40 CFR 65.113 - Standards: Sampling connection systems.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... be collected or captured. (c) Equipment design and operation. Each closed-purge, closed-loop, or... system; or (2) Collect and recycle the purged process fluid to a process; or (3) Be designed and operated to capture and transport all the purged process fluid to a control device that meets the requirements...

  2. 40 CFR 65.113 - Standards: Sampling connection systems.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... be collected or captured. (c) Equipment design and operation. Each closed-purge, closed-loop, or... system; or (2) Collect and recycle the purged process fluid to a process; or (3) Be designed and operated to capture and transport all the purged process fluid to a control device that meets the requirements...

  3. 40 CFR 63.1013 - Sampling connection systems standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) National Emission Standards for Equipment Leaks-Control Level 1 § 63.1013 Sampling connection... container are not required to be collected or captured. (c) Equipment design and operation. Each closed... process fluid to a process; or (3) Be designed and operated to capture and transport all the purged...

  4. Capturing Individual Uptake: Toward a Disruptive Research Methodology

    ERIC Educational Resources Information Center

    Bastian, Heather

    2015-01-01

    This article presents and illustrates a qualitative research methodology for studies of uptake. It does so by articulating a theoretical framework for qualitative investigations of uptake and detailing a research study designed to invoke and capture students' uptakes in a first-year writing classroom. The research design sought to make uptake…

  5. Estimating juvenile Chinook salmon (Oncorhynchus tshawytscha) abundance from beach seine data collected in the Sacramento–San Joaquin Delta and San Francisco Bay, California

    USGS Publications Warehouse

    Perry, Russell W.; Kirsch, Joseph E.; Hendrix, A. Noble

    2016-06-17

    Resource managers rely on abundance or density metrics derived from beach seine surveys to make vital decisions that affect fish population dynamics and assemblage structure. However, abundance and density metrics may be biased by imperfect capture and lack of geographic closure during sampling. Currently, there is considerable uncertainty about the capture efficiency of juvenile Chinook salmon (Oncorhynchus tshawytscha) by beach seines. Heterogeneity in capture can occur through unrealistic assumptions of closure and from variation in the probability of capture caused by environmental conditions. We evaluated the assumptions of closure and the influence of environmental conditions on capture efficiency and abundance estimates of Chinook salmon from beach seining within the Sacramento–San Joaquin Delta and the San Francisco Bay. Beach seine capture efficiency was measured using a stratified random sampling design combined with open and closed replicate depletion sampling. A total of 56 samples were collected during the spring of 2014. To assess variability in capture probability and the absolute abundance of juvenile Chinook salmon, beach seine capture efficiency data were fitted to the paired depletion design using modified N-mixture models. These models allowed us to explicitly test the closure assumption and estimate environmental effects on the probability of capture. We determined that our updated method allowing for lack of closure between depletion samples drastically outperformed traditional data analysis that assumes closure among replicate samples. The best-fit model (lowest-valued Akaike Information Criterion model) included the probability of fish being available for capture (relaxed closure assumption), capture probability modeled as a function of water velocity and percent coverage of fine sediment, and abundance modeled as a function of sample area, temperature, and water velocity. Given that beach seining is a ubiquitous sampling technique for many species, our improved sampling design and analysis could provide significant improvements in density and abundance estimation.

  6. An air-liquid contactor for large-scale capture of CO2 from air.

    PubMed

    Holmes, Geoffrey; Keith, David W

    2012-09-13

    We present a conceptually simple method for optimizing the design of a gas-liquid contactor for capture of carbon dioxide from ambient air, or 'air capture'. We apply the method to a slab geometry contactor that uses components, design and fabrication methods derived from cooling towers. We use mass transfer data appropriate for capture using a strong NaOH solution, combined with engineering and cost data derived from engineering studies performed by Carbon Engineering Ltd, and find that the total costs for air contacting alone-no regeneration-can be of the order of $60 per tonne CO(2). We analyse the reasons why our cost estimate diverges from that of other recent reports and conclude that the divergence arises from fundamental design choices rather than from differences in costing methodology. Finally, we review the technology risks and conclude that they can be readily addressed by prototype testing.

  7. Multi-Disciplinary Design Optimization Using WAVE

    NASA Technical Reports Server (NTRS)

    Irwin, Keith

    2000-01-01

    The current preliminary design tools lack the product performance, quality and cost prediction fidelity required to design Six Sigma products. They are also frequently incompatible with the tools used in detailed design, leading to a great deal of rework and lost or discarded data in the transition from preliminary to detailed design. Thus, enhanced preliminary design tools are needed in order to produce adequate financial returns to the business. To achieve this goal, GEAE has focused on building the preliminary design system around the same geometric 3D solid model that will be used in detailed design. With this approach, the preliminary designer will no longer convert a flowpath sketch into an engine cross section but rather, automatically create 3D solid geometry for structural integrity, life, weight, cost, complexity, producibility, and maintainability assessments. Likewise, both the preliminary design and the detailed design can benefit from the use of the same preliminary part sizing routines. The design analysis tools will also be integrated with the 3D solid model to eliminate manual transfer of data between programs. GEAE has aggressively pursued the computerized control of engineering knowledge for many years. Through its study and validation of 3D CAD programs and processes, GEAE concluded that total system control was not feasible at that time. Prior CAD tools focused exclusively on detail part geometry and Knowledge Based Engineering systems concentrated on rules input and data output. A system was needed to bridge the gap between the two to capture the total system. With the introduction of WAVE Engineering from UGS, the possibilities of an engineering system control device began to formulate. GEAE decided to investigate the new WAVE functionality to accomplish this task. NASA joined GEAE in funding this validation project through Task Order No. 1. With the validation project complete, the second phase under Task Order No. 2 was established to develop an associative control structure (framework) in the UG WAVE environment enabling multi-disciplinary design of turbine propulsion systems. The capabilities of WAVE were evaluated to assess its use as a rapid optimization and productivity tool. This project also identified future WAVE product enhancements that will make the tool still more beneficial for product development.

  8. The Choice between MapMan and Gene Ontology for Automated Gene Function Prediction in Plant Science

    PubMed Central

    Klie, Sebastian; Nikoloski, Zoran

    2012-01-01

    Since the introduction of the Gene Ontology (GO), the analysis of high-throughput data has become tightly coupled with the use of ontologies to establish associations between knowledge and data in an automated fashion. Ontologies provide a systematic description of knowledge by a controlled vocabulary of defined structure in which ontological concepts are connected by pre-defined relationships. In plant science, MapMan and GO offer two alternatives for ontology-driven analyses. Unlike GO, initially developed to characterize microbial systems, MapMan was specifically designed to cover plant-specific pathways and processes. While the dependencies between concepts in MapMan are modeled as a tree, in GO these are captured in a directed acyclic graph. Therefore, the difference in ontologies may cause discrepancies in data reduction, visualization, and hypothesis generation. Here provide the first systematic comparative analysis of GO and MapMan for the case of the model plant species Arabidopsis thaliana (Arabidopsis) with respect to their structural properties and difference in distributions of information content. In addition, we investigate the effect of the two ontologies on the specificity and sensitivity of automated gene function prediction via the coupling of co-expression networks and the guilt-by-association principle. Automated gene function prediction is particularly needed for the model plant Arabidopsis in which only half of genes have been functionally annotated based on sequence similarity to known genes. The results highlight the need for structured representation of species-specific biological knowledge, and warrants caution in the design principles employed in future ontologies. PMID:22754563

  9. Teaching Evidence-Based Approaches to Suicide Risk Assessment and Prevention that Enhance Psychiatric Training

    PubMed Central

    Zisook, Sidney; Anzia, Joan; Atri, Ashutosh; Baroni, Argelinda; Clayton, Paula; Haller, Ellen; Lomax, Jim; Mann, J. John; Oquendo, Maria A.; Pato, Michele; Perez-Rodriguez, M. Mercedes; Prabhakar, Deepak; Sen, Srijan; Thrall, Grace; Yaseen, Zimri S.

    2012-01-01

    This report describes one in a series of National Institute of Health (NIH) supported conferences aimed at enhancing the ability of leaders of psychiatry residency training to teach research literacy and produce both clinician-scholars and physician-scientists in their home programs. Most psychiatry training directors would not consider themselves research scholars or even well-schooled in evidence based practice. Yet they are the front line educators to prepare tomorrow’s psychiatrists to keep up with, critically evaluate, and in some cases actually participate in the discovery of new and emerging psychiatric knowledge. This annual conference is meant to help psychiatry training directors become more enthusiastic, knowledgeable and pedagogically prepared to create research-friendly environments at their home institutions, so that more trainees will, in turn, become research literate, practice evidence-based psychiatry, and enter research fellowships and careers. The overall design of each year’s meeting is a series of plenary sessions introducing participants to new information pertaining to the core theme of that year’s meeting, integrated with highly interactive small group teaching sessions designed to consolidate knowledge and provide pragmatic teaching tools appropriate for residents at various levels of training. The theme of each meeting, selected to be a compelling and contemporary clinical problem, serves as a vehicle to capture training directors’ attention while teaching relevant brain science, research literacy and effective pedagogy. This report describes the content and assessment of the 2011 annual pre-meeting, “Evidence-based Approaches to Suicide Risk Assessment and Prevention: Insights from the Neurosciences and Behavioral Sciences for use in Psychiatry Residency Training.” PMID:22995449

  10. Exploring the opinions of registered nurses working in a clinical transfusion environment on the contribution of e-learning to personal learning and clinical practice: results of a small scale educational research study.

    PubMed

    Cottrell, Susan; Donaldson, Jayne H

    2013-05-01

    To explore the opinions of registered nurses on the Learnbloodtransfusion Module 1: Safe Transfusion Practice e-learning programme to meeting personal learning styles and learning needs. A qualitative research methodology was applied based on the principles of phenomenology. Adopting a convenience sampling plan supported the recruitment of participants who had successfully completed the e-learning course. Thematic analysis from the semi-structured interviews identified common emerging themes through application of Colaizzis framework. Seven participants of total sample population (89) volunteered to participate in the study. Five themes emerged which included learning preferences, interactive learning, course design, patient safety and future learning needs. Findings positively show the e-learning programme captures the learning styles and needs of learners. In particular, learning styles of a reflector, theorist and activist as well as a visual learner can actively engage in the online learning experience. In an attempt to bridge the knowledge practice gap, further opinions are offered on the course design and the application of knowledge to practice following completion of the course. The findings of the small scale research study have shown that the e-learning course does meet the diverse learning styles and needs of nurses working in a clinical transfusion environment. However, technology alone is not sufficient and a blended approach to learning must be adopted to meet bridging the theory practice gap supporting the integration of knowledge to clinical practice. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Design framework for a spectral mask for a plenoptic camera

    NASA Astrophysics Data System (ADS)

    Berkner, Kathrin; Shroff, Sapna A.

    2012-01-01

    Plenoptic cameras are designed to capture different combinations of light rays from a scene, sampling its lightfield. Such camera designs capturing directional ray information enable applications such as digital refocusing, rotation, or depth estimation. Only few address capturing spectral information of the scene. It has been demonstrated that by modifying a plenoptic camera with a filter array containing different spectral filters inserted in the pupil plane of the main lens, sampling of the spectral dimension of the plenoptic function is performed. As a result, the plenoptic camera is turned into a single-snapshot multispectral imaging system that trades-off spatial with spectral information captured with a single sensor. Little work has been performed so far on analyzing diffraction effects and aberrations of the optical system on the performance of the spectral imager. In this paper we demonstrate simulation of a spectrally-coded plenoptic camera optical system via wave propagation analysis, evaluate quality of the spectral measurements captured at the detector plane, and demonstrate opportunities for optimization of the spectral mask for a few sample applications.

  12. A Bayesian Model of the Memory Colour Effect.

    PubMed

    Witzel, Christoph; Olkkonen, Maria; Gegenfurtner, Karl R

    2018-01-01

    According to the memory colour effect, the colour of a colour-diagnostic object is not perceived independently of the object itself. Instead, it has been shown through an achromatic adjustment method that colour-diagnostic objects still appear slightly in their typical colour, even when they are colourimetrically grey. Bayesian models provide a promising approach to capture the effect of prior knowledge on colour perception and to link these effects to more general effects of cue integration. Here, we model memory colour effects using prior knowledge about typical colours as priors for the grey adjustments in a Bayesian model. This simple model does not involve any fitting of free parameters. The Bayesian model roughly captured the magnitude of the measured memory colour effect for photographs of objects. To some extent, the model predicted observed differences in memory colour effects across objects. The model could not account for the differences in memory colour effects across different levels of realism in the object images. The Bayesian model provides a particularly simple account of memory colour effects, capturing some of the multiple sources of variation of these effects.

  13. A Bayesian Model of the Memory Colour Effect

    PubMed Central

    Olkkonen, Maria; Gegenfurtner, Karl R.

    2018-01-01

    According to the memory colour effect, the colour of a colour-diagnostic object is not perceived independently of the object itself. Instead, it has been shown through an achromatic adjustment method that colour-diagnostic objects still appear slightly in their typical colour, even when they are colourimetrically grey. Bayesian models provide a promising approach to capture the effect of prior knowledge on colour perception and to link these effects to more general effects of cue integration. Here, we model memory colour effects using prior knowledge about typical colours as priors for the grey adjustments in a Bayesian model. This simple model does not involve any fitting of free parameters. The Bayesian model roughly captured the magnitude of the measured memory colour effect for photographs of objects. To some extent, the model predicted observed differences in memory colour effects across objects. The model could not account for the differences in memory colour effects across different levels of realism in the object images. The Bayesian model provides a particularly simple account of memory colour effects, capturing some of the multiple sources of variation of these effects. PMID:29760874

  14. Failure of communication and capture: The perils of temporary unipolar pacing system.

    PubMed

    Sahinoglu, Efe; Wool, Thomas J; Wool, Kenneth J

    2015-06-01

    We present a case of a patient with pacemaker dependence secondary to complete heart block who developed loss of capture of her temporary pacemaker. Patient developed torsades de pointes then ventricular fibrillation, requiring CPR and external cardioversion. After patient was stabilized, it was noticed that loss of capture of pacemaker corresponded with nursing care, when the pulse generator was lifted off patient׳s chest wall, and that patient׳s temporary pacing system had been programmed to unipolar mode without knowledge of attending cardiologist. This case highlights the importance of communication ensuring all caregivers are aware of mode of the temporary pacing system.

  15. Effect of fossil fuels on the parameters of CO2 capture.

    PubMed

    Nagy, Tibor; Mizsey, Peter

    2013-08-06

    The carbon dioxide capture is a more and more important issue in the design and operation of boilers and/or power stations because of increasing environmental considerations. Such processes, absorber desorber should be able to cope with flue gases from the use of different fossil primary energy sources, in order to guarantee a flexible, stable, and secure energy supply operation. The changing flue gases have significant influence on the optimal operation of the capture process, that is, where the required heating of the desorber is the minimal. Therefore special considerations are devoted to the proper design and control of such boiler and/or power stations equipped with CO2 capture process.

  16. Reusing Design Knowledge Based on Design Cases and Knowledge Map

    ERIC Educational Resources Information Center

    Yang, Cheng; Liu, Zheng; Wang, Haobai; Shen, Jiaoqi

    2013-01-01

    Design knowledge was reused for innovative design work to support designers with product design knowledge and help designers who lack rich experiences to improve their design capacity and efficiency. First, based on the ontological model of product design knowledge constructed by taxonomy, implicit and explicit knowledge was extracted from some…

  17. Capturing field-scale variability in crop performance across a regional-scale climosequence

    NASA Astrophysics Data System (ADS)

    Brooks, E. S.; Poggio, M.; Anderson, T. R.; Gasch, C.; Yourek, M. A.; Ward, N. K.; Magney, T. S.; Brown, D. J.; Huggins, D. R.

    2014-12-01

    With the increasing availability of variable rate technology for applying fertilizers and other agrichemicals in dryland agricultural production systems there is a growing need to better capture and understand the processes driving field scale variability in crop yield and soil water. This need for a better understanding of field scale variability has led to the recent designation of the R. J. Cook Agronomy Farm (CAF) (Pullman, WA, USA) as a United States Department of Agriculture Long-Term Agro-Ecosystem Research (LTAR) site. Field scale variability at the CAF is closely monitored using extensive environmental sensor networks and intensive hand sampling. As investigating land-soil-water dynamics at CAF is essential for improving precision agriculture, transferring this knowledge across the regional-scale climosequence is challenging. In this study we describe the hydropedologic functioning of the CAF in relation to five extensively instrumented field sites located within 50 km in the same climatic region. The formation of restrictive argillic soil horizons in the wetter, cooler eastern edge of the region results in the development of extensive perched water tables, surface saturation, and surface runoff, whereas excess water is not an issue in the warmer, drier, western edge of the region. Similarly, crop and tillage management varies across the region as well. We discuss the implications of these regional differences on field scale management decisions and demonstrate how we are using proximal soil sensing and remote sensing imagery to better understand and capture field scale variability at a particular field site.

  18. Consistent visualizations of changing knowledge

    PubMed Central

    Tipney, Hannah J.; Schuyler, Ronald P.; Hunter, Lawrence

    2009-01-01

    Networks are increasingly used in biology to represent complex data in uncomplicated symbolic form. However, as biological knowledge is continually evolving, so must those networks representing this knowledge. Capturing and presenting this type of knowledge change over time is particularly challenging due to the intimate manner in which researchers customize those networks they come into contact with. The effective visualization of this knowledge is important as it creates insight into complex systems and stimulates hypothesis generation and biological discovery. Here we highlight how the retention of user customizations, and the collection and visualization of knowledge associated provenance supports effective and productive network exploration. We also present an extension of the Hanalyzer system, ReOrient, which supports network exploration and analysis in the presence of knowledge change. PMID:21347184

  19. Toward more transparent and reproducible omics studies through a common metadata checklist and data publications.

    PubMed

    Kolker, Eugene; Özdemir, Vural; Martens, Lennart; Hancock, William; Anderson, Gordon; Anderson, Nathaniel; Aynacioglu, Sukru; Baranova, Ancha; Campagna, Shawn R; Chen, Rui; Choiniere, John; Dearth, Stephen P; Feng, Wu-Chun; Ferguson, Lynnette; Fox, Geoffrey; Frishman, Dmitrij; Grossman, Robert; Heath, Allison; Higdon, Roger; Hutz, Mara H; Janko, Imre; Jiang, Lihua; Joshi, Sanjay; Kel, Alexander; Kemnitz, Joseph W; Kohane, Isaac S; Kolker, Natali; Lancet, Doron; Lee, Elaine; Li, Weizhong; Lisitsa, Andrey; Llerena, Adrian; Macnealy-Koch, Courtney; Marshall, Jean-Claude; Masuzzo, Paola; May, Amanda; Mias, George; Monroe, Matthew; Montague, Elizabeth; Mooney, Sean; Nesvizhskii, Alexey; Noronha, Santosh; Omenn, Gilbert; Rajasimha, Harsha; Ramamoorthy, Preveen; Sheehan, Jerry; Smarr, Larry; Smith, Charles V; Smith, Todd; Snyder, Michael; Rapole, Srikanth; Srivastava, Sanjeeva; Stanberry, Larissa; Stewart, Elizabeth; Toppo, Stefano; Uetz, Peter; Verheggen, Kenneth; Voy, Brynn H; Warnich, Louise; Wilhelm, Steven W; Yandl, Gregory

    2014-01-01

    Biological processes are fundamentally driven by complex interactions between biomolecules. Integrated high-throughput omics studies enable multifaceted views of cells, organisms, or their communities. With the advent of new post-genomics technologies, omics studies are becoming increasingly prevalent; yet the full impact of these studies can only be realized through data harmonization, sharing, meta-analysis, and integrated research. These essential steps require consistent generation, capture, and distribution of metadata. To ensure transparency, facilitate data harmonization, and maximize reproducibility and usability of life sciences studies, we propose a simple common omics metadata checklist. The proposed checklist is built on the rich ontologies and standards already in use by the life sciences community. The checklist will serve as a common denominator to guide experimental design, capture important parameters, and be used as a standard format for stand-alone data publications. The omics metadata checklist and data publications will create efficient linkages between omics data and knowledge-based life sciences innovation and, importantly, allow for appropriate attribution to data generators and infrastructure science builders in the post-genomics era. We ask that the life sciences community test the proposed omics metadata checklist and data publications and provide feedback for their use and improvement.

  20. Toward More Transparent and Reproducible Omics Studies Through a Common Metadata Checklist and Data Publications.

    PubMed

    Kolker, Eugene; Özdemir, Vural; Martens, Lennart; Hancock, William; Anderson, Gordon; Anderson, Nathaniel; Aynacioglu, Sukru; Baranova, Ancha; Campagna, Shawn R; Chen, Rui; Choiniere, John; Dearth, Stephen P; Feng, Wu-Chun; Ferguson, Lynnette; Fox, Geoffrey; Frishman, Dmitrij; Grossman, Robert; Heath, Allison; Higdon, Roger; Hutz, Mara H; Janko, Imre; Jiang, Lihua; Joshi, Sanjay; Kel, Alexander; Kemnitz, Joseph W; Kohane, Isaac S; Kolker, Natali; Lancet, Doron; Lee, Elaine; Li, Weizhong; Lisitsa, Andrey; Llerena, Adrian; MacNealy-Koch, Courtney; Marshall, Jean-Claude; Masuzzo, Paola; May, Amanda; Mias, George; Monroe, Matthew; Montague, Elizabeth; Mooney, Sean; Nesvizhskii, Alexey; Noronha, Santosh; Omenn, Gilbert; Rajasimha, Harsha; Ramamoorthy, Preveen; Sheehan, Jerry; Smarr, Larry; Smith, Charles V; Smith, Todd; Snyder, Michael; Rapole, Srikanth; Srivastava, Sanjeeva; Stanberry, Larissa; Stewart, Elizabeth; Toppo, Stefano; Uetz, Peter; Verheggen, Kenneth; Voy, Brynn H; Warnich, Louise; Wilhelm, Steven W; Yandl, Gregory

    2013-12-01

    Biological processes are fundamentally driven by complex interactions between biomolecules. Integrated high-throughput omics studies enable multifaceted views of cells, organisms, or their communities. With the advent of new post-genomics technologies, omics studies are becoming increasingly prevalent; yet the full impact of these studies can only be realized through data harmonization, sharing, meta-analysis, and integrated research. These essential steps require consistent generation, capture, and distribution of metadata. To ensure transparency, facilitate data harmonization, and maximize reproducibility and usability of life sciences studies, we propose a simple common omics metadata checklist. The proposed checklist is built on the rich ontologies and standards already in use by the life sciences community. The checklist will serve as a common denominator to guide experimental design, capture important parameters, and be used as a standard format for stand-alone data publications. The omics metadata checklist and data publications will create efficient linkages between omics data and knowledge-based life sciences innovation and, importantly, allow for appropriate attribution to data generators and infrastructure science builders in the post-genomics era. We ask that the life sciences community test the proposed omics metadata checklist and data publications and provide feedback for their use and improvement.

  1. Summary Report of H- Injection Session II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weiren Chou

    1999-06-28

    The H - injection was invented many years ago and has since been successfully applied in many machines over the last decades. The challenge to the high intensity machines is how to reduce the injection loss, which is usually the major part of total beam losses in a machine. Painting, both longitudinal and transverse, is an effective way to reduce the space charge e ects and to minimize losses. RF capture of a chopped beam also gives better e ciency than adiabatic capture. To employ a 2nd harmonic rf system to atten the rf bucket shape is another commonly usedmore » scheme. To compensate the capacitive space charge impedance by an inductive insert could be a new venture, but which is not discussed at the workshop due to time limitation. The foil physics is well understood. Simulations seem to be able to include all the important e ects in it, including the space charge. The general feeling is that we are in a good position concerning H - injection studies. Although there remains a number of design issues, the knowledge, experiences and tools in our hand should be able to address each of them properly.« less

  2. Developing Pre-Service Teachers' Subject Matter Knowledge of Electromagnetism by Integrating Concept Maps and Collaborative Learning

    ERIC Educational Resources Information Center

    Govender, Nadaraj

    2015-01-01

    This case study explored the development of two pre-service teachers' subject matter knowledge (SMK) of electromagnetism while integrating the use of concept maps (CM) and collaborative learning (CL) strategies. The study aimed at capturing how these pre-service teachers' SMK in electromagnetism was enhanced after having been taught SMK in a…

  3. Hanging with the Right Crowd: Crowdsourcing as a New Business Practice for Innovation, Productivity, Knowledge Capture, and Marketing

    ERIC Educational Resources Information Center

    Erickson, Lisa B.

    2013-01-01

    In today's connected world, the reach of the Internet and collaborative social media tools have opened up new opportunities for individuals, regardless of their location, to share their knowledge, expertise, and creativity with others. These tools have also opened up opportunities for organizations to connect with new sources of innovation to…

  4. OER (Re)Use and Language Teachers' Tacit Professional Knowledge: Three Vignettes

    ERIC Educational Resources Information Center

    Beaven, Tita

    2015-01-01

    The pedagogic practical knowledge that teachers use in their lessons is very difficult to make visible and often remains tacit. This chapter draws on data from a recent study and closely analyses a number of Open Educational Resources used by three language teachers at the UK Open University in order to try to capture how their use of the…

  5. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  6. The Power of Story: Dressing Up the Naked Truth

    NASA Technical Reports Server (NTRS)

    Simmons, Annette

    2004-01-01

    ASK Magazine is not alone when it comes to using storytelling to capture lessons learned and share knowledge. Several other practitioners have successfully introduced this approach to knowledge management within organizations. This article by Annette Simmons marks the first in a series ty authors whose work on storytelling has been widely recognized. We hope these features illuminate why ASK contributors use the story form to share their knowledge, and how you can do the same. Annette Simmons spoke at the February 2002 APPL Masters Forum.

  7. Artificial intelligence techniques for scheduling Space Shuttle missions

    NASA Technical Reports Server (NTRS)

    Henke, Andrea L.; Stottler, Richard H.

    1994-01-01

    Planning and scheduling of NASA Space Shuttle missions is a complex, labor-intensive process requiring the expertise of experienced mission planners. We have developed a planning and scheduling system using combinations of artificial intelligence knowledge representations and planning techniques to capture mission planning knowledge and automate the multi-mission planning process. Our integrated object oriented and rule-based approach reduces planning time by orders of magnitude and provides planners with the flexibility to easily modify planning knowledge and constraints without requiring programming expertise.

  8. The power of techknowledgy.

    PubMed

    Kabachinski, Jeff

    2010-01-01

    Knowledge can range from complex, accumulated expertise (tacit knowledge) to structured explicit content like service procedures. For most of us, knowledge management should only be one of many collaborative means to an end, not the end in itself (unless you are the corporate knowledge management director or chief knowledge officer). For that reason, KM is important only to the extent that it improves an organization's capability and capacity to deal with, and develop in, the four dimensions of capturing, codifying, storing, and using knowledge. Knowledge that is more or less explicit can be embedded in procedures or represented in documents and databases and transferred with reasonable accuracy. Tacit knowledge transfer generally requires extensive personal contact. Take for example troubleshooting circuits. While troubleshooting can be procedural to an extent, it is still somewhat of an art that pulls from experience and training. This is the kind of tacit knowledge where partnerships, mentoring, or an apprenticeship, are most effective. The most successful organizations are those where knowledge management is part of everyone's job. Tacit, complex knowledge that is developed and internalized over a long period of time is almost impossible to reproduce in a document, database, or expert system. Even before the days of "core competencies", the learning organization, expert systems, and strategy focus, good managers valued the experience and know-how of employees. Today, many are recognizing that what is needed is more than a casual approach to corporate knowledge if they are to succeed. In addition, the aging population of the baby boomers may require means to capture their experience and knowledge before they leave the workforce. There is little doubt that knowledge is one of any organization's most important resources, or that knowledge workers' roles will grow in importance in the years ahead. Why would an organization believe that knowledge and knowledge workers are important, yet not advocate active management of knowledge itself? Taking advantage of already accumulated corporate intellectual property is by far the most low-cost way to increase capability and competitive stature. These are all good reasons why it might pay to take a look at your KM usage.

  9. Design of Stratified Functional Nanoporous Materials for CO 2 Capture and Conversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, J. Karl; Ye, Jingyun

    The objective of this project is to develop novel nanoporous materials for CO 2 capture and conversion. The motivation of this work is that capture of CO 2 from flue gas or the atmosphere coupled with catalytic hydrogenation of CO 2 into valuable chemicals and fuels can reduce the net amount of CO 2 in the atmosphere while providing liquid transportation fuels and other commodity chemicals. One approach to increasing the economic viability of carbon capture and conversion is to design a single material that can be used for both the capture and catalytic conversion of CO 2, because suchmore » a material could increase efficiency through process intensification. We have used density functional theory (DFT) methods to design catalytic moieties that can be incorporated into various metal organic framework (MOF) materials. We chose to work with MOFs because they are highly tailorable, can be functionalized, and have been shown to selectively adsorb CO 2 over N 2, which is a requirement for CO 2 capture from flue gas. Moreover, the incorporation of molecular catalytic moieties into MOF, through covalent bonding, produces a heterogeneous catalytic material having activities and selectivities close to those of homogeneous catalysts, but without the draw-backs associated with homogeneous catalysis.« less

  10. Derivation of Rigid Body Analysis Models from Vehicle Architecture Abstractions

    DTIC Science & Technology

    2011-06-17

    models of every type have their basis in some type of physical representation of the design domain. Rather than describing three-dimensional continua of...arrangement, while capturing just enough physical detail to be used as the basis for a meaningful representation of the design , and eventually, analyses that...permit architecture assessment. The design information captured by the abstractions is available at the very earliest stages of the vehicle

  11. An automated rendezvous and capture system design concept for the cargo transfer vehicle and Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Fuchs, Ron; Marsh, Steven

    1991-01-01

    A rendezvous sensor system concept was developed for the cargo transfer vehicle (CTV) to autonomously rendezvous with and be captured by Space Station Freedom (SSF). The development of requirements, the design of a unique Lockheed developed sensor concept to meet these requirements, and the system design to place this sensor on the CTV and rendezvous with the SSF are described .

  12. W.A. Parish Post Combustion CO 2 Capture and Sequestration Project Final Public Design Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armpriester, Anthony

    The Petra Nova Project is a commercial scale post-combustion carbon dioxide capture project that is being developed by a joint venture between NRG Energy (NRG) and JX Nippon Oil and Gas Exploration (JX). The project is designed to separate and capture carbon dioxide from an existing coal-fired unit's flue gas slipstream at NRG's W.A. Parish Generation Station located southwest of Houston, Texas. The captured carbon dioxide will be transported by pipeline and injected into the West Ranch oil field to boost oil production. The project, which is partially funded by financial assistance from the U.S. Department of Energy will usemore » Mitsubishi Heavy Industries of America, Inc.'s Kansai Mitsubishi Carbon Dioxide Recovery (KM-CDR(R)) advanced amine-based carbon dioxide absorption technology to treat and capture at least 90% of the carbon dioxide from a 240 megawatt equivalent flue gas slipstream off of Unit 8 at W.A. Parish. The project will capture approximately 5,000 tons of carbon dioxide per day or 1.5 million tons per year that Unit 8 would otherwise emit, representing the largest commercial scale deployment of post-combustion carbon dioxide capture at a coal power plant to date. The joint venture issued full notice to proceed in July 2014 and when complete, the project is expected to be the world's largest post-combustion carbon dioxide capture facility on an existing coal plant. The detailed engineering is sufficiently complete to prepare and issue the Final Public Design Report.« less

  13. Evaluation of Solid Sorbents as a Retrofit Technology for CO 2 Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjostrom, Sharon

    2016-06-02

    ADA completed a DOE-sponsored program titled Evaluation of Solid Sorbents as a Retrofit Technology for CO 2 Capture under program DE-FE0004343. During this program, sorbents were analyzed for use in a post-combustion CO 2 capture process. A supported amine sorbent was selected based upon superior performance to adsorb a greater amount of CO 2 than the activated carbon sorbents tested. When the most ideal sorbent at the time was selected, it was characterized and used to create a preliminary techno-economic analysis (TEA). A preliminary 550 MW coal-fired power plant using Illinois #6 bituminous coal was designed with a solid sorbentmore » CO 2 capture system using the selected supported amine sorbent to both facilitate the TEA and to create the necessary framework to scale down the design to a 1 MWe equivalent slipstream pilot facility. The preliminary techno-economic analysis showed promising results and potential for improved performance for CO 2 capture compared to conventional MEA systems. As a result, a 1 MWe equivalent solid sorbent system was designed, constructed, and then installed at a coal-fired power plant in Alabama. The pilot was designed to capture 90% of the CO 2 from the incoming flue gas at 1 MWe net electrical generating equivalent. Testing was not possible at the design conditions due to changes in sorbent handling characteristics at post-regenerator temperatures that were not properly incorporated into the pilot design. Thus, severe pluggage occurred at nominally 60% of the design sorbent circulation rate with heated sorbent, although no handling issues were noted when the system was operated prior to bringing the regenerator to operating temperature. Testing within the constraints of the pilot plant resulted in 90% capture of the incoming CO 2 at a flow rate equivalent of 0.2 to 0.25 MWe net electrical generating equivalent. The reduction in equivalent flow rate at 90% capture was primarily the result of sorbent circulation limitations at operating temperatures combined with pre-loading of the sorbent with CO 2 prior to entering the adsorber. Specifically, CO 2-rich gas was utilized to convey sorbent from the regenerator to the adsorber. This gas was nominally 45°C below the regenerator temperature during testing. ADA’s post-combustion capture system with modifications to overcome pilot constraints, in conjunction with incorporating a sorbent with CO 2 working capacity of 15 g CO 2/100 g sorbent and a contact time of 10 to 15 minutes or less with flue gas could provide significant cost and performance benefits when compared to an MEA system.« less

  14. Large Pilot Scale Testing of Linde/BASF Post-Combustion CO 2 Capture Technology at the Abbott Coal-Fired Power Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Brien, Kevin C.

    The work summarized in this report is the first step towards a project that will re-train and create jobs for personnel in the coal industry and continue regional economic development to benefit regions impacted by previous downturns. The larger project is aimed at capturing ~300 tons/day (272 metric tonnes/day) CO 2 at a 90% capture rate from existing coal- fired boilers at the Abbott Power Plant on the campus of University of Illinois (UI). It will employ the Linde-BASF novel amine-based advanced CO 2 capture technology, which has already shown the potential to be cost-effective, energy efficient and compact atmore » the 0.5-1.5 MWe pilot scales. The overall objective of the project is to design and install a scaled-up system of nominal 15 MWe size, integrate it with the Abbott Power Plant flue gas, steam and other utility systems, and demonstrate the viability of continuous operation under realistic conditions with high efficiency and capacity. The project will also begin to build a workforce that understands how to operate and maintain the capture plants by including students from regional community colleges and universities in the operation and evaluation of the capture system. This project will also lay the groundwork for follow-on projects that pilot utilization of the captured CO 2 from coal-fired power plants. The net impact will be to demonstrate a replicable means to (1) use a standardized procedure to evaluate power plants for their ability to be retrofitted with a pilot capture unit; (2) design and construct reliable capture systems based on the Linde-BASF technology; (3) operate and maintain these systems; (4) implement training programs with local community colleges and universities to establish a workforce to operate and maintain the systems; and (5) prepare to evaluate at the large pilot scale level various methods to utilize the resulting captured CO 2. Towards the larger project goal, the UI-led team, together with Linde, has completed a preliminary design for the carbon capture pilot plant with basic engineering and cost estimates, established permitting needs, identified approaches to address Environmental, Health, and Safety concerns related to pilot plant installation and operation, developed approaches for long-term use of the captured carbon, and established strategies for workforce development and job creation that will re-train coal operators to operate carbon capture plants. This report describes Phase I accomplishments and demonstrates that the project team is well-prepared for full implementation of Phase 2, to design, build, and operate the carbon capture pilot plant.« less

  15. Informatics — EDRN Public Portal

    Cancer.gov

    The EDRN provides a comprehensive informatics activity which includes a number of tools and an integrated knowledge environment for capturing, managing, integrating, and sharing results from across EDRN's cancer biomarker research network.

  16. Global Dynamic Exposure and the OpenBuildingMap - Communicating Risk and Involving Communities

    NASA Astrophysics Data System (ADS)

    Schorlemmer, Danijel; Beutin, Thomas; Hirata, Naoshi; Hao, Ken; Wyss, Max; Cotton, Fabrice; Prehn, Karsten

    2017-04-01

    Detailed understanding of local risk factors regarding natural catastrophes requires in-depth characterization of the local exposure. Current exposure capture techniques have to find the balance between resolution and coverage. We aim at bridging this gap by employing a crowd-sourced approach to exposure capturing, focusing on risk related to earthquake hazard. OpenStreetMap (OSM), the rich and constantly growing geographical database, is an ideal foundation for this task. More than 3.5 billion geographical nodes, more than 200 million building footprints (growing by 100'000 per day), and a plethora of information about school, hospital, and other critical facilities allows us to exploit this dataset for risk-related computations. We are combining the strengths of crowd-sourced data collection with the knowledge of experts in extracting the most information from these data. Besides relying on the very active OpenStreetMap community and the Humanitarian OpenStreetMap Team, which are collecting building information at high pace, we are providing a tailored building capture tool for mobile devices. This tool is facilitating simple and fast building property capturing for OpenStreetMap by any person or interested community. With our OpenBuildingMap system, we are harvesting this dataset by processing every building in near-realtime. We are collecting exposure and vulnerability indicators from explicitly provided data (e.g. hospital locations), implicitly provided data (e.g. building shapes and positions), and semantically derived data, i.e. interpretation applying expert knowledge. The expert knowledge is needed to translate the simple building properties as captured by OpenStreetMap users into vulnerability and exposure indicators and subsequently into building classifications as defined in the Building Taxonomy 2.0 developed by the Global Earthquake Model (GEM) and the European Macroseismic Scale (EMS98). With this approach, we increase the resolution of existing exposure models from aggregated exposure information to building-by-building vulnerability. We report on our method, on the software development for the mobile application and the server-side analysis system, and on the OpenBuildingMap (www.openbuildingmap.org), our global Tile Map Service focusing on building properties. The free/open framework we provide can be used on commodity hardware for local to regional exposure capturing, for stakeholders in disaster management and mitigation for communicating risk, and for communities to understand their risk.

  17. An Expert System-Driven Method for Parametric Trajectory Optimization During Conceptual Design

    NASA Technical Reports Server (NTRS)

    Dees, Patrick D.; Zwack, Mathew R.; Steffens, Michael; Edwards, Stephen; Diaz, Manuel J.; Holt, James B.

    2015-01-01

    During the early phases of engineering design, the costs committed are high, costs incurred are low, and the design freedom is high. It is well documented that decisions made in these early design phases drive the entire design's life cycle cost. In a traditional paradigm, key design decisions are made when little is known about the design. As the design matures, design changes become more difficult in both cost and schedule to enact. The current capability-based paradigm, which has emerged because of the constrained economic environment, calls for the infusion of knowledge usually acquired during later design phases into earlier design phases, i.e. bringing knowledge acquired during preliminary and detailed design into pre-conceptual and conceptual design. An area of critical importance to launch vehicle design is the optimization of its ascent trajectory, as the optimal trajectory will be able to take full advantage of the launch vehicle's capability to deliver a maximum amount of payload into orbit. Hence, the optimal ascent trajectory plays an important role in the vehicle's affordability posture yet little of the information required to successfully optimize a trajectory is known early in the design phase. Thus, the current paradigm of optimizing ascent trajectories involves generating point solutions for every change in a vehicle's design parameters. This is often a very tedious, manual, and time-consuming task for the analysts. Moreover, the trajectory design space is highly non-linear and multi-modal due to the interaction of various constraints. When these obstacles are coupled with the Program to Optimize Simulated Trajectories (POST), an industry standard program to optimize ascent trajectories that is difficult to use, expert trajectory analysts are required to effectively optimize a vehicle's ascent trajectory. Over the course of this paper, the authors discuss a methodology developed at NASA Marshall's Advanced Concepts Office to address these issues. The methodology is two-fold: first, capture the heuristics developed by human analysts over their many years of experience; and secondly, leverage the power of modern computing to evaluate multiple trajectories simultaneously and therefore enable the exploration of the trajectory's design space early during the pre- conceptual and conceptual phases of design. This methodology is coupled with design of experiments in order to train surrogate models, which enables trajectory design space visualization and parametric optimal ascent trajectory information to be available when early design decisions are being made.

  18. A Discussion of Knowledge Based Design

    NASA Technical Reports Server (NTRS)

    Wood, Richard M.; Bauer, Steven X. S.

    1999-01-01

    A discussion of knowledge and Knowledge- Based design as related to the design of aircraft is presented. The paper discusses the perceived problem with existing design studies and introduces the concepts of design and knowledge for a Knowledge- Based design system. A review of several Knowledge-Based design activities is provided. A Virtual Reality, Knowledge-Based system is proposed and reviewed. The feasibility of Virtual Reality to improve the efficiency and effectiveness of aerodynamic and multidisciplinary design, evaluation, and analysis of aircraft through the coupling of virtual reality technology and a Knowledge-Based design system is also reviewed. The final section of the paper discusses future directions for design and the role of Knowledge-Based design.

  19. Exploring creative activity: a software environment for multimedia systems

    NASA Astrophysics Data System (ADS)

    Farrett, Peter W.; Jardine, David A.

    1992-03-01

    This paper examines various issues related to the theory, design, and implementation of a system that supports creative activity for a multimedia environment. The system incorporates artificial intelligence notions to acquire concepts of the problem domain. This paper investigates this environment by considering a model that is a basis for a system, which supports a history of user interaction. A multimedia system that supports creative activity is problematic. It must function as a tool allowing users to experiment dynamically with their own creative reasoning process--a very nebulous task environment. It should also support the acquisition of domain knowledge so that empirical observation can be further evaluated. This paper aims to illustrate that via the reuse of domain-specific knowledge, closely related ideas can be quickly developed. This approach is useful in the following sense: Multimedia navigational systems hardcode referential links with respect to a web or network. Although users can access or control navigation in a nonlinear (static) manner, these referential links are 'frozen' and can not capture their creative actions, which are essential in tutoring or learning applications. This paper describes a multimedia assistant based on the notion of knowledge- links, which allows users to navigate through creative information in a nonlinear (dynamic) fashion. A selection of prototype code based on object-oriented techniques and logic programming partially demonstrates this.

  20. Market Characteristics and Awareness of Managed Care Options Among Elderly Beneficiaries Enrolled in Traditional Medicare

    PubMed Central

    Mittler, Jessica N.; Landon, Bruce E.; Zaslavsky, Alan M.; Cleary, Paul D.

    2011-01-01

    Background Medicare beneficiaries' awareness of Medicare managed care plans is critical for realizing the potential benefits of coverage choices. Objectives To assess the relationships of the number of Medicare risk plans, managed care penetration, and stability of plans in an area with traditional Medicare beneficiaries' awareness of the program. Research Design Cross-sectional analysis of Medicare Current Beneficiary Survey data about beneficiaries' awareness and knowledge of Medicare managed care plan availability. Logistic regression models used to assess the relationships between awareness and market characteristics. Subjects Traditional Medicare beneficiaries (n = 3,597) who had never been enrolled in Medicare managed care, but had at least one plan available in their area in 2002, and excluding beneficiaries under 65, receiving Medicaid, or with end stage renal disease. Measures Traditional Medicare beneficiaries' knowledge of Medicare managed care plans in general and in their area. Results Having more Medicare risk plans available was significantly associated with greater awareness, and having an intermediate number of plans (2-4) was significantly associated with more accurate knowledge of Medicare risk plan availability than was having fewer or more plans. Conclusions Medicare may have more success engaging consumers in choice and capturing the benefits of plan competition by more actively selecting and managing the plan choice set. PMID:22340776

  1. Small Particles Intact Capture Experiment (SPICE)

    NASA Technical Reports Server (NTRS)

    Nishioka, Ken-Ji; Carle, G. C.; Bunch, T. E.; Mendez, David J.; Ryder, J. T.

    1994-01-01

    The Small Particles Intact Capture Experiment (SPICE) will develop technologies and engineering techniques necessary to capture nearly intact, uncontaminated cosmic and interplanetary dust particles (IDP's). Successful capture of such particles will benefit the exobiology and planetary science communities by providing particulate samples that may have survived unaltered since the formation of the solar system. Characterization of these particles may contribute fundamental data to our knowledge of how these particles could have formed into our planet Earth and, perhaps, contributed to the beginnings of life. The term 'uncontaminated' means that captured cosmic and IDP particles are free of organic contamination from the capture process and the term 'nearly intact capture' means that their chemical and elemental components are not materially altered during capture. The key to capturing cosmic and IDP particles that are organic-contamination free and nearly intact is the capture medium. Initial screening of capture media included organic foams, multiple thin foil layers, and aerogel (a silica gel); but, with the exception of aerogel, the requirements of no contamination or nearly intact capture were not met. To ensure no contamination of particles in the capture process, high-purity aerogel was chosen. High-purity aerogel results in high clarity (visual clearness), a useful quality in detection and recovery of embedded captured particles from the aerogel. P. Tsou at the Jet Propulsion Laboratory (JPL) originally described the use of aerogel for this purpose and reported laboratory test results. He has flown aerogel as a 'GAS-can Lid' payload on STS-47 and is evaluating the results. The Timeband Capture Cell Experiment (TICCE), a Eureca 1 experiment, is also flying aerogel and is scheduled for recovery in late April.

  2. Improving information recognition and performance of recycling chimneys.

    PubMed

    Durugbo, Christopher

    2013-01-01

    The aim of this study was to assess and improve how recyclers (individuals carrying out the task of recycling) make use of visual cues to carryout recycling tasks in relation to 'recycling chimneys' (repositories for recycled waste). An initial task analysis was conducted through an activity sampling study and an eye tracking experiment using a mobile eye tracker to capture fixations of recyclers during recycling tasks. Following data collection using the eye tracker, a set of recommendations for improving information representation were then identified using the widely researched skills, rules, knowledge framework, and for a comparative study to assess the performance of improved interfaces for recycling chimneys based on Ecological Interface Design principles. Information representation on recycling chimneys determines how we recycle waste. This study describes an eco-ergonomics-based approach to improve the design of interfaces for recycling chimneys. The results are valuable for improving the performance of waste collection processes in terms of minimising contamination and increasing the quantity of recyclables.

  3. Loss of local capture of the pulmonary vein myocardium after antral isolation: prevalence and clinical significance.

    PubMed

    Squara, Fabien; Liuba, Ioan; Chik, William; Santangeli, Pasquale; Zado, Erica S; Callans, David J; Marchlinski, Francis E

    2015-03-01

    Capture of the myocardial sleeves of the pulmonary veins (PV) during PV pacing is mandatory for assessing exit block after PV isolation (PVI). However, previous studies reported that a significant proportion of PVs failed to demonstrate local capture after PVI. We designed this study to evaluate the prevalence and the clinical significance of loss of PV capture after PVI. Thirty patients (14 redo) undergoing antral PVI were included. Before and after PVI, local PV capture was assessed during circumferential pacing (10 mA/2 milliseconds) with a circular multipolar catheter (CMC), using EGM analysis from each dipole of the CMC and from the ablation catheter placed in ipsilateral PV. Pacing output was varied to optimize identification of sleeve capture. All PVs demonstrated sleeve capture before PVI, but only 81% and 40% after first time and redo PVI, respectively (P < 0.001 vs. before PVI). In multivariate analysis, absence of spontaneous PV depolarizations after PVI and previous PVI procedures were associated with less PV sleeve capture after PVI (40% sleeve capture, P < 0.001 for both). Loss of PV local capture by design was coincident with the development of PV entrance block and importantly predicted absence of acute reconnection during adenosine challenge with 96% positive predictive value (23% negative predictive value). Loss of PV local capture is common after antral PVI resulting in entrance block, and may be used as a specific alternate endpoint for PV electrical isolation. Additionally, loss of PV local capture may identify PVs at very low risk of acute reconnection during adenosine challenge. © 2014 Wiley Periodicals, Inc.

  4. EU Climate-KIC Innovation Blue Green Dream Project: Creation of Educational Experience, Communication and Dissemination

    NASA Astrophysics Data System (ADS)

    Tchiguirinskaia, Ioulia; Gires, Auguste; Vicari, Rosa; Schertzer, Daniel; Maksimovic, Cedo

    2013-04-01

    The combined effects of climate change and increasing urbanization call for a change of paradigm for planning, maintenance and management of new urban developments and retrofitting of existing ones to maximize ecosystem services and increase resilience to the adverse climate change effects. This presentation will discuss synergies of the EU Climate-KIC Innovation Blue Green Dream (BGD) Project in promoting the BGD demonstration and training sites established in participating European countries. The BGD demonstration and training sites show clear benefits when blue and green infrastructures are considered together. These sites present a unique opportunity for community learning and dissemination. Their development and running acts as a hub for engineers, architects, planners and modellers to come together in their design and implementation stage. This process, being captured in a variety of media, creates a corpus of knowledge, anchored in specific examples of different scales, types and dimensions. During the EU Climate-KIC Innovation Blue Green Dream Project, this corpus of knowledge will be used to develop dissemination and training materials whose content will be customised to fit urgent societal needs.

  5. Using expert knowledge for test linking.

    PubMed

    Bolsinova, Maria; Hoijtink, Herbert; Vermeulen, Jorine Adinda; Béguin, Anton

    2017-12-01

    Linking and equating procedures are used to make the results of different test forms comparable. In the cases where no assumption of random equivalent groups can be made some form of linking design is used. In practice the amount of data available to link the two tests is often very limited due to logistic and security reasons, which affects the precision of linking procedures. This study proposes to enhance the quality of linking procedures based on sparse data by using Bayesian methods which combine the information in the linking data with background information captured in informative prior distributions. We propose two methods for the elicitation of prior knowledge about the difference in difficulty of two tests from subject-matter experts and explain how these results can be used in the specification of priors. To illustrate the proposed methods and evaluate the quality of linking with and without informative priors, an empirical example of linking primary school mathematics tests is presented. The results suggest that informative priors can increase the precision of linking without decreasing the accuracy. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Mixture class recovery in GMM under varying degrees of class separation: frequentist versus Bayesian estimation.

    PubMed

    Depaoli, Sarah

    2013-06-01

    Growth mixture modeling (GMM) represents a technique that is designed to capture change over time for unobserved subgroups (or latent classes) that exhibit qualitatively different patterns of growth. The aim of the current article was to explore the impact of latent class separation (i.e., how similar growth trajectories are across latent classes) on GMM performance. Several estimation conditions were compared: maximum likelihood via the expectation maximization (EM) algorithm and the Bayesian framework implementing diffuse priors, "accurate" informative priors, weakly informative priors, data-driven informative priors, priors reflecting partial-knowledge of parameters, and "inaccurate" (but informative) priors. The main goal was to provide insight about the optimal estimation condition under different degrees of latent class separation for GMM. Results indicated that optimal parameter recovery was obtained though the Bayesian approach using "accurate" informative priors, and partial-knowledge priors showed promise for the recovery of the growth trajectory parameters. Maximum likelihood and the remaining Bayesian estimation conditions yielded poor parameter recovery for the latent class proportions and the growth trajectories. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  7. Status of NTD Ge bolometer material and devices

    NASA Technical Reports Server (NTRS)

    Haller, E. E.; Haegel, N. M.; Park, I. S.

    1986-01-01

    The first IR Detector Technology Workshop took place at NASA Ames Research Center on July 12 and 13, 1983. The conclusions presented at that meeting are still valid. More was learned about the physics of hopping conduction at very low temperatures which will be important for bolometer design and operation at ever decreasing temperatures. Resistivity measurements were extended down to 50 mK. At such low temperatures, precise knowledge of the neutron capture cross sections sigma (sub n) of the various Ge isotopes is critical if one is to make an accurate prediction of the dopant concentrations and compensation, and therefore resistivity, that will result from a given irradiation. An empirical approach for obtaining the desired resistivity material is described and the process of conducting a set of experiments which will improve the knowledge of the effective sigma (sub n) values for a given location in a particular reactor is discussed. A wider range of NTD Ge samples is now available. Noise measurements on bolometers with ion implanted contacts show the no 1/f noise component appears down to 1 Hz and probably lower.

  8. Atmospheric inverse modeling via sparse reconstruction

    NASA Astrophysics Data System (ADS)

    Hase, Nils; Miller, Scot M.; Maaß, Peter; Notholt, Justus; Palm, Mathias; Warneke, Thorsten

    2017-10-01

    Many applications in atmospheric science involve ill-posed inverse problems. A crucial component of many inverse problems is the proper formulation of a priori knowledge about the unknown parameters. In most cases, this knowledge is expressed as a Gaussian prior. This formulation often performs well at capturing smoothed, large-scale processes but is often ill equipped to capture localized structures like large point sources or localized hot spots. Over the last decade, scientists from a diverse array of applied mathematics and engineering fields have developed sparse reconstruction techniques to identify localized structures. In this study, we present a new regularization approach for ill-posed inverse problems in atmospheric science. It is based on Tikhonov regularization with sparsity constraint and allows bounds on the parameters. We enforce sparsity using a dictionary representation system. We analyze its performance in an atmospheric inverse modeling scenario by estimating anthropogenic US methane (CH4) emissions from simulated atmospheric measurements. Different measures indicate that our sparse reconstruction approach is better able to capture large point sources or localized hot spots than other methods commonly used in atmospheric inversions. It captures the overall signal equally well but adds details on the grid scale. This feature can be of value for any inverse problem with point or spatially discrete sources. We show an example for source estimation of synthetic methane emissions from the Barnett shale formation.

  9. Simulation of aerosolized oil droplets capture in a range hood exhaust using coupled CFD-population balance method

    NASA Astrophysics Data System (ADS)

    Liu, Shuyuan; Zhang, Yong; Feng, Yu; Shi, Changbin; Cao, Yong; Yuan, Wei

    2018-02-01

    A coupled population balance sectional method (PBSM) coupled with computational fluid dynamics (CFD) is presented to simulate the capture of aerosolized oil droplets (AODs) in a range hood exhaust. The homogeneous nucleation and coagulation processes are modeled and simulated with this CFD-PBSM method. With the design angle, α of the range hood exhaust varying from 60° to 30°, the AODs capture increases meanwhile the pressure drop between the inlet and the outlet of the range hood also increases from 8.38Pa to 175.75Pa. The increasing inlet flow velocities also result in less AODs capture although the total suction increases due to higher flow rates to the range hood. Therefore, the CFD-PBSM method provides an insight into the formation and capture of AODs as well as their impact on the operation and design of the range hood exhaust.

  10. Physiological ramifications for loggerhead turtles captured in pelagic longlines

    PubMed Central

    Williard, Amanda; Parga, Mariluz; Sagarminaga, Ricardo; Swimmer, Yonat

    2015-01-01

    Bycatch of endangered loggerhead turtles in longline fisheries results in high rates of post-release mortality that may negatively impact populations. The factors contributing to post-release mortality have not been well studied, but traumatic injuries and physiological disturbances experienced as a result of capture are thought to play a role. The goal of our study was to gauge the physiological status of loggerhead turtles immediately upon removal from longline gear in order to refine our understanding of the impacts of capture and the potential for post-release mortality. We analysed blood samples collected from longline- and hand-captured loggerhead turtles, and discovered that capture in longline gear results in blood loss, induction of the systemic stress response, and a moderate increase in lactate. The method by which turtles are landed and released, particularly if released with the hook or line still attached, may exacerbate stress and lead to chronic injuries, sublethal effects or delayed mortality. Our study is the first, to the best of our knowledge, to document the physiological impacts of capture in longline gear, and our findings underscore the importance of best practices gear removal to promote post-release survival in longline-captured turtles. PMID:26490415

  11. Physiological ramifications for loggerhead turtles captured in pelagic longlines.

    PubMed

    Williard, Amanda; Parga, Mariluz; Sagarminaga, Ricardo; Swimmer, Yonat

    2015-10-01

    Bycatch of endangered loggerhead turtles in longline fisheries results in high rates of post-release mortality that may negatively impact populations. The factors contributing to post-release mortality have not been well studied, but traumatic injuries and physiological disturbances experienced as a result of capture are thought to play a role. The goal of our study was to gauge the physiological status of loggerhead turtles immediately upon removal from longline gear in order to refine our understanding of the impacts of capture and the potential for post-release mortality. We analysed blood samples collected from longline- and hand-captured loggerhead turtles, and discovered that capture in longline gear results in blood loss, induction of the systemic stress response, and a moderate increase in lactate. The method by which turtles are landed and released, particularly if released with the hook or line still attached, may exacerbate stress and lead to chronic injuries, sublethal effects or delayed mortality. Our study is the first, to the best of our knowledge, to document the physiological impacts of capture in longline gear, and our findings underscore the importance of best practices gear removal to promote post-release survival in longline-captured turtles. © 2015 The Author(s).

  12. Effects of using structured templates for recalling chemistry experiments.

    PubMed

    Willoughby, Cerys; Logothetis, Thomas A; Frey, Jeremy G

    2016-01-01

    The way that we recall information is dependent upon both the knowledge in our memories and the conditions under which we recall the information. Electronic Laboratory Notebooks can provide a structured interface for the capture of experiment records through the use of forms and templates. These templates can be useful by providing cues to help researchers to remember to record particular aspects of their experiment, but they may also constrain the information that is recorded by encouraging them to record only what is asked for. It is therefore unknown whether using structured templates for capturing experiment records will have positive or negative effects on the quality and usefulness of the records for assessment and future use. In this paper we report on the results of a set of studies investigating the effects of different template designs on the recording of experiments by undergraduate students and academic researchers. The results indicate that using structured templates to write up experiments does make a significant difference to the information that is recalled and recorded. These differences have both positive and negative effects, with templates prompting the capture of specific information that is otherwise forgotten, but also apparently losing some of the personal elements of the experiment experience such as observations and explanations. Other unexpected effects were seen with templates that can change the information that is captured, but also interfere with the way an experiment is conducted. Our results showed that using structured templates can improve the completeness of the experiment context information captured but can also cause a loss of personal elements of the experiment experience when compared with allowing the researcher to structure their own record. The results suggest that interfaces for recording information about chemistry experiments, whether paper-based questionnaires or templates in Electronic Laboratory Notebooks, can be an effective way to improve the quality of experiment write-ups, but that care needs to be taken to ensure that the correct cues are provided.Graphical abstractScientists have traditionally recorded their research in paper notebooks, a format that provides great flexibility for capturing information. In contrast, Electronic Laboratory Notebooks frequently make use of forms or structured templates for capturing experiment records. Structured templates can provide cues that can improve record quality by increasing the amount of information captured and encouraging consistency. However, using the wrong cues can lead to a loss of personal elements of the experiment experience and frustrate users. This image shows two participants from one of our studies recording their experiment using a computer-based template.

  13. Modularising ontology and designing inference patterns to personalise health condition assessment: the case of obesity.

    PubMed

    Sojic, Aleksandra; Terkaj, Walter; Contini, Giorgia; Sacco, Marco

    2016-05-04

    The public health initiatives for obesity prevention are increasingly exploiting the advantages of smart technologies that can register various kinds of data related to physical, physiological, and behavioural conditions. Since individual features and habits vary among people, the design of appropriate intervention strategies for motivating changes in behavioural patterns towards a healthy lifestyle requires the interpretation and integration of collected information, while considering individual profiles in a personalised manner. The ontology-based modelling is recognised as a promising approach in facing the interoperability and integration of heterogeneous information related to characterisation of personal profiles. The presented ontology captures individual profiles across several obesity-related knowledge-domains structured into dedicated modules in order to support inference about health condition, physical features, behavioural habits associated with a person, and relevant changes over time. The modularisation strategy is designed to facilitate ontology development, maintenance, and reuse. The domain-specific modules formalised in the Web Ontology Language (OWL) integrate the domain-specific sets of rules formalised in the Semantic Web Rule Language (SWRL). The inference rules follow a modelling pattern designed to support personalised assessment of health condition as age- and gender-specific. The test cases exemplify a personalised assessment of the obesity-related health conditions for the population of teenagers. The paper addresses several issues concerning the modelling of normative concepts related to obesity and depicts how the public health concern impacts classification of teenagers according to their phenotypes. The modelling choices regarding the ontology-structure are explained in the context of the modelling goal to integrate multiple knowledge-domains and support reasoning about the individual changes over time. The presented modularisation pattern enhances reusability of the domain-specific modules across various health care domains.

  14. Advancing cognitive engineering methods to support user interface design for electronic health records.

    PubMed

    Thyvalikakath, Thankam P; Dziabiak, Michael P; Johnson, Raymond; Torres-Urquidy, Miguel Humberto; Acharya, Amit; Yabes, Jonathan; Schleyer, Titus K

    2014-04-01

    Despite many decades of research on the effective development of clinical systems in medicine, the adoption of health information technology to improve patient care continues to be slow, especially in ambulatory settings. This applies to dentistry as well, a primary care discipline with approximately 137,000 practitioners in the United States. A critical reason for slow adoption is the poor usability of clinical systems, which makes it difficult for providers to navigate through the information and obtain an integrated view of patient data. In this study, we documented the cognitive processes and information management strategies used by dentists during a typical patient examination. The results will inform the design of a novel electronic dental record interface. We conducted a cognitive task analysis (CTA) study to observe ten general dentists (five general dentists and five general dental faculty members, each with more than two years of clinical experience) examining three simulated patient cases using a think-aloud protocol. Dentists first reviewed the patient's demographics, chief complaint, medical history and dental history to determine the general status of the patient. Subsequently, they proceeded to examine the patient's intraoral status using radiographs, intraoral images, hard tissue and periodontal tissue information. The results also identified dentists' patterns of navigation through patient's information and additional information needs during a typical clinician-patient encounter. This study reinforced the significance of applying cognitive engineering methods to inform the design of a clinical system. Second, applying CTA to a scenario closely simulating an actual patient encounter helped with capturing participants' knowledge states and decision-making when diagnosing and treating a patient. The resultant knowledge of dentists' patterns of information retrieval and review will significantly contribute to designing flexible and task-appropriate information presentation in electronic dental records. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Exp 38 Patch Design - decal file allpath Replacement Pantones 4-

    NASA Image and Video Library

    2013-04-30

    ISS038-S-001 (April 2013) --- As the International Space Station (ISS) has become a stepping stone to future space exploration, the Expedition 38 mission patch design paints a visual roadmap of exploration beyond low Earth orbit, most prominently represented by the design?s flowing Expedition 38 mission numbers that wrap around Earth, the moon and Mars. Just as the sun is a guiding light in the galaxy, the ISS illuminates the bottom of the design as it is a shining beacon of the advancement of science, knowledge, and technology carried out aboard the Space Station. To visually capture the idea of the ISS being a foundation for infinite discovery, the space station?s iconic solar arrays span upwards, providing the number 38 and its exploration roadmap a symbolic pedestal to rest on. Finally, the overall use of red, white, and blue in the design acknowledges the flags of the countries of origin for Expedition 38?s crew ? the United States, Russia, and Japan. The NASA insignia design for space shuttle flights is reserved for use by the astronauts and for other official use as the NASA Administrator may authorize. Public availability has been approved only in the forms of illustrations by the various news media. When and if there is any change in this policy, which is not anticipated, the change will be publicly announced. Photo credit: NASA

  16. Teacher- or Learner-Centred? Science Teacher Beliefs Related to Topic Specific Pedagogical Content Knowledge: A South African Case Study

    NASA Astrophysics Data System (ADS)

    Mavhunga, Elizabeth; Rollnick, Marissa

    2016-12-01

    In science education, learner-centred classroom practices are widely accepted as desirable and are associated with responsive and reformed kinds of teacher beliefs. They are further associated with high-quality Pedagogical Content Knowledge (PCK). Topic-Specific Pedagogical Content Knowledge (TSPCK), a version of PCK defined at topic level, is known to enable the transformation of topic content into a form accessible to learners. However, little is known about teacher science beliefs in relation to TSPCK and therefore the nature of likely associated classroom practices. In this study, we investigated the relationship between TSPCK and underlying science teacher beliefs following an intervention targeting the improvement of TSPCK in the topic chemical equilibrium. Sixteen final year pre-service chemistry teachers were exposed to an intervention that explicitly focussed on knowledge for transforming the content of chemical equilibrium using the five knowledge components of TSPCK. A specially designed TSPCK instrument in chemical equilibrium and the Teacher Belief Instrument (TBI) were used to capture written responses in pre- and post-tests. Additional qualitative data was collected from audio-recorded discussions and written responses from an open-ended question asked before and after the intervention. Two key findings emerged from the study. Firstly, the development of TSPCK was linked to shifts in underlying science teacher beliefs in the direction of learner-centred teaching for the majority of pre-service teachers. Secondly, this shift was not evident for all, as for some there was development of TSPCK without a shift from teacher-centred beliefs about science teaching.

  17. Early Design Choices: Capture, Model, Integrate, Analyze, Simulate

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2004-01-01

    I. Designs are constructed incrementally to meet requirements and solve problems: a) Requirements types: objectives, scenarios, constraints, ilities. etc. b) Problem/issue types: risk/safety, cost/difficulty, interaction, conflict, etc. II. Capture requirements, problems and solutions: a) Collect design and analysis products and make them accessible for integration and analysis; b) Link changes in design requirements, problems and solutions; and c) Harvest design data for design models and choice structures. III. System designs are constructed by multiple groups designing interacting subsystems a) Diverse problems, choice criteria, analysis methods and point solutions. IV. Support integration and global analysis of repercussions: a) System implications of point solutions; b) Broad analysis of interactions beyond totals of mass, cost, etc.

  18. Measuring θ13 in the Double Chooz experiment

    NASA Astrophysics Data System (ADS)

    Crum, Keith

    2013-04-01

    Double Chooz measures θ13 by searching for the disappearance of reactor electron antineutrinos (νe) interacting via inverse beta decay (IBD) in a liquid scintillator-based detector. The signature of IBD is the coincidence of positron annihilation followed by the capture of a neutron. Although Double Chooz was primarily designed to detect νe by searching for neutron capture on gadolinium, we can also search for neutron capture on hydrogen. We developed separate analyses for neutron capture on hydrogen and gadolinium as the two elements have different capture energies, capture lifetimes, and spatial distributions within our detector.

  19. Trust and technology: the social foundations of aviation regulation.

    PubMed

    Downer, John

    2010-03-01

    This paper looks at the dilemmas posed by 'expertise' in high-technology regulation by examining the US Federal Aviation Administration's (FAA) 'type-certification' process, through which they evaluate new designs of civil aircraft. It observes that the FAA delegate a large amount of this work to the manufacturers themselves, and discusses why they do this by invoking arguments from the sociology of science and technology. It suggests that - contrary to popular portrayal - regulators of high technologies face an inevitable epistemic barrier when making technological assessments, which forces them to delegate technical questions to people with more tacit knowledge, and hence to 'regulate' at a distance by evaluating 'trust' rather than 'technology'. It then unravels some of the implications of this and its relation to our theories of regulation and 'regulatory capture'.

  20. How Do Clinicians Learn About Knowledge Translation? An Investigation of Current Web-Based Learning Opportunities.

    PubMed

    Damarell, Raechel A; Tieman, Jennifer J

    2017-07-13

    Clinicians are important stakeholders in the translation of well-designed research evidence into clinical practice for optimal patient care. However, the application of knowledge translation (KT) theories and processes may present conceptual and practical challenges for clinicians. Online learning platforms are an effective means of delivering KT education, providing an interactive, time-efficient, and affordable alternative to face-to-face education programs. This study investigates the availability and accessibility of online KT learning opportunities for health professionals. It also provides an analysis of the types of resources and associated disciplines retrieved by a range of KT synonyms. We searched a range of bibliographic databases and the Internet (Google advanced option) using 9 KT terms to identify online KT learning resources. To be eligible, resources had to be free, aimed at clinicians, educational in intent, and interactive in design. Each term was searched using two different search engines. The details of the first 100 websites captured per browser (ie, n=200 results per term) were entered into EndNote. Each site was subsequently visited to determine its status as a learning resource. Eligible websites were appraised for quality using the AACODS (Authority, Accuracy, Coverage, Objectivity, Date, Significance) tool. We identified 971 unique websites via our multiple search strategies. Of these, 43 were health-related and educational in intent. Once these sites were evaluated for interactivity, a single website matched our inclusion criteria (Dementia Knowledge Translation Learning Centre). KT is an important but complex system of processes. These processes overlap with knowledge, practice, and improvement processes that go by a range of different names. For clinicians to be informed and competent in KT, they require better access to free learning opportunities. These resources should be designed from the viewpoint of the clinician, presenting KT's multifaceted theories and processes in an engaging, interactive way. This learning should empower clinicians to contextualize and apply KT strategies within their own care settings. ©Raechel A Damarell, Jennifer J Tieman. Originally published in JMIR Medical Education (http://mededu.jmir.org), 13.07.2017.

  1. The Temporal Attentive Observation (TAO) Scale: Development of an Instrument to Assess Attentive Behavior Sequences during Serious Gameplay

    ERIC Educational Resources Information Center

    Folkestad, James E.; McKernan, Brian; Train, Stephanie; Martey, Rosa Mikeal; Rhodes, Matthew G.; Kenski, Kate; Shaw, Adrienne; Stromer-Galley, Jennifer; Clegg, Benjamin A.; Strzalkowski, Tomek

    2018-01-01

    The engaging nature of video games has intrigued learning professionals attempting to capture and retain learners' attention. Designing learning interventions that not only capture the learner's attention, but also are designed around the natural cycle of attention will be vital for learning. This paper introduces the temporal attentive…

  2. Study on Capturing Functional Requirements of the New Product Based on Evolution

    NASA Astrophysics Data System (ADS)

    Liu, Fang; Song, Liya; Bai, Zhonghang; Zhang, Peng

    In order to exist in an increasingly competitive global marketplace, it is important for corporations to forecast the evolutionary direction of new products rapidly and effectively. Most products in the world are developed based on the design of existing products. In the product design, capturing functional requirements is a key step. Function is continuously evolving, which is driven by the evolution of needs and technologies. So the functional requirements of new product can be forecasted based on the functions of existing product. Eight laws of function evolution are put forward in this paper. The process model of capturing the functional requirements of new product based on function evolution is proposed. An example illustrates the design process.

  3. Low-energy near Earth asteroid capture using Earth flybys and aerobraking

    NASA Astrophysics Data System (ADS)

    Tan, Minghu; McInnes, Colin; Ceriotti, Matteo

    2018-04-01

    Since the Sun-Earth libration points L1 and L2 are regarded as ideal locations for space science missions and candidate gateways for future crewed interplanetary missions, capturing near-Earth asteroids (NEAs) around the Sun-Earth L1/L2 points has generated significant interest. Therefore, this paper proposes the concept of coupling together a flyby of the Earth and then capturing small NEAs onto Sun-Earth L1/L2 periodic orbits. In this capture strategy, the Sun-Earth circular restricted three-body problem (CRTBP) is used to calculate target Lypaunov orbits and their invariant manifolds. A periapsis map is then employed to determine the required perigee of the Earth flyby. Moreover, depending on the perigee distance of the flyby, Earth flybys with and without aerobraking are investigated to design a transfer trajectory capturing a small NEA from its initial orbit to the stable manifolds associated with Sun-Earth L1/L2 periodic orbits. Finally, a global optimization is carried out, based on a detailed design procedure for NEA capture using an Earth flyby. Results show that the NEA capture strategies using an Earth flyby with and without aerobraking both have the potential to be of lower cost in terms of energy requirements than a direct NEA capture strategy without the Earth flyby. Moreover, NEA capture with an Earth flyby also has the potential for a shorter flight time compared to the NEA capture strategy without the Earth flyby.

  4. A case study of the knowledge transfer practices from the perspectives of highly experienced engineers in the aerospace industry

    NASA Astrophysics Data System (ADS)

    Martin, Deloris

    Purpose. The purpose of this study was to describe the existing knowledge transfer practices in selected aerospace companies as perceived by highly experienced engineers retiring from the company. Specifically it was designed to investigate and describe (a) the processes and procedures used to transfer knowledge, (b) the systems that encourage knowledge transfer, (c) the impact of management actions on knowledge transfer, and (d) constraining factors that might impede knowledge transfer. Methodology. A descriptive case study was the methodology applied in this study. Qualitative data were gathered from highly experienced engineers from 3 large aerospace companies in Southern California. A semistructured interview was conducted face-to-face with each participant in a private or semiprivate, non-workplace setting to obtain each engineer's perspectives on his or her company's current knowledge transfer practices. Findings. The participants in this study preferred to transfer knowledge using face-to-face methods, one-on-one, through actual troubleshooting and problem-solving scenarios. Managers in these aerospace companies were observed as having knowledge transfer as a low priority; they tend not to promote knowledge transfer among their employees. While mentoring is the most common knowledge transfer system these companies offer, it is not the preferred method of knowledge transfer among the highly experienced engineers. Job security and schedule pressures are the top constraints that impede knowledge transfer between the highly experienced engineers and their coworkers. Conclusions. The study data support the conclusion that the highly experienced engineers in the study's aerospace companies would more likely transfer their knowledge to those remaining in the industry if the transfer could occur face-to-face with management support and acknowledgement of their expertise and if their job security is not threatened. The study also supports the conclusion that managers should be responsible for the leadership in developing a knowledge-sharing culture and rewarding those who do share. Recommendations. It is recommended that a quantitative study of highly experienced engineers in aerospace be conducted to determine the degree to which knowledge-sharing methods, processes, and procedures may be effective in capturing their knowledge. It is also recommended that a replication of this study be undertaken to include the perspectives of first-line managers on developing a knowledge-sharing culture for the aerospace industry.

  5. Multi-phase CFD modeling of solid sorbent carbon capture system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryan, E. M.; DeCroix, D.; Breault, R.

    2013-07-01

    Computational fluid dynamics (CFD) simulations are used to investigate a low temperature post-combustion carbon capture reactor. The CFD models are based on a small scale solid sorbent carbon capture reactor design from ADA-ES and Southern Company. The reactor is a fluidized bed design based on a silica-supported amine sorbent. CFD models using both Eulerian–Eulerian and Eulerian–Lagrangian multi-phase modeling methods are developed to investigate the hydrodynamics and adsorption of carbon dioxide in the reactor. Models developed in both FLUENT® and BARRACUDA are presented to explore the strengths and weaknesses of state of the art CFD codes for modeling multi-phase carbon capturemore » reactors. The results of the simulations show that the FLUENT® Eulerian–Lagrangian simulations (DDPM) are unstable for the given reactor design; while the BARRACUDA Eulerian–Lagrangian model is able to simulate the system given appropriate simplifying assumptions. FLUENT® Eulerian–Eulerian simulations also provide a stable solution for the carbon capture reactor given the appropriate simplifying assumptions.« less

  6. Multi-Phase CFD Modeling of Solid Sorbent Carbon Capture System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryan, Emily M.; DeCroix, David; Breault, Ronald W.

    2013-07-30

    Computational fluid dynamics (CFD) simulations are used to investigate a low temperature post-combustion carbon capture reactor. The CFD models are based on a small scale solid sorbent carbon capture reactor design from ADA-ES and Southern Company. The reactor is a fluidized bed design based on a silica-supported amine sorbent. CFD models using both Eulerian-Eulerian and Eulerian-Lagrangian multi-phase modeling methods are developed to investigate the hydrodynamics and adsorption of carbon dioxide in the reactor. Models developed in both FLUENT® and BARRACUDA are presented to explore the strengths and weaknesses of state of the art CFD codes for modeling multi-phase carbon capturemore » reactors. The results of the simulations show that the FLUENT® Eulerian-Lagrangian simulations (DDPM) are unstable for the given reactor design; while the BARRACUDA Eulerian-Lagrangian model is able to simulate the system given appropriate simplifying assumptions. FLUENT® Eulerian-Eulerian simulations also provide a stable solution for the carbon capture reactor given the appropriate simplifying assumptions.« less

  7. 75 FR 28024 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-19

    ... the data-capturing process. SAMHSA will place Web site registration information into a Knowledge Management database and will place email subscription information into a database maintained by a third-party...

  8. Analysis of the supply chain and conservation status of sharks (Elasmobranchii: Superorder Selachimorpha) based on fisher knowledge.

    PubMed

    Martins, Ana Paula Barbosa; Feitosa, Leonardo Manir; Lessa, Rosangela Paula; Almeida, Zafira Silva; Heupel, Michelle; Silva, Wagner Macedo; Tchaicka, Ligia; Nunes, Jorge Luiz Silva

    2018-01-01

    Increasing fishing effort has caused declines in shark populations worldwide. Understanding biological and ecological characteristics of sharks is essential to effectively implement management measures, but to fully understand drivers of fishing pressure social factors must be considered through multidisciplinary and integrated approaches. The present study aimed to use fisher and trader knowledge to describe the shark catch and product supply chain in Northeastern Brazil, and evaluate perceptions regarding the regional conservation status of shark species. Non-systematic observations and structured individual interviews were conducted with experienced fishers and traders. The demand and economic value of shark fins has reportedly decreased over the last 10 years while the shark meat trade has increased slightly, including a small increase in the average price per kilogram of meat. Several threatened shark species were reportedly often captured off shore and traded at local markets. This reported and observed harvest breaches current Brazilian environmental laws. Fishing communities are aware of population declines of several shark species, but rarely take action to avoid capture of sharks. The continuing capture of sharks is mainly due to a lack of knowledge of environmental laws, lack of enforcement by responsible authorities, and difficulties encountered by fishers in finding alternative income streams. National and regional conservation measures are immediately required to reduce overfishing on shark populations in Northeastern Brazil. Social and economic improvements for poor fishing communities must also be implemented to achieve sustainable fisheries.

  9. Analysis of the supply chain and conservation status of sharks (Elasmobranchii: Superorder Selachimorpha) based on fisher knowledge

    PubMed Central

    Almeida, Zafira Silva; Heupel, Michelle; Silva, Wagner Macedo; Tchaicka, Ligia

    2018-01-01

    Increasing fishing effort has caused declines in shark populations worldwide. Understanding biological and ecological characteristics of sharks is essential to effectively implement management measures, but to fully understand drivers of fishing pressure social factors must be considered through multidisciplinary and integrated approaches. The present study aimed to use fisher and trader knowledge to describe the shark catch and product supply chain in Northeastern Brazil, and evaluate perceptions regarding the regional conservation status of shark species. Non-systematic observations and structured individual interviews were conducted with experienced fishers and traders. The demand and economic value of shark fins has reportedly decreased over the last 10 years while the shark meat trade has increased slightly, including a small increase in the average price per kilogram of meat. Several threatened shark species were reportedly often captured off shore and traded at local markets. This reported and observed harvest breaches current Brazilian environmental laws. Fishing communities are aware of population declines of several shark species, but rarely take action to avoid capture of sharks. The continuing capture of sharks is mainly due to a lack of knowledge of environmental laws, lack of enforcement by responsible authorities, and difficulties encountered by fishers in finding alternative income streams. National and regional conservation measures are immediately required to reduce overfishing on shark populations in Northeastern Brazil. Social and economic improvements for poor fishing communities must also be implemented to achieve sustainable fisheries. PMID:29534100

  10. Interplanetary mission design techniques for flagship-class missions

    NASA Astrophysics Data System (ADS)

    Kloster, Kevin W.

    Trajectory design, given the current level of propulsive technology, requires knowledge of orbital mechanics, computational resources, extensive use of tools such as gravity-assist and V infinity leveraging, as well as insight and finesse. Designing missions that deliver a capable science package to a celestial body of interest that are robust and affordable is a difficult task. Techniques are presented here that assist the mission designer in constructing trajectories for flagship-class missions in the outer Solar System. These techniques are applied in this work to spacecraft that are currently in flight or in the planning stages. By escaping the Saturnian system, the Cassini spacecraft can reach other destinations in the Solar System while satisfying planetary quarantine. The patched-conic method was used to search for trajectories that depart Saturn via gravity assist at Titan. Trajectories were found that fly by Jupiter to reach Uranus or Neptune, capture at Jupiter or Neptune, escape the Solar System, fly by Uranus during its 2049 equinox, or encounter Centaurs. A "grand tour," which visits Jupiter, Uranus, and Neptune, departs Saturn in 2014. New tools were built to search for encounters with Centaurs, small Solar System bodies between the orbits of Jupiter and Neptune, and to minimize the DeltaV to target these encounters. Cassini could reach Chiron, the first-discovered Centaur, in 10.5 years after a 2022 Saturn departure. For a Europa Orbiter mission, the strategy for designing Jovian System tours that include Io flybys differs significantly from schemes developed for previous versions of the mission. Assuming that the closest approach distance of the incoming hyperbola at Jupiter is below the orbit of Io, then an Io gravity assist gives the greatest energy pump-down for the least decrease in perijove radius. Using Io to help capture the spacecraft can increase the savings in Jupiter orbit insertion DeltaV over a Ganymede-aided capture. The tour design is guided by Tisserand graphs overlaid with a simple and accurate radiation model so that tours including Io flybys can maintain an acceptable radiation dosage. While Io flybys increase the duration of tours that are ultimately bound for Europa, they offer DeltaV savings and greater scientific return, including the possibility of flying through the plume of one of Io's volcanoes. Different combinations of interplanetary trajectories and are considered with a focus on options that could enable flagship-class missions to Uranus. A patched-conic method is used to identify trajectories to Uranus with launch dates between 2015 and 2050. Flight time is constrained to be less than 14 years. A graphical technique is introduced to identify the most efficient launch opportunities and gravity-assist paths to Uranus. Several trajectories emerge as attractive options including classical paths such as Venus-Earth-Earth-Jupiter, with launch V1 as low as 3.6 km/s. A baseline DeltaV cost is established for capture at Uranus via chemical propulsion. Ballistic reduction of orbital inclination using flybys of the satellites of Uranus is investigated; Oberon is shown to have greater inclination change capability than Titania despite Oberon being less massive.

  11. An integrated decision support system for wastewater nutrient recovery and recycling to agriculture

    NASA Astrophysics Data System (ADS)

    Roy, E. D.; Bomeisl, L.; Cornbrooks, P.; Mo, W.

    2017-12-01

    Nutrient recovery and recycling has become a key research topic within the wastewater engineering and nutrient management communities. Several technologies now exist that can effectively capture nutrients from wastewater, and innovation in this area continues to be an important research pursuit. However, practical nutrient recycling solutions require more than capable nutrient capture technologies. We also need to understand the role that wastewater nutrient recovery and recycling can play within broader nutrient management schemes at the landscape level, including important interactions at the nexus of food, energy, and water. We are developing an integrated decision support system that combines wastewater treatment data, agricultural data, spatial nutrient balance modeling, life cycle assessment, stakeholder knowledge, and multi-criteria decision making. Our goals are to: (1) help guide design decisions related to the implementation of sustainable nutrient recovery technology, (2) support innovations in watershed nutrient management that operate at the interface of the built environment and agriculture, and (3) aid efforts to protect aquatic ecosystems while supporting human welfare in a circular nutrient economy. These goals will be realized partly through the assessment of plausible alternative scenarios for the future. In this presentation, we will describe the tool and focus on nutrient balance results for the New England region. These results illustrate that both centralized and decentralized wastewater nutrient recovery schemes have potential to transform nutrient flows in many New England watersheds, diverting wastewater N and P away from aquatic ecosystems and toward local or regional agricultural soils where they can offset a substantial percentage of imported fertilizer. We will also highlight feasibility criteria and next steps to integrate stakeholder knowledge, economics, and life cycle assessment into the tool.

  12. Pacific Islands Families: First Two Years of Life Study--design and methodology.

    PubMed

    Paterson, Janis; Tukuitonga, Colin; Abbott, Max; Feehan, Michael; Silva, Phil; Percival, Teuila; Carter, Sarnia; Cowley-Malcolm, Esther; Borrows, Jim; Williams, Maynard; Schluter, Philip

    2006-01-27

    Knowledge about the health, psychosocial, and behavioural characteristics of Pacific peoples with young children resident in New Zealand is limited. The Pacific Islands Families: First Two Years of Life (PIF) Study was designed to redress this knowledge gap. This paper describes the design and methodology of the PIF Study. Mothers of Pacific infants born at Middlemore Hospital between 15 March and 17 December 2000 were recruited. Maternal home interviews covering sociodemographic, cultural, environmental, child development, family and household dynamics, childcare, lifestyle, and health issues were undertaken at approximately 6-weeks, 12-months, and 24-months postpartum. Paternal home interviews and child development assessments were conducted at approximately 12-months and 24-months postpartum. Information from Middlemore's Hospital Discharge Summary records and Plunket's 6-week and 6-month assessments was also captured. 1708 mothers were identified, 1657 were invited to participate, 1590 (96%) consented to a home visit; and, of these, 1,477 (93%) were eligible for the PIF study. Of those eligible, 1,376 (93%) participated at 6-weeks, 1224 (83%) participated at 12-months, and 1144 (77%) participated at 24-months. No important differential attrition was observed. Paternal interviews and child assessments were conducted on 825 fathers and 1241 infants at 12-months and on 757 fathers and 1064 children at 24-months. The PIF study is a large, scientifically and culturally robust longitudinal study that has achieved respectable participation rates in a historically hard-to-reach population. We believe that results from this study will inform future policy development within New Zealand.

  13. Pilot testing of a membrane system for postcombustion CO 2 capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merkel, Tim; Kniep, Jay; Wei, Xiaotong

    2015-09-30

    This final report summarizes work conducted for the U.S. Department of Energy, National Energy Technology Laboratory (DOE) to scale up an efficient post-combustion CO 2 capture membrane process to the small pilot test stage (award number DE-FE0005795). The primary goal of this research program was to design, fabricate, and operate a membrane CO 2 capture system to treat coal-derived flue gas containing 20 tonnes CO 2/day (20 TPD). Membrane Technology and Research (MTR) conducted this project in collaboration with Babcock and Wilcox (B&W), the Electric Power Research Institute (EPRI), WorleyParsons (WP), the Illinois Sustainable Technology Center (ISTC), Enerkem (EK), andmore » the National Carbon Capture Center (NCCC). In addition to the small pilot design, build and slipstream testing at NCCC, other project efforts included laboratory membrane and module development at MTR, validation field testing on a 1 TPD membrane system at NCCC, boiler modeling and testing at B&W, a techno-economic analysis (TEA) by EPRI/WP, a case study of the membrane technology applied to a ~20 MWe power plant by ISTC, and an industrial CO 2 capture test at an Enerkem waste-to-biofuel facility. The 20 TPD small pilot membrane system built in this project successfully completed over 1,000 hours of operation treating flue gas at NCCC. The Polaris™ membranes used on this system demonstrated stable performance, and when combined with over 10,000 hours of operation at NCCC on a 1 TPD system, the risk associated with uncertainty in the durability of postcombustion capture membranes has been greatly reduced. Moreover, next-generation Polaris membranes with higher performance and lower cost were validation tested on the 1 TPD system. The 20 TPD system also demonstrated successful operation of a new low-pressure-drop sweep module that will reduce parasitic energy losses at full scale by as much as 10 MWe. In modeling and pilot boiler testing, B&W confirmed the viability of CO 2 recycle to the boiler as envisioned in the MTR process design. The impact of this CO 2 recycle on boiler efficiency was quantified and incorporated into a TEA of the membrane capture process applied to a full-scale power plant. As with previous studies, the TEA showed the membrane process to be lower cost than the conventional solvent capture process even at 90% CO 2capture. A sensitivity study indicates that the membrane capture cost decreases significantly if the 90% capture requirement is relaxed. Depending on the process design, a minimum capture cost is achieved at 30-60% capture, values that would meet proposed CO 2 emission regulations for coal-fired power plants. In summary, this project has successfully advanced the MTR membrane capture process through small pilot testing (technology readiness level 6). The technology is ready for future scale-up to the 10 MWe size.« less

  14. Evaluation of Trap Designs and Deployment Strategies for Capturing Halyomorpha halys (Hemiptera: Pentatomidae)

    PubMed Central

    Morrison, William R.; Cullum, John P.; Leskey, Tracy C.

    2015-01-01

    Halyomorpha halys (Stål) is an invasive pest that attacks numerous crops. For growers to make informed management decisions against H. halys, an effective monitoring tool must be in place. We evaluated various trap designs baited with the two-component aggregation pheromone of H. halys and synergist and deployed in commercial apple orchards. We compared our current experimental standard trap, a black plywood pyramid trap 1.22 m in height deployed between border row apple trees with other trap designs for two growing seasons. These included a black lightweight coroplast pyramid trap of similar dimension, a smaller (29 cm) pyramid trap also ground deployed, a smaller limb-attached pyramid trap, a smaller pyramid trap hanging from a horizontal branch, and a semipyramid design known as the Rescue trap. We found that the coroplast pyramid was the most sensitive, capturing more adults than all other trap designs including our experimental standard. Smaller pyramid traps performed equally in adult captures to our experimental standard, though nymphal captures were statistically lower for the hanging traps. Experimental standard plywood and coroplast pyramid trap correlations were strong, suggesting that standard plywood pyramid traps could be replaced with lighter, cheaper coroplast pyramid traps. Strong correlations with small ground- and limb-deployed pyramid traps also suggest that these designs offer promise as well. Growers may be able to adopt alternative trap designs that are cheaper, lighter, and easier to deploy to monitor H. halys in orchards without a significant loss in sensitivity. PMID:26470309

  15. Fostering creativity in product and service development: validation in the domain of information technology.

    PubMed

    Zeng, Liang; Proctor, Robert W; Salvendy, Gavriel

    2011-06-01

    This research is intended to empirically validate a general model of creative product and service development proposed in the literature. A current research gap inspired construction of a conceptual model to capture fundamental phases and pertinent facilitating metacognitive strategies in the creative design process. The model also depicts the mechanism by which design creativity affects consumer behavior. The validity and assets of this model have not yet been investigated. Four laboratory studies were conducted to demonstrate the value of the proposed cognitive phases and associated metacognitive strategies in the conceptual model. Realistic product and service design problems were used in creativity assessment to ensure ecological validity. Design creativity was enhanced by explicit problem analysis, whereby one formulates problems from different perspectives and at different levels of abstraction. Remote association in conceptual combination spawned more design creativity than did near association. Abstraction led to greater creativity in conducting conceptual expansion than did specificity, which induced mental fixation. Domain-specific knowledge and experience enhanced design creativity, indicating that design can be of a domain-specific nature. Design creativity added integrated value to products and services and positively influenced customer behavior. The validity and value of the proposed conceptual model is supported by empirical findings. The conceptual model of creative design could underpin future theory development. Propositions advanced in this article should provide insights and approaches to facilitate organizations pursuing product and service creativity to gain competitive advantage.

  16. Experimental and Theoretical Understanding of Neutron Capture on Uranium Isotopes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ullmann, John Leonard

    2017-09-21

    Neutron capture cross sections on uranium isotopes are important quantities needed to model nuclear explosion performance, nuclear reactor design, nuclear test diagnostics, and nuclear forensics. It has been difficult to calculate capture accurately, and factors of 2 or more be- tween calculation and measurements are not uncommon, although normalization to measurements of the average capture width and nuclear level density can improve the result. The calculations of capture for 233,235,237,239U are further complicated by the need to accurately include the fission channel.

  17. Low-cost floating emergence net and bottle trap: Comparison of two designs

    USGS Publications Warehouse

    Cadmus, Pete; Pomeranz, Justin; Kraus, Johanna M.

    2016-01-01

    Sampling emergent aquatic insects is of interest to many freshwater ecologists. Many quantitative emergence traps require the use of aspiration for collection. However, aspiration is infeasible in studies with large amounts of replication that is often required in large biomonitoring projects. We designed an economic, collapsible pyramid-shaped floating emergence trap with an external collection bottle that avoids the need for aspiration. This design was compared experimentally to a design of similar dimensions that relied on aspiration to ensure comparable results. The pyramid-shaped design captured twice as many total emerging insects. When a preservative was used in bottle collectors, >95% of the emergent abundance was collected in the bottle. When no preservative was used, >81% of the total insects were collected from the bottle. In addition to capturing fewer emergent insects, the traps that required aspiration took significantly longer to sample. Large studies and studies sampling remote locations could benefit from the economical construction, speed of sampling, and capture efficiency.

  18. Thermal Propulsion Capture System Heat Exchanger Design

    NASA Technical Reports Server (NTRS)

    Richard, Evan M.

    2016-01-01

    One of the biggest challenges of manned spaceflight beyond low earth orbit and the moon is harmful radiation that astronauts would be exposed to on their long journey to Mars and further destinations. Using nuclear energy has the potential to be a more effective means of propulsion compared to traditional chemical engines (higher specific impulse). An upper stage nuclear engine would allow astronauts to reach their destination faster and more fuel efficiently. Testing these engines poses engineering challenges due to the need to totally capture the engine exhaust. The Thermal Propulsion Capture System is a concept for cost effectively and safely testing Nuclear Thermal Engines. Nominally, hydrogen exhausted from the engine is not radioactive, but is treated as such in case of fuel element failure. The Thermal Propulsion Capture System involves injecting liquid oxygen to convert the hydrogen exhaust into steam. The steam is then cooled and condensed into liquid water to allow for storage. The Thermal Propulsion Capture System concept for ground testing of a nuclear powered engine involves capturing the engine exhaust to be cooled and condensed before being stored. The hydrogen exhaust is injected with liquid oxygen and burned to form steam. That steam must be cooled to saturation temperatures before being condensed into liquid water. A crossflow heat exchanger using water as a working fluid will be designed to accomplish this goal. Design a cross flow heat exchanger for the Thermal Propulsion Capture System testing which: Eliminates the need for water injection cooling, Cools steam from 5800 F to saturation temperature, and Is efficient and minimizes water requirement.

  19. CHEMICAL EFFECTS IN BIOLOGICAL SYSTEMS – DATA DICTIONARY (CEBS-DD): A COMPENDIUM OF TERMS FOR THE CAPTURE AND INTEGRATION OF BIOLOGICAL STUDY DESIGN DESCRIPTION, CONVENTIONAL PHENOTYPES AND ‘OMICS’ DATA

    EPA Science Inventory

    A critical component in the design of the Chemical Effects in Biological Systems (CEBS) Knowledgebase is a strategy to capture toxicogenomics study protocols and the toxicity endpoint data (clinical pathology and histopathology). A Study is generally an experiment carried out du...

  20. Sample Acqusition Drilling System for the the Resource Prospector Mission

    NASA Astrophysics Data System (ADS)

    Zacny, K.; Paulsen, G.; Quinn, J.; Smith, J.; Kleinhenz, J.

    2015-12-01

    The goal of the Lunar Resource Prospector Mission (RPM) is to capture and identify volatiles species within the top meter of the lunar regolith. The RPM drill has been designed to 1. Generate cuttings and place them on the surface for analysis by the the Near InfraRed Volatiles Spectrometer Subsystem (NIRVSS), and 2. Capture cuttings and transfer them to the Oxygen and Volatile Extraction Node (OVEN) coupled with the Lunar Advanced Volatiles Analysis (LAVA) subsystem. The RPM drill is based on the Mars Icebreaker drill developed for capturing samples of ice and ice cemented ground on Mars. The drill weighs approximately 10 kg and is rated at ~300 Watt. It is a rotary-percussive, fully autonomous system designed to capture cuttings for analysis. The drill consists of: 1. Rotary-Percussive Drill Head, 2. Sampling Auger, 3. Brushing station, 4. Z-stage, 5. Deployment stage. To reduce sample handling complexity, the drill auger is designed to capture cuttings as opposed to cores. High sampling efficiency is possible through a dual design of the auger. The lower section has deep and low pitch flutes for retaining of cuttings. The upper section has been designed to efficiently move the cuttings out of the hole. The drill uses a "bite" sampling approach where samples are captured in ~10 cm intervals. The first generation drill was tested in Mars chamber as well as in Antarctica and the Arctic. It demonstrated drilling at 1-1-100-100 level (1 meter in 1 hour with 100 Watt and 100 N Weight on Bit) in ice, ice cemented ground, soil, and rocks. The second generation drill was deployed on a Carnegie Mellon University rover, called Zoe, and tested in Atacama in 2012. The tests demonstrated fully autonomous sample acquisition and delivery to a carousel. The third generation drill was tested in NASA GRC's vacuum chamber, VF13, at 10-5 torr and approximately 200 K. It demonstrated successful capture and transfer of icy samples to a crucible. The drill has been modified and integrated onto the NASA JSC RPM rover. It has been undergoing testing in a lab and in the field during the Summer of 2015.

  1. Linking product design to consumer behavior: the moderating role of consumption experience.

    PubMed

    Gilal, Naeem Gul; Zhang, Jing; Gilal, Faheem Gul

    2018-01-01

    Previous investigations of product design broadly link aesthetic, functional, and symbolic designs to sales growth, high turnover, and market share. However, the effect of product design dimensions on consumer willingness-to-buy (WTB) and word-of-mouth (WOM) is virtually ignored by consumer researchers. Similarly, whether the consumption experience can differentiate the effect of the three product design dimensions on WTB and WOM is completely unknown. Using categorization theory as a lens, our study aims to explore the effect of product design dimensions on consumer WTB and WOM directly and indirectly through the moderation of the consumption experience. A convenience sample of (n=357) Chinese and (n=277) Korean shoppers was utilized to test the hypotheses in the fashion apparel industry. Our results showed that the aesthetic design was more prominent in capturing consumer WTB for both Chinese and Koreans. Similarly, the aesthetic design was more salient in enhancing WOM for Chinese, whereas the symbolic design was more promising in terms of improving WOM for Koreans. Further, our moderation results demonstrated that the consumption experience could differentiate the effects of the three product design dimensions on consumer WTB and WOM for Chinese. By contrast, the consumption experience could only interact with the aesthetic design to improve WOM for South Koreans. To the best of authors' knowledge, the present study is one of the initial attempts to link three product design dimensions with consumer WTB and WOM in the fashion apparel context and explored whether consumption experience competes or complement with three product design dimensions to shape consumer WTB and WOM for Chinese and Koreans.

  2. A population-based tissue probability map-driven level set method for fully automated mammographic density estimations.

    PubMed

    Kim, Youngwoo; Hong, Byung Woo; Kim, Seung Ja; Kim, Jong Hyo

    2014-07-01

    A major challenge when distinguishing glandular tissues on mammograms, especially for area-based estimations, lies in determining a boundary on a hazy transition zone from adipose to glandular tissues. This stems from the nature of mammography, which is a projection of superimposed tissues consisting of different structures. In this paper, the authors present a novel segmentation scheme which incorporates the learned prior knowledge of experts into a level set framework for fully automated mammographic density estimations. The authors modeled the learned knowledge as a population-based tissue probability map (PTPM) that was designed to capture the classification of experts' visual systems. The PTPM was constructed using an image database of a selected population consisting of 297 cases. Three mammogram experts extracted regions for dense and fatty tissues on digital mammograms, which was an independent subset used to create a tissue probability map for each ROI based on its local statistics. This tissue class probability was taken as a prior in the Bayesian formulation and was incorporated into a level set framework as an additional term to control the evolution and followed the energy surface designed to reflect experts' knowledge as well as the regional statistics inside and outside of the evolving contour. A subset of 100 digital mammograms, which was not used in constructing the PTPM, was used to validate the performance. The energy was minimized when the initial contour reached the boundary of the dense and fatty tissues, as defined by experts. The correlation coefficient between mammographic density measurements made by experts and measurements by the proposed method was 0.93, while that with the conventional level set was 0.47. The proposed method showed a marked improvement over the conventional level set method in terms of accuracy and reliability. This result suggests that the proposed method successfully incorporated the learned knowledge of the experts' visual systems and has potential to be used as an automated and quantitative tool for estimations of mammographic breast density levels.

  3. Using the Web as a Higher Order Thinking Partner: Case Study of an Advanced Learner Creatively Synthesizing Knowledge on the Web

    ERIC Educational Resources Information Center

    DeSchryver, Michael

    2017-01-01

    Previous work provided foundations for the theory of web-mediated knowledge synthesis, a framework for using the web in more creative and generative ways. This article explores specific instances of the various elements of that theory in a single case from the initial study. That is, a thorough exploration of think-aloud, screen capture, and…

  4. 75 FR 46943 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-04

    ...-capturing process. SAMHSA will place Web site registration information into a Knowledge Management database... September 3, 2010 to: SAMHSA Desk Officer, Human Resources and Housing Branch, Office of Management and...

  5. Balancing Act: How to Capture Knowledge without Killing It.

    ERIC Educational Resources Information Center

    Brown, John Seely; Duguid, Paul

    2000-01-01

    Top-down processes for institutionalizing ideas can stifle creativity. Xerox researchers learned how to combine process-based and practice-based methods in order to disseminate best practices from a community of repair technicians. (JOW)

  6. Exercise Sensing and Pose Recovery Inference Tool (ESPRIT) - A Compact Stereo-based Motion Capture Solution For Exercise Monitoring

    NASA Technical Reports Server (NTRS)

    Lee, Mun Wai

    2015-01-01

    Crew exercise is important during long-duration space flight not only for maintaining health and fitness but also for preventing adverse health problems, such as losses in muscle strength and bone density. Monitoring crew exercise via motion capture and kinematic analysis aids understanding of the effects of microgravity on exercise and helps ensure that exercise prescriptions are effective. Intelligent Automation, Inc., has developed ESPRIT to monitor exercise activities, detect body markers, extract image features, and recover three-dimensional (3D) kinematic body poses. The system relies on prior knowledge and modeling of the human body and on advanced statistical inference techniques to achieve robust and accurate motion capture. In Phase I, the company demonstrated motion capture of several exercises, including walking, curling, and dead lifting. Phase II efforts focused on enhancing algorithms and delivering an ESPRIT prototype for testing and demonstration.

  7. Destination pluto: New horizons performance during the approach phase

    NASA Astrophysics Data System (ADS)

    Flanigan, Sarah H.; Rogers, Gabe D.; Guo, Yanping; Kirk, Madeline N.; Weaver, Harold A.; Owen, William M.; Jackman, Coralie D.; Bauman, Jeremy; Pelletier, Frederic; Nelson, Derek; Stanbridge, Dale; Dumont, Phillip J.; Williams, Bobby; Stern, S. Alan; Olkin, Cathy B.; Young, Leslie A.; Ennico, Kimberly

    2016-11-01

    The New Horizons spacecraft began its journey to the Pluto-Charon system on January 19, 2006 on-board an Atlas V rocket from Cape Canaveral, Florida. As the first mission in NASA's New Frontiers program, the objective of the New Horizons mission is to perform the first exploration of ice dwarfs in the Kuiper Belt, extending knowledge of the solar system to include the icy "third zone" for the first time. Arriving at the correct time and correct position relative to Pluto on July 14, 2015 depended on the successful execution of a carefully choreographed sequence of events. The Core command sequence, which was developed and optimized over multiple years and included the highest-priority science observations during the closest approach period, was contingent on precise navigation to the Pluto-Charon system and nominal performance of the guidance and control (G&C) subsystem. The flyby and gravity assist of Jupiter on February 28, 2007 was critical in placing New Horizons on the path to Pluto. Once past Jupiter, trajectory correction maneuvers (TCMs) became the sole source of trajectory control since the spacecraft did not encounter any other planetary bodies along its flight path prior to Pluto. During the Pluto approach phase, which formally began on January 15, 2015, optical navigation images were captured primarily with the Long Range Reconnaissance Imager to refine spacecraft and Pluto-Charon system trajectory knowledge, which in turn was used to design TCMs. Orbit determination solutions were also used to update the spacecraft's on-board trajectory knowledge throughout the approach phase. Nominal performance of the G&C subsystem, accurate TCM designs, and high-quality orbit determination solutions resulted in final Pluto-relative B-plane arrival conditions that facilitated a successful first reconnaissance of the Pluto-Charon system.

  8. Device Scale Modeling of Solvent Absorption using MFIX-TFM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carney, Janine E.; Finn, Justin R.

    Recent climate change is largely attributed to greenhouse gases (e.g., carbon dioxide, methane) and fossil fuels account for a large majority of global CO 2 emissions. That said, fossil fuels will continue to play a significant role in the generation of power for the foreseeable future. The extent to which CO 2 is emitted needs to be reduced, however, carbon capture and sequestration are also necessary actions to tackle climate change. Different approaches exist for CO 2 capture including both post-combustion and pre-combustion technologies, oxy-fuel combustion and/or chemical looping combustion. The focus of this effort is on post-combustion solvent-absorption technology.more » To apply CO 2 technologies at commercial scale, the availability and maturity and the potential for scalability of that technology need to be considered. Solvent absorption is a proven technology but not at the scale needed by typical power plant. The scale up and down and design of laboratory and commercial packed bed reactors depends heavily on the specific knowledge of two-phase pressure drop, liquid holdup, the wetting efficiency and mass transfer efficiency as a function of operating conditions. Simple scaling rules often fail to provide proper design. Conventional reactor design modeling approaches will generally characterize complex non-ideal flow and mixing patterns using simplified and/or mechanistic flow assumptions. While there are varying levels of complexity used within these approaches, none of these models resolve the local velocity fields. Consequently, they are unable to account for important design factors such as flow maldistribution and channeling from a fundamental perspective. Ideally design would be aided by development of predictive models based on truer representation of the physical and chemical processes that occur at different scales. Computational fluid dynamic (CFD) models are based on multidimensional flow equations with first principle foundations. CFD models can include a more accurate physical description of flow processes and be modified to include more complex behavior. Wetting performance and spatial liquid distribution inside the absorber are recognized as weak areas of knowledge requiring further investigation. CFD tools offer a possible method to investigating such topics and gaining a better understanding of their influence on reactor performance. This report focuses first on describing a hydrodynamic model for countercurrent gas-liquid flow through a packed column and then on the chemistry, heat and mass transfer specific to CO 2 absorption using monoethanolamine (MEA). The indicated model is implemented in MFIX, a CFD open source software package. The user defined functions needed to build this model are described in detail along with the keywords for the corresponding input file. A test case is outlined along with a few results. The example serves to briefly illustrate the developed CFD tool and its potential capability to investigate solvent absorption.« less

  9. Towards Modeling False Memory With Computational Knowledge Bases.

    PubMed

    Li, Justin; Kohanyi, Emma

    2017-01-01

    One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.

  10. Methodology for identifying and representing knowledge in the scope of CMM inspection resource selection

    NASA Astrophysics Data System (ADS)

    Martínez, S.; Barreiro, J.; Cuesta, E.; Álvarez, B. J.; González, D.

    2012-04-01

    This paper is focused on the task of elicitation and structuring of knowledge related to selection of inspection resources. The final goal is to obtain an informal model of knowledge oriented to the inspection planning in coordinate measuring machines. In the first tasks, where knowledge is captured, it is necessary to use tools that make easier the analysis and structuring of knowledge, so that rules of selection can be easily stated to configure the inspection resources. In order to store the knowledge a so-called Onto-Process ontology has been developed. This ontology may be of application to diverse processes in manufacturing engineering. This paper describes the decomposition of the ontology in terms of general units of knowledge and others more specific for selection of sensor assemblies in inspection planning with touch sensors.

  11. Penetration of multiple thin films in micrometeorite capture cells

    NASA Technical Reports Server (NTRS)

    Simon, Charles G.

    1994-01-01

    As part of a continuing effort to develop cosmic dust detectors/collectors for use in space, we performed a series of hypervelocity impact experiments on combined sensor/capture-cell assemblies using 10-200-micron-diameter glass projectiles and olivine crystals at velocities of 0.9-14.4 km/s. The design objective of the space-flight instrument is to measure the trajectories of individual particles with sufficient accuracy to permit identification of their parent bodies and to capture enough impactor material to allow chemical and isotopic analyses of samples returned to Earth. Three different multiple-film small-particle capture cell designs (0.1-100-micron-thick Al foils with approx. 10, 100, and 1800 micron spacing) were evaluated for their ability to capture impactor fragments and residue. Their performances were compared to two other types of capture cells, foil covered Ge crystals, and 0.50 and 0.120 g/cu cm aerogels. All capture cells were tested behind multifilm (1.4-6.0-micron-thick) polyvinylidene fluoride (PVDF) velocity/trajectory sensor devices. Several tests were also done without the PVDF sensors for comparison. The results of this study were reported by Simon in a comprehensive report in which the morphology of impacts and impactor residues in various types of capture cells after passage through two PVDF sensor films is discussed. Impactor fragments in selected capture cells from impacts at velocities up to 6.4 km/s were identified using scanning electron microscopy with energy dispersive spectroscopy (SEM/EDS).

  12. An overview of DANCE: a 4II BaF[2] detector for neutron capture measurements at LANSCE.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ullmann, J. L.

    2004-01-01

    The Detector for Advanced Neutron Capture experiments (DANCE) is a 162-element, 4{pi} BaF{sub 2} array designed to make neutron capture cross-section measurements on rare or radioactive targets with masses as little as 1 mg. Accurate capture cross sections are needed in many research areas, including stellar nucleosynthesis, advanced nuclear fuel cycles, waste transmutation, and other applied programs. These cross sections are difficult to calculate accurately and must be measured. Up to now, except for a few long-lived nuclides there are essentially no differential capture measurements on radioactive nuclei. The DANCE array is located at the Lujan Neutron Scattering Center atmore » LANSCE, which is a continuous-spectrum neutron source with useable energies from below thermal to about 100 keV. Data acquisition is done with 320 fast waveform digitizers. The design and initial performance results, including background minimization, will be discussed.« less

  13. Platform control for space-based imaging: the TOPSAT mission

    NASA Astrophysics Data System (ADS)

    Dungate, D.; Morgan, C.; Hardacre, S.; Liddle, D.; Cropp, A.; Levett, W.; Price, M.; Steyn, H.

    2004-11-01

    This paper describes the imaging mode ADCS design for the TOPSAT satellite, an Earth observation demonstration mission targeted at military applications. The baselined orbit for TOPSAT is a 600-700km sun synchronous orbit from which images up to 30° off track can be captured. For this baseline, the imaging camera proves a resolution of 2.5m and a nominal image size of 15x15km. The ADCS design solution for the imaging mode uses a moving demand approach to enable a single control algorithm solution for both the preparatory reorientation prior to image capture and the post capture return to nadir pointing. During image capture proper, control is suspended to minimise the disturbances experienced by the satellite from the wheels. Prior to each imaging sequence, the moving demand attitude and rate profiles are calculated such that the correct attitude and rate are achieved at the correct orbital position, enabling the correct target area to be captured.

  14. Performance Characteristics of a Kernel-Space Packet Capture Module

    DTIC Science & Technology

    2010-03-01

    Defense, or the United States Government . AFIT/GCO/ENG/10-03 PERFORMANCE CHARACTERISTICS OF A KERNEL-SPACE PACKET CAPTURE MODULE THESIS Presented to the...3.1.2.3 Prototype. The proof of concept for this research is the design, development, and comparative performance analysis of a kernel level N2d capture...changes to kernel code 5. Can be used for both user-space and kernel-space capture applications in order to control comparative performance analysis to

  15. An effective box trap for capturing lynx

    Treesearch

    Jay A. Kolbe; John R. Squires; Thomas W. Parker

    2003-01-01

    We designed a box trap for capturing lynx (Lynx lynx) that is lightweight, safe, effective, and less expensive than many commercial models. It can be constructed in approximately 3-4 hours from readily available materials. We used this trap to capture 40 lynx 89 times (96% of lynx entering traps) and observed no trapping related injuries. We compare our box...

  16. Realized detection and capture probabilities for giant gartersnakes (Thamnophis gigas) using modified floating aquatic funnel traps

    USGS Publications Warehouse

    Halstead, Brian J.; Skalos, Shannon M.; Casazza, Michael L.; Wylie, Glenn D.

    2015-01-01

    Detection and capture probabilities for giant gartersnakes (Thamnophis gigas) are very low, and successfully evaluating the effects of variables or experimental treatments on giant gartersnake populations will require greater detection and capture probabilities than those that had been achieved with standard trap designs. Previous research identified important trap modifications that can increase the probability of snakes entering traps and help prevent the escape of captured snakes. The purpose of this study was to quantify detection and capture probabilities obtained using the most successful modification to commercially available traps to date (2015), and examine the ability of realized detection and capture probabilities to achieve benchmark levels of precision in occupancy and capture-mark-recapture studies.

  17. Adaptation of Hybridization Capture of Chromatin-associated Proteins for Proteomics to Mammalian Cells.

    PubMed

    Guillen-Ahlers, Hector; Rao, Prahlad K; Perumalla, Danu S; Montoya, Maria J; Jadhav, Avinash Y L; Shortreed, Michael R; Smith, Lloyd M; Olivier, Michael

    2018-06-01

    The hybridization capture of chromatin-associated proteins for proteomics (HyCCAPP) technology was initially developed to uncover novel DNA-protein interactions in yeast. It allows analysis of a target region of interest without the need for prior knowledge about likely proteins bound to the target region. This, in theory, allows HyCCAPP to be used to analyze any genomic region of interest, and it provides sufficient flexibility to work in different cell systems. This method is not meant to study binding sites of known transcription factors, a task better suited for Chromatin Immunoprecipitation (ChIP) and ChIP-like methods. The strength of HyCCAPP lies in its ability to explore DNA regions for which there is limited or no knowledge about the proteins bound to it. It can also be a convenient method to avoid biases (present in ChIP-like methods) introduced by protein-based chromatin enrichment using antibodies. Potentially, HyCCAPP can be a powerful tool to uncover truly novel DNA-protein interactions. To date, the technology has been predominantly applied to yeast cells or to high copy repeat sequences in mammalian cells. In order to become the powerful tool we envision, HyCCAPP approaches need to be optimized to efficiently capture single-copy loci in mammalian cells. Here, we present our adaptation of the initial yeast HyCCAPP capture protocol to human cell lines, and show that single-copy chromatin regions can be efficiently isolated with this modified protocol.

  18. Characterizing scientific production and consumption in Physics

    PubMed Central

    Zhang, Qian; Perra, Nicola; Gonçalves, Bruno; Ciulla, Fabio; Vespignani, Alessandro

    2013-01-01

    We analyze the entire publication database of the American Physical Society generating longitudinal (50 years) citation networks geolocalized at the level of single urban areas. We define the knowledge diffusion proxy, and scientific production ranking algorithms to capture the spatio-temporal dynamics of Physics knowledge worldwide. By using the knowledge diffusion proxy we identify the key cities in the production and consumption of knowledge in Physics as a function of time. The results from the scientific production ranking algorithm allow us to characterize the top cities for scholarly research in Physics. Although we focus on a single dataset concerning a specific field, the methodology presented here opens the path to comparative studies of the dynamics of knowledge across disciplines and research areas. PMID:23571320

  19. Primary Care Outcomes Questionnaire: psychometric testing of a new instrument.

    PubMed

    Murphy, Mairead; Hollinghurst, Sandra; Cowlishaw, Sean; Salisbury, Chris

    2018-06-01

    Patients attend primary care for many reasons and to achieve a range of possible outcomes. There is currently no Patient Reported Outcome Measure (PROM) designed to capture these diverse outcomes, and trials of interventions in primary care may thus fail to detect beneficial effects. This study describes the psychometric testing of the Primary Care Outcomes Questionnaire (PCOQ), which was designed to capture a broad range of outcomes relevant to primary care. Questionnaires were administered in primary care in South West England. Patients completed the PCOQ in GP waiting rooms before a consultation, and a second questionnaire, including the PCOQ and seven comparator PROMs, after 1 week. Psychometric testing included exploratory factor analysis on the PCOQ, internal consistency, correlation coefficients between domain scores and comparator measures, and repeated measures effect sizes indicating change across 1 week. In total, 602 patients completed the PCOQ at baseline, and 264 (44%) returned the follow-up questionnaire. Exploratory factor analysis suggested four dimensions underlying the PCOQ items: health and wellbeing, health knowledge and self-care, confidence in health provision, and confidence in health plan. Each dimension was internally consistent and correlated as expected with comparator PROMs, providing evidence of construct validity. Patients reporting an improvement in their main problem exhibited small to moderate improvements in relevant domain scores on the PCOQ. The PCOQ was acceptable, feasible, showed strong psychometric properties, and was responsive to change. It is a promising new tool for assessment of outcomes of primary care interventions from a patient perspective. © British Journal of General Practice 2018.

  20. SensePath: Understanding the Sensemaking Process Through Analytic Provenance.

    PubMed

    Nguyen, Phong H; Xu, Kai; Wheat, Ashley; Wong, B L William; Attfield, Simon; Fields, Bob

    2016-01-01

    Sensemaking is described as the process of comprehension, finding meaning and gaining insight from information, producing new knowledge and informing further action. Understanding the sensemaking process allows building effective visual analytics tools to make sense of large and complex datasets. Currently, it is often a manual and time-consuming undertaking to comprehend this: researchers collect observation data, transcribe screen capture videos and think-aloud recordings, identify recurring patterns, and eventually abstract the sensemaking process into a general model. In this paper, we propose a general approach to facilitate such a qualitative analysis process, and introduce a prototype, SensePath, to demonstrate the application of this approach with a focus on browser-based online sensemaking. The approach is based on a study of a number of qualitative research sessions including observations of users performing sensemaking tasks and post hoc analyses to uncover their sensemaking processes. Based on the study results and a follow-up participatory design session with HCI researchers, we decided to focus on the transcription and coding stages of thematic analysis. SensePath automatically captures user's sensemaking actions, i.e., analytic provenance, and provides multi-linked views to support their further analysis. A number of other requirements elicited from the design session are also implemented in SensePath, such as easy integration with existing qualitative analysis workflow and non-intrusive for participants. The tool was used by an experienced HCI researcher to analyze two sensemaking sessions. The researcher found the tool intuitive and considerably reduced analysis time, allowing better understanding of the sensemaking process.

  1. Designing for designers: insights into the knowledge users of inclusive design.

    PubMed

    Dong, Hua; McGinley, Chris; Nickpour, Farnaz; Cifter, Abdusselam Selami

    2015-01-01

    Over the last twenty years, research on inclusive design has delivered a wealth of publications and initiatives, forming an emerging knowledge base for inclusive design. The inclusive design knowledge base breaks down into two discrete areas - understanding end users from many different perspectives, and understanding the information needs of the knowledge users (e.g. designers) who are involved in promoting and delivering inclusive design solutions. Much research has focused on the end users, but in recent years, understanding the needs and the characteristics of knowledge users has added a new dimension to the research task. This paper focuses on the knowledge users of inclusive design. It discusses the different types of knowledge users and their knowledge needs. The research programmes undertaken by the Inclusive Design Research Group (IDRG) are used to illustrate the process of understanding knowledge needs of designers, developing different types of tools to meet those needs and evaluating their effectiveness. The paper concludes with a discussion on how to adopt an inclusive design research methodology to effectively engage the knowledge users in the development of inclusive design tools. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  2. Application of free energy minimization to the design of adaptive multi-agent teams

    NASA Astrophysics Data System (ADS)

    Levchuk, Georgiy; Pattipati, Krishna; Fouse, Adam; Serfaty, Daniel

    2017-05-01

    Many novel DoD missions, from disaster relief to cyber reconnaissance, require teams of humans and machines with diverse capabilities. Current solutions do not account for heterogeneity of agent capabilities, uncertainty of team knowledge, and dynamics of and dependencies between tasks and agent roles, resulting in brittle teams. Most importantly, the state-of-the-art team design solutions are either centralized, imposing role and relation assignment onto agents, or completely distributed, suitable for only homogeneous organizations such as swarms. Centralized design models can't provide insights for team's self-organization, i.e. adapting team structure over time in distributed collaborative manner by team members with diverse expertise and responsibilities. In this paper we present an information-theoretic formalization of team composition and structure adaptation using a minimization of variational free energy. The structure adaptation is obtained in an iterative distributed and collaborative manner without the need for centralized control. We show that our model is lightweight, predictive, and produces team structures that theoretically approximate an optimal policy for team adaptation. Our model also provides a unique coupling between the structure and action policy, and captures three essential processes of learning, perception, and control.

  3. Computational design of soft materials for the capture of Cs-137 in contaminated environments: From 2D covalent cucurbituril networks to 3D supramolecular materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pichierri, Fabio, E-mail: fabio@che.tohoku.ac.jp

    Using computational quantum chemistry methods we design novel 2D and 3D soft materials made of cucurbituril macrocycles covalently connected with each other via rigid linkers. Such covalent cucurbituril networks might be useful for the capture of radioactive Cs-137 (present as Cs{sup +}) in the contaminated environment.

  4. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K MS-DOS format diskette. TARGET was developed in 1991.

  5. Intellectual Capital and YOU.

    ERIC Educational Resources Information Center

    Gordon, Jack

    1999-01-01

    Discusses the differences between corporate universities and training departments and suggests that marketing is a big part of it. Defines knowledge management as the effort to capture an organization's collective experience and wisdom and to make it useful to everyone. (JOW)

  6. Through the eyes of the Informationist: Identifying information needs of the Breast Imaging Service at a tertiary medical center specializing in cancer.

    PubMed

    DeRosa, Antonio P; Gibson, Donna S; Morris, Elizabeth A

    2017-09-01

    The information services offered by Embedded Librarians over the years have led to the more modern-and domain knowledge-specific-role of the Informationist. A 10-point questionnaire was developed and used to interview 12 attending physicians and three fellows chosen at random. The participants are either on the research track (n = 3) or the clinical track (n = 9). A two-part schematic was also created to capture more detailed feedback about the information needs and information-seeking behavior of clinicians regarding patient care (clinical) and research activities. Bibliographic management tool use and time-related factors were also captured in the interviews and written schematics. The role of the Informationist is an emerging, yet valuable one to assigned clinical groups. Clinician's knowledge-base, current awareness, productivity, and evidence-based care can be improved by use of Informationist services.

  7. Multi-frame knowledge based text enhancement for mobile phone captured videos

    NASA Astrophysics Data System (ADS)

    Ozarslan, Suleyman; Eren, P. Erhan

    2014-02-01

    In this study, we explore automated text recognition and enhancement using mobile phone captured videos of store receipts. We propose a method which includes Optical Character Resolution (OCR) enhanced by our proposed Row Based Multiple Frame Integration (RB-MFI), and Knowledge Based Correction (KBC) algorithms. In this method, first, the trained OCR engine is used for recognition; then, the RB-MFI algorithm is applied to the output of the OCR. The RB-MFI algorithm determines and combines the most accurate rows of the text outputs extracted by using OCR from multiple frames of the video. After RB-MFI, KBC algorithm is applied to these rows to correct erroneous characters. Results of the experiments show that the proposed video-based approach which includes the RB-MFI and the KBC algorithm increases the word character recognition rate to 95%, and the character recognition rate to 98%.

  8. The Knowledge Program: an Innovative, Comprehensive Electronic Data Capture System and Warehouse

    PubMed Central

    Katzan, Irene; Speck, Micheal; Dopler, Chris; Urchek, John; Bielawski, Kay; Dunphy, Cheryl; Jehi, Lara; Bae, Charles; Parchman, Alandra

    2011-01-01

    Data contained in the electronic health record (EHR) present a tremendous opportunity to improve quality-of-care and enhance research capabilities. However, the EHR is not structured to provide data for such purposes: most clinical information is entered as free text and content varies substantially between providers. Discrete information on patients’ functional status is typically not collected. Data extraction tools are often unavailable. We have developed the Knowledge Program (KP), a comprehensive initiative to improve the collection of discrete clinical information into the EHR and the retrievability of data for use in research, quality, and patient care. A distinct feature of the KP is the systematic collection of patient-reported outcomes, which is captured discretely, allowing more refined analyses of care outcomes. The KP capitalizes on features of the Epic EHR and utilizes an external IT infrastructure distinct from Epic for enhanced functionality. Here, we describe the development and implementation of the KP. PMID:22195124

  9. Evaluation and comparison of an adaptive method technique for improved performance of linear Fresnel secondary designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hack, Madeline; Zhu, Guangdong; Wendelin, Timothy J.

    As a line-focus concentrating solar power (CSP) technology, linear Fresnel collectors have the potential to become a low-cost solution for electricity production and a variety of thermal energy applications. However, this technology often suffers from relatively low performance. A secondary reflector is a key component used to improve optical performance of a linear Fresnel collector. The shape of a secondary reflector is particularly critical in determining solar power captured by the absorber tube(s), and thus, the collector's optical performance. However, to the authors' knowledge, no well-established process existed to derive the optimal secondary shape prior to the development of amore » new adaptive method to optimize the secondary reflector shape. The new adaptive method does not assume any pre-defined analytical form; rather, it constitutes an optimum shape through an adaptive process by maximizing the energy collection onto the absorber tube. In this paper, the adaptive method is compared with popular secondary-reflector designs with respect to a collector's optical performance under various scenarios. For the first time, a comprehensive, in-depth comparison was conducted on all popular secondary designs for CSP applications. In conclusion, it is shown that the adaptive design exhibits the best optical performance.« less

  10. Evaluation and comparison of an adaptive method technique for improved performance of linear Fresnel secondary designs

    DOE PAGES

    Hack, Madeline; Zhu, Guangdong; Wendelin, Timothy J.

    2017-09-13

    As a line-focus concentrating solar power (CSP) technology, linear Fresnel collectors have the potential to become a low-cost solution for electricity production and a variety of thermal energy applications. However, this technology often suffers from relatively low performance. A secondary reflector is a key component used to improve optical performance of a linear Fresnel collector. The shape of a secondary reflector is particularly critical in determining solar power captured by the absorber tube(s), and thus, the collector's optical performance. However, to the authors' knowledge, no well-established process existed to derive the optimal secondary shape prior to the development of amore » new adaptive method to optimize the secondary reflector shape. The new adaptive method does not assume any pre-defined analytical form; rather, it constitutes an optimum shape through an adaptive process by maximizing the energy collection onto the absorber tube. In this paper, the adaptive method is compared with popular secondary-reflector designs with respect to a collector's optical performance under various scenarios. For the first time, a comprehensive, in-depth comparison was conducted on all popular secondary designs for CSP applications. In conclusion, it is shown that the adaptive design exhibits the best optical performance.« less

  11. Nutrition warnings as front-of-pack labels: influence of design features on healthfulness perception and attentional capture.

    PubMed

    Cabrera, Manuel; Machín, Leandro; Arrúa, Alejandra; Antúnez, Lucía; Curutchet, María Rosa; Giménez, Ana; Ares, Gastón

    2017-12-01

    Warnings are a new directive front-of-pack (FOP) nutrition labelling scheme that highlights products with high content of key nutrients. The design of warnings influences their ability to catch consumers' attention and to clearly communicate their intended meaning, which are key determinants of their effectiveness. The aim of the present work was to evaluate the influence of design features of warnings as a FOP nutrition labelling scheme on perceived healthfulness and attentional capture. Five studies with a total of 496 people were carried out. In the first study, the association of colour and perceived healthfulness was evaluated in an online survey in which participants had to rate their perceived healthfulness of eight colours. In the second study, the influence of colour, shape and textual information on perceived healthfulness was evaluated using choice-conjoint analysis. The third study focused on implicit associations between two design features (shape and colour) on perceived healthfulness. The fourth and fifth studies used visual search to evaluate the influence of colour, size and position of the warnings on attentional capture. Perceived healthfulness was significantly influenced by shape, colour and textual information. Colour was the variable with the largest contribution to perceived healthfulness. Colour, size and position of the warnings on the labels affected attentional capture. Results from the experiments provide recommendations for the design of warnings to identify products with unfavourable nutrient profile.

  12. Towards more transparent and reproducible omics studies through a common metadata checklist and data publications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolker, Eugene; Ozdemir, Vural; Martens , Lennart

    Biological processes are fundamentally driven by complex interactions between biomolecules. Integrated high-throughput omics studies enable multifaceted views of cells, organisms, or their communities. With the advent of new post-genomics technologies omics studies are becoming increasingly prevalent yet the full impact of these studies can only be realized through data harmonization, sharing, meta-analysis, and integrated research,. These three essential steps require consistent generation, capture, and distribution of the metadata. To ensure transparency, facilitate data harmonization, and maximize reproducibility and usability of life sciences studies, we propose a simple common omics metadata checklist. The proposed checklist is built on the rich ontologiesmore » and standards already in use by the life sciences community. The checklist will serve as a common denominator to guide experimental design, capture important parameters, and be used as a standard format for stand-alone data publications. This omics metadata checklist and data publications will create efficient linkages between omics data and knowledge-based life sciences innovation and importantly, allow for appropriate attribution to data generators and infrastructure science builders in the post-genomics era. We ask that the life sciences community test the proposed omics metadata checklist and data publications and provide feedback for their use and improvement.« less

  13. Toward More Transparent and Reproducible Omics Studies Through a Common Metadata Checklist and Data Publications

    PubMed Central

    Özdemir, Vural; Martens, Lennart; Hancock, William; Anderson, Gordon; Anderson, Nathaniel; Aynacioglu, Sukru; Baranova, Ancha; Campagna, Shawn R.; Chen, Rui; Choiniere, John; Dearth, Stephen P.; Feng, Wu-Chun; Ferguson, Lynnette; Fox, Geoffrey; Frishman, Dmitrij; Grossman, Robert; Heath, Allison; Higdon, Roger; Hutz, Mara H.; Janko, Imre; Jiang, Lihua; Joshi, Sanjay; Kel, Alexander; Kemnitz, Joseph W.; Kohane, Isaac S.; Kolker, Natali; Lancet, Doron; Lee, Elaine; Li, Weizhong; Lisitsa, Andrey; Llerena, Adrian; MacNealy-Koch, Courtney; Marshall, Jean-Claude; Masuzzo, Paola; May, Amanda; Mias, George; Monroe, Matthew; Montague, Elizabeth; Mooney, Sean; Nesvizhskii, Alexey; Noronha, Santosh; Omenn, Gilbert; Rajasimha, Harsha; Ramamoorthy, Preveen; Sheehan, Jerry; Smarr, Larry; Smith, Charles V.; Smith, Todd; Snyder, Michael; Rapole, Srikanth; Srivastava, Sanjeeva; Stanberry, Larissa; Stewart, Elizabeth; Toppo, Stefano; Uetz, Peter; Verheggen, Kenneth; Voy, Brynn H.; Warnich, Louise; Wilhelm, Steven W.; Yandl, Gregory

    2014-01-01

    Abstract Biological processes are fundamentally driven by complex interactions between biomolecules. Integrated high-throughput omics studies enable multifaceted views of cells, organisms, or their communities. With the advent of new post-genomics technologies, omics studies are becoming increasingly prevalent; yet the full impact of these studies can only be realized through data harmonization, sharing, meta-analysis, and integrated research. These essential steps require consistent generation, capture, and distribution of metadata. To ensure transparency, facilitate data harmonization, and maximize reproducibility and usability of life sciences studies, we propose a simple common omics metadata checklist. The proposed checklist is built on the rich ontologies and standards already in use by the life sciences community. The checklist will serve as a common denominator to guide experimental design, capture important parameters, and be used as a standard format for stand-alone data publications. The omics metadata checklist and data publications will create efficient linkages between omics data and knowledge-based life sciences innovation and, importantly, allow for appropriate attribution to data generators and infrastructure science builders in the post-genomics era. We ask that the life sciences community test the proposed omics metadata checklist and data publications and provide feedback for their use and improvement. PMID:24456465

  14. NASA Materials Related Lessons Learned

    NASA Technical Reports Server (NTRS)

    Garcia, Danny; Gill, Paul S.; Vaughan, William W.

    2003-01-01

    Lessons Learned have been the basis for our accomplishments throughout the ages. They have been passed down from father to son, mother to daughter, teacher to pupil, and older to younger worker. Lessons Learned have also been the basis for the nation s accomplishments for more than 200 years. Both government and industry have long recognized the need to systematically document and utilize the knowledge gained from past experiences in order to avoid the repetition of failures and mishaps. Through the knowledge captured and recorded in Lessons Learned from more than 80 years of flight in the Earth s atmosphere, NASA s materials researchers are constantly working to develop stronger, lighter, and more durable materials that can withstand the challenges of space. The Agency s talented materials engineers and scientists continue to build on that rich tradition by using the knowledge and wisdom gained from past experiences to create futuristic materials and technologies that will be used in the next generation of advanced spacecraft and satellites that may one day enable mankind to land men on another planet or explore our nearest star. These same materials may also have application here on Earth to make commercial aircraft more economical to build and fly. With the explosion in technical accomplishments over the last decade, the ability to capture knowledge and have the capability to rapidly communicate this knowledge at lightning speed throughout an organization like NASA has become critical. Use of Lessons Learned is a principal component of an organizational culture committed to continuous improvement.

  15. NASA Materials Related Lessons Learned

    NASA Technical Reports Server (NTRS)

    Garcia, Danny; Gill, Paul S.; Vaughan, William W.; Parker, Nelson C. (Technical Monitor)

    2002-01-01

    Lessons Learned have been the basis for our accomplishments throughout the ages. They have been passed down from father to son, mother to daughter, teacher to pupil, and older to younger worker. Lessons Learned have also been the basis for the nation's accomplishments for more than 200 years. Both government and industry have long recognized the need to systematically document and utilize the knowledge gained from past experiences in order to avoid the repetition of failures and mishaps. Through the knowledge captured and recorded in Lessons Learned from more than 80 years of flight in the Earth's atmosphere, NASA's materials researchers are constantly working to develop stronger, lighter, and more durable materials that can withstand the challenges of space. The Agency's talented materials engineers and scientists continue to build on that rich tradition by using the knowledge and wisdom gained from past experiences to create futurist materials and technologies that will be used in the next generation of advanced spacecraft and satellites that may one day enable mankind to land men on another planet or explore our nearest star. These same materials may also have application here on Earth to make commercial aircraft more economical to build and fly. With the explosion in technical accomplishments over the last decade, the ability to capture knowledge and have the capability to rapidly communicate this knowledge at lightning speed throughout an organization like NASA has become critical. Use of Lessons Learned is a principal component of an organizational culture committed to continuous improvement.

  16. Dialogue-Games: Meta-Communication Structures for Natural Language Interaction

    DTIC Science & Technology

    1977-01-01

    Dialogue- games are only those described here. For example, they are not necessarily competitive , consciously pursued, or zero-sum. 3. THE DIALOGUE- GAME ...fr«. CO / (Mt l / H- James A. Levin James A. Moore ARPA ORDER NO. 2930 NR 134 374 ISI/RR 77-53 January 1977 Dialogue Games : Meta...these patterns. These patterns have been represented by a set of knowledge structures called Dialogue- games , capturing shared conventional Knowledge

  17. Quantum Chemical Topology: Knowledgeable atoms in peptides

    NASA Astrophysics Data System (ADS)

    Popelier, Paul L. A.

    2012-06-01

    The need to improve atomistic biomolecular force fields remains acute. Fortunately, the abundance of contemporary computing power enables an overhaul of the architecture of current force fields, which typically base their electrostatics on fixed atomic partial charges. We discuss the principles behind the electrostatics of a more realistic force field under construction, called QCTFF. At the heart of QCTFF lies the so-called topological atom, which is a malleable box, whose shape and electrostatics changes in response to a changing environment. This response is captured by a machine learning method called Kriging. Kriging directly predicts each multipole moment of a given atom (i.e. the output) from the coordinates of the nuclei surrounding this atom (i.e. the input). This procedure yields accurate interatomic electrostatic energies, which form the basis for future-proof progress in force field design.

  18. Measurement of Flow Patterns and Dispersion in the Human Airways

    NASA Astrophysics Data System (ADS)

    Fresconi, Frank E.; Prasad, Ajay K.

    2006-03-01

    A detailed knowledge of the flow and dispersion within the human respiratory tract is desirable for numerous reasons. Both risk assessments of exposure to toxic particles in the environment and the design of medical delivery systems targeting both lung-specific conditions (asthma, cystic fibrosis, and chronic obstructive pulmonary disease (COPD)) and system-wide ailments (diabetes, cancer, hormone replacement) would profit from such an understanding. The present work features experimental efforts aimed at elucidating the fluid mechanics of the lung. Particle image velocimetry (PIV) and laser induced fluorescence (LIF) measurements of oscillatory flows were undertaken in anatomically accurate models (single and multi-generational) of the conductive region of the lung. PIV results captured primary and secondary velocity fields. LIF was used to determine the amount of convective dispersion across an individual generation of the lung.

  19. “Test me and treat me”—attitudes to vitamin D deficiency and supplementation: a qualitative study

    PubMed Central

    Kotta, Siddharth; Gadhvi, Dev; Jakeways, Niki; Saeed, Maryum; Sohanpal, Ratna; Hull, Sally; Famakin, Olufunke; Martineau, Adrian; Griffiths, Chris

    2015-01-01

    Objective Lay interest in vitamin D and the potential benefits of supplementation is considerable, but little information exists concerning lay knowledge, beliefs and attitudes towards vitamin D to inform public health initiatives and professional guidance. Design Qualitative focus group study. Participants 58 adults capturing diversity in disease status, gender, age and ethnicity. Setting A large general practice in east London. Results Many respondents lacked knowledge about vitamin D, including dietary sources and government recommendations. Most were positive about sun exposure, but confused by ambiguous health messages about risks and benefits of sunshine. Medicalised views of vitamin D were prominent, notably from those in favour of supplementation, who talked of “doses”, “side effects” and “regular testing.” Fortification of food with vitamin D was controversial, with opposing utilitarian (better overall for the majority) and libertarian (freedom to choose) views. Conclusions Knowledge about vitamin D was limited. Clearer messages are needed about risks and benefits of sun exposure. Testing and supplementation by health professionals, while potentially useful in some high-risk groups, have contributed to a medicalised view of vitamin D. Health policy should address the public's need for clear information on sources and effects of vitamin D, including risks and benefits of sun exposure, and take account of divergent views on fortification. Professional guidance is needed on testing and supplementation to counter inappropriate medicalisation. PMID:26173717

  20. Developing a Cyberinfrastructure for integrated assessments of environmental contaminants.

    PubMed

    Kaur, Taranjit; Singh, Jatinder; Goodale, Wing M; Kramar, David; Nelson, Peter

    2005-03-01

    The objective of this study was to design and implement prototype software for capturing field data and automating the process for reporting and analyzing the distribution of mercury. The four phase process used to design, develop, deploy and evaluate the prototype software is described. Two different development strategies were used: (1) design of a mobile data collection application intended to capture field data in a meaningful format and automate transfer into user databases, followed by (2) a re-engineering of the original software to develop an integrated database environment with improved methods for aggregating and sharing data. Results demonstrated that innovative use of commercially available hardware and software components can lead to the development of an end-to-end digital cyberinfrastructure that captures, records, stores, transmits, compiles and integrates multi-source data as it relates to mercury.

Top